The amorality of Web 2.0

This post, along with seventy-eight others, is collected in the book Utopia Is Creepy.

From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we’re all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.

But as the Web matured during the late 1990s, the dreams of a digital awakening went unfulfilled. The Net turned out to be more about commerce than consciousness, more a mall than a commune. And when the new millenium arrived, it brought not a new age but a dispiritingly commonplace popping of a bubble of earthly greed. Somewhere along the way, the moneychangers had taken over the temple. The Internet had transformed many things, but it had not transformed us. We were the same as ever.

The New New Age

But the yearning for a higher consciousness didn’t burst with the bubble. Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. In a profile of Internet savant Tim O’Reilly in the current issue of Wired, Steven Levy writes that “the idea of collective consciousness is becoming manifest in the Internet.” He quotes O’Reilly: “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s – except we didn’t know it would be technology-mediated.” Levy then asks, “Could it be that the Internet – or what O’Reilly calls Web 2.0 – is really the successor to the human potential movement?”

Levy’s article appears in the afterglow of Kevin Kelly’s sweeping “We Are the Web” in Wired’s August issue. Kelly, erstwhile prophet of the Long Boom, surveys the development of the World Wide Web, from the Netscape IPO ten years ago, and concludes that it has become a “magic window” that provides a “spookily godlike” perspective on existence. “I doubt angels have a better view of humanity,” he writes.

But that’s only the beginning. In the future, according to Kelly, the Web will grant us not only the vision of gods but also their power. The Web is becoming “the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds … We will live inside this thing.”

The revelation continues:

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

This isn’t the language of exposition. It’s the language of rapture.

The Cult of the Amateur

Now, lest you dismiss me as a mere cynic, if not a fallen angel, let me make clear that I’m all for seeking transcendence, whether it’s by going to church or living in a hut in the woods or sitting at the feet of the Maharishi or gazing into the glittering pixels of an LCD screen. One gathers one’s manna where one finds it. And if there’s a higher consciousness to be found, then by all means let’s get elevated. My problem is this: When we view the Web in religious terms, when we imbue it with our personal yearning for transcendence, we can no longer see it objectively. By necessity, we have to look at the Internet as a moral force, not as a simple collection of inanimate hardware and software. No decent person wants to worship an amoral conglomeration of technology.

And so all the things that Web 2.0 represents – participation, collectivism, virtual communities, amateurism – become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state. But is it really so? Is there a counterargument to be made? Might, on balance, the practical effect of Web 2.0 on society and culture be bad, not good? To see Web 2.0 as a moral force is to turn a deaf ear to such questions.

Let me bring the discussion down to a brass tack. If you read anything about Web 2.0, you’ll inevitably find praise heaped upon Wikipedia as a glorious manifestation of “the age of participation.” Wikipedia is an open-source encyclopedia; anyone who wants to contribute can add an entry or edit an existing one. O’Reilly, in a new essay on Web 2.0, says that Wikipedia marks “a profound change in the dynamics of content creation” – a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is allowing us to pool our individual brains into a great collective mind. It’s a harbinger of the Machine.

In theory, Wikipedia is a beautiful thing – it has to be a beautiful thing if the Web is leading us to a higher consciousness. In reality, though, Wikipedia isn’t very good at all. Certainly, it’s useful – I regularly consult it to get a quick gloss on a subject. But at a factual level it’s unreliable, and the writing is often appalling. I wouldn’t depend on it as a source, and I certainly wouldn’t recommend it to a student writing a research paper.

Take, for instance, this section from Wikipedia’s biography of Bill Gates, excerpted verbatim:

Gates married Melinda French on January 1, 1994. They have three children, Jennifer Katharine Gates (born April 26, 1996), Rory John Gates (born May 23, 1999) and Phoebe Adele Gates (born September 14, 2002).

In 1994, Gates acquired the Codex Leicester, a collection of writings by Leonardo da Vinci; as of 2003 it was on display at the Seattle Art Museum.

In 1997, Gates was the victim of a bizarre extortion plot by Chicago resident Adam Quinn Pletcher. Gates testified at the subsequent trial. Pletcher was convicted and sentenced in July 1998 to six years in prison. In February 1998 Gates was attacked by Noël Godin with a cream pie. In July 2005, he solicited the services of famed lawyer Hesham Foda.

According to Forbes, Gates contributed money to the 2004 presidential campaign of George W. Bush. According to the Center for Responsive Politics, Gates is cited as having contributed at least $33,335 to over 50 political campaigns during the 2004 election cycle.

Excuse me for stating the obvious, but this is garbage, an incoherent hodge-podge of dubious factoids (who the heck is “famed lawyer Hesham Foda”?) that adds up to something far less than the sum of its parts.

Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim:

Her nickname as a youth—Lady Jane—was one she reportedly disliked. She traveled to Communist Russia in 1964 and was impressed by the people, who welcomed her warmly as Henry’s daughter. In the mid-1960s she bought a farm outside of Paris, had it renovated and personally started a garden. She visited Andy Warhol’s Factory in 1966. About her 1971 Oscar win, her father Henry said: “How in hell would you like to have been in this business as long as I and have one of your kids win an Oscar before you do?” Jane was on the cover of Life magazine, March 29, 1968.

While early she had grown both distant from and critical of her father for much of her young life, in 1980, she bought the play “On Golden Pond” for the purpose of acting alongside her father—hoping he might win the Oscar that had eluded him throughout his career. He won, and when she accepted the Oscar on his behalf, she said it was “the happiest night of my life.” Director and first husband Roger Vadim once said about her: “Living with Jane was difficult in the beginning … she had so many, how do you say, ‘bachelor habits.’ Too much organization. Time is her enemy. She cannot relax. Always there is something to do.” Vadim also said, “There is also in Jane a basic wish to carry things to the limit.”

This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. Remember, this emanation of collective intelligence is not just a couple of months old. It’s been around for nearly five years and has been worked over by many thousands of diligent contributors. At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. When will the great Wikipedia get good? Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

The promoters of Web 2.0 venerate the amateur and distrust the professional. We see it in their unalloyed praise of Wikipedia, and we see it in their worship of open-source software and myriad other examples of democratic creativity. Perhaps nowhere, though, is their love of amateurism so apparent as in their promotion of blogging as an alternative to what they call “the mainstream media.” Here’s O’Reilly: “While mainstream media may see individual blogs as competitors, what is really unnerving is that the competition is with the blogosphere as a whole. This is not just a competition between sites, but a competition between business models. The world of Web 2.0 is also the world of what Dan Gillmor calls ‘we, the media,’ a world in which ‘the former audience,’ not a few people in a back room, decides what’s important.”

I’m all for blogs and blogging. (I’m writing this, ain’t I?) But I’m not blind to the limitations and the flaws of the blogosphere – its superficiality, its emphasis on opinion over reporting, its echolalia, its tendency to reinforce rather than challenge ideological extremism and segregation. Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do. Those despised “people in a back room” can fund in-depth reporting and research. They can underwrite projects that can take months or years to reach fruition – or that may fail altogether. They can hire and pay talented people who would not be able to survive as sole proprietors on the Internet. They can employ editors and proofreaders and other unsung protectors of quality work. They can place, with equal weight, opposing ideologies on the same page. Forced to choose between reading blogs and subscribing to, say, the New York Times, the Financial Times, the Atlantic, and the Economist, I will choose the latter. I will take the professionals over the amateurs.

But I don’t want to be forced to make that choice.

Scary Economics

And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture – and it’s doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it’s created by amateurs rather than professionals, it’s free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we’ve recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can’t imagine anything more frightening.

In “We Are the Web,” Kelly writes that “because of the ease of creation and dissemination, online culture is the culture.” I hope he’s wrong, but I fear he’s right – or will come to be right.

Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters the forms and economics of production and consumption. It doesn’t care whether its consequences are good or bad. It doesn’t care whether it brings us to a higher consciousness or a lower one. It doesn’t care whether it burnishes our culture or dulls it. It doesn’t care whether it leads us into a golden age or a dark one. So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.

193 thoughts on “The amorality of Web 2.0

  1. angry_squirrel

    Congratulations, you have presented the most long-winded defense of traditional liberal media I have yet seen in a blog. Although not all traditional media is left biased, you site those that are and you feel sad about a revolution that many of us open minded libertarians are elated about. Not all bloggers negate facts and research, and most Wikipedia entries are of quality useful value. But your story cries out “my heart aches for the days of media control and bias” and for you I present the world’s smallest violin.

    Read our blogs and weep!

  2. AccMan Pro

    Information Central

    I don’t usually have a lot of time for Andrew Orlowski but on this occasion I give him 100%. He’s penned a vicious vibrant analysis of Wikipedia and its quality. His point is well made and should serve as a warning to those enamoured of this technolog…

  3. Mark Evans

    Web 2.0 – Stop the Insanity

    Has the high-tech community lost it mind over Web 2.0? The growing roar over what Web 2.0 represents, whether it’s good or bad, whether new companies are being created just to flip, blah, blah, blah.

  4. | Michael Boyle's weblog

    Several links pointed me

    to Rough Type, a blog by Nicholas Carr in which you’ll find a good article about the new Web (aka Web 2.0), and in particular about Wikipedia: The amorality of Web 2.0. It’s mostly a good, if muddled, piece, and in the comments you can see the (many) e…

  5. Read/Write Web

    There is no cult of the amateur, Mr Carr

    Nicolas Carr, a real journalist, has a blog post that argues that Web 2.0 is amoral. That’s a relatively uninteresting academic argument though. Of more practical import, is his rage against the "cult of the amateur". But I find it…

  6. The Integrative Stream

    Tech or Transcendence?

    Nicholas Carr has begun to make a nice career for himself callings spades spades and asking good questions about whether accepted orthodoxy in business and technology should be accepted quite so readily. He made a splash with a Harvard Business Review…

  7. Chris Tolles


    Nice piece — while there’s a lot to talk about with regard to the sanctimoniousness of many of the folks involved in the industry, I’d like to point out a couple of things around the economics here.

    Specifically, there are some antecedants to your thinking around the implications of good things being beaten by poorer quality, cheaper competition in the “Worse is Better” essay by Richard Gabriel as well as Clay Christensen’s Innovator’s Dilemma.

    When my co-founders and I launched the Open Directory Project ( the volunteer created web directory, and put the entire directory out for free, it changed the economics of the industry. You couldn’t charge for a “better” directory — but it emerged that you could charge people to be *in* your directory (a la Yahoo and Looksmart).

    While I’m amused at the sharp stick you’re using to poke at the starry eyed idealists here, I don’t think the world that’s emerging is *worse* for it.

    Wikipedia ends up being more up to date than a traditional encyclopedia, not only cheaper. If you want the *real* scoop on the media industry, you read, ’cause they have the news that mainstream guys won’t have for weeks.

    And what about all the kum ba ya nonsense around these changes? Well, the idealism is part of the package here — and something you need to consider when you’re building and marketing products, or managing your career. If someone’s going to go out and harness the public to create a competitor, you might want to take it seriously. If every one of those people *believes* in what they’re doing, it is a force to be reckoned with, whether or not they are right, “good”, or “bad”. If they believe they are building *The* machine, it’s a very different amount of effort than if everyone thinks they’re working as part of a machine.

    Chris Tolles


    Open Directory Project


    VP Marketing

  8. William Pietri

    I’d agree that you should always keep an eye on utopians, and that technologies bear watching for what they are, not what we wish them to be. The telegraph and the television both prompted similarly rapturous visions. (For those interested in this, by the way, Tom Standage’s book The Victorian Internet is fascinating.)

    But I think beating up on Wikipedia is a little premature, for two reasons. First, I’m not sure the first edition of the Britannica, a copy of which I have next to me on the shelf, would stand up to that level of examination either. To me it seems limited, quirky, and a bit hodge-podge. The third edition, finished nearly 30 years later, was seven times the size. And even then one wonders how accurate and polished it was given that a single person still wrote “many of the scientific treatises and histories, and almost all minor articles.”

    And second, people carping about particular articles in Wikipedia would do well to remember what makes Wikipedia different: you too can fix it. I understand here the author is making a broader point, so perhaps he has an excuse. But for most people I’ve heard fret or grumble about Wikipedia, they could have solved the problem in the amount of time they spent bellyaching.

  9. larry borsato

    The Web 2.0 Religion.

    The hottest story on the web today is Nicholas Carr’s post, The amorality of Web 2.0:Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters…

  10. Anonymous

    Eheh, the funny thing is how the Jane Fonda Wikipedia entry was completely rewritten since this blog. Just go to the history and compare the version from october 3 and the current one. The full irony is that the modifications can be use by both side of the debate equally well.

    I do find indigenous comparing a centenial encyclopedia that takes more years between revised editions than wiki has been in existence as a proof.

    I also find grating the insistence on so called amateurism. It’s just a framing word chosen, as rethoric lover are wont to do, so that the debate is waged on an advantageous battlefield for those pushing it’s use. If you accept the term, then Web2.0 already lost. Most people actually part of the movement prefer to use the term openess, which reflects that anyone, professional or not, can participate.

    There’s also the irony that many depth pieces found in mainstream media are the work of independant journalists and freelancers. I guess, as per Nick the argument, we should not trust them. They may, after all, not follow the NYT editiorial line properly. ;-)


    5 Earth-Shattering Things You Should Know About Ajax And Probably Don’t

    Ajax is hotter right now than Nicholas Carr’s backside after coming out of Tim O’Reilly’s Web 2.0 woodshed.

  12. Anonymous

    The homogenous Web 2.0 “Ads by Goooooogle” featuring “contextually relevant” sales pitches for the likes of “The Enlightened Business” contradicts arguments presented.

  13. Gotta Love It

    Stairway to Heaven

    I just got around to seeing Nicholas Carr’s blog entry of a couple weeks ago, “The amorality of Web 2.0”. It provides an interesting counter-point to the Web 2.0 love fest that has swept through the blog-world recently. I would…

  14. charlest

    I fail to see what is wrong with the Wikipedia entries on Bill Gates and Jane Fonda. For the purposes of the average Joe wanting to know a little more about these personalities without needing an authoritative or comprehensive account, it does exactly what it says on the tin. What exactly is anyone’s objection to the entries? Tone, focus, intellectual snobbery?

  15. The PC Doctor

    Wikipedia has problems

    So finally Wikipedia admits to quality problems .  And serious quality problems at that.

    About time.  While I admire the concept behind Wikipedia  I can’t help but feel that the project went seriously astray years ago and at…

  16. pitsch

    Go further, and require each of them to make a contribution: you will see how many things are still missing, and you will be obliged to get the assistance of a large number of men who belong to different classes, priceless men, but to whom the gates of the academies are nonetheless closed because of their social station. All the members of these learned societies are more than is needed for a single object of human science; all the societies together are not sufficient for a science of man in general.

    Denis Diderot (~1777) Encyclopédie, Article on


  17. IB Weblog

    wikipedia | web 2.0 , mit Qualitätsproblemen

    Hierüber wird gesprochen:

    This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. , gemeint ist der Eintrag zu Jane Fonda in der amerikanischen Wikipedia und entsetzt war Nicholas G. Carr der sic…

  18. Jud

    The general theme of the piece is, it seems to me, inarguable. Summarized and oversimplified quite a lot, the message is essentially that there is no special magic in open collaboration as a working method. I.e., you can have 100 monkeys clattering away on keyboards in an open environment, and mathematics may tell you that they will eventually produce the collected works of Shakespeare, but it takes less time and less trash is produced if you have Shakespeare do it.

    Of course Wikipedia contributors are quite a bit smarter than monkeys and Britannica authors aren’t Shakespeares, but the general principle holds – garbage in, garbage out, whether produced in an open collaborative environment or a closed proprietary one.

    There are arguments on each side maintaining that the open collaborative process or the closed proprietary process guarantees to an extent against GIGO. These arguments are flawed. The basic flaw can be expressed in a very general way by this “Dragnet” snippet reproduced from my unreliable memory (the dialogue may not be quoted with entire accuracy, but the general idea is there):

    Utopian Kid: “We’re trying to build a perfect world!”

    Sgt. Joe Friday: “Can’t have a perfect world.”

    UK: “Why not?!”

    SGF: “No perfect people.”

    Advocates of the open collaborative method argue for what Mr. Carr refers to as “collective intelligence,” and has been referred to concerning Linux as the “many eyes” principle – that is, open collaboration ensures that there will be many eyes to see and fix any mistakes; therefore, even if there is GI, it will be caught before it has been O for too long. However, mere open collaboration doesn’t ensure the quality of the “eyes,” so who knows whether they will be interested enough in the subject to look at the GI in the first place; to know it is G in the second place; and have the expertise to fix it, or at least know the right person to do so, in the third place?

    Advocates of the closed proprietary method argue, explicitly or implicitly, that people paying for expertise want to make sure they get value for it, whether they are hiring people to produce content or paying for its use. Unfortunately, nearly all of us are likely to be able to cite examples of people we hired, even after research, references, etc., who produced bad work, and items we purchased that didn’t do what they were supposed to.

    Another problem inherent in the production of any content, whether by the open collaborative or closed proprietary method, paradoxically most affects precisely those issues that are most complex and critical. A general description can be found in this excerpt from a message to a mailing list written by Mr. Poul-Henning Kamp, describing the origin of the term “bikeshedding” (

    “[C. Northcote] Parkinson shows how you can go into the board of directors and get approval for building a multi-million or even billion dollar atomic power plant, but if you want to build a bike shed you will be tangled up in endless discussions.

    “Parkinson explains that this is because an atomic plant is so vast, so expensive and so complicated that people cannot grasp it, and rather than try, they fall back on the assumption that somebody else checked all the details before it got this far. Richard P. Feynmann gives a couple of interesting, and very much to the point, examples relating to Los Alamos in his books.

    “A bike shed on the other hand. Anyone can build one of those over a weekend, and still have time to watch the game on TV. So no matter how well prepared, no matter how reasonable you are with your proposal, somebody will seize the chance to show that he is doing his job, that he is paying attention, that he is here.”

    Kamp was discussing problems with a change he wanted to make in the source code of FreeBSD, a computer operating system that is produced through a relatively open collaborative method. (Anyone can look at the source code and submit requested changes; the change requests are vetted and implemented or not by a smaller group of “committers.”) Decisions regarding the building of atomic power plants for the most part are, and decisions regarding the Manhattan Project certainly were, made by closed groups. Both systems are equally subject to the tendency to leave the really big complicated stuff to “the experts” (and to assume, perhaps erroneously, that the experts have in fact looked at these issues), while spending lots of time discussing smaller stuff that may be less important, but hey, at least everyone gets the chance to express an opinion.

    I don’t think anyone can reasonably object to the point that open collaboration *by itself* is no guarantor of quality – but neither is the closed proprietary method. One needn’t think very long and hard before coming up with a number of examples of supposed factual content produced by very august proprietary shops that turned out to be simply made-up (e.g., fairly recent scandals at the Washington Post and New York Times).

    So are the proprietary outlets indeed doomed by competition from free collaborative sources, especially since there is no inherent guarantee of quality in either method of producing information?

    I think the answer will be determined by how rapidly each adapts to the market for reliable information. Parents whose children receive failing grades on reports written from Wikipedia information (making the not-necessarily-justified assumptions that Wikipedia was inaccurate and teachers caught the errors), or other parents who hear of such stories, may well be inspired to shell out for the Britannica. Of course, any ensuing success for the Britannica will be short-lived if the Britannica turns out to have errors itself (again assuming teachers capable of catching them). As discussed above, I don’t think we can assume in the first instance that either Wikipedia or the Britannica will be relatively free from errors;* all we can say is that their respective chances for success depend on whether Wikipedia offers reliable information, and whether Britannica offers information that is sufficiently more reliable than Wikipedia to justify its price.

    *I imagine that I hear you saying, “But the jobs of the Britannica folks depend on this!” Yep. Did lives depend on decisions made at Los Alamos?

  19. phil jones


    “I would argue, in fact, that the overall quality of an encyclopedia is best judged by its weakest entries rather than its best. What’s the worth of an unreliable reference work?”

    “Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do.”

    So why should be judge the mainstream by its best, but the amateurs by their weakest?

    Sure, mainstream media *can* produce quality, well researched material, tending away from extremism. But so can blogs.

    You want to argue a stronger case : that its more likely for a media *funded* by people buying content (or advertisers) to produce quality than a pack amateurs.

    I wouldn’t be so sure :

    If there’s an argument to be made that the market for information is better than the gift-economy, its that reader / customers *recognise* quality and switch their custom and attention to it.

    But it’s increasingly looking as though mainstream media has discovered that it doesn’t really need to produce much more than sensationalism, opinion and loose facts in order to win customers.

    And if you’re going to base your faith in what customers recognise anyway; then why not cut out the middle man and trust the readers to recognise it on wikipedia and in the blogosphere? Are the blogs any worse than media in the hard to measure, but most important metric : “attention paid quality”?

  20. Der Haken

    Geek Reading: Web 2.0

    […] In the eyes of the Rough Type, Web 2.0 stands for participation, collectivism, virtual communities, amateurism. This is basically what I think Web 2.0 represents. His article is very long as well, but gives you a whole lot of insight as well as…

Comments are closed.