The amorality of Web 2.0

From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we’re all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.

But as the Web matured during the late 1990s, the dreams of a digital awakening went unfulfilled. The Net turned out to be more about commerce than consciousness, more a mall than a commune. And when the new millenium arrived, it brought not a new age but a dispiritingly commonplace popping of a bubble of earthly greed. Somewhere along the way, the moneychangers had taken over the temple. The Internet had transformed many things, but it had not transformed us. We were the same as ever.

The New New Age

But the yearning for a higher consciousness didn’t burst with the bubble. Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. In a profile of Internet savant Tim O’Reilly in the current issue of Wired, Steven Levy writes that “the idea of collective consciousness is becoming manifest in the Internet.” He quotes O’Reilly: “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s – except we didn’t know it would be technology-mediated.” Levy then asks, “Could it be that the Internet – or what O’Reilly calls Web 2.0 – is really the successor to the human potential movement?”

Levy’s article appears in the afterglow of Kevin Kelly’s sweeping “We Are the Web” in Wired’s August issue. Kelly, erstwhile prophet of the Long Boom, surveys the development of the World Wide Web, from the Netscape IPO ten years ago, and concludes that it has become a “magic window” that provides a “spookily godlike” perspective on existence. “I doubt angels have a better view of humanity,” he writes.

But that’s only the beginning. In the future, according to Kelly, the Web will grant us not only the vision of gods but also their power. The Web is becoming “the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds … We will live inside this thing.”

The revelation continues:

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

This isn’t the language of exposition. It’s the language of rapture.

The Cult of the Amateur

Now, lest you dismiss me as a mere cynic, if not a fallen angel, let me make clear that I’m all for seeking transcendence, whether it’s by going to church or living in a hut in the woods or sitting at the feet of the Maharishi or gazing into the glittering pixels of an LCD screen. One gathers one’s manna where one finds it. And if there’s a higher consciousness to be found, then by all means let’s get elevated. My problem is this: When we view the Web in religious terms, when we imbue it with our personal yearning for transcendence, we can no longer see it objectively. By necessity, we have to look at the Internet as a moral force, not as a simple collection of inanimate hardware and software. No decent person wants to worship an amoral conglomeration of technology.

And so all the things that Web 2.0 represents – participation, collectivism, virtual communities, amateurism – become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state. But is it really so? Is there a counterargument to be made? Might, on balance, the practical effect of Web 2.0 on society and culture be bad, not good? To see Web 2.0 as a moral force is to turn a deaf ear to such questions.

Let me bring the discussion down to a brass tack. If you read anything about Web 2.0, you’ll inevitably find praise heaped upon Wikipedia as a glorious manifestation of “the age of participation.” Wikipedia is an open-source encyclopedia; anyone who wants to contribute can add an entry or edit an existing one. O’Reilly, in a new essay on Web 2.0, says that Wikipedia marks “a profound change in the dynamics of content creation” – a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is allowing us to pool our individual brains into a great collective mind. It’s a harbinger of the Machine.

In theory, Wikipedia is a beautiful thing – it has to be a beautiful thing if the Web is leading us to a higher consciousness. In reality, though, Wikipedia isn’t very good at all. Certainly, it’s useful – I regularly consult it to get a quick gloss on a subject. But at a factual level it’s unreliable, and the writing is often appalling. I wouldn’t depend on it as a source, and I certainly wouldn’t recommend it to a student writing a research paper.

Take, for instance, this section from Wikipedia’s biography of Bill Gates, excerpted verbatim:

Gates married Melinda French on January 1, 1994. They have three children, Jennifer Katharine Gates (born April 26, 1996), Rory John Gates (born May 23, 1999) and Phoebe Adele Gates (born September 14, 2002).

In 1994, Gates acquired the Codex Leicester, a collection of writings by Leonardo da Vinci; as of 2003 it was on display at the Seattle Art Museum.

In 1997, Gates was the victim of a bizarre extortion plot by Chicago resident Adam Quinn Pletcher. Gates testified at the subsequent trial. Pletcher was convicted and sentenced in July 1998 to six years in prison. In February 1998 Gates was attacked by Noël Godin with a cream pie. In July 2005, he solicited the services of famed lawyer Hesham Foda.

According to Forbes, Gates contributed money to the 2004 presidential campaign of George W. Bush. According to the Center for Responsive Politics, Gates is cited as having contributed at least $33,335 to over 50 political campaigns during the 2004 election cycle.

Excuse me for stating the obvious, but this is garbage, an incoherent hodge-podge of dubious factoids (who the heck is “famed lawyer Hesham Foda”?) that adds up to something far less than the sum of its parts.

Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim:

Her nickname as a youth—Lady Jane—was one she reportedly disliked. She traveled to Communist Russia in 1964 and was impressed by the people, who welcomed her warmly as Henry’s daughter. In the mid-1960s she bought a farm outside of Paris, had it renovated and personally started a garden. She visited Andy Warhol’s Factory in 1966. About her 1971 Oscar win, her father Henry said: “How in hell would you like to have been in this business as long as I and have one of your kids win an Oscar before you do?” Jane was on the cover of Life magazine, March 29, 1968.

While early she had grown both distant from and critical of her father for much of her young life, in 1980, she bought the play “On Golden Pond” for the purpose of acting alongside her father—hoping he might win the Oscar that had eluded him throughout his career. He won, and when she accepted the Oscar on his behalf, she said it was “the happiest night of my life.” Director and first husband Roger Vadim once said about her: “Living with Jane was difficult in the beginning … she had so many, how do you say, ‘bachelor habits.’ Too much organization. Time is her enemy. She cannot relax. Always there is something to do.” Vadim also said, “There is also in Jane a basic wish to carry things to the limit.”

This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. Remember, this emanation of collective intelligence is not just a couple of months old. It’s been around for nearly five years and has been worked over by many thousands of diligent contributors. At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. When will the great Wikipedia get good? Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

The promoters of Web 2.0 venerate the amateur and distrust the professional. We see it in their unalloyed praise of Wikipedia, and we see it in their worship of open-source software and myriad other examples of democratic creativity. Perhaps nowhere, though, is their love of amateurism so apparent as in their promotion of blogging as an alternative to what they call “the mainstream media.” Here’s O’Reilly: “While mainstream media may see individual blogs as competitors, what is really unnerving is that the competition is with the blogosphere as a whole. This is not just a competition between sites, but a competition between business models. The world of Web 2.0 is also the world of what Dan Gillmor calls ‘we, the media,’ a world in which ‘the former audience,’ not a few people in a back room, decides what’s important.”

I’m all for blogs and blogging. (I’m writing this, ain’t I?) But I’m not blind to the limitations and the flaws of the blogosphere – its superficiality, its emphasis on opinion over reporting, its echolalia, its tendency to reinforce rather than challenge ideological extremism and segregation. Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do. Those despised “people in a back room” can fund in-depth reporting and research. They can underwrite projects that can take months or years to reach fruition – or that may fail altogether. They can hire and pay talented people who would not be able to survive as sole proprietors on the Internet. They can employ editors and proofreaders and other unsung protectors of quality work. They can place, with equal weight, opposing ideologies on the same page. Forced to choose between reading blogs and subscribing to, say, the New York Times, the Financial Times, the Atlantic, and the Economist, I will choose the latter. I will take the professionals over the amateurs.

But I don’t want to be forced to make that choice.

Scary Economics

And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture – and it’s doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it’s created by amateurs rather than professionals, it’s free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we’ve recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can’t imagine anything more frightening.

In “We Are the Web,” Kelly writes that “because of the ease of creation and dissemination, online culture is the culture.” I hope he’s wrong, but I fear he’s right – or will come to be right.

Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters the forms and economics of production and consumption. It doesn’t care whether its consequences are good or bad. It doesn’t care whether it brings us to a higher consciousness or a lower one. It doesn’t care whether it burnishes our culture or dulls it. It doesn’t care whether it leads us into a golden age or a dark one. So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.

193 Comments

Filed under Uncategorized

193 Responses to The amorality of Web 2.0

  1. John Gauntt

    If you think about it, those who would infuse some designed progress toward perfection in Web 2.0 of TCP/IP have something in common with the Dover, PA school board who want to legislate positive design and purpose in the much older web of DNA/RNA. I agree that the web is amoral. And I believe that is its virtue.

  2. ordaj

    “They can employ editors and proofreaders and other unsung protectors of quality work.”

    This is one of the main problems with software today, free or proprietary. A lot of it gets slapped together and rushed to release, ridden with bugs, quality issues, and don’t even get me started on usability issues.

    Where’s the fire?

  3. Wayne

    “I agree that the web is amoral. And I believe that is its virtue.”

    Here. Here.

    This process will claim some of the “unsung protectors” as it were, some of whom deserve better. When, really, has it ever been different?

    But in its amorality, this medium will also uncover that person who, BUT FOR the right credentials, might have contributed mightily with his or her ideas to the betterment of us all. It’s as close to color blind as society gets.

    The point is not found in all the drek. It’s finding the jewel in the trash.

  4. A superb post by Nick!

    ‘Echolalia’ — there’s a lookup word. I used Dictionary.com.

    Moral or otherwise, the Web 1 or 2 or whathaveyou is nice. I learn from it. I add to it. But I agree that it’s as dangerous, stupid, sad or coherent as one wants it to be.

    Kelly is a fuzz-brain. Tim is not.

  5. Shouvik

    Open Source and Wikipedia are not the same. And faith in open source does not mean shunning the professional for the ameture. How does Nick explain the high quality of Apache Web Server and the robustness of PHP ? Wikipedia is a mess, I agree, that is due to its own model.

    Let me talk something else here. Skill and professions are not something that existed just because the Web (1.0, 2.0 or X.X) was not there. Skills and professions were not something that will cease to exist when Web is ubiquitous. The Human Race found that there are a class of people that can do a set of jobs better than the rest, or, are choosen to specialize on a job. That is the reason why some people became the carpenter, some soldier and some priests. If the job performed by specialists becomes too easy, that specialization goes away. Economics is exactly proportional to the need of specialization.

    Web does not make people expert content creator. This specialization will be there and people will get paid for it. People will not go for a Fairy Tale Wikipedia and will always go for Harry Potter.

    Layoffs in the media houses may be explained as shrinking necessesity for average content creators. Quality will always be rewarded economically.

  6. Nick

    Shouvik, I didn’t mean to imply (and I’m sorry if I did) that open source software is of poor quality. (I’ve written often of the critical importance of open source to the future of IT.) I was simply saying that the veneration of open source efforts, often at the expense of traditional for-pay software development, is one manifestation of the cult of the amateur.

    As for your claim that “quality will always be rewarded economically,” I think you’re being much too complacent.

  7. Asay: ‘The Amorality of Web 2.0′ (Nick Carr)

    It’s so hard to find intelligent contrarians these days. I experienced that firsthand today and yesterday at LinuxWorld UK, where you were cheered for saying inane but popular things like, “My dream is to bless the world with Linux desktops…

  8. It is always interesting to read exceptionally well-crafted content — especially when you don’t totally agree (or disagree) with the point of view of the author.

    I am on a team of graduate students at MIT that are looking at the “business side of Web 2.0″. As such, we are trying to determine if there really is a “there there”. Jury is still out…but at some level, its starting to feel like the 1990s again…

  9. Everything you’ve written here is a valid opinion, and commercial encyclopedias are doomed anyway because (as Microsoft is finding out with Linux) it’s hard to compete with free. (I eagerly await EB putting out TCO studies on Wikipedia.)

    Speaking as someone who’s highly involved in it (I write stuff, I’m an administrator, I’m on the Arbitration Committee, I’m a mailing list moderator, I do media interviews), Wikipedia is of mediocre quality with some really good bits. If you hit the “Random page” link twenty times, you’ll end up mostly with sketchy three-paragraph stub articles.

    That said, the good bits are fantastic. Although articles good enough to make “Featured Article” status (which are indeed excellent) tend to be hideously esoteric; somehow getting more general articles up to that sort of quality is not facilitated at present.

    Encyclopedia Britannica is an amazing work. It’s of consistent high quality, it’s one of the great books in the English language and it’s doomed. Brilliant but pricey has difficulty competing economically with free and apparently adequate (see http://en.wikipedia.org/wiki/Worse_is_better – this story plays out over and over again in the computing field and is the essence of “disruptive technology”). They could release the entire EB under an open content license, but they have shareholders who might want a word about that.

    So if we want a good encyclopedia in ten years, it’s going to have to be a good Wikipedia. So those who care about getting a good encyclopedia are going to have to work out how to make Wikipedia better, or there won’t be anything.

    I’ve made some efforts in this direction – pushing toward a page-rating feature, a “Rate this page” tab at the top, which, unlike an editorial committee, will actually scale with the contributor base and will highlight areas in need of attention. (See http://meta.wikimedia.org/wiki/Article_validation_feature and http://meta.wikimedia.org/wiki/En_validation_topics – the feature is currently waiting on an implementati, on the lead developer thinks won’t kill the database.) Recent discussion on the WikiEN-L mailing list has also included proposals for a scaleable article rating system.

    Wikipedia is likely to be it by first-mover advantage and network effect. Think about what you can do to ensure there is a good encyclopedia in ten years.

  10. This is a very interesting piece. I think that your main point — that the Internet is not what it has been hailed or condemned as — is very true.

    I only wonder where the Web is headed. Perhaps it can only be as perfect as its creators.

  11. When you subscribe to any blog you know what you are getting into. Generally blogs are opinionated and spilled with facts here and there. You do your own DD before coming to any conclusion. Only when professionals start misrepresenting infomation it becames unethical. The case in point is fox news article on opendocument file formats where it raps Massachusetts official. During the initial release of the article it conveniently failed to disclose it was sponsored by microsoft. Cases like this lead people to cheer for Opensource at the expense of Proprietary systems. I guess people prefer being amoral compared to unethical

  12. ta bu shi da yu

    The problem I have with your examples on Wikipedia is that you haven’t picked the best examples. If you review the featured articles, would you reach the same conclusion?

  13. Nick

    I chose the two entries I used (Gates and Fonda) at random, and they were the first two I looked at – I didn’t try to find the worst examples possible, in other words; I simply took the first two I went to. (I wanted to choose subjects that most people would have some familiarity with.) I have looked at a lot of other entries previously and since, and many are every bit as bad as the two I featured. You’re right, though, that there are very good entries, and I suppose I could have searched for a couple of those and featured them. But an encyclopedia can’t just have a small percentage of good entries and be considered a success. I would argue, in fact, that the overall quality of an encyclopedia is best judged by its weakest entries rather than its best. What’s the worth of an unreliable reference work?

  14. Do Ants Have Souls?

    Nick Carr writes a brilliant piece on people’s quasi-religious fervor over Web 2.0. He quotes from Kevin Kelly’s We Are The Web (this is from the last page of the article):There is only one time in the history of each

  15. “I would argue, in fact, that the overall quality of an encyclopedia is best judged by its weakest entries rather than its best. What’s the worth of an unreliable reference work?”

    Those are really two separate things. Given that Wikipedia lets you see inside the sausage factory, judge it by the results of “Random link” and articles about things you do know (as you did). On the second point, that too many articles are unreferenced is a problem we’re at work on – that’s actually a different problem than quality of writing and coverage, IMO.

  16. Gautam

    Its strange… you write a blog, to criticize web 2.0, when blogs- a web 2.0 application is the reason you are heard and read over the Internet!

  17. Do computers make us smarter?

    With all the talk lately about the new net revolution, Web 2.0, and all of that (e.g., point and counter-point), it is interesting to throw some actual research into the mix. Lowell Monke’s recent article in Orion Magazine does just that.

    [R]e…

  18. The Cult of the Amateur – Really?

    Nicholas Carr has a post on the amateur nature of the collective consciousness of the Internet. Its ironic. He points to the slipshod quality of much of Wikipedia to highlight the flaws of the blogosphere including echolalia, tendency to reinforce…

  19. JS

    “In theory, Wikipedia is a beautiful thing.”

    In theory? I thought that in theory it wouldn’t work at all…

  20. Ian Woollard

    I don’t know about this theory that Wikipedia is a heap of junk; by comparing it with the EB.

    I mean, does anyone *have* a copy of the 5th year of the EB? Who says that that was better than the Wikipedia?

    I mean, aren’t we comparing a very new and very ambitious encyclopedia with a two hundred year old encyclopedia, and expecting *rather* too much? As in what gives?

  21. When Kevin Kelly turned up again, I knew that the resurgence of tech was real. Enough real value was being created to allow hucksters and frauds to make a living again.

    The question is, are we empowering the esoteric at the expensive of the authoritative? Maybe the answer to that is yes, so far. Maybe, that dream of authority was always a myth anyway. The Britannica was a status purchase for the middle class, unread and displayed on the shelves. Why should it be our yardstick?

    The web is a dreadfully imperfect tool. But let’s ask whether our technology serves human needs better, not whether it matches a previous dream of human knowledge.

  22. Kevin Kelly

    Nice piece! I don’t disagree with most of it. Except for the very last paragraph, which is really the only paragraph dealing with the point of your story. That the web, machines, and maybe even technology are not moral.

    I’ve changed my mind on this. I used to think technology was neutral — just a tool — you could use it for good or evil. Pretty standard belief for us nerds. But in spending the last three years trying to figure out what the greater meaning of technology is I’ve reluctantly concluded that technology is a moral force (for the good). I’ll need a whole book to make that argument (if I can) and that is what I am working on.

    But you have to agree it is an important and vital question. I hope you continue your investigation of it.

  23. Anonymous

    Does the EB even have entries for Bill Gates and Jane Fonda? Not that I think it should, but why are these metrics for comparing encyclopedic performance?

    And I’d answer my own question but the town’s library has been shut down and I don’t feel like spending $49.95 to find out. Seeing how the on-line edition only has 73,000 articles in it, I’m doubtfull.

  24. The Cult of the Amateur

    Nicholas Carr has an interesting piece on the Web 2.0 phenomenon, the vision of the web as a sort of collective consciousness that will fundamentally change human culture, and even the very concept of human intelligence.

    This is a very interesting p…

  25. I think everyone in the wikipedia community is trying very hard to make the quality “good” as you say; and wikipedia certainly responds to input such as this. You might be happy to know that both articles you have mentioned have been since added to cleanup projects, in addition to broader discussions about ways to improve writing quality.

    At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

    I certainly am not of the opinion that wikipedia is some transcendent work beyond the descriptions of good or bad, but I think this point might be looked at more closely. A work, of whatever size, that is edited and written by a collection of people over a period of time that, in all probability, have varying masteries of english will inevitably appear to be bad writing. It takes another person to come in and combine all the probably factually correct information into sentence structures that are pleasing to read, wiki’s call this re-factoring sometimes. It is a difficult and time consuming process as you can imagine, but one wikipedia is trying to make more appealing for editors.

  26. Why Wikipedia is not Web 2.0 and salvation might come even if it is amoral

    The argument, however nicely put, seems to twist things in order to give room to some ideas of yours, not to rebuff the apostles of an electronic collective consciousness.

    Your argument seems to go like this: People believe in an eventual transformation of the current information networks (by new organization principles) into something like a collective mind, which works by making individual (amateurish) contribution part of a greater mental structure (unlike classical culture, which apparently is about singular performances). The believers in that electronic transcendence have to subscribe to a view which makes the bits and pieces of the new system into a moral, normative agglomerate, and thus good. The epithome of that vision is Wikipedia. Wikipedia is not good, because it is unprofessional. Thus, the transcendence does not work and the believers are rebutted. (Forgive me my inaccuracies.)

    What is wrong with this argument of yours?

    1. People speculating about collective consciousness are a small minority among the web visionaries, and they are not generally identical with wikipedians.

    2. A collective mind vision does not necessarily entail a moral view. Even if it did, it would not mean that the individual aspects would have to embody “goodness”. The individual aspects have to realize suitability.

    3. Good in the sense of moral things is different from good in the sense of “professional quality”. In the case of Wikipedia, your claim should not be mixed with moral goodness, but specifically apply to suitability for Wikipadias purpose (which is being an authoritative reference).

    4. You state that Wikipedia is not professional enough, and what you really mean is that it does not implement the proper principles to serve its purpose (good quality reference). The conclusion from this, however, is not that collaborative intelligence does not work, but that Wikipedia’s constraints do not result in the required qualities.

    5. The question stemming from the argument does not apply to an antagonism between professional and amateurish contribution, but whether the principles that pertain to professional work might be implemented “outside” the experts within a distributed, self-organizing information structure. This is indeed an interesting question, which might lead us even into a Strong AI debate.

  27. Susan Littlemore

    I wholeheartedly share Nicholas’s scepticism on Wikipedia in particular. The ability of anyone to insert factoids that survive for long enough to be read by the unwary, and the replacement of the traditional ‘getting it right’ principle with ‘not getting it too obviously wrong’ add up to an abandonment of intellectual rigour for the sake of ease: fast food taking over in the intellectual as well as physical realm. For more discussion see http://forum.atimes.com/topic.asp?TOPIC_ID=4083

  28. John Quiggin

    I’m not convinced by the critique of Wikipedia. I don’t go to an encyclopedia for fine writing, and I don’t think that fixing bad writing should be a high priority for a venture that’s still only five years old. At this point, the main priority ought still to be adding more information and ensuring the accuracy of what’s there. It’s my impression that Wikipedia compares pretty well with the competition on both scores.

    A more convincing test would be to pick a sample and show that there were significant errors or omissions (relative to the competition). Of course, these would be fixed quickly but the lesson would stand.

  29. Web 2.0: The Triumph of Amateur Hour?

    I have never, ever understood the cult of professionalism adhered to by some journalists (bloggers have no journalism degrees, bloggers bad amateurs, bloggers threaten professionals, so must be crushed). Nor have I been on the bandwagon to eviserate th…

  30. Fantastic post! Some of the brains nehind the “Web 2.0″ are really into whole celebrity/rock star mentality, deperately seeking 15 seconds of fame, usually for “re-creating the wheel” (RSS vs. XHTML). If the internet & computers are supposed to be making my life easier, how come I worship “the glittering pixels of an LCD screen” for endless hours a day?

  31. Hi,

    Your argument doesn’t hold because your premises are contradictory.

    Premise 1) Peer communities produce low ‘quality’ (aka value) goods.

    Premise 2) These goods are substitutes for traditional goods in the same market.

    Conclusion) Demand shifts inwards (“scary economics”).

    If peer communites produce only low ‘quality’ (aka value) goods, demand will not shift inwards, because these goods are not substitutes for (high value) traditionally produced goods –

    unless we invoke some kind of unrealistic deus ex machina, like huge elasticity.

    In fact, the only way your conclusion holds is to *invert* your first premise: if peer communities, in fact, produce high value goods.

    These are then substitutes for traditional goods, and demand shifts inwards without us having to resort to anything else.

    You can’t have it both ways… :)

  32. nicholas carr on “the amorality of web 2.0″

    This has caused a slight stir, and is refreshing insofar as it is questions the giddy optimism surrounding web 2.0. Nicholas Carr writes about business and technology and was an editor of the Harvard Business Review. He published this on…

  33. Nick

    First, thanks to everyone who has taken the time to comment on my post. Here are some brief comments on your comments:

    John Gauntt and Wayne: I, too, prefer my technology to be amoral, and the web’s “color-blindness” is indeed a great strength.

    Venkat: I’m not defending the shortcomings of traditional media, which are many and growing. My question is whether the economic pressures caused by the web will in the end make those shortcomings even worse.

    David Gerard, JS, Ian Woollard, John Quiggin, Judson Dunn: Wikipedia can choose to judge itself by whatever criteria it chooses, but it promotes itself as “the free encyclopedia.” Therefore, it’s only fair that others judge it by the standards of reliability we would expect from other encyclopedias. Do you really think that most of the millions of people who consult Wikipedia (including many young students) make the effort to look inside “the sausage factory” to see how it works and what its inherent flaws are? Of course they don’t. They use it like they’s use any other encyclopedia, unaware that at any given moment any given entry can include factual errors, omissions and distortions. What the Wikipedia community should do is put a warning notice on the top of every page: “WARNING: This page may include factual errors.” Why don’t you include such a warning where real users would see it?

    Neil K.: Yes, that’s a good question. I wish we could (in your terms) empower the esoteric without destroying the authoritative. My fear is that we can’t.

    Kevin Kelly: Thanks. I look forward to your book.

    Anonymous: You write: “the town’s library has been shut down and I don’t feel like spending $49.95 [on on-line Encyclopedia Britannica] to find out [if it even has entries on Bill Gates and Jane Fonda]. Seeing how the on-line edition only has 73,000 articles in it, I’m doubtful.” The fact that the on-line edition of EB is of poorer quality than the print edition – and that your local library has closed down – underscores my fear about how the economics of the Web may weaken the general culture rather than strengthen it.

    Gruber: Good points. I do think that collective effort can produce excellent results in some circumstances and mediocre results in others, and that the difference can often be traced to constraints on collectivism. More precisely, collectivism works best when there’s some form of hierarchical control over the end product (as in Linux) and works less well in the absence of such control (as in Wikipedia). “Collective intelligence” is a misnomer, in other words; much of the intelligence ultimately depends not on collectivism but on having smart people at the center. Pure democracies produce crappy results.

    umair: Thanks for raising these issues. But you’re starting from a faulty assumption when you posit “quality” and “value” as being synonymous. They’re not.

  34. Nicholas Carr on The amorality of Web 2.0

  35. bob stein

    frankly, when you peel away the senstive “gee i’m just concerned about our future” strokes, all you’ve got here is one more apologist for the status quo and his job in particular.

    particularly grating is the dishonesty of the piece which introduces a quote from the Wikipedia: “Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim: . . .” well i went to the Wikipedia and it turns out the quote is from a section entitled “early years” not Jane Fonda’s Life. the actual Wikipedia entry on Fonda is quite extensive far outstripping the pathetic Brittanica entry which i quote in full:

    “Jane Seymour Fonda American motion-picture actress who was also noted for her political activism.

    The daughter of actor Henry Fonda, she left Vassar College after two years and lived in New York City. She studied acting under Lee Strasberg at the Actors Studio there in 1958 and worked as a model. Her acting career began with appearances in the Broadway play There Was a Little Girl (1960) and the motion picture Tall Story (1960), and she went on to appear in comic roles in numerous films in the 1960s, including Cat Ballou (1965) and Barefoot In the Park (1967). Her subsequent, more substantial roles were in such socially conscious films as They Shoot Horses, Don’t They? (1969), Klute (1971), Coming Home (1978), and The China Syndrome (1979). She received Academy Awards for best actress for her performances in Klute and Coming Home. She costarred with her father in the film On Golden Pond (1981).

    In the 1970s and ’80s Fonda was active on behalf of left-wing political causes. She was an outspoken opponent of the Vietnam War who journeyed to Hanoi in 1972 to denounce the U.S. bombing campaigns there. In the 1980s she devised a popular exercise program for women while continuing to appear in motion pictures. She was married three times, to the French film director Roger Vadim, to the American politician Tom Hayden, and to the American broadcasting entrepreneur Ted Turner.”

    be honest, in this case, which would you rather have, “the amateur” or “the professional” version.

    i don’t mean to mindlessly promote the wonders of collaborative effort but i don’t think it helps to make such a lame case for professionalism.

    one more point . . . i’m not a big fan of kelley’s rapturous presentation, but his basic point, that we’re inventing the future and we really should do as good a job as possible seems spot-on to me.

  36. Publishing to the People (Samizdata)

    Ben at if:book posted inviting discussion around some issues in a long post by Nicholas Carr. Ben’s focus on Carr’s post was around the web being a competitor to traditional media. The issue he centers around is, essentially, will Web

  37. Anonymous

    Nick, I think this is a great post and is always good to hear criticism. Don’t mind what Web 2.0″ fans say out there, keep posting your opinion.

    Personally I do thinkg the concept of “Web2.0″ has grown into a sort of fanaticism which reminds me of 98/99 (not to say com…sm or mar..st). I am really looking forward to a good study on the “business models” of “Web 2.0″, hoppefully the guys from MIT (earlier post) would have something soon for the rest of the world. Up to now that there are two business models “sell to another company” (possibly to the one that think are way behind on the wave) and advertising – and we know already how that rollercoster ride goes…..

    Just for quality comparison here is Bill Gates entry on Encarta:

    Gates, William Henry, III, born in 1955, American business executive, who serves as chairman and chief software architect of Microsoft Corporation, the leading computer software company in the United States. Gates cofounded Microsoft in 1975 with high school friend Paul Allen. The company’s success made Gates one of the most influential figures in the computer industry and, eventually, one of the richest people in the world.

    (Microsoft is the publisher of Encarta Encyclopedia.)

    Born in Seattle, Washington, Gates attended public school through the sixth grade. In the seventh grade he entered Seattle’s exclusive Lakeside School, where he met Allen. Gates was first introduced to computers and programming languages in 1968, when he was in the eighth grade. That year Lakeside bought a teletype machine that connected to a mainframe computer over phone lines. At the time, the school was one of the few that provided students with access to a computer.

    Soon afterward, Gates, Allen, and other students convinced a local computer company to give them free access to its PDP-10, a new minicomputer made by Digital Equipment Corporation. In exchange for the computer time, the students tried to find flaws in the system. Gates spent much of his free time on the PDP-10 learning programming languages such as BASIC, Fortran, and LISP. In 1972 Gates and Allen founded Traf-O-Data, a company that designed and built computerized car-counting machines for traffic analysis. The project introduced them to the programmable 8008 microprocessor from Intel Corporation.

    While attending Harvard University in Cambridge, Massachusetts, in 1975, Gates teamed with Allen to develop a version of the BASIC programming language for the Altair 8800, the first personal computer. They licensed the software to the manufacturer of the Altair, Micro Instrumentation and Telemetry Systems (MITS), and formed Microsoft (originally Micro-soft) to develop versions of BASIC for other computer companies. Gates decided to drop out of Harvard in his junior year to devote his time to Microsoft. In 1980 Microsoft closed a pivotal deal with International Business Machines Corporation (IBM) to provide the operating system for the IBM PC personal computer. As part of the deal, Microsoft retained the right to license the operating system to other companies. The success of the IBM PC made the operating system, MS-DOS, an industry standard. Microsoft’s revenues skyrocketed as other computer makers licensed MS-DOS and demand for personal computers surged. In 1986 Microsoft offered its stock to the public; by 1987 rapid appreciation of the stock had made Gates, 31, the youngest ever self-made billionaire. In the 1990s, as Microsoft’s Windows operating system and Office application software achieved worldwide market dominance, Gates amassed a fortune worth tens of billions of dollars. Alongside his successes, however, Gates was accused of using his company’s power to stifle competition. In 2000 a federal judge found Microsoft guilty of violating antitrust laws and ordered it split into two companies. An appeals court overturned the breakup order in 2001 but upheld the judge’s ruling that Microsoft had abused its power to protect its Windows monopoly. In November 2001 Microsoft reached a settlement with the U.S. Justice Department and nine states, and a year later, the settlement was upheld by a federal district court judge. (For more information on the history of Microsoft, see Microsoft Corporation.)

    Gates has made personal investments in other high-technology companies. He sits on the board of one biotechnology company and has invested in a number of others. In 1989 he founded Corbis Corporation, which now owns the largest collection of digital images in the world.

    In the late 1990s Gates became more involved in philanthropy. With his wife he established the Bill & Melinda Gates Foundation, which, ranked by assets, quickly became one of the largest foundations in the world. Gates has also authored two books: The Road Ahead (1995; revised, 1996), which details his vision of technology’s role in society, and Business @ the Speed of Thought (1999), which discusses the role technology can play in running a business.

    In 1998 Gates appointed an executive vice president of Microsoft, Steve Ballmer, to the position of president, but Gates continued to serve as Microsoft’s chairman and chief executive officer (CEO). In 2000 Gates transferred the title of CEO to Ballmer. While remaining chairman, Gates also took on the title of chief software architect to focus on the development of new products and technologies.

  38. Steve Button

    When will the great Wikipedia get good?

    OK Here’s a thought. Why not have ratings on the articles (a bit like Amazon) and allow people to comment on the articles in Wikipedia. Then you could have a starting point for which articles need improvement.

    You could also use a “Was this review helpful to you” button, to weed out bad reviewers (or to give more attention to useful reviewers).

    Perhaps they do this already, but I haven’t seen it yet? I’ve used WP dozens of time in the past, so if it is there, and I haven’t noticed it.. then it needs to be more obvious IMHO.

    Steve Button

  39. Amoraility, Egalitarianism, and the Bazaar

    Courtesy of The Register, I came across Nicholas Carr’s blog today. And more specifically this post.

    He’s got some great idea which I’m not going to attempt to summarise in any great detail because he says it much better than I coul…

  40. Anonymous

    I am not sure ratings would solve the problem of wikipedia. There are two main reason to that

    – the reader might not know the entry contained not factual data (therefore cannot rate it)

    – most readers don’t take the time to rate features, products.

    I believe they tried that in search engines and it did not work out well (and offering money made it worst).Check out the “Dr. Daniel E. Rose” webcast at berkeley he talks about it at some point. http://webcast.berkeley.edu/courses/archive.php?seriesid=1906978252

  41. Joining The Cult of the Amateur

    Nicholas Carr has written a quite interesting article on Web 2.0 and what he believes to be the Cult of the Amateur. Basically, he means that because stuff is free, like in Wikipedia or OpenSource (his words, not mine), they will always be used and re…

  42. choi li akiro singh santos

    Finally some Sanity amid the Hype:

    Great comments. I too used Wikipedia often and have found the writing to be uneven at best. Nonetheless, I find that it is a good starting point: the collection of EXTERNAL LINKS, for a given topic are usually better than what you can get from a Search Engine.

    The triumph of Amateurs at the expense of Professionals is scary. Might it be possible that we may not have a zero-sum game here? The bloggers have been able to correct mistakes that have eluded the “editors” of the NYTimes, the Economist, etc. Without professional publications, I doubt if bloggers will have that much to write about: I suspect that the practice linking to original content from other bloggers will not be sufficient. Remember, most blog discussions start from an article from a professional publication.

    What is most interesting is that the so-called Web 2.0 companies have no Business Model. The most famous example to date is Flickr, a great site that Yahoo! purchased. Could it have survived on it’s own? I’m not sure about that. Dave Winer recently noted:

    “I wasn’t at Web 2.0 last week, but I know some of the jargon that developed there. People were walking around saying this is the Flickr of that, and that is the Flickr of this.”

    Let the hype continue ….

  43. Frapazoid

    The more things change, the more they stay the same.

    You know, there used to be the huge network of intelligence and communication where anyone could post their ideas and thoughts.

    Yeah, it was called the “internet” last time too.

    Web 2.0 isn’t some super machine or anything.

    The term “Web 2.0″ doesn’t even mean anything at all! This is just a fad term to lure venture capitalist who aren’t doing their research.

    Every time I see someone talking about “Web 2.0″, I have to ask, HAS THE ENTIRE WORLD GONE MAD?!

    There is nothing new about the web! It has been gradually developing for the past decade. There is no such thing as “Web 2.0″.

  44. The Web 2.0 open/closed debate explodes

    A simmering debate about how much companies like Google and Oodle are allowed to duplicate from other sources has exploded into full force. We woke up this morning to an emotional torrent on multiple sides. The significance of this stems beyond Google …

  45. Wikipedia under file

    It is hardly fair to search out the worst entries you can find and then present them as typical. I doubt any encylopaedia would look good on that basis.

  46. cyou

    The internet is a huge garbage of formated text and other file-formats. This is not something new.You may read a few books from Clifford Stoll. Wikipedia is just again a piece of crap.

    Why ? Do they pay people ? Did they have some research experience ?

    Blogs are the amateur form of a newspaper article.

  47. Sandy Borthick

    If you’ve got the attention span to read a whole book about the real world (not more specious bloviating about the “blogosphere”), I suggest ‘The World is Flat’ by Thomas Friedman. Web 2.0 may end up being Bubble 2.0, but only if everyone piles on and pays too much in the portal consolidation getting underway. In the longer run, say 3-5 years out, I doubt that amateur blogging (this decade’s CB radios) will satisfy many peoples’ desire for concise coverage, informed commentary and/or auhoritative depth reporting — anymore than one-off hacks and ‘perpetual betas’ (one of the Web 2.0 articles of faith) will successfully address corporate requirements for stable enterpise-level software.

    p.s. Nick — Just think of Web 2.0 as a style thang, and you won’t be so bummed out about it! That reminds me of another topical read: ‘Boomeritis’ by Ken Wilbur.

  48. Technology and Morality

    A fascinating essay about the Internet’s Second Coming and its spiritual, ethical and cultural consequences. Something of a counterpoint to the prevailing viewpoint and very shrewd in parts.