The amorality of Web 2.0

This post, along with seventy-eight others, is collected in the book Utopia Is Creepy.

From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we’re all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.

But as the Web matured during the late 1990s, the dreams of a digital awakening went unfulfilled. The Net turned out to be more about commerce than consciousness, more a mall than a commune. And when the new millenium arrived, it brought not a new age but a dispiritingly commonplace popping of a bubble of earthly greed. Somewhere along the way, the moneychangers had taken over the temple. The Internet had transformed many things, but it had not transformed us. We were the same as ever.

The New New Age

But the yearning for a higher consciousness didn’t burst with the bubble. Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. In a profile of Internet savant Tim O’Reilly in the current issue of Wired, Steven Levy writes that “the idea of collective consciousness is becoming manifest in the Internet.” He quotes O’Reilly: “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s – except we didn’t know it would be technology-mediated.” Levy then asks, “Could it be that the Internet – or what O’Reilly calls Web 2.0 – is really the successor to the human potential movement?”

Levy’s article appears in the afterglow of Kevin Kelly’s sweeping “We Are the Web” in Wired’s August issue. Kelly, erstwhile prophet of the Long Boom, surveys the development of the World Wide Web, from the Netscape IPO ten years ago, and concludes that it has become a “magic window” that provides a “spookily godlike” perspective on existence. “I doubt angels have a better view of humanity,” he writes.

But that’s only the beginning. In the future, according to Kelly, the Web will grant us not only the vision of gods but also their power. The Web is becoming “the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds … We will live inside this thing.”

The revelation continues:

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

This isn’t the language of exposition. It’s the language of rapture.

The Cult of the Amateur

Now, lest you dismiss me as a mere cynic, if not a fallen angel, let me make clear that I’m all for seeking transcendence, whether it’s by going to church or living in a hut in the woods or sitting at the feet of the Maharishi or gazing into the glittering pixels of an LCD screen. One gathers one’s manna where one finds it. And if there’s a higher consciousness to be found, then by all means let’s get elevated. My problem is this: When we view the Web in religious terms, when we imbue it with our personal yearning for transcendence, we can no longer see it objectively. By necessity, we have to look at the Internet as a moral force, not as a simple collection of inanimate hardware and software. No decent person wants to worship an amoral conglomeration of technology.

And so all the things that Web 2.0 represents – participation, collectivism, virtual communities, amateurism – become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state. But is it really so? Is there a counterargument to be made? Might, on balance, the practical effect of Web 2.0 on society and culture be bad, not good? To see Web 2.0 as a moral force is to turn a deaf ear to such questions.

Let me bring the discussion down to a brass tack. If you read anything about Web 2.0, you’ll inevitably find praise heaped upon Wikipedia as a glorious manifestation of “the age of participation.” Wikipedia is an open-source encyclopedia; anyone who wants to contribute can add an entry or edit an existing one. O’Reilly, in a new essay on Web 2.0, says that Wikipedia marks “a profound change in the dynamics of content creation” – a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is allowing us to pool our individual brains into a great collective mind. It’s a harbinger of the Machine.

In theory, Wikipedia is a beautiful thing – it has to be a beautiful thing if the Web is leading us to a higher consciousness. In reality, though, Wikipedia isn’t very good at all. Certainly, it’s useful – I regularly consult it to get a quick gloss on a subject. But at a factual level it’s unreliable, and the writing is often appalling. I wouldn’t depend on it as a source, and I certainly wouldn’t recommend it to a student writing a research paper.

Take, for instance, this section from Wikipedia’s biography of Bill Gates, excerpted verbatim:

Gates married Melinda French on January 1, 1994. They have three children, Jennifer Katharine Gates (born April 26, 1996), Rory John Gates (born May 23, 1999) and Phoebe Adele Gates (born September 14, 2002).

In 1994, Gates acquired the Codex Leicester, a collection of writings by Leonardo da Vinci; as of 2003 it was on display at the Seattle Art Museum.

In 1997, Gates was the victim of a bizarre extortion plot by Chicago resident Adam Quinn Pletcher. Gates testified at the subsequent trial. Pletcher was convicted and sentenced in July 1998 to six years in prison. In February 1998 Gates was attacked by Noël Godin with a cream pie. In July 2005, he solicited the services of famed lawyer Hesham Foda.

According to Forbes, Gates contributed money to the 2004 presidential campaign of George W. Bush. According to the Center for Responsive Politics, Gates is cited as having contributed at least $33,335 to over 50 political campaigns during the 2004 election cycle.

Excuse me for stating the obvious, but this is garbage, an incoherent hodge-podge of dubious factoids (who the heck is “famed lawyer Hesham Foda”?) that adds up to something far less than the sum of its parts.

Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim:

Her nickname as a youth—Lady Jane—was one she reportedly disliked. She traveled to Communist Russia in 1964 and was impressed by the people, who welcomed her warmly as Henry’s daughter. In the mid-1960s she bought a farm outside of Paris, had it renovated and personally started a garden. She visited Andy Warhol’s Factory in 1966. About her 1971 Oscar win, her father Henry said: “How in hell would you like to have been in this business as long as I and have one of your kids win an Oscar before you do?” Jane was on the cover of Life magazine, March 29, 1968.

While early she had grown both distant from and critical of her father for much of her young life, in 1980, she bought the play “On Golden Pond” for the purpose of acting alongside her father—hoping he might win the Oscar that had eluded him throughout his career. He won, and when she accepted the Oscar on his behalf, she said it was “the happiest night of my life.” Director and first husband Roger Vadim once said about her: “Living with Jane was difficult in the beginning … she had so many, how do you say, ‘bachelor habits.’ Too much organization. Time is her enemy. She cannot relax. Always there is something to do.” Vadim also said, “There is also in Jane a basic wish to carry things to the limit.”

This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. Remember, this emanation of collective intelligence is not just a couple of months old. It’s been around for nearly five years and has been worked over by many thousands of diligent contributors. At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. When will the great Wikipedia get good? Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

The promoters of Web 2.0 venerate the amateur and distrust the professional. We see it in their unalloyed praise of Wikipedia, and we see it in their worship of open-source software and myriad other examples of democratic creativity. Perhaps nowhere, though, is their love of amateurism so apparent as in their promotion of blogging as an alternative to what they call “the mainstream media.” Here’s O’Reilly: “While mainstream media may see individual blogs as competitors, what is really unnerving is that the competition is with the blogosphere as a whole. This is not just a competition between sites, but a competition between business models. The world of Web 2.0 is also the world of what Dan Gillmor calls ‘we, the media,’ a world in which ‘the former audience,’ not a few people in a back room, decides what’s important.”

I’m all for blogs and blogging. (I’m writing this, ain’t I?) But I’m not blind to the limitations and the flaws of the blogosphere – its superficiality, its emphasis on opinion over reporting, its echolalia, its tendency to reinforce rather than challenge ideological extremism and segregation. Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do. Those despised “people in a back room” can fund in-depth reporting and research. They can underwrite projects that can take months or years to reach fruition – or that may fail altogether. They can hire and pay talented people who would not be able to survive as sole proprietors on the Internet. They can employ editors and proofreaders and other unsung protectors of quality work. They can place, with equal weight, opposing ideologies on the same page. Forced to choose between reading blogs and subscribing to, say, the New York Times, the Financial Times, the Atlantic, and the Economist, I will choose the latter. I will take the professionals over the amateurs.

But I don’t want to be forced to make that choice.

Scary Economics

And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture – and it’s doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it’s created by amateurs rather than professionals, it’s free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we’ve recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can’t imagine anything more frightening.

In “We Are the Web,” Kelly writes that “because of the ease of creation and dissemination, online culture is the culture.” I hope he’s wrong, but I fear he’s right – or will come to be right.

Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters the forms and economics of production and consumption. It doesn’t care whether its consequences are good or bad. It doesn’t care whether it brings us to a higher consciousness or a lower one. It doesn’t care whether it burnishes our culture or dulls it. It doesn’t care whether it leads us into a golden age or a dark one. So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.

193 thoughts on “The amorality of Web 2.0

  1. Steve

    Nicholas’ “The amorality of Web 2.0

    October 03, 2005” was well done. I would reinforce, however, that “professional journalistic media” (aka newspapers) bring it on themselves when they lose readership due to:

    a. Blatantly partial/biased reporting

    b. Printed product loaded with ads and wire service stories

  2. theQview

    Inefficient Market for Facts

    Wikipedia if often cited as the example of the power of user created collaborative content and the power of collective intelligence. As Nicholas Carr points out in The amorality of Web 2.0: If you read anything about Web 2.0, you’ll

  3. bill

    With any luck at all, the layoffs at the newspapers will continue and avalanche until their ultimate demise. Lacing every story with slanted political tripe is not the way to sell newspapers.

    The first thing any intelligently managed business does is stay out of politics. You run the risk of losing half your customer base before you even get started.

  4. Crystal Shards

    Web 2.0 – The WWC

    After reading a quick news post about cracks in web 2.0 on Wired News, I was lead to a particularly enlightening essay about what exactly Web 2.0 is. This term has been travelling around the net for some time now and I had been curious as to what exac…

  5. Marek Jakubik

    When I hear writers and journalists cry about Web 2.0 Wikipedia and the likes, I think: that’s how The Grand Chefs must have felt when McDonald’s (quel horreur!) entered their territory. Nothing new here, democratization began many centuries ago and (only) now it is touching the information. What a concept for the Information Age, I say.

    Evolution works this way because we, people, have learned, accepted and assigned privilege to Choice. In almost all disciplines of human endeavors we favor Choice. That’s why we can eat substandard food, buy substandard merchandise or even provide a non-standard medical treatment to ourselves (when we choose to heal ourselves or with help of various alternate sources).

    However, since we also recognize that such a laissez-fair approach may at times be dangerous we do provide for exceptions. In many fields such as education, medical, banking, insurance etc. we regulate. To operate in those fields one must pass rigorous tests.

    So, back to Information: are The i-Chefs calling for regulating? Let them. Fat chance of that but they do have a choice. They can (still) move to China.

    I, meanwhile, might consume another scrap of (say) Mr. Carr’s writing and say: Vive le Choice! As for Jane Fonda’s tortured life stories I really wonder: Why does anybody care?

  6. CS182

    The Web 2 cult — time for heretics to apply some sense?

    Rough Type: Nicholas Carr’s Blog: The amorality of Web 2.0 The amorality of Web 2.0…

  7. NilsNet Blog

    Journalism: Who needs it? I do.

    I recently listened to an IT Conversations podcast featuring journalists Dan Gillmor, Jeff Jarvis, and Jay Rosen talking about how blogging is changing journalism. I love what they said in a perfect world, but my reality is

  8. Stuffola

    Journalism 2.0: It’s Not the Meat, It’s the Motion

    A while back I linked to Nicholas Carr, who had some interesting things to say about the rise of Web 2.0. For a guy who appreciates diversity of opinion, Jeff Jarvis seemed quick to dismiss Carr as an elitist curmudgeon…

  9. erik

    I can only agree with you Nick. I have lately discussed the future of mainly the future design of social software at the my blog at Stanford (http://fellows.rdvp.org/eriksundelof/blog/). What is a bit scary these days is the unconditional trust people put on system like the Wikipedia and Delicious. People that say that we should not compare the Wikipedia to Brittanica clearly haven’t read the true intentions of them.

    A bit more general about the subject. I am a true believer of technology and think there might be a way to go to really make a difference, but as you said “The Internet had transformed many things, but it had not transformed us. We were the same as ever.” This is a essential message that tends to get lost when people are discussing new technologies. We have to be ready for it and able to handle them. The sad part is that the absolute majority has to or else we (might) end up loosing in the end.

    I am a strong believer in the power of the people, but still believe there are dangers with the complete freedom, by which I have not at all said we should not have the freedom that Web 2.0 is said to represent. I think it is just said that people like you say put a bit to strong belief in that technology will change the world.

    Technology is a tool, nothing else. If we want to save the world we just have to do it ourselves probably by the help of technology. Technology by itself do very little. :)

    I am a supertech, but nevertheless I have to (and all other (super)techs) have to reflect on the awareness and readiness to exposure of the users that we develop for.

    Web 2.0 is something that is better than Web 1.0, but that does not mean that it is perfect. It is far from perfect, but however a leap in the right direction.

  10. Net

    The Amoral web 2.0

    Nicloas Carr, has written a very sharp piece on what he calls the amorality of Web 2.0. From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the

  11. Net

    The Amoral web 2.0

    Nicloas Carr, has written a very sharp piece on what he calls the amorality of Web 2.0. From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the

  12. IPcentral Weblog

    Navel-Gazing

    Given yesterday’s disappointing vote failing to formally exempt bloggers from free speech-limiting campaign finance rules (see a compelling breakdown of the vote in David Carney’s Tech Law Journal if you have a subscription) I thought I’d do a bit of…

  13. SiliconBeat

    Craig to invest in Wikipedia?

    Craig NewmarkHere’s a notable interview in Grade the News with Craig Newmark, founder of San Francisco online classifieds site, Craigslist. In it, he discusses his views on the problem of mainstream media, and the promise of citizen journalism. Craig a…

  14. Ed Byrne

    I don’t believe that free trumps quality every time – if that was the case the world would be a vastly different place, and premium brands (like the Apple I’m typing this on) would have no market.

    However free content does often trump what may be better, paid, content – if it’s only marginally so. The Britannica is a wonderful encyclopaedia, but Wikipedia isn’t a million miles off it – and a good web search can fill in the Wikipedia’s blanks and provide enough information.

    What’s more, while paying for information doesn’t really offend or affect me in any way – the fantastic thing about the Internet (1.0 or 2.0 or later) is that diverse cultures and economies can access it – so while free is fine for me, it’s absolutely essential to people that live on dollars a week, and anything other than free means ‘inaccessible’.

    Web 2.0 is no more or less amoral as Web 1.0. Business is business and shareholders and capitalism demand profit. But that doesn’t mean the web isn’t ALSO what Kevin Kelly calls it – which I believe can co-exist with the business side of the net.

    If ‘information is power’ then regardless of technology amorality, the Interent has empowered an awful lot of people that would otherwise not be so.

  15. Nick

    Ed: “All the time,” in common speech, doesn’t mean “every time”; it means “frequently.” If I say “People jaywalk all the time,” it doesn’t mean that people jaywalk every time they cross a street; it means that jaywalking is commonplace. That’s what I meant, too.

  16. Writing and the Digital Life

    The changing economics of creative work

    This article by Nicholas Carr at his blog Rough Type is now a month old – sorry for the delay in linking to it – but it’s a must-read critique of Web 2.0, the Wikipedia debate, and issues we discuss

  17. Writing and the Digital Life

    The changing economics of creative work

    This article by Nicholas Carr at his blog Rough Type is now a month old – sorry for the delay in linking to it – but it’s a must-read critique of Web 2.0, the Wikipedia debate, and issues we discuss

  18. Ed Byrne

    Sorry Nick, I didn’t mean to nitpick on a single phrase – you’re right of course, and I agree with you.

    What I’m really trying to say is that between Kevin Kelley’s utopian view, and you’re kind of cynical view, is a median where the web actually exists.

    Big Business always have their own interests at heart, often to the cost of the people – that’s true on-line and in the real world. That’s the amorality of the web. And then there’s people who want to make a difference – donating to charity, working with poor, homeless and underpriviliged people and nations. These people have a role on-line as well, they’re the web’s moral fiber.

  19. Mark Evans

    Microsoft’s Web 2.0 Manifesto

    There is going to be a huge amount of chatter within the blogosphere – and, hopefully, the mainstream media – about the “Live” e-mails sent out by Bill Gates and Microsoft CTO

Comments are closed.