This post, along with seventy-eight others, is collected in the book Utopia Is Creepy.
From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we’re all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.
But as the Web matured during the late 1990s, the dreams of a digital awakening went unfulfilled. The Net turned out to be more about commerce than consciousness, more a mall than a commune. And when the new millenium arrived, it brought not a new age but a dispiritingly commonplace popping of a bubble of earthly greed. Somewhere along the way, the moneychangers had taken over the temple. The Internet had transformed many things, but it had not transformed us. We were the same as ever.
The New New Age
But the yearning for a higher consciousness didn’t burst with the bubble. Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. In a profile of Internet savant Tim O’Reilly in the current issue of Wired, Steven Levy writes that “the idea of collective consciousness is becoming manifest in the Internet.” He quotes O’Reilly: “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s – except we didn’t know it would be technology-mediated.” Levy then asks, “Could it be that the Internet – or what O’Reilly calls Web 2.0 – is really the successor to the human potential movement?”
Levy’s article appears in the afterglow of Kevin Kelly’s sweeping “We Are the Web” in Wired’s August issue. Kelly, erstwhile prophet of the Long Boom, surveys the development of the World Wide Web, from the Netscape IPO ten years ago, and concludes that it has become a “magic window” that provides a “spookily godlike” perspective on existence. “I doubt angels have a better view of humanity,” he writes.
But that’s only the beginning. In the future, according to Kelly, the Web will grant us not only the vision of gods but also their power. The Web is becoming “the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds … We will live inside this thing.”
The revelation continues:
There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.
You and I are alive at this moment.
We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.
Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.
This isn’t the language of exposition. It’s the language of rapture.
The Cult of the Amateur
Now, lest you dismiss me as a mere cynic, if not a fallen angel, let me make clear that I’m all for seeking transcendence, whether it’s by going to church or living in a hut in the woods or sitting at the feet of the Maharishi or gazing into the glittering pixels of an LCD screen. One gathers one’s manna where one finds it. And if there’s a higher consciousness to be found, then by all means let’s get elevated. My problem is this: When we view the Web in religious terms, when we imbue it with our personal yearning for transcendence, we can no longer see it objectively. By necessity, we have to look at the Internet as a moral force, not as a simple collection of inanimate hardware and software. No decent person wants to worship an amoral conglomeration of technology.
And so all the things that Web 2.0 represents – participation, collectivism, virtual communities, amateurism – become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state. But is it really so? Is there a counterargument to be made? Might, on balance, the practical effect of Web 2.0 on society and culture be bad, not good? To see Web 2.0 as a moral force is to turn a deaf ear to such questions.
Let me bring the discussion down to a brass tack. If you read anything about Web 2.0, you’ll inevitably find praise heaped upon Wikipedia as a glorious manifestation of “the age of participation.” Wikipedia is an open-source encyclopedia; anyone who wants to contribute can add an entry or edit an existing one. O’Reilly, in a new essay on Web 2.0, says that Wikipedia marks “a profound change in the dynamics of content creation” – a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is allowing us to pool our individual brains into a great collective mind. It’s a harbinger of the Machine.
In theory, Wikipedia is a beautiful thing – it has to be a beautiful thing if the Web is leading us to a higher consciousness. In reality, though, Wikipedia isn’t very good at all. Certainly, it’s useful – I regularly consult it to get a quick gloss on a subject. But at a factual level it’s unreliable, and the writing is often appalling. I wouldn’t depend on it as a source, and I certainly wouldn’t recommend it to a student writing a research paper.
Take, for instance, this section from Wikipedia’s biography of Bill Gates, excerpted verbatim:
Gates married Melinda French on January 1, 1994. They have three children, Jennifer Katharine Gates (born April 26, 1996), Rory John Gates (born May 23, 1999) and Phoebe Adele Gates (born September 14, 2002).
In 1994, Gates acquired the Codex Leicester, a collection of writings by Leonardo da Vinci; as of 2003 it was on display at the Seattle Art Museum.
In 1997, Gates was the victim of a bizarre extortion plot by Chicago resident Adam Quinn Pletcher. Gates testified at the subsequent trial. Pletcher was convicted and sentenced in July 1998 to six years in prison. In February 1998 Gates was attacked by Noël Godin with a cream pie. In July 2005, he solicited the services of famed lawyer Hesham Foda.
According to Forbes, Gates contributed money to the 2004 presidential campaign of George W. Bush. According to the Center for Responsive Politics, Gates is cited as having contributed at least $33,335 to over 50 political campaigns during the 2004 election cycle.
Excuse me for stating the obvious, but this is garbage, an incoherent hodge-podge of dubious factoids (who the heck is “famed lawyer Hesham Foda”?) that adds up to something far less than the sum of its parts.
Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim:
Her nickname as a youth—Lady Jane—was one she reportedly disliked. She traveled to Communist Russia in 1964 and was impressed by the people, who welcomed her warmly as Henry’s daughter. In the mid-1960s she bought a farm outside of Paris, had it renovated and personally started a garden. She visited Andy Warhol’s Factory in 1966. About her 1971 Oscar win, her father Henry said: “How in hell would you like to have been in this business as long as I and have one of your kids win an Oscar before you do?” Jane was on the cover of Life magazine, March 29, 1968.
While early she had grown both distant from and critical of her father for much of her young life, in 1980, she bought the play “On Golden Pond” for the purpose of acting alongside her father—hoping he might win the Oscar that had eluded him throughout his career. He won, and when she accepted the Oscar on his behalf, she said it was “the happiest night of my life.” Director and first husband Roger Vadim once said about her: “Living with Jane was difficult in the beginning … she had so many, how do you say, ‘bachelor habits.’ Too much organization. Time is her enemy. She cannot relax. Always there is something to do.” Vadim also said, “There is also in Jane a basic wish to carry things to the limit.”
This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. Remember, this emanation of collective intelligence is not just a couple of months old. It’s been around for nearly five years and has been worked over by many thousands of diligent contributors. At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. When will the great Wikipedia get good? Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?
The promoters of Web 2.0 venerate the amateur and distrust the professional. We see it in their unalloyed praise of Wikipedia, and we see it in their worship of open-source software and myriad other examples of democratic creativity. Perhaps nowhere, though, is their love of amateurism so apparent as in their promotion of blogging as an alternative to what they call “the mainstream media.” Here’s O’Reilly: “While mainstream media may see individual blogs as competitors, what is really unnerving is that the competition is with the blogosphere as a whole. This is not just a competition between sites, but a competition between business models. The world of Web 2.0 is also the world of what Dan Gillmor calls ‘we, the media,’ a world in which ‘the former audience,’ not a few people in a back room, decides what’s important.”
I’m all for blogs and blogging. (I’m writing this, ain’t I?) But I’m not blind to the limitations and the flaws of the blogosphere – its superficiality, its emphasis on opinion over reporting, its echolalia, its tendency to reinforce rather than challenge ideological extremism and segregation. Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do. Those despised “people in a back room” can fund in-depth reporting and research. They can underwrite projects that can take months or years to reach fruition – or that may fail altogether. They can hire and pay talented people who would not be able to survive as sole proprietors on the Internet. They can employ editors and proofreaders and other unsung protectors of quality work. They can place, with equal weight, opposing ideologies on the same page. Forced to choose between reading blogs and subscribing to, say, the New York Times, the Financial Times, the Atlantic, and the Economist, I will choose the latter. I will take the professionals over the amateurs.
But I don’t want to be forced to make that choice.
Scary Economics
And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture – and it’s doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it’s created by amateurs rather than professionals, it’s free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we’ve recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can’t imagine anything more frightening.
In “We Are the Web,” Kelly writes that “because of the ease of creation and dissemination, online culture is the culture.” I hope he’s wrong, but I fear he’s right – or will come to be right.
Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters the forms and economics of production and consumption. It doesn’t care whether its consequences are good or bad. It doesn’t care whether it brings us to a higher consciousness or a lower one. It doesn’t care whether it burnishes our culture or dulls it. It doesn’t care whether it leads us into a golden age or a dark one. So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.
Why we need professional encyclopedias
When I discovered Wikipedia I was impressed. I was impressed because I had low expectations. “Look how much these amateurs…
Why we need professional encyclopedias
When I discovered Wikipedia I was impressed. I was impressed because I had low expectations. “Look how much these amateurs…
Web 2.0?
Finally! Here’s something meaningful about this “Web 2.0” fad. And yes, I did AJAX before you knew what that was, so don’t bother commeting! ;-)
Sorry, I fell asleep somewhere inbetween the Matrix and the talk of Angels.
Any chance you could do a new article with the key points laid out? Rather than this ramble. I can’t actually work out what you’re saying?
I personally think Web 2.0 is a good definition(this is against Web 2.0 right?)
Sorry, you expect a lot more from a user-submitted encyclopedia with almost a MILLION articles?
It is commendable in its own right for being a big player in interaction.
Oh please… yes you’re right; We’re all going to die because of Web 2.0… *sigh*… Web 2.0 is not going to take us anywhere, it will be us who does so. Just because we class the current web age we’re in as Web2.0 doesn’t mean it changes whats going to happen in the future. The ‘Ice Age’ was not named first and then it suddenly got cold. The same works for the web, we didn’t name the current age ‘Web 2.0’ and suddenly every web app became AJAXy.
Scary Economics
Scary Economics And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture -…
[Essays] Paul Graham’s Web 2.0 Essay
Paul Graham’s latest essay covers the latest catchphrase: “Web 2.0”. It opens with this paragraph:
While I agree with most of the points you make regarding wikipedia and the cult of the amateur, one should also point out the difference in quality between the first edition of the EB and the seminal 1911 edition: how long did it take for the “experts” to get it right? To polish it requires dedication and commitment to excellence, but get to the level where one needed to make only incremental improvement?
Naked agility: corporate blogging as reciprocal branding
In an honest attempt at starting a “naked conversation”, I would like to share some of the considerations I have when posting to an Accenture branded blog like this one. Being an Expressive Driver, the thought came to my mind…
Wikipedia, poster child for Web 2.0 flaws
Another arrow got fired at wikipedia.org recently in USA Today, with an op-ed piece by John Siegenthaler Sr.., who writes about his outrage on finding an entry in the collaborative encyclopedia that described him as playing a role in the assassination…
This is an amazing piece of writing. I’m sorry I just found it about 10 minutes ago. Your opening paragraph coupled with your use of “if not a fallen angel” makes me think that you are familiar with the Gnostics. Specifically that trait in humans that leads us to think that there is just one, pardon the expression, kernel of knowledge that will bring us to perfection.
Needless to say – living here in Seattle I’m just a bit jaundice about the whole Web 2.0 thing.
And BTW – John Seignethaler *jr* used to do the weekend news on the ABC affil here in Seattle.
Comparing Wikipedia and Britannica
This week’s issue of Nature has a news feature that reports on an experiment our journalists conducted to discover how reliable the scientific content of Wikipedia is compared to Britannica’s. They sent 50 pairs of Wikipedia and Britannica articles on…
Jimmy Wales’ Wikipedia comes close to Britannica in terms of the accuracy of its
science entries, a Nature investigation finds.
LINK IN MESSAGE = http://www.nature.com/nature/journal/v438/n7070/full/438900a.html
Whilst many bloggers are “amateurs”, the work they produce is often of a higher quality than that produced by “professionals”.
Take the SCO vs IBM case.
None of the professionals with a capacity to fund “in-depth reporting and research” did so. They did none of their own, instead reporting things from analysts with no knowledge of code or Unix.
The “amateurs” like Eric Raymond of OSI and Paula Jones of Groklaw were the people doing the serious investigative work.
Y’know, I’ve read all this discussion. Most of the points I’d like to make were already done.
However, I do remember Britannica and I still have the edition, where an article about lemmings endorsed the Disney version that lemmings march to their deaths every year – they just happened to believe Disney’s manufactured documentary was real (hint – it wasn’t). So much for peer review, eh?
Indeed the web is ammoral. And that’s great. Because morals, like it or not, are not set on stone – just as pure democracy and pure communism are thoroughly flawed and, ultimately, worst for their countries than any middle ground, any strict set of morals is doomed to fail with time, as any enterprise that will not adapt, as any species that will not change to accomodate its environment. Flexibility, rather than stagnation.
Yes, the Internet is ammoral – and better for it.
The examples you use to discredit Wikipedia are not representative of its content at large. Why didn’t you pick up excerpts from Science articles, where the real information is ?
Hasn’t it cross your mind that nobody cares about Bill Gates and Jane Fonda ? This is precisely why Wikipedia plays a role : it skims real information from garbage.
This article is not honest. It wouldn’t be accepted in Wikipedia…
Nicholas, this is what modernism is all about: structural differentiation and cultural generalization. Sociologists have been writing about this for decades, it just appears as though the Internet is accelerating this evolution. It is not the cause, it is the accelerator.
collective intelligence such as the types found in the “online world” can only be of average quality. The concept of mass collaboration might seem alluring, but the fact is that “too many cooks spoil the broth”. The truth about something…. anything does not depend on how many minds collaborated and concurred… it comes to a single mind (others copy and re-hash).
Great article: in-depth, well-written, and accurate. I have also been, for long, a “Web 2.0” skeptic for the same reasons you enumerated.
A lot of people have a tendency to venerate the “hip” new technology, the emerging cyberworld, et cetera. They fail to give any reason why these things are so world-chaningly novel. Consider what Google (to use one example) actually does: one who knows anything about the PageRank algorithm will understand that it merely bolsters the status quo. Which only means that the “new media” of the blogosphere, if mediated by Google or like technology, will eventually crystallize into the old. Unfortunately, the merits of “old media” have not been replicated adequately by Google, Wikipedia, and the blogosphere.
On the latter, I don’t buy for a second that the blogosphere is nearly as significant as people make it out to be. The blog is a medium of communication– a weak, sometimes useful one– but not a social revolution. Within five years, I suspect that this concept of the “blog”, stripped of novelty, will lose its “hipness” and the genre will refold itself into the blander and more inclusive “webpage” category.
I always find myself amazed by the techno-enthusiasts these days: they seem to predict virtual miracles without paying any thought to limitations of resources and feasibility. Who will pay for the “Over-mind”, who will administer it, and how do we know that we can trust them? (Anyone familiar with Wikipedia knows that its administrators are often poor-to-mediocre contributors with long editing histories, people who got in on the gig first and played nice.) If Wikipedia foreshadows an emerging social reality, the future doesn’t look good.
I’d say, overall, that one of Wikipedia’s greatest flaws is its perpetual adolescence. Many of the articles are in a neverending state of rough-draft truthiness; work that would be dismissed as garbage at Britannica is taken (unless the editor is disliked by others who frequent said article, in which case he is reverted without thought given to his work) as a valuable contribution– a start, a partial solution. This makes the production process more rapid– more people can contribute– but also leaves a lot of crappy partial solutions (half-started articles, incoherent writing) lying around.
More to the point, Wikipedia fails because it has no successful mechanism for discriminating between what is good and what is bad. It merely throws all of it up there, leaving the reader to decide. This system melts down when good and bad come into direct conflict, and the dispute is resolved via “community consensus”– that is, whoever has a longer edit history wins.
An Internet and two “Webs” later, we’re still the same humanity and not nearly competent to administer what some expect from mass communication.
http://www.corante.com/many/archives/2005/10/20/nick_carrs_amorality.php#59480
Submitted for anyone’s perusal, is the above link. A link that grazes the border of the tiffany twisted spaces of what I’d like to call The Corante Zone.
But seriously, what in the blazes is everyone getting their panties into a bunch over. It’s, it’s not OZ people! It’s not like we finally found a place where we can dump our persona, and not have to worry about the person. Geeze, someone makes a reference in a black book with the letter B I B L E about “the flesh being weak”, and everyone has to run around like chickens with their brains severed trying to find a place to immortalize themselves.
Maybe that’s what it really boils down to. All the Boomers who brought us Web 1.0 and Dot-Bomb ( thanks folks, no really, you didn’t have to, and here I didn’t get you anything ), are getting older and they’re wondering, “Ok, how to I leave my mark, what can I pass on to those who come after me.” Sorry, even that was a little too soppy sweet for me. If this was their REAL motivator, to make things “just work” ( sorry SJ, but I hope you don’t mind my borrowing that ), they wouldn’t be running around all over THE valley and the the nooks and craggy crannies of Seattle, trying to find angels that peddle in greenbacks instead of sounding off just like the “Barons of the media” have.
Such smart people, such pomposity ( did I spell that correctly ? Hmmm. ). Keep hammering away, Nicholas.
Then you have a very poor imagination; and in my opinion the above is an embarrassment to Web 2.0. But as you’ve ably noted, it is hardly the only one… last time I checked traditional and new media are composed of people. Quality and bias will certainly vary.
Nicolas, are new technologies are immoral, from some point of view..
Nicholas Carr is Having a Go at Web2.0
by: Yann Gourvennec You already knew about Nicholas Carr’s famous article ‘IT doesn’t matter‘, which we have already commented on Visionarymarketing.com. Carr made his point very clear about what he thought about how strategic IT was …
This is an interesting argument and one I have not given enough thought to in the past. However, to regard independant amature media on the web as inaccurate and ignore the corporate/political bias of commercial media is just ignorant. As someone who works in the media I have been exposed to exactly how full of shit commercial media can be at it’s worst.
what a load of wank