The amorality of Web 2.0

This post, along with seventy-eight others, is collected in the book Utopia Is Creepy.

From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we’re all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.

But as the Web matured during the late 1990s, the dreams of a digital awakening went unfulfilled. The Net turned out to be more about commerce than consciousness, more a mall than a commune. And when the new millenium arrived, it brought not a new age but a dispiritingly commonplace popping of a bubble of earthly greed. Somewhere along the way, the moneychangers had taken over the temple. The Internet had transformed many things, but it had not transformed us. We were the same as ever.

The New New Age

But the yearning for a higher consciousness didn’t burst with the bubble. Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. In a profile of Internet savant Tim O’Reilly in the current issue of Wired, Steven Levy writes that “the idea of collective consciousness is becoming manifest in the Internet.” He quotes O’Reilly: “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s – except we didn’t know it would be technology-mediated.” Levy then asks, “Could it be that the Internet – or what O’Reilly calls Web 2.0 – is really the successor to the human potential movement?”

Levy’s article appears in the afterglow of Kevin Kelly’s sweeping “We Are the Web” in Wired’s August issue. Kelly, erstwhile prophet of the Long Boom, surveys the development of the World Wide Web, from the Netscape IPO ten years ago, and concludes that it has become a “magic window” that provides a “spookily godlike” perspective on existence. “I doubt angels have a better view of humanity,” he writes.

But that’s only the beginning. In the future, according to Kelly, the Web will grant us not only the vision of gods but also their power. The Web is becoming “the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds … We will live inside this thing.”

The revelation continues:

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

This isn’t the language of exposition. It’s the language of rapture.

The Cult of the Amateur

Now, lest you dismiss me as a mere cynic, if not a fallen angel, let me make clear that I’m all for seeking transcendence, whether it’s by going to church or living in a hut in the woods or sitting at the feet of the Maharishi or gazing into the glittering pixels of an LCD screen. One gathers one’s manna where one finds it. And if there’s a higher consciousness to be found, then by all means let’s get elevated. My problem is this: When we view the Web in religious terms, when we imbue it with our personal yearning for transcendence, we can no longer see it objectively. By necessity, we have to look at the Internet as a moral force, not as a simple collection of inanimate hardware and software. No decent person wants to worship an amoral conglomeration of technology.

And so all the things that Web 2.0 represents – participation, collectivism, virtual communities, amateurism – become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state. But is it really so? Is there a counterargument to be made? Might, on balance, the practical effect of Web 2.0 on society and culture be bad, not good? To see Web 2.0 as a moral force is to turn a deaf ear to such questions.

Let me bring the discussion down to a brass tack. If you read anything about Web 2.0, you’ll inevitably find praise heaped upon Wikipedia as a glorious manifestation of “the age of participation.” Wikipedia is an open-source encyclopedia; anyone who wants to contribute can add an entry or edit an existing one. O’Reilly, in a new essay on Web 2.0, says that Wikipedia marks “a profound change in the dynamics of content creation” – a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is allowing us to pool our individual brains into a great collective mind. It’s a harbinger of the Machine.

In theory, Wikipedia is a beautiful thing – it has to be a beautiful thing if the Web is leading us to a higher consciousness. In reality, though, Wikipedia isn’t very good at all. Certainly, it’s useful – I regularly consult it to get a quick gloss on a subject. But at a factual level it’s unreliable, and the writing is often appalling. I wouldn’t depend on it as a source, and I certainly wouldn’t recommend it to a student writing a research paper.

Take, for instance, this section from Wikipedia’s biography of Bill Gates, excerpted verbatim:

Gates married Melinda French on January 1, 1994. They have three children, Jennifer Katharine Gates (born April 26, 1996), Rory John Gates (born May 23, 1999) and Phoebe Adele Gates (born September 14, 2002).

In 1994, Gates acquired the Codex Leicester, a collection of writings by Leonardo da Vinci; as of 2003 it was on display at the Seattle Art Museum.

In 1997, Gates was the victim of a bizarre extortion plot by Chicago resident Adam Quinn Pletcher. Gates testified at the subsequent trial. Pletcher was convicted and sentenced in July 1998 to six years in prison. In February 1998 Gates was attacked by Noël Godin with a cream pie. In July 2005, he solicited the services of famed lawyer Hesham Foda.

According to Forbes, Gates contributed money to the 2004 presidential campaign of George W. Bush. According to the Center for Responsive Politics, Gates is cited as having contributed at least $33,335 to over 50 political campaigns during the 2004 election cycle.

Excuse me for stating the obvious, but this is garbage, an incoherent hodge-podge of dubious factoids (who the heck is “famed lawyer Hesham Foda”?) that adds up to something far less than the sum of its parts.

Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim:

Her nickname as a youth—Lady Jane—was one she reportedly disliked. She traveled to Communist Russia in 1964 and was impressed by the people, who welcomed her warmly as Henry’s daughter. In the mid-1960s she bought a farm outside of Paris, had it renovated and personally started a garden. She visited Andy Warhol’s Factory in 1966. About her 1971 Oscar win, her father Henry said: “How in hell would you like to have been in this business as long as I and have one of your kids win an Oscar before you do?” Jane was on the cover of Life magazine, March 29, 1968.

While early she had grown both distant from and critical of her father for much of her young life, in 1980, she bought the play “On Golden Pond” for the purpose of acting alongside her father—hoping he might win the Oscar that had eluded him throughout his career. He won, and when she accepted the Oscar on his behalf, she said it was “the happiest night of my life.” Director and first husband Roger Vadim once said about her: “Living with Jane was difficult in the beginning … she had so many, how do you say, ‘bachelor habits.’ Too much organization. Time is her enemy. She cannot relax. Always there is something to do.” Vadim also said, “There is also in Jane a basic wish to carry things to the limit.”

This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. Remember, this emanation of collective intelligence is not just a couple of months old. It’s been around for nearly five years and has been worked over by many thousands of diligent contributors. At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. When will the great Wikipedia get good? Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

The promoters of Web 2.0 venerate the amateur and distrust the professional. We see it in their unalloyed praise of Wikipedia, and we see it in their worship of open-source software and myriad other examples of democratic creativity. Perhaps nowhere, though, is their love of amateurism so apparent as in their promotion of blogging as an alternative to what they call “the mainstream media.” Here’s O’Reilly: “While mainstream media may see individual blogs as competitors, what is really unnerving is that the competition is with the blogosphere as a whole. This is not just a competition between sites, but a competition between business models. The world of Web 2.0 is also the world of what Dan Gillmor calls ‘we, the media,’ a world in which ‘the former audience,’ not a few people in a back room, decides what’s important.”

I’m all for blogs and blogging. (I’m writing this, ain’t I?) But I’m not blind to the limitations and the flaws of the blogosphere – its superficiality, its emphasis on opinion over reporting, its echolalia, its tendency to reinforce rather than challenge ideological extremism and segregation. Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do. Those despised “people in a back room” can fund in-depth reporting and research. They can underwrite projects that can take months or years to reach fruition – or that may fail altogether. They can hire and pay talented people who would not be able to survive as sole proprietors on the Internet. They can employ editors and proofreaders and other unsung protectors of quality work. They can place, with equal weight, opposing ideologies on the same page. Forced to choose between reading blogs and subscribing to, say, the New York Times, the Financial Times, the Atlantic, and the Economist, I will choose the latter. I will take the professionals over the amateurs.

But I don’t want to be forced to make that choice.

Scary Economics

And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture – and it’s doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it’s created by amateurs rather than professionals, it’s free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we’ve recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can’t imagine anything more frightening.

In “We Are the Web,” Kelly writes that “because of the ease of creation and dissemination, online culture is the culture.” I hope he’s wrong, but I fear he’s right – or will come to be right.

Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters the forms and economics of production and consumption. It doesn’t care whether its consequences are good or bad. It doesn’t care whether it brings us to a higher consciousness or a lower one. It doesn’t care whether it burnishes our culture or dulls it. It doesn’t care whether it leads us into a golden age or a dark one. So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.

193 thoughts on “The amorality of Web 2.0

  1. Gruber

    Why Wikipedia is not Web 2.0 and salvation might come even if it is amoral

    The argument, however nicely put, seems to twist things in order to give room to some ideas of yours, not to rebuff the apostles of an electronic collective consciousness.

    Your argument seems to go like this: People believe in an eventual transformation of the current information networks (by new organization principles) into something like a collective mind, which works by making individual (amateurish) contribution part of a greater mental structure (unlike classical culture, which apparently is about singular performances). The believers in that electronic transcendence have to subscribe to a view which makes the bits and pieces of the new system into a moral, normative agglomerate, and thus good. The epithome of that vision is Wikipedia. Wikipedia is not good, because it is unprofessional. Thus, the transcendence does not work and the believers are rebutted. (Forgive me my inaccuracies.)

    What is wrong with this argument of yours?

    1. People speculating about collective consciousness are a small minority among the web visionaries, and they are not generally identical with wikipedians.

    2. A collective mind vision does not necessarily entail a moral view. Even if it did, it would not mean that the individual aspects would have to embody “goodness”. The individual aspects have to realize suitability.

    3. Good in the sense of moral things is different from good in the sense of “professional quality”. In the case of Wikipedia, your claim should not be mixed with moral goodness, but specifically apply to suitability for Wikipadias purpose (which is being an authoritative reference).

    4. You state that Wikipedia is not professional enough, and what you really mean is that it does not implement the proper principles to serve its purpose (good quality reference). The conclusion from this, however, is not that collaborative intelligence does not work, but that Wikipedia’s constraints do not result in the required qualities.

    5. The question stemming from the argument does not apply to an antagonism between professional and amateurish contribution, but whether the principles that pertain to professional work might be implemented “outside” the experts within a distributed, self-organizing information structure. This is indeed an interesting question, which might lead us even into a Strong AI debate.

  2. Susan Littlemore

    I wholeheartedly share Nicholas’s scepticism on Wikipedia in particular. The ability of anyone to insert factoids that survive for long enough to be read by the unwary, and the replacement of the traditional ‘getting it right’ principle with ‘not getting it too obviously wrong’ add up to an abandonment of intellectual rigour for the sake of ease: fast food taking over in the intellectual as well as physical realm. For more discussion see http://forum.atimes.com/topic.asp?TOPIC_ID=4083

  3. John Quiggin

    I’m not convinced by the critique of Wikipedia. I don’t go to an encyclopedia for fine writing, and I don’t think that fixing bad writing should be a high priority for a venture that’s still only five years old. At this point, the main priority ought still to be adding more information and ensuring the accuracy of what’s there. It’s my impression that Wikipedia compares pretty well with the competition on both scores.

    A more convincing test would be to pick a sample and show that there were significant errors or omissions (relative to the competition). Of course, these would be fixed quickly but the lesson would stand.

  4. Stuffola

    Web 2.0: The Triumph of Amateur Hour?

    I have never, ever understood the cult of professionalism adhered to by some journalists (bloggers have no journalism degrees, bloggers bad amateurs, bloggers threaten professionals, so must be crushed). Nor have I been on the bandwagon to eviserate th…

  5. Sean Gephardt

    Fantastic post! Some of the brains nehind the “Web 2.0” are really into whole celebrity/rock star mentality, deperately seeking 15 seconds of fame, usually for “re-creating the wheel” (RSS vs. XHTML). If the internet & computers are supposed to be making my life easier, how come I worship “the glittering pixels of an LCD screen” for endless hours a day?

  6. umair

    Hi,

    Your argument doesn’t hold because your premises are contradictory.

    Premise 1) Peer communities produce low ‘quality’ (aka value) goods.

    Premise 2) These goods are substitutes for traditional goods in the same market.

    Conclusion) Demand shifts inwards (“scary economics”).

    If peer communites produce only low ‘quality’ (aka value) goods, demand will not shift inwards, because these goods are not substitutes for (high value) traditionally produced goods –

    unless we invoke some kind of unrealistic deus ex machina, like huge elasticity.

    In fact, the only way your conclusion holds is to *invert* your first premise: if peer communities, in fact, produce high value goods.

    These are then substitutes for traditional goods, and demand shifts inwards without us having to resort to anything else.

    You can’t have it both ways… :)

  7. if:book

    nicholas carr on “the amorality of web 2.0”

    This has caused a slight stir, and is refreshing insofar as it is questions the giddy optimism surrounding web 2.0. Nicholas Carr writes about business and technology and was an editor of the Harvard Business Review. He published this on…

  8. Nick

    First, thanks to everyone who has taken the time to comment on my post. Here are some brief comments on your comments:

    John Gauntt and Wayne: I, too, prefer my technology to be amoral, and the web’s “color-blindness” is indeed a great strength.

    Venkat: I’m not defending the shortcomings of traditional media, which are many and growing. My question is whether the economic pressures caused by the web will in the end make those shortcomings even worse.

    David Gerard, JS, Ian Woollard, John Quiggin, Judson Dunn: Wikipedia can choose to judge itself by whatever criteria it chooses, but it promotes itself as “the free encyclopedia.” Therefore, it’s only fair that others judge it by the standards of reliability we would expect from other encyclopedias. Do you really think that most of the millions of people who consult Wikipedia (including many young students) make the effort to look inside “the sausage factory” to see how it works and what its inherent flaws are? Of course they don’t. They use it like they’s use any other encyclopedia, unaware that at any given moment any given entry can include factual errors, omissions and distortions. What the Wikipedia community should do is put a warning notice on the top of every page: “WARNING: This page may include factual errors.” Why don’t you include such a warning where real users would see it?

    Neil K.: Yes, that’s a good question. I wish we could (in your terms) empower the esoteric without destroying the authoritative. My fear is that we can’t.

    Kevin Kelly: Thanks. I look forward to your book.

    Anonymous: You write: “the town’s library has been shut down and I don’t feel like spending $49.95 [on on-line Encyclopedia Britannica] to find out [if it even has entries on Bill Gates and Jane Fonda]. Seeing how the on-line edition only has 73,000 articles in it, I’m doubtful.” The fact that the on-line edition of EB is of poorer quality than the print edition – and that your local library has closed down – underscores my fear about how the economics of the Web may weaken the general culture rather than strengthen it.

    Gruber: Good points. I do think that collective effort can produce excellent results in some circumstances and mediocre results in others, and that the difference can often be traced to constraints on collectivism. More precisely, collectivism works best when there’s some form of hierarchical control over the end product (as in Linux) and works less well in the absence of such control (as in Wikipedia). “Collective intelligence” is a misnomer, in other words; much of the intelligence ultimately depends not on collectivism but on having smart people at the center. Pure democracies produce crappy results.

    umair: Thanks for raising these issues. But you’re starting from a faulty assumption when you posit “quality” and “value” as being synonymous. They’re not.

  9. bob stein

    frankly, when you peel away the senstive “gee i’m just concerned about our future” strokes, all you’ve got here is one more apologist for the status quo and his job in particular.

    particularly grating is the dishonesty of the piece which introduces a quote from the Wikipedia: “Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim: . . .” well i went to the Wikipedia and it turns out the quote is from a section entitled “early years” not Jane Fonda’s Life. the actual Wikipedia entry on Fonda is quite extensive far outstripping the pathetic Brittanica entry which i quote in full:

    “Jane Seymour Fonda American motion-picture actress who was also noted for her political activism.

    The daughter of actor Henry Fonda, she left Vassar College after two years and lived in New York City. She studied acting under Lee Strasberg at the Actors Studio there in 1958 and worked as a model. Her acting career began with appearances in the Broadway play There Was a Little Girl (1960) and the motion picture Tall Story (1960), and she went on to appear in comic roles in numerous films in the 1960s, including Cat Ballou (1965) and Barefoot In the Park (1967). Her subsequent, more substantial roles were in such socially conscious films as They Shoot Horses, Don’t They? (1969), Klute (1971), Coming Home (1978), and The China Syndrome (1979). She received Academy Awards for best actress for her performances in Klute and Coming Home. She costarred with her father in the film On Golden Pond (1981).

    In the 1970s and ’80s Fonda was active on behalf of left-wing political causes. She was an outspoken opponent of the Vietnam War who journeyed to Hanoi in 1972 to denounce the U.S. bombing campaigns there. In the 1980s she devised a popular exercise program for women while continuing to appear in motion pictures. She was married three times, to the French film director Roger Vadim, to the American politician Tom Hayden, and to the American broadcasting entrepreneur Ted Turner.”

    be honest, in this case, which would you rather have, “the amateur” or “the professional” version.

    i don’t mean to mindlessly promote the wonders of collaborative effort but i don’t think it helps to make such a lame case for professionalism.

    one more point . . . i’m not a big fan of kelley’s rapturous presentation, but his basic point, that we’re inventing the future and we really should do as good a job as possible seems spot-on to me.

  10. J. LeRoy

    Publishing to the People (Samizdata)

    Ben at if:book posted inviting discussion around some issues in a long post by Nicholas Carr. Ben’s focus on Carr’s post was around the web being a competitor to traditional media. The issue he centers around is, essentially, will Web

  11. Anonymous

    Nick, I think this is a great post and is always good to hear criticism. Don’t mind what Web 2.0″ fans say out there, keep posting your opinion.

    Personally I do thinkg the concept of “Web2.0” has grown into a sort of fanaticism which reminds me of 98/99 (not to say com…sm or mar..st). I am really looking forward to a good study on the “business models” of “Web 2.0”, hoppefully the guys from MIT (earlier post) would have something soon for the rest of the world. Up to now that there are two business models “sell to another company” (possibly to the one that think are way behind on the wave) and advertising – and we know already how that rollercoster ride goes…..

    Just for quality comparison here is Bill Gates entry on Encarta:

    Gates, William Henry, III, born in 1955, American business executive, who serves as chairman and chief software architect of Microsoft Corporation, the leading computer software company in the United States. Gates cofounded Microsoft in 1975 with high school friend Paul Allen. The company’s success made Gates one of the most influential figures in the computer industry and, eventually, one of the richest people in the world.

    (Microsoft is the publisher of Encarta Encyclopedia.)

    Born in Seattle, Washington, Gates attended public school through the sixth grade. In the seventh grade he entered Seattle’s exclusive Lakeside School, where he met Allen. Gates was first introduced to computers and programming languages in 1968, when he was in the eighth grade. That year Lakeside bought a teletype machine that connected to a mainframe computer over phone lines. At the time, the school was one of the few that provided students with access to a computer.

    Soon afterward, Gates, Allen, and other students convinced a local computer company to give them free access to its PDP-10, a new minicomputer made by Digital Equipment Corporation. In exchange for the computer time, the students tried to find flaws in the system. Gates spent much of his free time on the PDP-10 learning programming languages such as BASIC, Fortran, and LISP. In 1972 Gates and Allen founded Traf-O-Data, a company that designed and built computerized car-counting machines for traffic analysis. The project introduced them to the programmable 8008 microprocessor from Intel Corporation.

    While attending Harvard University in Cambridge, Massachusetts, in 1975, Gates teamed with Allen to develop a version of the BASIC programming language for the Altair 8800, the first personal computer. They licensed the software to the manufacturer of the Altair, Micro Instrumentation and Telemetry Systems (MITS), and formed Microsoft (originally Micro-soft) to develop versions of BASIC for other computer companies. Gates decided to drop out of Harvard in his junior year to devote his time to Microsoft. In 1980 Microsoft closed a pivotal deal with International Business Machines Corporation (IBM) to provide the operating system for the IBM PC personal computer. As part of the deal, Microsoft retained the right to license the operating system to other companies. The success of the IBM PC made the operating system, MS-DOS, an industry standard. Microsoft’s revenues skyrocketed as other computer makers licensed MS-DOS and demand for personal computers surged. In 1986 Microsoft offered its stock to the public; by 1987 rapid appreciation of the stock had made Gates, 31, the youngest ever self-made billionaire. In the 1990s, as Microsoft’s Windows operating system and Office application software achieved worldwide market dominance, Gates amassed a fortune worth tens of billions of dollars. Alongside his successes, however, Gates was accused of using his company’s power to stifle competition. In 2000 a federal judge found Microsoft guilty of violating antitrust laws and ordered it split into two companies. An appeals court overturned the breakup order in 2001 but upheld the judge’s ruling that Microsoft had abused its power to protect its Windows monopoly. In November 2001 Microsoft reached a settlement with the U.S. Justice Department and nine states, and a year later, the settlement was upheld by a federal district court judge. (For more information on the history of Microsoft, see Microsoft Corporation.)

    Gates has made personal investments in other high-technology companies. He sits on the board of one biotechnology company and has invested in a number of others. In 1989 he founded Corbis Corporation, which now owns the largest collection of digital images in the world.

    In the late 1990s Gates became more involved in philanthropy. With his wife he established the Bill & Melinda Gates Foundation, which, ranked by assets, quickly became one of the largest foundations in the world. Gates has also authored two books: The Road Ahead (1995; revised, 1996), which details his vision of technology’s role in society, and Business @ the Speed of Thought (1999), which discusses the role technology can play in running a business.

    In 1998 Gates appointed an executive vice president of Microsoft, Steve Ballmer, to the position of president, but Gates continued to serve as Microsoft’s chairman and chief executive officer (CEO). In 2000 Gates transferred the title of CEO to Ballmer. While remaining chairman, Gates also took on the title of chief software architect to focus on the development of new products and technologies.

  12. Steve Button

    When will the great Wikipedia get good?

    OK Here’s a thought. Why not have ratings on the articles (a bit like Amazon) and allow people to comment on the articles in Wikipedia. Then you could have a starting point for which articles need improvement.

    You could also use a “Was this review helpful to you” button, to weed out bad reviewers (or to give more attention to useful reviewers).

    Perhaps they do this already, but I haven’t seen it yet? I’ve used WP dozens of time in the past, so if it is there, and I haven’t noticed it.. then it needs to be more obvious IMHO.

    Steve Button

  13. s h a u n l o g

    Amoraility, Egalitarianism, and the Bazaar

    Courtesy of The Register, I came across Nicholas Carr’s blog today. And more specifically this post.

    He’s got some great idea which I’m not going to attempt to summarise in any great detail because he says it much better than I coul…

  14. Anonymous

    I am not sure ratings would solve the problem of wikipedia. There are two main reason to that

    – the reader might not know the entry contained not factual data (therefore cannot rate it)

    – most readers don’t take the time to rate features, products.

    I believe they tried that in search engines and it did not work out well (and offering money made it worst).Check out the “Dr. Daniel E. Rose” webcast at berkeley he talks about it at some point. http://webcast.berkeley.edu/courses/archive.php?seriesid=1906978252

  15. blog.thiesen.org

    Joining The Cult of the Amateur

    Nicholas Carr has written a quite interesting article on Web 2.0 and what he believes to be the Cult of the Amateur. Basically, he means that because stuff is free, like in Wikipedia or OpenSource (his words, not mine), they will always be used and re…

  16. choi li akiro singh santos

    Finally some Sanity amid the Hype:

    Great comments. I too used Wikipedia often and have found the writing to be uneven at best. Nonetheless, I find that it is a good starting point: the collection of EXTERNAL LINKS, for a given topic are usually better than what you can get from a Search Engine.

    The triumph of Amateurs at the expense of Professionals is scary. Might it be possible that we may not have a zero-sum game here? The bloggers have been able to correct mistakes that have eluded the “editors” of the NYTimes, the Economist, etc. Without professional publications, I doubt if bloggers will have that much to write about: I suspect that the practice linking to original content from other bloggers will not be sufficient. Remember, most blog discussions start from an article from a professional publication.

    What is most interesting is that the so-called Web 2.0 companies have no Business Model. The most famous example to date is Flickr, a great site that Yahoo! purchased. Could it have survived on it’s own? I’m not sure about that. Dave Winer recently noted:

    “I wasn’t at Web 2.0 last week, but I know some of the jargon that developed there. People were walking around saying this is the Flickr of that, and that is the Flickr of this.”

    Let the hype continue ….

  17. Frapazoid

    The more things change, the more they stay the same.

    You know, there used to be the huge network of intelligence and communication where anyone could post their ideas and thoughts.

    Yeah, it was called the “internet” last time too.

    Web 2.0 isn’t some super machine or anything.

    The term “Web 2.0” doesn’t even mean anything at all! This is just a fad term to lure venture capitalist who aren’t doing their research.

    Every time I see someone talking about “Web 2.0”, I have to ask, HAS THE ENTIRE WORLD GONE MAD?!

    There is nothing new about the web! It has been gradually developing for the past decade. There is no such thing as “Web 2.0”.

  18. SiliconBeat

    The Web 2.0 open/closed debate explodes

    A simmering debate about how much companies like Google and Oodle are allowed to duplicate from other sources has exploded into full force. We woke up this morning to an emotional torrent on multiple sides. The significance of this stems beyond Google …

  19. cyou

    The internet is a huge garbage of formated text and other file-formats. This is not something new.You may read a few books from Clifford Stoll. Wikipedia is just again a piece of crap.

    Why ? Do they pay people ? Did they have some research experience ?

    Blogs are the amateur form of a newspaper article.

  20. Sandy Borthick

    If you’ve got the attention span to read a whole book about the real world (not more specious bloviating about the “blogosphere”), I suggest ‘The World is Flat’ by Thomas Friedman. Web 2.0 may end up being Bubble 2.0, but only if everyone piles on and pays too much in the portal consolidation getting underway. In the longer run, say 3-5 years out, I doubt that amateur blogging (this decade’s CB radios) will satisfy many peoples’ desire for concise coverage, informed commentary and/or auhoritative depth reporting — anymore than one-off hacks and ‘perpetual betas’ (one of the Web 2.0 articles of faith) will successfully address corporate requirements for stable enterpise-level software.

    p.s. Nick — Just think of Web 2.0 as a style thang, and you won’t be so bummed out about it! That reminds me of another topical read: ‘Boomeritis’ by Ken Wilbur.

  21. The Ant Nest

    Technology and Morality

    A fascinating essay about the Internet’s Second Coming and its spiritual, ethical and cultural consequences. Something of a counterpoint to the prevailing viewpoint and very shrewd in parts.

Comments are closed.