The amorality of Web 2.0

This post, along with seventy-eight others, is collected in the book Utopia Is Creepy.

From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we’re all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.

But as the Web matured during the late 1990s, the dreams of a digital awakening went unfulfilled. The Net turned out to be more about commerce than consciousness, more a mall than a commune. And when the new millenium arrived, it brought not a new age but a dispiritingly commonplace popping of a bubble of earthly greed. Somewhere along the way, the moneychangers had taken over the temple. The Internet had transformed many things, but it had not transformed us. We were the same as ever.

The New New Age

But the yearning for a higher consciousness didn’t burst with the bubble. Web 1.0 may have turned out to be spiritual vaporware, but now we have the hyper-hyped upgrade: Web 2.0. In a profile of Internet savant Tim O’Reilly in the current issue of Wired, Steven Levy writes that “the idea of collective consciousness is becoming manifest in the Internet.” He quotes O’Reilly: “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s – except we didn’t know it would be technology-mediated.” Levy then asks, “Could it be that the Internet – or what O’Reilly calls Web 2.0 – is really the successor to the human potential movement?”

Levy’s article appears in the afterglow of Kevin Kelly’s sweeping “We Are the Web” in Wired’s August issue. Kelly, erstwhile prophet of the Long Boom, surveys the development of the World Wide Web, from the Netscape IPO ten years ago, and concludes that it has become a “magic window” that provides a “spookily godlike” perspective on existence. “I doubt angels have a better view of humanity,” he writes.

But that’s only the beginning. In the future, according to Kelly, the Web will grant us not only the vision of gods but also their power. The Web is becoming “the OS for a megacomputer that encompasses the Internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network. This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies but our minds … We will live inside this thing.”

The revelation continues:

There is only one time in the history of each planet when its inhabitants first wire up its innumerable parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment.

We should marvel, but people alive at such times usually don’t. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilization.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era. In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognized as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network. From this embryonic neural net was born a collaborative interface for our civilization, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

This isn’t the language of exposition. It’s the language of rapture.

The Cult of the Amateur

Now, lest you dismiss me as a mere cynic, if not a fallen angel, let me make clear that I’m all for seeking transcendence, whether it’s by going to church or living in a hut in the woods or sitting at the feet of the Maharishi or gazing into the glittering pixels of an LCD screen. One gathers one’s manna where one finds it. And if there’s a higher consciousness to be found, then by all means let’s get elevated. My problem is this: When we view the Web in religious terms, when we imbue it with our personal yearning for transcendence, we can no longer see it objectively. By necessity, we have to look at the Internet as a moral force, not as a simple collection of inanimate hardware and software. No decent person wants to worship an amoral conglomeration of technology.

And so all the things that Web 2.0 represents – participation, collectivism, virtual communities, amateurism – become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state. But is it really so? Is there a counterargument to be made? Might, on balance, the practical effect of Web 2.0 on society and culture be bad, not good? To see Web 2.0 as a moral force is to turn a deaf ear to such questions.

Let me bring the discussion down to a brass tack. If you read anything about Web 2.0, you’ll inevitably find praise heaped upon Wikipedia as a glorious manifestation of “the age of participation.” Wikipedia is an open-source encyclopedia; anyone who wants to contribute can add an entry or edit an existing one. O’Reilly, in a new essay on Web 2.0, says that Wikipedia marks “a profound change in the dynamics of content creation” – a leap beyond the Web 1.0 model of Britannica Online. To Kevin Kelly, Wikipedia shows how the Web is allowing us to pool our individual brains into a great collective mind. It’s a harbinger of the Machine.

In theory, Wikipedia is a beautiful thing – it has to be a beautiful thing if the Web is leading us to a higher consciousness. In reality, though, Wikipedia isn’t very good at all. Certainly, it’s useful – I regularly consult it to get a quick gloss on a subject. But at a factual level it’s unreliable, and the writing is often appalling. I wouldn’t depend on it as a source, and I certainly wouldn’t recommend it to a student writing a research paper.

Take, for instance, this section from Wikipedia’s biography of Bill Gates, excerpted verbatim:

Gates married Melinda French on January 1, 1994. They have three children, Jennifer Katharine Gates (born April 26, 1996), Rory John Gates (born May 23, 1999) and Phoebe Adele Gates (born September 14, 2002).

In 1994, Gates acquired the Codex Leicester, a collection of writings by Leonardo da Vinci; as of 2003 it was on display at the Seattle Art Museum.

In 1997, Gates was the victim of a bizarre extortion plot by Chicago resident Adam Quinn Pletcher. Gates testified at the subsequent trial. Pletcher was convicted and sentenced in July 1998 to six years in prison. In February 1998 Gates was attacked by Noël Godin with a cream pie. In July 2005, he solicited the services of famed lawyer Hesham Foda.

According to Forbes, Gates contributed money to the 2004 presidential campaign of George W. Bush. According to the Center for Responsive Politics, Gates is cited as having contributed at least $33,335 to over 50 political campaigns during the 2004 election cycle.

Excuse me for stating the obvious, but this is garbage, an incoherent hodge-podge of dubious factoids (who the heck is “famed lawyer Hesham Foda”?) that adds up to something far less than the sum of its parts.

Here’s Wikipedia on Jane Fonda’s life, again excerpted verbatim:

Her nickname as a youth—Lady Jane—was one she reportedly disliked. She traveled to Communist Russia in 1964 and was impressed by the people, who welcomed her warmly as Henry’s daughter. In the mid-1960s she bought a farm outside of Paris, had it renovated and personally started a garden. She visited Andy Warhol’s Factory in 1966. About her 1971 Oscar win, her father Henry said: “How in hell would you like to have been in this business as long as I and have one of your kids win an Oscar before you do?” Jane was on the cover of Life magazine, March 29, 1968.

While early she had grown both distant from and critical of her father for much of her young life, in 1980, she bought the play “On Golden Pond” for the purpose of acting alongside her father—hoping he might win the Oscar that had eluded him throughout his career. He won, and when she accepted the Oscar on his behalf, she said it was “the happiest night of my life.” Director and first husband Roger Vadim once said about her: “Living with Jane was difficult in the beginning … she had so many, how do you say, ‘bachelor habits.’ Too much organization. Time is her enemy. She cannot relax. Always there is something to do.” Vadim also said, “There is also in Jane a basic wish to carry things to the limit.”

This is worse than bad, and it is, unfortunately, representative of the slipshod quality of much of Wikipedia. Remember, this emanation of collective intelligence is not just a couple of months old. It’s been around for nearly five years and has been worked over by many thousands of diligent contributors. At this point, it seems fair to ask exactly when the intelligence in “collective intelligence” will begin to manifest itself. When will the great Wikipedia get good? Or is “good” an old-fashioned concept that doesn’t apply to emergent phenomena like communal on-line encyclopedias?

The promoters of Web 2.0 venerate the amateur and distrust the professional. We see it in their unalloyed praise of Wikipedia, and we see it in their worship of open-source software and myriad other examples of democratic creativity. Perhaps nowhere, though, is their love of amateurism so apparent as in their promotion of blogging as an alternative to what they call “the mainstream media.” Here’s O’Reilly: “While mainstream media may see individual blogs as competitors, what is really unnerving is that the competition is with the blogosphere as a whole. This is not just a competition between sites, but a competition between business models. The world of Web 2.0 is also the world of what Dan Gillmor calls ‘we, the media,’ a world in which ‘the former audience,’ not a few people in a back room, decides what’s important.”

I’m all for blogs and blogging. (I’m writing this, ain’t I?) But I’m not blind to the limitations and the flaws of the blogosphere – its superficiality, its emphasis on opinion over reporting, its echolalia, its tendency to reinforce rather than challenge ideological extremism and segregation. Now, all the same criticisms can (and should) be hurled at segments of the mainstream media. And yet, at its best, the mainstream media is able to do things that are different from – and, yes, more important than – what bloggers can do. Those despised “people in a back room” can fund in-depth reporting and research. They can underwrite projects that can take months or years to reach fruition – or that may fail altogether. They can hire and pay talented people who would not be able to survive as sole proprietors on the Internet. They can employ editors and proofreaders and other unsung protectors of quality work. They can place, with equal weight, opposing ideologies on the same page. Forced to choose between reading blogs and subscribing to, say, the New York Times, the Financial Times, the Atlantic, and the Economist, I will choose the latter. I will take the professionals over the amateurs.

But I don’t want to be forced to make that choice.

Scary Economics

And so, having gone on for so long, I at long last come to my point. The Internet is changing the economics of creative work – or, to put it more broadly, the economics of culture – and it’s doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it’s created by amateurs rather than professionals, it’s free. And free trumps quality all the time. So what happens to those poor saps who write encyclopedias for a living? They wither and die. The same thing happens when blogs and other free on-line content go up against old-fashioned newspapers and magazines. Of course the mainstream media sees the blogosphere as a competitor. It is a competitor. And, given the economics of the competition, it may well turn out to be a superior competitor. The layoffs we’ve recently seen at major newspapers may just be the beginning, and those layoffs should be cause not for self-satisfied snickering but for despair. Implicit in the ecstatic visions of Web 2.0 is the hegemony of the amateur. I for one can’t imagine anything more frightening.

In “We Are the Web,” Kelly writes that “because of the ease of creation and dissemination, online culture is the culture.” I hope he’s wrong, but I fear he’s right – or will come to be right.

Like it or not, Web 2.0, like Web 1.0, is amoral. It’s a set of technologies – a machine, not a Machine – that alters the forms and economics of production and consumption. It doesn’t care whether its consequences are good or bad. It doesn’t care whether it brings us to a higher consciousness or a lower one. It doesn’t care whether it burnishes our culture or dulls it. It doesn’t care whether it leads us into a golden age or a dark one. So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.

193 thoughts on “The amorality of Web 2.0

  1. JPL

    What I find interesting is that many people here see mainstream or commercial media being by default biased and the collaborative community being by default earnest and true.

    The sheer amount of open-source and blogging/wiki advocacy in the comments in this blog is in my opinion a clear indication that there is a bias here in the “Web 2.0” sphere also – different from typical political/commercial bias, yes, maybe even benevolent, but a bias nonetheless. The road tho hell in indeed paved with good intentions.

  2. Doctor Early

    Had this been an open-source site, I would have been able to edit the author’s misspelling of millennium (second paragraph, line three), as well as millennialist (final sentence of the article)!

    All jokes aside, that being said, I agree with the quality control issue re: the reliability, or lack thereof, with the information listed on free, open-source sites such as Wikipedia.

    However, I believe we none should be obtaining any substantial amount of facts from any one source regardless.

  3. meregistered

    While my comments come far too late to be likely to actually be read I have a few points I’d like to make in reference to this well written and timely blog entry:

    1. I marginally agree with the quoted author who suggests that the current changes relating to the internet may be quite historic in scope. I do not really agree with his time frame. I would have to include from the moment people started adopting Tim Burners-Lee’s technology until about 5 years from now. I also do not really agree with his fervor.

    2. Wikipedia has been found to be very close in ACCURACY to Britanica (slightly less). I completely concur that the quality is much lower. Unfortunately I care about data not specifically how that data is presented. I don’t think I’m alone. Additionally the best accuracy can be found when consulting multiple data sources.

    3. BLOGs have their place and news agencies have theirs. Unfortunately I think the news agencies are missing their opportunity to compete with BLOGs properly (as many businesses do when the market changes). News agencies can learn to use the web to increase readership and revenue if they will study it, understand it, and learn from it.

    The RIAA is a good example of a business too inflexible and willingly ignorant to properly compete in a changing market.

    I hope the news agencies will not languish in the same ignorant state.

  4. Pat

    I would prefer reading what is in Wikipedia against “googeling around in the trash”. Encyclopedia Britannica is for the british! – Wikipedia is for the world.

    Now there is the interesting question: are you an expert or an amateur, and in which field, and where is the boarder between amateur and expert?

    Are there millions of amateurs or millions of experts sitting in front of the microsoftblackboxes.

    If you don’t confess that you are an amateur and think you are an expert, contribute something! As long as the earning of bread is guarantied, everybody can afford to give some virtual donation for free.

    Amorality:

    The Information in the Internet is the product of the used tool – the tool is stupid, but the outcoming product is either good or bad.

  5. Prokofy Neva

    I wonder how Nicholas Carr is thinking about this topic now. It’s odd, I do feel there is something spiritual about the Web 2.0 phenomena, but it will be many things to many people, not One Mind and secular humanism orthodoxy and the usual rigid paganist beliefs one finds on the Internet. It’s a space where spirtuality is possible, that’s all. Not always going to be so grand.

    What I’m intrigued by looking back at this post of a year ago now is the idea that Web 2.0 was going to be so democratizing and so participatory and so You — like Time Magazine says on the cover story of the ‘person of the year’. The celebration of the amateur, supposedly. Except…it’s nothing like that. What we’ve seen of the virtual worlds like Second Life, or the even more complex Multiverse, is that programmers rule them with an iron hand. Code is law. Only the most talented and superior in skills can win. The Snowcrash notion of elitists making and running the world is being enacted.

    The idea that just below that 1 percent of top content creators is another 10-15 percent of amateurs happy to work at their amateur level is a certain kind of fiction sustaining the entire thing. It’s what enables Second Life’s maker Linden Lab to get paid huge sums in tier for land (server space) by people wishing to “build their dream).

    There’s also the obvious point that when you have both a lot of amateurs just putting up their cat’s picture on MySpace, a lot of sectarian idiots on Wikipedia, or even a bunch of very smug and superior programmers and graphic artists on SL, what do you have? You do not necessarily have a *better* culture or world. You merely have a *synthetic* culture that is merely…linked up.

    Would you *want* to be networked into a great mediocre hive mind like that? We may all be looking for a place to hide from it soon.

  6. Joseph80

    I believe that Web 2.0 is not the cause of the amorality of which you speak, but a tool for a revolution that has been brewing for quite some time.

    As an example, we “amateurs” are tired of professional encyclopedias just ignoring the fact that there were such things as “Meganthropus” in the hominid chain. Sure, Wikepedia has some erroneous entries on that example (the gigantic hominid never lived in Australia), but at least Wikepedia has SOMETHING on it, and we can research to verify. Brittanica gives us virtually nothing, and so we have nothing to research.

    We artists and entrepreneurs are tired of the hegemony of the government and people like Bill Gates. Musicians are fed up with artists with only half of their talent getting all the national props and money because they knew someone important or were at the right place at the right time, while the innovators and true geniuses starve in the slums. We’re tired of marketers and coporate stiffs judging our work instead of aesthetes, other artists, and fans.

    Authors are tired of waiting six months to a year for publishers to get back to us, and not being able to submit elsewhere while we wait. Especially when a POD can be on the market within 30 days of submission and we know it. We’re tired of getting only %10 royalties knowing that someone who isn’t even bothering marketing our book is making the bulk of the proceeds from it.

    We’re tired of being judged by the political agendas of New York critics when we know that Amazon Customer reviews mean a whole lot more. After all, new York critics get books for free, they don’t buy them. It’s the consumer who buys them, therefore we find it more important to please the consumer.

    As a cabinet designer, I’m personally tired of companies like Cabinetvision and 20/20 charging unfathomable amounts of money for software that took less design time and has less creative/design power than a copy of “Neverwinter Nights” which retails for only $45.

    They can keep their support, if that’s their excuse for planning their retirements off of a single sale.

    Now we’ve got web 2.0 and great “amateur” programs like cabinetplanner. And guess what? Cabinetplanner is more accurate than either of the two and probably costs about %2 of either program’s price.

    We’ll be glad to see “professionals” like Cabinetvision and 20/20 go bankrupt, along with all of the Major Labels in Music and the Major Publishers in printing. They stole from us, and it’s their just dessert.

    Maybe we’ll oversaturate the market with “amateur” offerings, but maybe then consumers will have to actually LISTEN to a sample or READ an excerpt in order to choose a purchase, instead of getting their entertainment spoon fed to them by the media.

    The demise of true journalism and newspapers may become a reality, and that is somewhat sad. But it’s partly the media’s fault, having become an ugly parody of what it once was, given to excessive hype and sensationalism, and demonstrating gross bias toward the Hollywood Agenda (which is so far out of touch with real people’s lives that it would be completely laughable if it weren’t so tragic).

    And think how many trees we’ll be saving when all those presses stop printing all that birdcage lining and package stuffing.

  7. SallyF

    Some of the responders find fault with your use of the “amorality” since all they see is the root “moral” and equate it with religious, political and social dogma. That is not what Nick is talking about. He is talking about economic shifts and how that, in terms of knowledge and quality, might unnecessarily cause the extinction of some valuable species. A prime example would be Jimmy Wales’ stated goal of “burying” the best encyclopedia in the world. In a September 2006 debate with Dale Hoiberg, Wales remarks that Hoiberg’s view that EB’s editorial model has little to learn from that of Wikipedia’s will be a “fitting epitaph”: Will Wikipedia Mean the End Of Traditional Encyclopedias?. How vicious and unnecessary (it shows what a poor winner Wales can be). Look at the differences in style, with Wales emphasizing the “size” of Wikipedia and little else of importance.

    There is little doubt in my mind that Wikipedia will gut much of the lower-income and middle-income consumer market that EB has. But clearly Wikipedia is changing its own practices in the direction of EB, Citizendium and even Baidu Baike. What is to stop Wikipedia editors from skimming the cream of EB’s annual new results by simply comparing EB articles and Wikipedia articles and easily “catching up” with EB year after year in a manner that dodges copyright via easy re-writing of any new EB results? Wikipedia already has a project about “missing” encyclopedic articles that eases the task of reaching the goal of being “better” than EB by having an entry for each of EB’s entries. Think of whales or other large mammals (the pun with Wales…ignore me): once they are gone, you will miss them and you are going to have a rough time getting them back. When you burn down the ancient Library of Alexandria, it is worth asking how many years you have just set all of civilization back in terms of knowledge and potential progress. It is much easier to break off and digest an existing big valuable piece of an organization than it is to build it up again. If Jimmy Wales ever succeeds in his destructive goal, he will have, in term of Eric S. Raymond’s most successful book, torn down the Cathedral (or as I like to say, the Library) and you will have to settle for the kind of books that you can buy in the Bazaar. My advice: do not forget how Jimmy made his money in the 1990’s. Sigh. You can go look up “the internet is for porn” on YouTube yourself. I am amused those vids also, but only as entertainment.

    That is what Nick means when talks about “amorality”: economic amorality with no vision of society beyond the next three to six months. Maybe that, and the later Web 2.0 sellouts by the “plantation owners” (the property owners of Web 2.0 organizations) of Web 2.0 communities such as what just happened with Newsvines and seems likely with Facebook. Web 2.0 is “different”, but not in the way that Steve Jobs asks us to “think different”. Jobs asks us to think creatively. Web 2.0 is not inherently creative. Once you get beyond the social aspects of any of its “communities” (and their politics), there remain some sites that have a mission, a project and some product. That product might be free, but it is often of inferior quality. You get what you pay for. I support the Wikipedia project, but I recognize that it is going to harm existing encyclopedias that are better now than Wikipedia will ever be. Like Microsoft, Wikipedia will clone what already works well and demonetize that existing marketplace sector by providing an inferior product. Like Microsoft, Wikipedia might even “get it right” eventually for some of its content, but it will never be “Britannica or better”. In any such transition like Web 2.0, there are always two sides.

  8. K.Reid.Johnston

    Dave Carr says “…if we want a good encyclopedia in ten years, it’s going to have to be a good Wikipedia.

    I heartily agree, and I’m involved in an example of this now. We’re not using the public Wikipedia though, just the Wiki software. Our current challenge is preserving the intellectual capital of a large IT bespoke infrastructure at a time when the group we’re conserving is undergoing a dangerous amount of churn (we see this a lot).

    A specialist Wiki put together for the group of remaining specialists, if well organised, allows for fast capture before, and refinement after the quickly evaporating IC has run out. At least you can capture something on a page in a hurry.

    You don’t see this sort of thing on the radar much because it’s too cheap & easy to notice, but a good specialist wiki in use by a small team of specialists can reshape a corporate transition at the point where it’s most vulnerable.

  9. Joseph Pally

    Accessibility is more important than accuracy.

    Practicality is more important than perfection.

    Usefulness is more important than Theory.

    Wikipedia solves the “availability of information” problem so nicely.

    Balanced with the moderation exerted by 6.6 Billion others (according to Wikipedia) – rather than a few elite.

    Readable or not, it is useful a lot.

    I hope Jim Wales gets a Nobel Prize.

  10. caglar kaya

    It seems to me that one’s position is undermined when a politically charged counter example is used. Particularly, when such a particular counter example was not absolutely necessary to prove one’s point.

  11. Michael

    What an excellent article and mostly wonderfully insightful replies. Great minds at work here.

    It is interesting to note that a web 2.0 blog, this one, seems to have spurred the improvement and updating of the two articles you highlighted. You have a long road ahead to highlight all the other bad ones … ;)

    Seriously though, I totally agree there needs to be a large disclaimer pointing out that the ‘facts could be wrong’ above all article. This can be also said of most printed articles too!

    The good news for me is, as information flows ever more freely, even inaccurate information, it is a step further away from the ability for the few to control the many. Gutenberg and Caxton helped remove the church’s control over mankind and Web 2.0 is helping free us from Fox et al.

    I loved the extract from Encarta on Gates. If ever there was a selective re-writing of history that was it. Therein lies the beauty of Web 2.0, for all its faults, Gates can’t make up his own version of history there.

  12. ssavage

    This blog is web 2.0

    I like reading this blog because it is written by someone who thinks and writes well.

    Because it is widely viewed as being important and because it attracts informed and thoughtful comments it is a useful place for me to further refine and understand my perspective (opinion).

    Perspective and hence truth is a multidimensional entity. Some of us view this blog as quality, some view the wikipedia entry on Jane Fonda as quality, while other view it all as garbage.

    The Web involves us, it evolves us. Of course it is an amoral machine, But it is not a machine that is controlled. It is a machine that is us. It’s scale is http://www.ibiblio.org/lunarbin/worldpop

    Not more and Not less.

    So when you say “So let’s can the millenialist rhetoric and see the thing for what it is, not what we wish it would be.”

    You are both right and wrong. It depends on our perspective, as does all things.

    Does it extend our vision? Of course it does. Does it enhance our understanding? Of course it does. Does it do that for every individual whether or not they have ever seen a computer. Of course it does! Has there ever been anything as powerful. Of course NOT.

    Can it do more? (increase our collective conciousness) (bring agreement) (change the direction of political discourse) Of course it can. Look at the election!! See anything different there? :)

    Is it good? Are we? Of course it is / we are.

    Try not to be silly.

    Go figure.

  13. shaggy

    Great Indian Developer Awards 2008

    Vote for your favorite trailblazing individuals & products in the IT Developer ecosystem and win exciting prizes!

    Prizes include: Apple Mac Air Book, 26″ PLASMA TV, NOKIA95 8 GB, APPLE iPOD 80Gb, ZEST HOLIDAYS, GIDS GOODIES.

    The Great Indian Developer Awards is a first-of-its-kind initiative that honors individual & organisational excellence in the IT Developer ecosystem. Just click and cast your vote. With over 15 categories & a distinguished independent jury, who knows… it could be your colleague, company or product that will emerge triumphant. And you get to go home with cool goodies as well! Hurry!

    Click here to vote for the Great Indian Developer Awards: http://www.developersummit.com/awards.html#vote

  14. Camus

    Surely the essential point of Wickipedia is that it is work in progress. if you find these entries so awful why don’t you edit them? Obviously the contributors ought to provide accurate and reliable information, but this is the objective of the whole project – to define and refine entries until they reflect the state of the art in digital information.

Comments are closed.