In an important essay at First Monday, Paul Duguid takes a hard and rigorous look at whether, and to what extent, web-based peer production can produce quality work.
“Two ideas are often invoked,” he writes, “to defend the quality of peer production.” The first of these “laws of quality,” borrowed from open-source software development, is Linus’s Law: “Given enough eyeballs, all bugs are shallow.” Duguid points out, as others have before him, that applying this law outside the sphere of software production is problematic. The “stern gatekeeper” of software quality – the program has to run – is absent in the production of most cultural works. The second law of quality is what Duguid dubs “Graham’s Law,” after Paul Graham, who “claims that ‘The method of ensuring quality’ in peer production is ‘Darwinian … People just produce whatever they want; the good stuff spreads, and the bad gets ignored.'” This law, too, is problematic, argues Duguid. It reflects
an optimistic faith that the “truth will conquer.” While this optimism has roots in Milton’s Areopagitica, it is perhaps a particularly American, democratic belief, enshrined in the First Amendment. Such optimism no doubt makes good political principle, but it does not dictate political process. Freedom of speech is not the same as the freedom to replace other’s versions of the truth with your own. The authors of the U.S. Constitution and the Bill of Rights may have believed that open debate leads to political truth, [but] they did not believe that the Constitution would improve were it changed at the whim of each citizen’s changing view of truth. Consequently, the U.S. Constitution has significant built–in inertia … As this example may suggest, Graham’s implication that continuous tinkering only makes things better is highly suspect. It is hard to see why entropy would be indefinitely suspended by peer production. In areas of “cultural production,” in particular, progress is not necessarily linear, and neither the latest (nor the earliest) version of a work [is] always the best …
Rather than taking the laws on faith, we need to ask in which cases the laws work, in which they do not, and if they do not, why not. So we need to look at cases where the laws have failed to work and then to ask — in general, systemic rather than individual, particularistic terms — why.
Duguid goes on to examine the products of two well-established and seemingly fairly simple peer-production processes on the Internet: Gracenote (for compiling information about the contents of compact disks) and Project Gutenberg (for publishing online versions of out-of-copyright texts). While finding the products of both these projects “immensely useful,” he also documents, painstakingly, that they have deep and persistent flaws: “both suffer from problems of quality that are not addressed by what I have called the laws of quality – the general faith that popular sites that are open to improvement [will] iron out problems and continuously improve.”
He then examines Wikipedia, a more complex project in peer production. He documents the many flaws that bedevil the online encyclopedia, from plagiarism to the use of unreliable sources to sloppy writing, and shows how Wikipedia, “despite its creed of continuous improvement, can defy Graham’s Law” and evolve toward lower rather than higher quality as edits pile up. Duguid focuses his analysis on two entries, for the early English novelists Laurence Sterne and Daniel Defoe, and he admits that these entries “do appear to fall into a backwater of Wikipedia. Thus it may seem unfair to choose these as examples to illustrate aspects of the whole.” But he then makes a crucial point about assessing an encyclopedia’s quality:
I suggested earlier, however, that judging overall quality from the less– rather than the more–frequented parts, the weak rather than the strong links, is not a bad idea. After all, how is the ordinary user to know when he or she has landed in a backwater? With Linus’s Law in mind, we should acknowledge that the eyeballs that consult encyclopedia entries are, in the default case, quite unlike those beta testing or developing code and quite unsuited to recognizing or characterizing any but the most obvious errors. To use an Open Source program is in itself often an acknowledgment of a certain level of skill. To turn to the encyclopedia is, by contrast, more likely a confession of ignorance. If I want or need to find out about Defoe, then I’m not likely to be in a position to critique an entry on him.
“Editing,” Duguid writes, “is a hard task and needs to attract people prepared to think through the salient issues. Wikipedia is very sensitive to malice. It needs to be as sensitive to ineptitude.”
In the end, Duguid concludes that the two “laws of quality” underpinning today’s peer-production projects are insufficient. “If we are to rely on peer production in multiple different spheres of information production,” he says, “we need to look for other ways to assure quality.” But one comes away from this excellent paper wondering whether, once these “other ways” of quality assurance are imposed on a process, it would still qualify as “peer production.” As Duguid eloquently demonstrates, quality doesn’t just happen; it’s not an emergent phenomenon. It’s imposed on a work by people who know what they’re doing. Quality – true quality – may thus be incompatible with the democratic ideal that lies at the heart of what we call peer production.