Wired editor Chris Anderson offers a spirited defense of internet “systems” like Wikipedia, Google, and the blogosphere. Criticism of these systems, he argues, stems largely from our incapacity to comprehend their “alien logic.” Built on the mathematical laws of probability, they “are statistically optimized to excel over time and large numbers.” They sacrifice “perfection at the microscale for optimization at the macroscale.” Our “mammalian minds,” by contrast, are engineered not to apprehend the wonders of the vast, probabilistically determined whole but to focus on the quality of the individual pieces. We’re prisoners of the microscale: “We want to know whether an encyclopedia entry is right or wrong. We want to know that there’s a wise hand (ideally human) guiding Google’s results. We want to trust what we read.”
Google in particular, Chris writes, “seems both omniscient and inscrutable. It makes connections that you or I might not, because they emerge naturally from math on a scale we can’t comprehend. Google is arguably the first company to be born with the alien intelligence of the Web’s large-N statistics hard-wired into its DNA. That’s why it’s so successful, and so seemingly unstoppable.”
Maybe it’s just the Christmas season, but all this talk of omniscience and inscrutability and the insufficiency of our mammalian brains brings to mind the classic explanation for why God’s ways remain mysterious to mere mortals: “Man’s finite mind is incapable of comprehending the infinite mind of God.” Chris presents the web’s alien intelligence as something of a secular godhead, a higher power beyond human understanding. Noting that “the weave of statistical mechanics” is “the only logic that such really large systems understand,” he concludes on a prayerful note: “Perhaps someday we will, too.” In the meantime, we must have faith.
I confess: I’m an unbeliever. My mammalian mind remains mired in the earthly muck of doubt. It’s not that I think Chris is wrong about the workings of “probabilistic systems.” I’m sure he’s right. Where I have a problem is in his implicit trust that the optimization of the system, the achievement of the mathematical perfection of the macroscale, is something to be desired. To people, “optimization” is a neutral term. The optimization of a complex mathematical, or economic, system may make things better for us, or it may make things worse. It may improve society, or degrade it. We may not be able to apprehend the ends, but that doesn’t mean the ends are going to be good.
In a comment on Chris’s post, a fellow named Brock takes issue with the idea that Wikipedia is a probabilistic system. The value of Wikipedia, he says, lies not in the whole but in the individual entries, and the quality of those entries is determined not by statistics but by the work of individuals: “Wikipedia is wrong when a single person is wrong.” Chris counters that, even with Wikipedia, the whole matters: “The main point I was making about Wikipedia was not that any single entry is probabilistic, but that the *entire encylopedia* is probabilistic. Your odds of getting a substantive, up-to-date and accurate entry for any given subject are excellent on Wikipedia, even if every individual entry isn’t excellent.” He then provides a hypothetical illustration:
To put it another way, the quality range in Britannica goes from, say, 5 to 9, with an average of 7. Wikipedia goes from 0 to 10, with an average of, say, 5. But given that Wikipedia has ten times as many entries as Britannica, your chances of finding a reasonable entry on the topic you’re looking for are actually higher on Wikipedia. That doesn’t mean that any given entry will be better, only that the overall value of Wikipedia is higher than Britannica when you consider it from this statistical perspective.
OK, but what are the broader consequences? Might not this statistical optimization of “value” at the macroscale be a recipe for mediocrity at the microscale – the scale, it’s worth remembering, that defines our own individual lives and the culture that surrounds us? By providing a free, easily and universally accessible information source at an average quality level of 5, will Wikipedia slowly erode the economic incentives to produce an alternative source with a quality level of 9 or 8 or 7? Will blogging do the same for the dissemination of news? Does Google-surfing, in the end, make us smarter or dumber, broader or narrower? Can we really put our trust in an alien logic’s ability to create a world to our liking? Do we want to be optimized?
Over a virtual Bethlehem rises a virtual star, and in the manger we find Kevin Kelly’s Machine, conjuring thoughts beyond our ken. Is it Our Savior or a mathematically perfected Rough Beast?