Monthly Archives: June 2006

Why do you think they call it “lock-in”?

Dave Winer writes, “The only criteria for winning that should be tolerated, by anyone, are features, performance and price. Lock-in is not an honorable or sustainable way to win.” I don’t think it’s that simple. It may not be honorable, but as far as “ways to win” go, lock-in is actually extraordinarily sustainable – much more so, in fact, than features, performance, and price, which all tend to get neutralized more quickly than lock-in does. Many of the greatest franchises in the history of the computer industry, from the IBM mainframe to Windows and Office to HP’s ink cartridges to eBay to the iPod and iTunes, have been sustained by lock-in. And that’s going to continue to be true. As Berkeley economists Carl Shapiro and Hal Varian have written, “The ‘friction-free’ economy is a fiction; look for more lock-in, not less, as the information age progresses.” In many cases – though certainly not all – customers actually like to be locked in; it can make things simpler, and simpler, for most people, is better.

Nothing lasts forever, of course, but if you’re looking for a durable business strategy, lock-in is mighty hard to beat.

The very long tail of spam

A search engine optimizer named Alex reports on the perverse consequences of search-based advertising. He points to a site that apparently has managed to get billions – yes, billions – of spam pages indexed by Google after being launched just 18 days ago. Loaded with AdSense ads and content swiped from other sites, it has already become one of the web’s top 7,000 sites, as ranked by Alexa. “I wonder,” writes Alex, “how much that one person is earning per day with billions and billions of pages indexed and ranking?” Well, if you have a billion pages (all automatically generated, of course) and each pulls in a penny a year, that would earn you a cool ten million bucks per annum, or about 27 grand per day. It’s not hard to see the incentive, is it?

If you’re interested in getting into this business, Alex provides a nifty set of instructions.

A bureaucracy of sorts

Wikipedia “is not the experiment in freewheeling collective creativity it might seem to be,” writes Katie Hafner in today’s New York Times.

So what is Wikipedia?

“At its core,” Hafner says, “Wikipedia is not just a reference work but also an online community that has built itself a bureaucracy of sorts – one that, in response to well-publicized problems with some entries, has recently grown more elaborate. It has a clear power structure that gives volunteer administrators the authority to exercise editorial control, delete unsuitable articles and protect those that are vulnerable to vandalism.”

Hafner goes on to quote Lotus founder and open-source advocate Mitch Kapor, who says that Wikipedia “can tell us a lot about the future of knowledge creation, which will depend much less on individual heroism and more on collaboration.” She also quotes Wikipedia cofounder and chief executive Jimmy Wales, who says that the online encyclopedia’s imposition of restrictions on the editing of certain articles “is a tool for quality control, but it hardly defines Wikipedia. What does define Wikipedia is the volunteer community and the open participation.” Regarding the establishment of editorial rules, Wales says: “It’s not always obvious when something becomes policy. One way is when I say it is.” And she quotes me: “As Wikipedia has tried to improve its quality, it’s beginning to look more and more like an editorial structure. To say that great work can be created by an army of amateurs with very little control is a distortion of what Wikipedia really is.”

Wikipedia’s other cofounder, Larry Sanger, has written a response to Hafner’s article, offering a very different perspective on Wikipedia’s “bureaucracy of sorts.” Sanger, who’s no longer associated with the encylopedia, takes issue with the assumption that the new editorial rules that Wikipedia has recently adopted make the publication “more responsible and more carefully controlled.” This assumption, he says, “is very badly wrong.” According to Sanger, the bureaucracy is a dysfunctional one:

I was seeing a bureaucratic sort of attitude develop just as I was leaving in 2002: people began, to my strenuous objections, to track how long they’d been with the project, how many edits they’d made, and they began to use this data as bludgeons in their disputes with each other. (I was very much opposed to the rule of the “I’ve been here longest”; I was always in favor of a meritocracy of real expertise, rather than a meritocracy of those who knew how to game the Wikipedia system.) People who happened to waste inordinate amounts of time on Wikipedia were very often looked upon by other Wikipedians as authorities, no matter how trollish or nutty they were. In fact, I suspect it helped (and still does help) to appear just slightly off-kilter. Straight shooters and people who rely exclusively on rational argument and genuine intellectual authority (based on actual study and expertise) are too often – not always, but too often – shouted down by pretentious mediocrities who no doubt resent the challenge to their personal authority. This “rule of the most persistent” then naturally ossified into a bureaucracy. That’s how it was (and still is) possible for teenagers and ideologues to gain substantial authority in the system, authority which they might then lord over everyone, regardless of actual level of intellectual attainment.

Sanger criticizes my perception of an “editorial structure” emerging at Wikipedia: “Wikipedia’s plethora of bureaucratic levels and rules really does bother me precisely because it is a bureaucracy. But it is impossible to take Wikipedia’s bureaucracy seriously qua responsible editorial structure. If Wikipedia must have a bureaucracy, at least it could be a bureaucracy of people who possess genuine editorial skill and who lack ideological drums to beat.”

What is Wikipedia? To be honest, I’m not exactly sure. But I do think that Mitch Kapor might be right in saying that it “can tell us a lot about the future of knowledge creation.” What it tells us, though, is a lot more complicated than he seems to want to believe.

UPDATE: Jimmy Wales, responding to the New York Times article, fantasizes about the Times having an “edit this page” button so that he could rewrite the piece to better fit with his view of things.

Because we can

“When we say we want all the world’s video, we really do want all the world’s video.” So says Google’s Hunter Walk in an interview with Beet TV. Beet’s Andy Plesser, noting that YouTube and Yahoo limit video uploads to 100 megabytes, comments: “We were stunned to hear from Hunter that Google Video DOES NOT LIMIT THE LENGTH OF FILE SIZE … The implications of this are enormous. It boggles the mind to consider the amount of storage needed for such an undertaking.” Not only that, but Google encourages people to upload video in the highest quality (ie, largest size) possible: “The higher the quality of the video you can provide to Google Video, the better your video will look on our site!”

Now even granting that we’d all like to see more of such Top 100 Google videos as Fart on TV, Caged & Feathered, I’m the Juggernaut Bitch and the #1 Girl Caught Cheating, the storage and bandwidth requirements for handling unlimited amounts of video do indeed boggle the mind. It certainly explains that new Columbia River powerplant (maybe “dataplant” is a better term).

But to what end? Can somebody tell me, with a straight face, the economic, social or cultural justification for “wanting all the world’s video”? Sure, Google and YouTube are encouraging people to invest huge amounts of time in producing and consuming silly fart and tickle videos, but how many silly fart and tickle videos does the world really need? A petabyte worth? A yottabyte? Whatever the ideal amount is, I think we’re probably nearly there – though I suppose once you push the price of silly fart and tickle videos to zero, demand scales to infinity.

No, the only explanation I can see is some newfangled form of megalomania, fed by a messianic ideology, a mountain of cash and too little sleep.

UPDATE: Ian Betteridge says it’s all to feed the AI. And he’s probably right.

UPDATE 2: Chris Gulker guesstimates that Google is currently serving 2,000 users per server. Anybody have a sense of what effect storing, searching and streaming all the world’s video would have on the company’s computing requirements?

The new powerplants

For anyone still doubting that computing is becoming a utility, check out this New York Times report on the huge new “powerplant” Google is quietly constructing near the Columbia River on the Oregon-Washington border. The massive complex includes “a computing center as big as two football fields, with twin cooling plants protruding four stories into the sky.” Both Microsoft and Yahoo are also building new computing powerplants nearby, drawn by the area’s cheap electricity and good network connections. What we’re seeing is a shift in the pattern of capital investment into information technology – away from individual users toward central utilities. It’s a shift that will play out slowly over the course of many years, as the new utilities expand their capacity and capabilities, and that will reshape the entire IT business.

Have face, will travel

So Microsoft’s self-styled human face is now some other company’s human face. This must be the first corporate human face transplant ever attempted. Will it take? Or will the new body reject the used puss? And what does it say about this whole human face business when a person proclaims himself to be a company’s human face and then, when a better offer comes along, tears himself from the old noggin and stitches himself to the new one? That seems a little untoward to me. If I were in a punny mood, I just might call it a mugging.

A company should probably be a little nervous about letting some blogger set up shop as its human face. The earnings the blogger pulls in through the attention economy may accrue more to his own bottom line than the firm’s. As Doc Searls puts it in a post titled “companies are schwag”: “what matters most in the long run is who you are. Not who you work for.” Which is something a company might want to keep in mind when choosing a face, or even a mask, for that matter.

Hear no evil?

Is Tim O’Reilly having a crisis of faith?

Not long ago, the big-time tech publisher and conference impresario was talking up Web 2.0 as a means of achieving a “technology-mediated” higher consciousness. But a shadow seems to have fallen across O’Reilly’s optimism. In a commencement speech last month, he cautioned, “If history is any guide, the democratization promised by Web 2.0 will eventually be succeeded by new monopolies [which] will have enormous power over our lives – and may use it for good or ill.” A couple of weeks ago, after being mugged in absentia by a hysterical blog mob, O’Reilly said the experience “has shaken my faith in the collective intelligence of the blogosphere.”

Yesterday, O’Reilly posted a blog entry titled “Big Brother Is Listening” (the title was changed today to “The AI Starts Work on Its Ears”) about an ambient-audio identification technology Google is developing that will let the company “capture TV sound with a laptop PC to identify the show that is the source of the sound and to use that information to immediately return personalized Internet content to the PC.” In a paper on the technology, two Google engineers refer to it as “audio fingerprinting.”

Says O’Reilly, warily: “What I find most interesting about this technology is not its current intended use, but all its possible unintended uses! What does it mean when our computers start to get an independent sensorium? How much more thought-provoking does this become when you think about Google as an emergent AI.” He then quotes at length from George Dyson’s disturbing essay “Turing’s Cathedral,” which sketched out the possible dark side of Google’s boy-genius tinkering. O’Reilly calls Dyson’s scenario “seemingly far-fetched but naggingly plausible.”

I wonder if O’Reilly will become the next great tech apostate, following in the footsteps of Bill Joy.