The metabolic thing

The Washington Post today has an expose on the restrooms in Google’s headquarters: “Every bathroom stall on the company campus holds a Japanese high-tech commode with a heated seat. If a flush is not enough, a wireless button on the door activates a bidet and drying.” Tacked up beside that button on the stall door is a piece of paper that “features a geek quiz that changes every few weeks and asks technical questions about testing programming code for bugs.”

I’m reminded, for some reason, of what Danny Hillis, the parallel-processing pioneer whose work paved the way for Google’s computer system, said about mankind: “We’re the metabolic thing, which is the monkey that walks around, and we’re the intelligent thing, which is a set of ideas and culture. And those two things have coevolved together, because they helped each other. But they’re fundamentally different things. What’s valuable about us, what’s good about humans, is the idea thing. It’s not the animal thing.”

A few years back, when Google’s founders still felt free to express their true ambitions, Sergey Brin said to Newsweek, “Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off. Between that and today, there’s plenty of space to cover.” And, certainly, if you had an artificial brain that was smarter than your brain, you’d no longer need the monkey that walks around.

Those Japanese commodes are nice, but it’s important to remember that they’re merely transitional devices. We’ll know that Google has truly fulfilled its vision when the Googleplex no longer needs toilets at all.

Zeitgeist

Here, courtesy of Wikimedia, is a list of the 30 most visited pages in Wikipedia in August:

1. Home page

2. Wikipedia

3. United States

4. List of big-bust models and performers

5. JonBenet Ramsey

6. List of sex positions

7. Wiki

8. Hurricane Katrina

9. Pluto

10. List of female porn stars

11. Irukandji jellyfish

12. Pornography

13. Wii

14. World Wrestling Entertainment roster

15. Jeff Hardy [professional wrestler]

16. Pokemon

17. September 11, 2001 attacks

18. Celebrity sex tape

19. Neighbours [Australian soap opera]

20. Warren Jeffs [polygamist cult leader]

21. C programming language

22. Sasuke Uchiha [fictional anime character]

23. Volkswagen Type 2

24. Priyanka Chopra [Miss World 2000]

25. Morocco

26. Nicole Scherzinger [lead singer of the Pussycat Dolls]

27. United States Air Force

28. Batman

29. List of gay porn stars [a model of wikipedian comprehensiveness]

30. Tupac Shakur

I wonder if “list of big-bust models and performers” and “list of gay porn stars” will be included in the version of Wikipedia that’s being loaded onto those $100 MIT laptops being sent to Third World schoolkids. Oh well, as Kevin Kelly said about the Web, “I doubt angels have a better view of humanity.”

The rebound that never came

In early 2003, in my article “IT Doesn’t Matter,” I had the temerity to suggest that companies should “spend less” on information technology, treating it as a cost of doing business rather than a means of gaining competitive advantage. The suggestion was roundly attacked by the tech industry, which reacted to the article with a kind of collective conniption fit. At the time, you’ll remember, IT spending was in the doldrums, with companies nursing their hangovers from the great IT investment binge of the 90s. The general assumption, among IT vendors in particular, was that the softness in spending was just a blip, that buyers would soon start ratcheting up their outlays again. “This is a cyclical event,” Tom Siebel confidently told the Wall Street Journal. A rebound was just around the corner.

Well, more than three years have passed, Tom Siebel’s company is kaput, and the rebound remains around the corner. It’s become clear that the slowdown in IT spending is not a passing cyclical event but a secular trend, a reflection of a basic change in the way companies view information technology. That fact was underscored last month when Information Week published the latest edition of its annual survey of IT spending among the “InformationWeek 500” – the companies that it identifies as being the most innovative users of IT. The survey reveals not only that IT budgets haven’t jumped since 2003, but that in fact they’ve continued to erode. Between 2003 and 2006, IT spending as a percentage of revenue has on average fallen from 3.66% to 3.21%. Of the 21 industry sectors tracked by Industry Week, only 5 saw in an increase in IT spending as a percentage of sales over the last three years. In absolute terms, IT expenditures have dropped as well, from an average of $353 million in 2003 to $304 million today. Of those shrinking budgets, moreover, the percentage devoted to purchases of new hardware and software slipped from 37% in 2003 to 34.5% today.

Remember, these are the most innovative users of technology, the ones that set the pace for everyone else.

What’s perhaps most revealing about the Info Week study is the trend it reveals in IT spending among IT vendors themselves. They’ve actually been reducing their IT outlays as a percentage of revenue even more aggressively than the average. The most innovative tech companies spent 4.4% of sales on IT in 2003, 4.0% in 2004, 3.5% in 2005, and just 3% in 2006. As the magazine reports, “The IT industry has been engaged in a multipronged attack on improving operational efficiencies. Not surprisingly, it’s a leader in the use of relatively new technologies such as multicore processors, server blades, and virtualization software to create more cost-effective deployment strategies.” The clear implication is that, as other companies begin to capitalize on these same advances, they, too, should be able to achieve even greater reductions in the amount of money they devote to information technology.

Earlier this month, Silicon.com asked a group of UK CIOs whether thay agreed with my contention that companies should be spending less on IT. Two-thirds of them thought I was right. One termed the idea “just common sense for businesses.” It’s not the first time that heresy has turned into dogma.

Trailer park computing

In a recent post on his blog, Sun CEO Jonathan Schwartz coyly hinted at a rethinking of the corporate data center. “Now I understand that IT infrastructure has to be put somewhere,” he wrote. “But the whole concept of a datacenter is a bit of an anachronism. We certainly don’t put power generators in precious city center real estate, or put them on pristine raised flooring with luxuriant environmentals, or surround them with glass and dramatic lighting to host tours for customers … Surely it’s time we all started revisiting some basic assumptions.”

It wasn’t hard to see that Schwartz had something up his sleeve.

Today, in addition to announcing an expanded push into data center virtualization, Sun is revealing that a year from now it plans to begin selling readymade data centers in shipping containers at a starting price of a half million bucks a pop. Designed by supercomputing genius Danny Hillis, the data-center-in-a-box will, Schwartz told the New York Times’s John Markoff, “be attractive to customers that need to expand computing capacity quickly.”

The container, designed to hold up to 245 server computers, can be plopped anywhere that has water and electricity hookups. “Once plugged in,” reports Markoff, “it requires just five minutes to be ready to run applications.”

Welcome to trailer park computing.

black boxThe containerized data center is one more manifestation of the fundamental shift that is transforming corporate computing – the shift from the Second Age client-server model of fragmented, custom-built computing components to the Third Age model of standardized, utility-class infrastructure. As this shift plays out, the center of corporate computing will move from the personal computer upstream to the data center. And, inevitably, what happened to the PC – standardization and commoditization – will happen to the data center as well. What is Sun’s data-center-in-a-box but an early example of the data center as a standardized commodity, an off-the-shelf, turnkey black box? Indeed, the intitiative’s codename is Project Blackbox – and the prototype container that Sun is showing off is painted black.

The effort reflects Hillis’s belief that computing is fated to become a utility, writes Markoff:

Long an advocate of the concept of utility computing, analogous to the way electricity is currently delivered, Mr. Hillis said he realized that large companies were wasting significant time assembling their own systems from small building blocks. “It struck me that everyone is rolling their own in-house and doing manufacturing in-house,” he said. “We realized that this obviously is something that is shippable.”

In many ways, the containerized data center resembles the standardized electricity-generation system that Thomas Edison sold to factories at the end of the 19th century and the beginning of the 20th. Manufacturers bought a lot of those systems to replace their complex, custom-built hydraulic or steam systems for generating mechanical power. Edison’s off-the-shelf powerplant turned out to be a transitional product – though a very lucrative one. Once the distribution network – the electric grid – had matured, factories abandoned their private generating stations altogether, choosing to get their power for a monthly fee from utilities, the ultimate black boxes.

Something similar will happen – is happening – with computing, but how exactly computing assets end up being divided between companies and utilities remains to be seen. In the meantime, commodity data centers, in various physical and virtual forms, should prove increasingly popular to companies looking to radically simplify their computing infrastructure and reduce the single biggest cost of corporate computing today: labor.

UPDATE: Dan Farber covers the launch of the Blackbox, while Jonathan Schwartz makes Sun’s marketing pitch and Greg Papadopoulos puts the machine into the context of data-center evolution. Blackfriars calls it “the ultimate computing commoditization play,” saying it “changes the economics” of data center construction. Techdirt is skeptical about the size of the market: “The sweet spot of companies for whom this will be ideal seems small. Its impact on Sun’s business won’t be as significant as what it represents, the continuing commoditization of corporate infrastructure.” Sun’s Tim Bray writes, “I have no idea how big the market is. But I’m glad we built it, because it is just totally drop-dead fucking cool.” (Question for Miss Manners: Is it kosher for a corporate blogger to use the f-word?)

Excuse me while I blog

Blog. Blog.

Say it five times in a row, preferably out loud: Blog. Blog. Blog. Blog. Blog. Has there ever been an uglier word? You don’t say it so much as you expectorate it. As though it carried some foul toxin that you had to get out of your mouth as quickly as possible. Blog! I think it must have snuck into the language in disguise. Clearly, it was meant to mean something very different. I’d guess it was intended to be a piece of low slang referring to some coarse bodily function.

Like: “Can we pull over at the next rest area? I really have to blog.”

Or: “The baby was up all night blogging.”

Or: “Oh, Christ, I think I just stepped in a blog.”

But somehow it escaped its scatological destiny and managed to hitch itself, like a tick, to a literary form. Who’s to blame? According to Wikipedia, which, needless to say, comes up as the first result when you google blog, Peter Merholz is the man whose name shall live in infamy. While Jorn Borger introduced the term “web log” – on December 17, 1997, to be precise – it was Merholz who “jokingly broke the word ‘weblog’ into the phrase ‘we blog’ in the sidebar of his blog Peterme.com in April or May of 1999. This was quickly adopted as both a noun and verb.” A passing act of silliness for which we all must now suffer. Thank you, Peter Merholz.

It doesn’t seem fair. No other literary pursuit is saddled with such a gruesome name. No one feels ridiculous saying “I am a novelist” or “I am a reporter” or “I am an essayist.” Hell, you can even say “I am an advertising copywriter,” and it sounds fairly respectable. But “I am a blogger”? Even when you say it to yourself, you can hear the sniggers in the background.

Imagine that you, a blogger, have just become engaged to some lovely person, and you are now meeting that lovely person’s lovely parents for the first time. You’re sitting on the sofa in their living room, sipping a cape-codder.

“So,” they ask, “what do you do?”

A tremor of shame flows through you. You try to say “I am a blogger,” but you can’t. It lodges in your throat and won’t budge. Panicked, you take refuge in circumlocution: “Well, I kind of, like, write, um, little commentaries that I, like, publish on the Internet.”

“Little commentaries?”

“Yeah, you know, like, commentaries.”

“About what?”

“Well, generally, they’re commentaries that comment on other commentaries.”

“How fascinating.”

You’re getting deeper into the mire, but you can’t stop yourself. “Yeah. Usually it starts with some news story, and then I and a whole bunch of other people, other commentarians, will start commenting on it, and it’ll just go from there. I mean, imagine that there’s this news story and that a whole bunch of mushrooms start sprouting off it. Well, I’m one of those mushrooms.”

Face it: even “fungus” is a nicer word than “blog.” In fact, if I had the opportunity to rename blogs, I think I would call them fungs. Granted, it’s not exactly a model of mellifluousness either, but at least its auditory connotations tend more toward the sexual than the excretory. “I fung.” “I am a funger.” Such phrases would encounter no obstacle in passing through my lips.

But “I am a blogger”? Sorry. Can’t do it. It sounds too much like a confession. It sounds like something you’d say while sitting in a circle of strangers in a windowless, linoleum-floored room in the basement of a medical clinic. And then you’d start sobbing, covering your face with your hands. And then the fat woman sitting next to you would put her hand on your back. “It’s all right,” she’d say. “We’re all bloggers here.”

Easy as pie

“News is not like the symphony,” writes Dave Winer, “it’s like cooking dinner.” He’s “totally sure,” he says, that he knows how the future of news will play out:

In ten years news will be gathered by all of us. The editorial decisions will be made collectively, and there will be people whose taste we trust who we will turn to to tell us which stories to pay attention to … The role of gatekeeper will be distributed, as will the role of reporter. Very few people, if any, will earn a living doing this, much as most of us don’t earn a living by cooking dinner, but we do it anyway, cause you gotta eat.

Yesterday, the Associated Press reported on the many journalists who have been killed covering the Iraq war:

Western journalists covering the war in Iraq face sniper fire, roadside bombs, kidnappers and a host of other dangers. Their Iraqi colleagues must cope with even greater risks, including families attacked in retribution for sensitive reporting, and arrest on suspicion of links to the violence journalists cover.

At least 85 journalists – mostly Iraqis – have been killed since the U.S.-led invasion in March 2003 – more than in either Vietnam or World War II. The security situation is getting progressively worse, and 2006 has been the deadliest year yet, with at least 25 journalists killed to date.

A week ago today, the Russian reporter Anna Politkovskaya was murdered. She was “shot in the chest as she was getting out of an elevator, then shot in the head.” The same day, two German reporters were murdered inside the tent they had pitched on the side of a road in Afghanistan. Last year, 47 reporters were killed while doing their jobs. The year before that, the death toll was 53.

“It’s easier for readers to become reporters,” Winer says, “than it is for reporters to become readers.”

Thanks for the insight.

United States vs. Google

Every era of computing has its defining antitrust case. In 1969, at the height of the mainframe age’s go-go years, the Justice Department filed its United States vs. IBM lawsuit, claiming that Big Blue had an unfair monopoly over the computer industry. At the time, IBM held a 70 percent share of the mainframe market (including services and software as well as machines).

In 1994, with the PC age in full flower, the Justice Department threatened Microsoft with an antitrust suit over the company’s practice of bundling products into its ubiquitous Windows operating system. Three years later, when Microsoft tightened the integration of its Internet Explorer browser into Windows, the government acted, filing its United States vs. Microsoft suit.

With Google this week taking over YouTube, it seems like an opportune time to look forward to the prospect – entirely speculative, of course – of what could be the defining antitrust case of the Internet era: United States vs. Google.

That may seem far-fetched at this point. In contrast to IBM and Microsoft, whose fierce competitiveness made them good villains, Google seems an unlikely monopolist. It’s a happy-face company, childlike even, which has gone out of its way to portray itself as the Good Witch to Microsoft’s Bad Witch, as the Silicon Valley Skywalker to the Redmond Vader. And yet, however pure its intentions, Google already has managed to seize a remarkable degree of control over the Internet. According to recent ComScore figures, it already holds a dominant 44 percent share of the web search market, more than its next two competitors, Yahoo and Microsoft, combined, and its share rises to 50% if you include AOL searches, which are subcontracted to Google. An RBC Capital Markets analyst recently predicted that Google’s share will reach 70 percent. “The question, really,” he wrote, “comes down to, ‘How long could it take?'”

Google’s AdSense ad-serving system, tightly integrated with the search engine, is even more dominant. It accounts for 62 percent of the market for search-based ads. That gives the company substantial control over the money flows throughout the vast non-retailing sector of the commercial internet.

With the YouTube buy, Google seizes a commanding 43 percent share of the web’s crowded and burgeoning video market. In a recent interview, YouTube CEO Chad Hurley said that his business enjoys a “natural network effect” that should allow its share to continue to rise strongly. “We have the most content because we have the largest audience and that’s going to continue to drive each other,” he said. “Both sides, both the content coming in and and the audience we’re creating. And it’s very similar again to the eBay issue where they had an auction product that gained critical mass.”

Google has been less successful in building up its own content and services businesses, but it’s a fabulously profitable company, thanks to its AdSense money-printing machine, and it can easily afford to acquire other attractive content and services companies. It can also afford, following the lead of Microsoft in the formative years of the PC market, to launch a slew of products across many different categories and let them chip away at their respective markets – which is exactly what it’s been doing. Moreover, its dominance in ad-serving enables it to cut exclusive advertising and search deals with major sites like MySpace, expanding its influence over users and hamstringing the competition.

Google’s corporate pronouncements are carefully, and, by all accounts, sincerely, aimed at countering fears that it is building a competition- and innovation-squelching empire. But its actions often belie its rhetoric. Its founders said they had no interest in launching an internet portal, but then they launched an internet portal. They said they wanted customers to leap off Google’s property as quickly as possible, but then they began cranking out more and more applications and sites aimed at keeping customers on Google’s property as long as possible. The company’s heart may be in the right place, but its economic interests lie elsewhere. And public companies aren’t known for being led by their hearts.

Nothing’s written in stone, of course. Someone could come up with a new and more attractive method of navigating the web that would quickly undermine the foundation of Google’s entire business. But it’s useful to remember that the commercial internet, and particularly Web 2.0, is all about scale, and right now scale is very much on Google’s side. Should Google’s dominance and power continue to grow, it would inevitably have a chilling effect on innovation and hence competition, and the public would suffer. At that point, the big unasked question would start being asked: should companies be able to compete in both the search/ad business and the content/services business, or should competition in those businesses be kept separate? If there is ultimately a defining antitrust case in the internet era, it is that question that will likely be at its core.