The art of defensive blogging

When it comes to CEO blogs, I’ve long been a skeptic. For one thing, most CEOs are godawful writers. For another, they don’t have much spare time. And for another, a blog, for a top executive, usually just ends up making you a better target. As is so often the case in life, it’s better to hold your tongue.

But an article about AOL’s Ted Leonsis in today’s Washington Post is giving me pause. Leonsis was miffed “to see that whenever he typed his name into Google’s search box, the results were a hodgepodge of news stories.” He wasn’t in control of the message, and for a topdog executive, losing control over how you’re viewed is a very dangerous thing. So he decided that he would “figure out a way to manipulate Google’s complicated search engine to put the information he wanted people to see at the top of his results.” He quickly realized that blogging would be a great way to accomplish that goal. He knew that a blog by a bigwig like himself would attract a lot of links from other bloggers, and thus lift his blog toward the top of search-engine rankings. To magnify the effect, he wisely started dropping into his postings the names of all the celebrities he meets as well as a lot of links to other popular blogs, both of which drew even more links to his blog.

Bingo. Now, his blog and his official biography are the first things you see when you google his name. “My job is done!” he says, with well-deserved pride.

Leonsis is what you might call a defensive blogger. His main goal isn’t to enter into a “conversation” with the AOL “community,” but just to gain more control over the results that show up when people google him. In fact – and this really turns the whole corporate blogging ethos on its pointy little head – Leonsis is blogging not to increase the flow of information but to narrow it, for his own professional benefit.

Now I realize that a lot of people out in the blogosphere will take offense at what Leonsis is doing. He’s not exactly taking a ride on the old cluetrain here. But you have to admit that from a business perspective it’s a brilliant strategy. It’s exactly what Machiavelli would have done if there’d been a blogosphere around back in the early 16th century. Other executives that want to gain more control over how they’re portrayed on the Net are going to have to give this idea a hard look. The best defense may be a good blog.

We are the Distillery

If Yale professor Yochai Benkler woke up with a nasty hangover this morning, we can forgive him. He was probably out late last night at some posh New Haven nightclub celebrating the latest, and some might say greatest, manifestation of his beloved “social production” model of creating cultural goods. The Ladybank Company of Distillers, in Ladybank, Scotland, has announced it is pioneering the communal, Internet-enabled production of Scotch whisky. “As a ‘co-creation’ company,” explains a press release, “Ladybank enables a group of like-minded people to create a product, service or even a community that is free from the normal rules of commerce, because it is driven by their shared passion and shaped by their lifestyle choices.”

The company is setting up an “online boardroom” to facilitate the harnessing of collective booze-making intelligence. Speaking proudly of a growing “virtual community of whisky lovers,” James Thomson, the founder of this wikipedia of tipple, says, “At Ladybank we believe the community spirit we have created among the members will really inform what we do as a business and our online presence will also encourage members to engage with the Ladybank community and exchange their thoughts on how the project should progress.” On its blog, the company says that its “real foundations” are not its physical plants but “the people we have and how they are behind the project and interacting with it.” We are the Web. And now we are the Distillery, too.

No word yet on whether they’ll open source their recipes.

Welcome Web 3.0!

Web 2.0 is so over. First came the tepid reviews of the third annual 2.0 boondoggle. “If you were looking to learn something new,” sniffed GigaOm’s Liz Gannes, “this week’s Web 2.0 Summit was not the place to be.” Wrote a jaded Scott Karp, “there were few revelations, few moments where you had the exhilarating experience of seeing something that was about to change the world. Every conversation I had began with discussing the underwhelming nature of Web 2.0.” “I didn’t come away from the conference having learned much,” confessed Richard MacManus, who felt the highlight of the event “was seeing Lou Reed play live.” It was Lou himself, though, who put it most bluntly, telling the Web 2.0ers, “You got 20 minutes.”

But the nail in the coffin comes in tomorrow’s New York Times, which features a big article by John Markoff on – yes! – Web 3.0. Formerly known as the semantic web, but now rebranded for mass consumption, Web 3.0 promises yet another Internet revolution. It would, Markoff writes, “provide the foundation for systems that can reason in a human fashion … In its current state, the Web is often described as being in the Lego phase, with all of its different parts capable of connecting to one another. Those who envision the next phase, Web 3.0, see it as an era when machines will start to do seemingly intelligent things.”

Personally, I’m overjoyed that Web 3.0 is coming. When dogcrap 2.0 sites like PayPerPost and ReviewMe start getting a lot of attention, you know you’re seeing the butt end of a movement. (There’s a horrible metaphor trying to get out of that last sentence, but please ignore it.) Besides, the arrival of 3.0 kind of justifies the whole 2.0 ethos. After all, 2.0 was about escaping the old, slow upgrade cycle and moving into an age of quick, seamless rollouts of new feature sets. If we can speed up software generations, why not speed up entire web generations? It doesn’t matter if 3.0 is still in beta – that makes it all the better, in fact.

But, seriously, Markoff’s piece is a thought-provoking one. As he describes it, Web 3.0 will be about mining “meaning,” rather than just data, from the web by using software to discover associations among far-flung bits of information:

the Holy Grail for developers of the semantic Web is to build a system that can give a reasonable and complete response to a simple question like: “I’m looking for a warm place to vacation and I have a budget of $3,000. Oh, and I have an 11-year-old child.” Under today’s system, such a query can lead to hours of sifting — through lists of flights, hotel, car rentals — and the options are often at odds with one another. Under Web 3.0, the same search would ideally call up a complete vacation package that was planned as meticulously as if it had been assembled by a human travel agent.

Web 3.0 thus promises to be much more useful than 2.0 (not to mention 1.0) and to render today’s search engines more or less obsolete. But there’s also a creepy side to 3.0, which Markoff only hints at. While it will be easy for you to mine meaning about vacations and other stuff, it will also be easy for others to mine meaning about you. In fact, Web 3.0 promises to give marketers, among others, an uncanny ability to identify, understand and manipulate us – without our knowledge or awareness. If you’d like a preview, watch Dan Frankowski’s presentation You Are What You Say and Oren Etzioni’s presentation All I Really Need to Know I Learned from Google, and then connect the dots. (Thanks to Greg Linden for those links.)

Markoff quotes artificial-intelligence-promoter Danny Hillis, who calls Web 3.0 technologies “spooky.” If Danny Hillis thinks they’re spooky, they’re spooky. But I’m looking on the bright side: At least I’ll have more material for the old blog.

One last thing: I’m claiming the trademarks on Web 3.0 Conference, Web 3.0 Summit, Web 3.0 Camp, Web 3.0 Uncamp, and Web 3.0 Olde Tyme Hoedown.

More blogs, less weight

One thing struck me as I read through the latest State of the Blogosphere report from Technorati boss David Sifry. It wasn’t that the total number of blogs in the known world had leapt once again, to something like 837.4 trillion. Rather, it was the rapidly shrinking presence of blogs among the top media sites as ranked by Technorati. To put it in popular terms, blogs are being squeezed out of the short head and pushed ever deeper into the long tail.

Just two years ago, in October 2004, blogs accounted for 16 of Technorati’s 35 most influential and authoritative media sites. They represented, in other words, 45% of the short head, with mainstream media (MSM) sites holding the remaining 55%. By March 2005, the number of blogs in the top 35 had dropped to 13, or 37%. By August 2005, it was down to 11, or 31%. By February of 2006, blogs held only 4 of the top 35 slots – or 11%. Finally, in Sifry’s new October 2006 report, just 2 blogs – Engadget and Boing Boing – were in the top 35. Blogs’ share of the short head has fallen to a piddling 6%. (See the chart below for a look at the trend.)

downtailing

At the very top of the long-tail curve, there’s been a similar erosion. Back in October 2004, there were three blogs in the Technorati top 10. Last year, there was one. Today, there are zero. Defining the short head more broadly, as the top 100 sites, provides an even starker picture of the rapid downtailing of blogs. Today, only 12 blogs show up in the top 100, down from 18 in February of this year – a drop of 33% in just a few months. There are now fewer blogs in the top 100 than there were in the top 35 in October 2004. It’s worth remembering that the last two years have been a time of remarkable growth and even more remarkable publicity for blogs – almost certainly the peak on both counts. Yet, still, blogs’ share of the top media sites – the sites that set the public agenda – has been shrinking rapidly. Even as the blogosphere has exploded in size, its prominence in online media has been waning.

What this seems to indicate is that the mainstream media is successfully making the leap from the print world to the online world. The old mainstream is the new mainstream. (Whether the MSM’s popular success translates into economic success remains to be seen.) As for blogs, they’re taking their place – and it’s an important place, if a more modest one then some might have hoped – as niche publications, as the new trade journals, newsletters, and zines. The idea of there being an A List of bloggers, then, is something of a misnomer now. The real A List of online media is made up almost entirely of the sites maintained by mainstream media companies. Bloggers seem fated to be, at best, B Listers.

Welcome back to frugal computing

1. The paradox of abundance

In a Wired article about the huge new data centers being built along the Columbia River by Google and its competitors, George Gilder writes that “in every era, the winning companies are those that waste what is abundant – as signalled by precipitously declining prices – in order to save what is scarce.” What is abundant today, he argues, is information technology – in particular, computing cycles, data storage, network bandwidth. Google, writes Gilder, operates a “massively parallel, prodigally wasteful petascale computer” in order to be “parsimonious with that most precious of resources, users’ patience.”

Wired editor Chris Anderson expands on Gilder’s theme, and his own Long Tail thesis, in a presentation he’s been giving on “the economics of abundance.” Blogger David Hornik describes the core thrust of Anderson’s argument:

The basic idea is that incredible advances in technology have driven the cost of things like transistors, storage, bandwidth, to zero. And when the elements that make up a business are sufficiently abundant as to approach free, companies appropriately should view their businesses differently than when resources were scarce . . . They should use those resources with abandon, without concern for waste.

It’s certainly true that, from the standpoint of the consumers of basic computing resources, those resources often seem “sufficiently abundant as to approach free.” They are abundant, and that does recast a lot of economic tradeoffs, with far-reaching consequences. But if we step back and look at the supply side of computing, we see a very different picture. What Gilder calls “petascale computing” is anything but free. The marginal cost of supplying a dose of processing power or a chunk of storage may be infinitesimal, but the fixed costs of petascale computing are very, very high. Led by web-computing giants like Google, Microsoft, Amazon, and Ask.com, companies are dumping billions of dollars of capital into constructing utility-class computing centers. And keeping those centers running requires, as Gilder himself notes, the “awesome consumption” of electricity:

If it’s necessary to waste memory and bandwidth to dominate the petascale era, gorging on energy is an inescapable cost of doing business. Ask.com operations VP Dayne Sampson estimates that the five leading search companies together have some 2 million servers, each shedding 300 watts of heat annually, a total of 600 megawatts. These are linked to hard drives that dissipate perhaps another gigawatt. Fifty percent again as much power is required to cool this searing heat, for a total of 2.4 gigawatts. With a third of the incoming power already lost to the grid’s inefficiencies, and half of what’s left lost to power supplies, transformers, and converters, the total of electricity consumed by major search engines in 2006 approaches 5 gigawatts.

In arguing that computing is “almost free,” while at the same time describing how costly it actually is, Gilder overlooks the paradox of abundance: that providing a resource in the quantities required to make it seem “free” can be a very expensive undertaking.

Kevin Kelly, in an interview with Gilder 13 years ago, also published in Wired, got at this paradox. Early in the interview, Gilder asserted, with remarkable prescience, that “you’re going to find that just as the integrated circuit rendered transistors – and hence mips and bits – virtually free, fiber optics is going to render bandwidth and hertz virtually free.” Kelly challenges Gilder: “Every time I hear the phrase ‘virtually free’ I think of the claim about nuclear power: ‘too cheap to meter.’ It’s almost utopian. I find myself not believing it, as much as I want to go along with the idea.” Gilder brushes him off: “When things become free you ignore them. Transistors that used to be seven bucks apiece now cost about a millionth of a cent. That means that you can regard them as insignificant, just as they’re becoming ubiquitous and thus determining the whole atmosphere of enterprise.”

They’re talking past each other because they’re looking at different things – at different ends of the supply chain. Gilder is focused so intently on abundance that he wants to see it everywhere, and, for all his foresight, that leads him to the mistaken, and dangerous, conclusion that computing resources should be “wasted” – or, in Anderson’s words, used “with abandon, without concern for waste.”

In this context, Gilder’s description of Google’s data centers as “prodigally wasteful” is misleading. As Google and its engineers have made clear time and again, the company is not a wastrel but rather a radical conservationist when it comes to computing. It painstakingly engineers every element of its massively parallel system to operate as efficiently as possible – to keep waste, particularly energy waste, to a minimum. And it’s precisely in its thriftiness that Google becomes such a powerful model for – and herald of – a new era in computing.

Brian Hayes, in a 2001 American Scientist essay called “The Computer and the Dynamo,” wrote that “efficiency is more than a matter of economics and industrial policy; it has an aesthetic aspect and even an ethical one … There is satisfaction in accomplishing more with less, in wringing the most results out of the least resources. For a long time this was a prominent strand in the mental habits of computer enthusiasts. To waste a CPU cycle or a byte of memory was an embarrassing lapse. To clobber a small problem with a big computer was considered tasteless and unsporting, like trout fishing with dynamite. Not even rolling blackouts will roll us back to that quaint age of frugal computing, but there is much to admire in its ethos.”

Hayes spoke too soon. Far from being a relic of a bygone era, frugal computing is back – with a vengeance. And the consequences are going to be felt throughout the entire IT business, from the vendors that sell computers, software, and services to the companies that use those products in running their own businesses. After more than two decades of prodigally wasteful computing, the ethos of frugality has returned.

2. The coal-fired computer

“The computers we love so dearly,” wrote Timothy Prickett Morgan in 2004, “are among the most inefficient devices ever invented”; most of the electricity that goes into them is released “as heat, noise, and light”:

The heat of computers comes from chips and mechanical components, the noise comes from fans and disks, and light comes in the form of blinking lights and monitors. Once any kind of computer makes its heat … the energy cycle doesn’t end there. That heat has to be removed so the computers and the people near them can continue functioning properly. This, ironically, takes more fans and air conditioners, and therefore more electricity … And while the electricity bills for running and cooling computers are generally not part of an IT budget, a company with lots of computers has to pay for all that juice.

The energy-inefficiency of the machines themselves is compounded by the way we’ve come to use them. The reigning client-server model of business computing requires that we have far more computers then we actually need. Servers and other hardware are dedicated to running individual applications, and they’re housed in data centers constructed to serve individual companies. The fragmentation of computing has led, by necessity, to woefully low levels of capacity utilization – 10% to 30% seems to be the norm in modern data centers. Compare that to the 90% capacity utilization rates routinely achieved by mainframes, and you get a good sense of how much waste is built into business computing today. The majority of computing capacity—and the electricity required to keep it running—is squandered.

Prickett Morgan calculates that, including secondary air-conditioning costs, the world’s PCs and servers eat up 2.5 trillion kilowatt-hours of energy every year, which, at 10 cents per kilowatt-hour, amounts to “$250 billion in hard, cold cash a year. Assuming that a server or PC is only used to do real work about 15 percent of the time, that means about $213 billion of that was absolutely wasted. If you were fair and added in the cost of coal mining, nuclear power plant maintenance and disposal of nuclear wastes, and pollution caused by electricity generation, these numbers would explode further.”

As Prickett Morgan notes, his numbers (like Gilder’s) are inexact – they’re just educated guesses. It’s impossible to know precisely how much power is being consumed by computer and computing systems and how much of that is wasted. (That, in fact, is the theme of Brian Hayes’s essay.) But it’s clear that both numbers are very large – large enough that they’re beginning to matter to companies. IT’s electricity costs are no longer just a hidden line item on the corporate budget. They’re a problem. Gartner estimates that in five years electricity will account for 20% to 40% of companies’ entire IT budgets. A new Business Week article called “Coping with Data Centers in Crisis” reports that “market researchers at IDC expect companies to spend more money to power and cool servers by 2009 than they will spend on the servers in the first place.” Beyond the high cost, many companies simply can’t get enough electricity to power their server-packed data centers. They’ve tapped out the grid.

And, of course, that’s just electricity. The fragmentation and poor capacity utilization of client-server computing also means that companies have had to buy a lot more gear and software than they’ve needed. Although IT vendors aren’t to blame for all the excess investment – it’s been a byproduct of the immaturity of information technology and, particularly, data communications – they’ve certainly been its primary beneficiaries. It’s been in their interest to promote and perpetuate the complexity and inefficiency of the current model.

3. Greenpeace in the data center

But the old model can’t be sustained for much longer. The economic costs of all the waste are bad enough. But there will soon be political costs as well. Environmental activists have, in recent years, pressured PC makers to take responsibility for recycling their machines. But they have yet to focus on the way information technology is used. As soon as activists, and the public in general, begin to understand how much electricity is wasted by computing and communication systems – and the consequences of that waste for the environment and in particular global warming – they’ll begin demanding that the makers and users of information technology improve efficiency dramatically. Greenpeace and its rainbow warriors will soon storm the data center – your data center.

As Edward Cone notes in a recent CIO Insight article, few IT managers have made conservation a priority in their decision making. But that will change quickly as public pressure mounts:

“We are right on the cusp of change,” says Adam Braunstein, [an IT analyst]. “If you look at things that are already concerns today, like waste disposal or power consumption—heating and cooling issues—and you consider the impact on the ways companies manage their technology, well, it’s going to be a very different world for CIOs in the near future.” Think of environmental consciousness as the next level of alignment, an enterprise-wide phenomenon that IT must support and sometimes lead. Eco-friendly IT may not be a strategic priority at your company, but it probably will be soon. The financial impact of energy costs, the legal liability surrounding device disposal, and the possible marketing benefits of being seen as a socially-conscious company are all drivers of this new reality. Plus, you know, saving the planet. The era of the Green CIO is almost upon us.

The good news is that we now have the technologies required to move beyond the client-server model and into a new era of frugal computing. Many of the most exciting advances in IT today, from virtualization to grid computing to autonomous computing to data encryption to fiber-optic networking to software-as-a-service, share one thing in common: They make corporate computing much more efficient. They allow us to move from the inflexible single-purpose and “single-tenant” systems of client-server computing, with their poor capacity utilization, to flexible, shared “multi-tenant” systems, which can achieve capacity utilization rates of 80% or higher – rates reminiscent of the mainframe era.

4. Winners and losers

As the economic and political costs of client-server computing grow, the shift to more efficient computing systems will accelerate. That’s not only going to change the nature of IT investment and management, it’s going to change the IT industry. Who will the winners be? We can’t know for sure, but some companies are well positioned to reap benefits from the change. There are, for instance, a handful of traditional suppliers – Sun Microsystems, Hewlett-Packard, and AMD, among them – that have made energy efficiency a priority. That gives them a technical and marketing advantage that they may be able to sustain.

There are also early leaders in creating multi-tenant utility systems, such as Amazon.com’s web services unit and Deutsche Telekom’s T-Systems division, which allow companies to avoid buying, running and powering their own hardware. They, too, are well positioned. The rapidly expanding software-as-a-service sector, which also uses efficient multi-tenant systems, offers an increasingly attractive alternative to traditional enterprise software applications. And, of course, there’s Google, which has been a pioneer in efficient computing at both the component and the systems level. Not all of these companies will be successful over the long run, but they do point the way to the future. And they’ve thrown down the gauntlet for the IT firms that cling to the inefficient model that up to now has been so lucrative for vendors.

The biggest winners, though, will be the users of IT. Although the transition to the post-client-server world will be difficult, companies will end up with cheaper, more efficient, and more flexible information systems. And, as Brian Hayes pointed out, we shouldn’t underestimate the aesthetic and ethical benefits for IT professionals. Doing more with less is more satisfying than doing less with more.

This is the second in a series of occasional Rough Type commentaries on the future of business computing. The first commentary looked at the prospects for “Office 2.0.”

Leaving the commune

It was just a year ago that Tim O’Reilly was talking about Web 2.0 in terms of “collective consciousness” and “the potential of what it is to be human.” “The Internet today is so much an echo of what we were talking about at [New Age HQ] Esalen in the ’70s,” he told Steven Levy, “- except we didn’t know it would be technology-mediated.”

But O’Reilly has changed his tune. Now, on the eve of his latest Web 2.0 Conference, he admits that he views Web 2.0 not in millennialist terms but in purely mercantile ones – as just another way to make a buck. An article in the San Francisco Chronicle notes that many people, “perhaps reacting to the greed that fueled the IPOs of the dot-com years, saw in Web 2.0 a chance to create a new collectivism.” But O’Reilly disagrees. “I don’t see it that way at all,” he says:

Web 2.0, he says, is about business. He says many tech movements start out with similar idealism, only to give way to capitalism. For instance, O’Reilly says, Napster introduced file sharing, but now iTunes has people comfortable with paying for music online. “You do a barn raising at a particular stage of society,” he said, “and then the developers come in … It always happens that way.”

He’s right, of course. But he puts it so coldly – “and then the developers come in” – that even I feel a little twinge of nostalgia for the old idealism, the old hippie dream. I feel like a yin without a yang.

By the way, O’Reilly just announced that he’s written an expanded version of his “What Is Web 2.0?” essay. It’s called “Web 2.0 Principles and Best Practices.” You can buy a copy for $375.

Crowdsourcing surveillance

Good news, folks. In the latest sign of the rapidly expanding scope of Web-based social production, Texas’s Virtual Neighborhood Border Watch Program has gone live, in a beta test open to all. The Del Rio News Herald reports that the Lone Star State began streaming video from nine surveillance cameras along the Mexican border on Friday, allowing regular citizens to tune in over the Internet and watch for attempts by illegals to enter the country. Beneath each video stream is a “Report Suspicious Activity” button, which can be used to alert local authorities should any potential mischief be spotted. There is also a button for ordering donuts.