1. The paradox of abundance
In a Wired article about the huge new data centers being built along the Columbia River by Google and its competitors, George Gilder writes that “in every era, the winning companies are those that waste what is abundant – as signalled by precipitously declining prices – in order to save what is scarce.” What is abundant today, he argues, is information technology – in particular, computing cycles, data storage, network bandwidth. Google, writes Gilder, operates a “massively parallel, prodigally wasteful petascale computer” in order to be “parsimonious with that most precious of resources, users’ patience.”
Wired editor Chris Anderson expands on Gilder’s theme, and his own Long Tail thesis, in a presentation he’s been giving on “the economics of abundance.” Blogger David Hornik describes the core thrust of Anderson’s argument:
The basic idea is that incredible advances in technology have driven the cost of things like transistors, storage, bandwidth, to zero. And when the elements that make up a business are sufficiently abundant as to approach free, companies appropriately should view their businesses differently than when resources were scarce . . . They should use those resources with abandon, without concern for waste.
It’s certainly true that, from the standpoint of the consumers of basic computing resources, those resources often seem “sufficiently abundant as to approach free.” They are abundant, and that does recast a lot of economic tradeoffs, with far-reaching consequences. But if we step back and look at the supply side of computing, we see a very different picture. What Gilder calls “petascale computing” is anything but free. The marginal cost of supplying a dose of processing power or a chunk of storage may be infinitesimal, but the fixed costs of petascale computing are very, very high. Led by web-computing giants like Google, Microsoft, Amazon, and Ask.com, companies are dumping billions of dollars of capital into constructing utility-class computing centers. And keeping those centers running requires, as Gilder himself notes, the “awesome consumption” of electricity:
If it’s necessary to waste memory and bandwidth to dominate the petascale era, gorging on energy is an inescapable cost of doing business. Ask.com operations VP Dayne Sampson estimates that the five leading search companies together have some 2 million servers, each shedding 300 watts of heat annually, a total of 600 megawatts. These are linked to hard drives that dissipate perhaps another gigawatt. Fifty percent again as much power is required to cool this searing heat, for a total of 2.4 gigawatts. With a third of the incoming power already lost to the grid’s inefficiencies, and half of what’s left lost to power supplies, transformers, and converters, the total of electricity consumed by major search engines in 2006 approaches 5 gigawatts.
In arguing that computing is “almost free,” while at the same time describing how costly it actually is, Gilder overlooks the paradox of abundance: that providing a resource in the quantities required to make it seem “free” can be a very expensive undertaking.
Kevin Kelly, in an interview with Gilder 13 years ago, also published in Wired, got at this paradox. Early in the interview, Gilder asserted, with remarkable prescience, that “you’re going to find that just as the integrated circuit rendered transistors – and hence mips and bits – virtually free, fiber optics is going to render bandwidth and hertz virtually free.” Kelly challenges Gilder: “Every time I hear the phrase ‘virtually free’ I think of the claim about nuclear power: ‘too cheap to meter.’ It’s almost utopian. I find myself not believing it, as much as I want to go along with the idea.” Gilder brushes him off: “When things become free you ignore them. Transistors that used to be seven bucks apiece now cost about a millionth of a cent. That means that you can regard them as insignificant, just as they’re becoming ubiquitous and thus determining the whole atmosphere of enterprise.”
They’re talking past each other because they’re looking at different things – at different ends of the supply chain. Gilder is focused so intently on abundance that he wants to see it everywhere, and, for all his foresight, that leads him to the mistaken, and dangerous, conclusion that computing resources should be “wasted” – or, in Anderson’s words, used “with abandon, without concern for waste.”
In this context, Gilder’s description of Google’s data centers as “prodigally wasteful” is misleading. As Google and its engineers have made clear time and again, the company is not a wastrel but rather a radical conservationist when it comes to computing. It painstakingly engineers every element of its massively parallel system to operate as efficiently as possible – to keep waste, particularly energy waste, to a minimum. And it’s precisely in its thriftiness that Google becomes such a powerful model for – and herald of – a new era in computing.
Brian Hayes, in a 2001 American Scientist essay called “The Computer and the Dynamo,” wrote that “efficiency is more than a matter of economics and industrial policy; it has an aesthetic aspect and even an ethical one … There is satisfaction in accomplishing more with less, in wringing the most results out of the least resources. For a long time this was a prominent strand in the mental habits of computer enthusiasts. To waste a CPU cycle or a byte of memory was an embarrassing lapse. To clobber a small problem with a big computer was considered tasteless and unsporting, like trout fishing with dynamite. Not even rolling blackouts will roll us back to that quaint age of frugal computing, but there is much to admire in its ethos.”
Hayes spoke too soon. Far from being a relic of a bygone era, frugal computing is back – with a vengeance. And the consequences are going to be felt throughout the entire IT business, from the vendors that sell computers, software, and services to the companies that use those products in running their own businesses. After more than two decades of prodigally wasteful computing, the ethos of frugality has returned.
2. The coal-fired computer
“The computers we love so dearly,” wrote Timothy Prickett Morgan in 2004, “are among the most inefficient devices ever invented”; most of the electricity that goes into them is released “as heat, noise, and light”:
The heat of computers comes from chips and mechanical components, the noise comes from fans and disks, and light comes in the form of blinking lights and monitors. Once any kind of computer makes its heat … the energy cycle doesn’t end there. That heat has to be removed so the computers and the people near them can continue functioning properly. This, ironically, takes more fans and air conditioners, and therefore more electricity … And while the electricity bills for running and cooling computers are generally not part of an IT budget, a company with lots of computers has to pay for all that juice.
The energy-inefficiency of the machines themselves is compounded by the way we’ve come to use them. The reigning client-server model of business computing requires that we have far more computers then we actually need. Servers and other hardware are dedicated to running individual applications, and they’re housed in data centers constructed to serve individual companies. The fragmentation of computing has led, by necessity, to woefully low levels of capacity utilization – 10% to 30% seems to be the norm in modern data centers. Compare that to the 90% capacity utilization rates routinely achieved by mainframes, and you get a good sense of how much waste is built into business computing today. The majority of computing capacity—and the electricity required to keep it running—is squandered.
Prickett Morgan calculates that, including secondary air-conditioning costs, the world’s PCs and servers eat up 2.5 trillion kilowatt-hours of energy every year, which, at 10 cents per kilowatt-hour, amounts to “$250 billion in hard, cold cash a year. Assuming that a server or PC is only used to do real work about 15 percent of the time, that means about $213 billion of that was absolutely wasted. If you were fair and added in the cost of coal mining, nuclear power plant maintenance and disposal of nuclear wastes, and pollution caused by electricity generation, these numbers would explode further.”
As Prickett Morgan notes, his numbers (like Gilder’s) are inexact – they’re just educated guesses. It’s impossible to know precisely how much power is being consumed by computer and computing systems and how much of that is wasted. (That, in fact, is the theme of Brian Hayes’s essay.) But it’s clear that both numbers are very large – large enough that they’re beginning to matter to companies. IT’s electricity costs are no longer just a hidden line item on the corporate budget. They’re a problem. Gartner estimates that in five years electricity will account for 20% to 40% of companies’ entire IT budgets. A new Business Week article called “Coping with Data Centers in Crisis” reports that “market researchers at IDC expect companies to spend more money to power and cool servers by 2009 than they will spend on the servers in the first place.” Beyond the high cost, many companies simply can’t get enough electricity to power their server-packed data centers. They’ve tapped out the grid.
And, of course, that’s just electricity. The fragmentation and poor capacity utilization of client-server computing also means that companies have had to buy a lot more gear and software than they’ve needed. Although IT vendors aren’t to blame for all the excess investment – it’s been a byproduct of the immaturity of information technology and, particularly, data communications – they’ve certainly been its primary beneficiaries. It’s been in their interest to promote and perpetuate the complexity and inefficiency of the current model.
3. Greenpeace in the data center
But the old model can’t be sustained for much longer. The economic costs of all the waste are bad enough. But there will soon be political costs as well. Environmental activists have, in recent years, pressured PC makers to take responsibility for recycling their machines. But they have yet to focus on the way information technology is used. As soon as activists, and the public in general, begin to understand how much electricity is wasted by computing and communication systems – and the consequences of that waste for the environment and in particular global warming – they’ll begin demanding that the makers and users of information technology improve efficiency dramatically. Greenpeace and its rainbow warriors will soon storm the data center – your data center.
As Edward Cone notes in a recent CIO Insight article, few IT managers have made conservation a priority in their decision making. But that will change quickly as public pressure mounts:
“We are right on the cusp of change,” says Adam Braunstein, [an IT analyst]. “If you look at things that are already concerns today, like waste disposal or power consumption—heating and cooling issues—and you consider the impact on the ways companies manage their technology, well, it’s going to be a very different world for CIOs in the near future.” Think of environmental consciousness as the next level of alignment, an enterprise-wide phenomenon that IT must support and sometimes lead. Eco-friendly IT may not be a strategic priority at your company, but it probably will be soon. The financial impact of energy costs, the legal liability surrounding device disposal, and the possible marketing benefits of being seen as a socially-conscious company are all drivers of this new reality. Plus, you know, saving the planet. The era of the Green CIO is almost upon us.
The good news is that we now have the technologies required to move beyond the client-server model and into a new era of frugal computing. Many of the most exciting advances in IT today, from virtualization to grid computing to autonomous computing to data encryption to fiber-optic networking to software-as-a-service, share one thing in common: They make corporate computing much more efficient. They allow us to move from the inflexible single-purpose and “single-tenant” systems of client-server computing, with their poor capacity utilization, to flexible, shared “multi-tenant” systems, which can achieve capacity utilization rates of 80% or higher – rates reminiscent of the mainframe era.
4. Winners and losers
As the economic and political costs of client-server computing grow, the shift to more efficient computing systems will accelerate. That’s not only going to change the nature of IT investment and management, it’s going to change the IT industry. Who will the winners be? We can’t know for sure, but some companies are well positioned to reap benefits from the change. There are, for instance, a handful of traditional suppliers – Sun Microsystems, Hewlett-Packard, and AMD, among them – that have made energy efficiency a priority. That gives them a technical and marketing advantage that they may be able to sustain.
There are also early leaders in creating multi-tenant utility systems, such as Amazon.com’s web services unit and Deutsche Telekom’s T-Systems division, which allow companies to avoid buying, running and powering their own hardware. They, too, are well positioned. The rapidly expanding software-as-a-service sector, which also uses efficient multi-tenant systems, offers an increasingly attractive alternative to traditional enterprise software applications. And, of course, there’s Google, which has been a pioneer in efficient computing at both the component and the systems level. Not all of these companies will be successful over the long run, but they do point the way to the future. And they’ve thrown down the gauntlet for the IT firms that cling to the inefficient model that up to now has been so lucrative for vendors.
The biggest winners, though, will be the users of IT. Although the transition to the post-client-server world will be difficult, companies will end up with cheaper, more efficient, and more flexible information systems. And, as Brian Hayes pointed out, we shouldn’t underestimate the aesthetic and ethical benefits for IT professionals. Doing more with less is more satisfying than doing less with more.
This is the second in a series of occasional Rough Type commentaries on the future of business computing. The first commentary looked at the prospects for “Office 2.0.”