Welcome back to frugal computing

1. The paradox of abundance

In a Wired article about the huge new data centers being built along the Columbia River by Google and its competitors, George Gilder writes that “in every era, the winning companies are those that waste what is abundant – as signalled by precipitously declining prices – in order to save what is scarce.” What is abundant today, he argues, is information technology – in particular, computing cycles, data storage, network bandwidth. Google, writes Gilder, operates a “massively parallel, prodigally wasteful petascale computer” in order to be “parsimonious with that most precious of resources, users’ patience.”

Wired editor Chris Anderson expands on Gilder’s theme, and his own Long Tail thesis, in a presentation he’s been giving on “the economics of abundance.” Blogger David Hornik describes the core thrust of Anderson’s argument:

The basic idea is that incredible advances in technology have driven the cost of things like transistors, storage, bandwidth, to zero. And when the elements that make up a business are sufficiently abundant as to approach free, companies appropriately should view their businesses differently than when resources were scarce . . . They should use those resources with abandon, without concern for waste.

It’s certainly true that, from the standpoint of the consumers of basic computing resources, those resources often seem “sufficiently abundant as to approach free.” They are abundant, and that does recast a lot of economic tradeoffs, with far-reaching consequences. But if we step back and look at the supply side of computing, we see a very different picture. What Gilder calls “petascale computing” is anything but free. The marginal cost of supplying a dose of processing power or a chunk of storage may be infinitesimal, but the fixed costs of petascale computing are very, very high. Led by web-computing giants like Google, Microsoft, Amazon, and Ask.com, companies are dumping billions of dollars of capital into constructing utility-class computing centers. And keeping those centers running requires, as Gilder himself notes, the “awesome consumption” of electricity:

If it’s necessary to waste memory and bandwidth to dominate the petascale era, gorging on energy is an inescapable cost of doing business. Ask.com operations VP Dayne Sampson estimates that the five leading search companies together have some 2 million servers, each shedding 300 watts of heat annually, a total of 600 megawatts. These are linked to hard drives that dissipate perhaps another gigawatt. Fifty percent again as much power is required to cool this searing heat, for a total of 2.4 gigawatts. With a third of the incoming power already lost to the grid’s inefficiencies, and half of what’s left lost to power supplies, transformers, and converters, the total of electricity consumed by major search engines in 2006 approaches 5 gigawatts.

In arguing that computing is “almost free,” while at the same time describing how costly it actually is, Gilder overlooks the paradox of abundance: that providing a resource in the quantities required to make it seem “free” can be a very expensive undertaking.

Kevin Kelly, in an interview with Gilder 13 years ago, also published in Wired, got at this paradox. Early in the interview, Gilder asserted, with remarkable prescience, that “you’re going to find that just as the integrated circuit rendered transistors – and hence mips and bits – virtually free, fiber optics is going to render bandwidth and hertz virtually free.” Kelly challenges Gilder: “Every time I hear the phrase ‘virtually free’ I think of the claim about nuclear power: ‘too cheap to meter.’ It’s almost utopian. I find myself not believing it, as much as I want to go along with the idea.” Gilder brushes him off: “When things become free you ignore them. Transistors that used to be seven bucks apiece now cost about a millionth of a cent. That means that you can regard them as insignificant, just as they’re becoming ubiquitous and thus determining the whole atmosphere of enterprise.”

They’re talking past each other because they’re looking at different things – at different ends of the supply chain. Gilder is focused so intently on abundance that he wants to see it everywhere, and, for all his foresight, that leads him to the mistaken, and dangerous, conclusion that computing resources should be “wasted” – or, in Anderson’s words, used “with abandon, without concern for waste.”

In this context, Gilder’s description of Google’s data centers as “prodigally wasteful” is misleading. As Google and its engineers have made clear time and again, the company is not a wastrel but rather a radical conservationist when it comes to computing. It painstakingly engineers every element of its massively parallel system to operate as efficiently as possible – to keep waste, particularly energy waste, to a minimum. And it’s precisely in its thriftiness that Google becomes such a powerful model for – and herald of – a new era in computing.

Brian Hayes, in a 2001 American Scientist essay called “The Computer and the Dynamo,” wrote that “efficiency is more than a matter of economics and industrial policy; it has an aesthetic aspect and even an ethical one … There is satisfaction in accomplishing more with less, in wringing the most results out of the least resources. For a long time this was a prominent strand in the mental habits of computer enthusiasts. To waste a CPU cycle or a byte of memory was an embarrassing lapse. To clobber a small problem with a big computer was considered tasteless and unsporting, like trout fishing with dynamite. Not even rolling blackouts will roll us back to that quaint age of frugal computing, but there is much to admire in its ethos.”

Hayes spoke too soon. Far from being a relic of a bygone era, frugal computing is back – with a vengeance. And the consequences are going to be felt throughout the entire IT business, from the vendors that sell computers, software, and services to the companies that use those products in running their own businesses. After more than two decades of prodigally wasteful computing, the ethos of frugality has returned.

2. The coal-fired computer

“The computers we love so dearly,” wrote Timothy Prickett Morgan in 2004, “are among the most inefficient devices ever invented”; most of the electricity that goes into them is released “as heat, noise, and light”:

The heat of computers comes from chips and mechanical components, the noise comes from fans and disks, and light comes in the form of blinking lights and monitors. Once any kind of computer makes its heat … the energy cycle doesn’t end there. That heat has to be removed so the computers and the people near them can continue functioning properly. This, ironically, takes more fans and air conditioners, and therefore more electricity … And while the electricity bills for running and cooling computers are generally not part of an IT budget, a company with lots of computers has to pay for all that juice.

The energy-inefficiency of the machines themselves is compounded by the way we’ve come to use them. The reigning client-server model of business computing requires that we have far more computers then we actually need. Servers and other hardware are dedicated to running individual applications, and they’re housed in data centers constructed to serve individual companies. The fragmentation of computing has led, by necessity, to woefully low levels of capacity utilization – 10% to 30% seems to be the norm in modern data centers. Compare that to the 90% capacity utilization rates routinely achieved by mainframes, and you get a good sense of how much waste is built into business computing today. The majority of computing capacity—and the electricity required to keep it running—is squandered.

Prickett Morgan calculates that, including secondary air-conditioning costs, the world’s PCs and servers eat up 2.5 trillion kilowatt-hours of energy every year, which, at 10 cents per kilowatt-hour, amounts to “$250 billion in hard, cold cash a year. Assuming that a server or PC is only used to do real work about 15 percent of the time, that means about $213 billion of that was absolutely wasted. If you were fair and added in the cost of coal mining, nuclear power plant maintenance and disposal of nuclear wastes, and pollution caused by electricity generation, these numbers would explode further.”

As Prickett Morgan notes, his numbers (like Gilder’s) are inexact – they’re just educated guesses. It’s impossible to know precisely how much power is being consumed by computer and computing systems and how much of that is wasted. (That, in fact, is the theme of Brian Hayes’s essay.) But it’s clear that both numbers are very large – large enough that they’re beginning to matter to companies. IT’s electricity costs are no longer just a hidden line item on the corporate budget. They’re a problem. Gartner estimates that in five years electricity will account for 20% to 40% of companies’ entire IT budgets. A new Business Week article called “Coping with Data Centers in Crisis” reports that “market researchers at IDC expect companies to spend more money to power and cool servers by 2009 than they will spend on the servers in the first place.” Beyond the high cost, many companies simply can’t get enough electricity to power their server-packed data centers. They’ve tapped out the grid.

And, of course, that’s just electricity. The fragmentation and poor capacity utilization of client-server computing also means that companies have had to buy a lot more gear and software than they’ve needed. Although IT vendors aren’t to blame for all the excess investment – it’s been a byproduct of the immaturity of information technology and, particularly, data communications – they’ve certainly been its primary beneficiaries. It’s been in their interest to promote and perpetuate the complexity and inefficiency of the current model.

3. Greenpeace in the data center

But the old model can’t be sustained for much longer. The economic costs of all the waste are bad enough. But there will soon be political costs as well. Environmental activists have, in recent years, pressured PC makers to take responsibility for recycling their machines. But they have yet to focus on the way information technology is used. As soon as activists, and the public in general, begin to understand how much electricity is wasted by computing and communication systems – and the consequences of that waste for the environment and in particular global warming – they’ll begin demanding that the makers and users of information technology improve efficiency dramatically. Greenpeace and its rainbow warriors will soon storm the data center – your data center.

As Edward Cone notes in a recent CIO Insight article, few IT managers have made conservation a priority in their decision making. But that will change quickly as public pressure mounts:

“We are right on the cusp of change,” says Adam Braunstein, [an IT analyst]. “If you look at things that are already concerns today, like waste disposal or power consumption—heating and cooling issues—and you consider the impact on the ways companies manage their technology, well, it’s going to be a very different world for CIOs in the near future.” Think of environmental consciousness as the next level of alignment, an enterprise-wide phenomenon that IT must support and sometimes lead. Eco-friendly IT may not be a strategic priority at your company, but it probably will be soon. The financial impact of energy costs, the legal liability surrounding device disposal, and the possible marketing benefits of being seen as a socially-conscious company are all drivers of this new reality. Plus, you know, saving the planet. The era of the Green CIO is almost upon us.

The good news is that we now have the technologies required to move beyond the client-server model and into a new era of frugal computing. Many of the most exciting advances in IT today, from virtualization to grid computing to autonomous computing to data encryption to fiber-optic networking to software-as-a-service, share one thing in common: They make corporate computing much more efficient. They allow us to move from the inflexible single-purpose and “single-tenant” systems of client-server computing, with their poor capacity utilization, to flexible, shared “multi-tenant” systems, which can achieve capacity utilization rates of 80% or higher – rates reminiscent of the mainframe era.

4. Winners and losers

As the economic and political costs of client-server computing grow, the shift to more efficient computing systems will accelerate. That’s not only going to change the nature of IT investment and management, it’s going to change the IT industry. Who will the winners be? We can’t know for sure, but some companies are well positioned to reap benefits from the change. There are, for instance, a handful of traditional suppliers – Sun Microsystems, Hewlett-Packard, and AMD, among them – that have made energy efficiency a priority. That gives them a technical and marketing advantage that they may be able to sustain.

There are also early leaders in creating multi-tenant utility systems, such as Amazon.com’s web services unit and Deutsche Telekom’s T-Systems division, which allow companies to avoid buying, running and powering their own hardware. They, too, are well positioned. The rapidly expanding software-as-a-service sector, which also uses efficient multi-tenant systems, offers an increasingly attractive alternative to traditional enterprise software applications. And, of course, there’s Google, which has been a pioneer in efficient computing at both the component and the systems level. Not all of these companies will be successful over the long run, but they do point the way to the future. And they’ve thrown down the gauntlet for the IT firms that cling to the inefficient model that up to now has been so lucrative for vendors.

The biggest winners, though, will be the users of IT. Although the transition to the post-client-server world will be difficult, companies will end up with cheaper, more efficient, and more flexible information systems. And, as Brian Hayes pointed out, we shouldn’t underestimate the aesthetic and ethical benefits for IT professionals. Doing more with less is more satisfying than doing less with more.

This is the second in a series of occasional Rough Type commentaries on the future of business computing. The first commentary looked at the prospects for “Office 2.0.”

11 thoughts on “Welcome back to frugal computing

  1. SteveEisner

    As I understand it, there’s at least one layer of “prodigal waste” inherent in the Google search architecture. Each request is farmed out to multiple servers, to gather both OneBox results and main search results. Requests are made to multiple redundant servers, and those that fail to respond in time are ignored… This contributes in part to the overall speedy response.

    I don’t know the economics of wasted CPU cycles, but it may be penny smart and pound foolish to look at generally frugal machine maintenance but ignore that it could take ten machines to service your query?

  2. finn

    I doubt that capacity utilization at the client is going to improve any time soon – even with the advent of frugal, high-utilization server farms hosting multi-tenant services, we’ll still need desktops and laptops from which to connect. I don’t think those will be getting any less powerful as time goes on, and performance/watt has not been decreasing very much.

    While it’s a welcome change that new services will be deployed on high-utilization platforms, and backoffice functions will migrate as well, the “last mile” of IT will continue to be a land of 10%-or-less CPU utilization.

  3. Nick Carr

    Finn,

    I agree that with PCs and other client devices capacity utilization will likely remain very low. Efficiency gains will come from advances in the design/engineering of the products themselves, with, for instance, better power supplies and better power management systems. (The move from CRTs to flat panel displays for desktops has already helped.) And maybe companies will finally get rid of screensavers.

    Also, a killer app for grid computing might appear at some point, which could increase PC utilization substantially.

    Nick

  4. AdrianCockcroft

    The traditional IT shop is a very diverse mix of systems with “stovepipes” preventing sharing so utilization levels are extremely low. The new generation of systems described by Gilder are large pools of resources that are pattern based, i.e. a relatively small number of patterns are replicated a very large number of times. The utilization of these pooled systems is as high as the operators want it to be. There is no reason for low utilization, you add or remove machines from the pool to control how busy they are on average. The large sites (Google/Yahoo/MSN/eBay/etc) are running continuous workloads on a world-wide basis, so unlike a typical IT shop they don’t go completely idle at night and at the weekends.

  5. Raphaël Labbé

    Love that part: it’s the all video/bandwith problem:

    In arguing that computing is “almost free,” while at the same time describing how costly it actually is, Gilder overlooks the paradox of abundance: that providing a resource in the quantities required to make it seem “free” can be a very expensive undertaking.

    Brilliant post.

  6. pitsch

    excellent article, highlights the shift of the power in todays network architecture,regarding the latest failures in the electricity grids in the US and Europe. Is centralisation and monopolisation, privatisation and commercialisation really more efficiant when you look at the development of large technical networks, e.g. the whole equasion? The Stern Report shows the impact of the climate changes on upcoming economic policies, a rapid shift away from irresponsible 90ies cyber-culture of an utopian virtuality towards the planetary realities of today. If economy was the problem then it probably has to come up with a solution now (google carbon trust?). the history of large technical networks shows a tendency towards “empire” building, no matter if transport, water, gas, electricity or internet. Harold C. Innis, teacher of McLuhan was already approaching media from this angle. What do the large companies understand of this today? “Trust, or the lack thereof, is the number one factor blocking the adoption of software as a service.” says Microsoft about Multitenent Services. Josef Stalin said: “Control is better than trust”. Instead decentralized peer to peer architecture of open source culture with their pragmatic communal approach to sharing information as an abundant ressource seems to be fundamentally more efficiant than what the property model of third-wave-capitalism can offer at the moment. The only possiblity to control the internet and its ressources of wrongly titled “intellectual property” is literally to let it run on centralized corporate utility-class mainframes and catch in the free labour of the users. This is the industrial backend of web2.0, a factory where the consumer becomes the producer. What Gilder calls the principle of abundance is called the fall of the profit rate elsewhere…

  7. Sam

    Nick-

    I really enjoyed your excellent synopsis here and very much agree that it is an important issue worth lots of attention.

    It was Gilder’s energetic optimism which influenced my career shift into IT in the 90’s; but I am left somewhat limp now by the bits he leaves out, butthe sense of inevitability of the incentives of waste.

    My concern is that the weak financial incentives for efficiency compromises will drag on the trend. As you note, the vendors — with some noteable exceptions — like to perpetuate stability, i.e., wasting. The “Fat Client” is the prominent case here, and Vista the classic current example. And users are partly to blame for not exerting enough influence.

    I am troubled that it takes a non-profit like Negroponte’s & Bender’s One Laptop Per Child (“olpc”) to drive the flat-panel industry to making super-efficient screens. Otherwise the Samsungs and Hitachis were headed toward the big, the new, the power-hungry product with fat margin and as you say even fatter TCO stats in end-usage.

    The same risk-aversion on both the buy- and the sell-sides of IT sits right in the middle of your own favorite theme — Does IT Matter?.

    I’m pleased and even proud that Sun & Google are leading here; what their competitors may find is that over time their leadership will provide them wider and wider distinguishing advantages.

    Indeed, one of my hopes for Linux is that a system like Ubuntu might soon run fine on a very minimal stateless system (Compact Flash for hard disk and no fan) that’s inately quiet and cool.

    So, in the end I am hopeful we will take these great opportunties and take them with relish.

  8. Mark Ontkush

    As some you follows green computing extensively on ecoIron, I thought this article summarized the major issues well. I do agree that IT is in for massive change, mostly driven by efficiency. It’s already happening in Asia and Africa, where some companies are reducing their IT costs by up to 80 percent.

    IT isn’t going away, but I do think that the role of IT will change. Maintenence costs will be cut to zero. However, I do see an opportunity for leading the charge into other green technologies such as solar, wind, etc. It’s just a new beginning.

  9. marc moore

    Back of the envelope calculations show about 1-2% of all electricity consumed in the U.S. goes toward powering computers. Hardly the largest use of energy but still significant.

    Considering that most of it is wasted – whether you want to judge that in terms of CPU utilization or content value, the word rings true – it bears looking at in a world in which we depend on ne’er-do-wells for much of our energy resources.

  10. Claus Dahl

    It’s hard to consider these numbers seriously without some comparisons between information technology and pre-information technology. If Marc Moore’s numbers are correct, wouldn’t that mean that even though there is environmental cost to computing, the economic output per energy unit is still much better for information companies than for ‘matter’-companies?

    How much energy does Wal-Mart consume if you include the entire value chain?

    How much of that is spent on IT-related energy costs.

  11. Joel P

    I agree with your defense of Google, but I don’t think you went nearly far enough. Google has designed their architecture to drive costs out so efficiently that electricity is practically the only major cost still left.

    If electricity takes up 75% of my family’s monthly budget, you can’t tell from that fact alone whether it’s because we never turn off the lights, or because we have turned off our phones, cable, internet, etc. and just use one light bulb to eat our soup kitchen meals and read our library books. Maybe it’s 75% of $1,000 (extremely wasteful), or maybe it’s 75% of $10 (unbelievably efficient).

    Back in 2003 Google put up a fantastic whitepaper explaining how and why energy consumption is one of their top design considerations:

    http://labs.google.com/papers/googlecluster.html

    From page 1: “Here we present the architecture of the Google cluster, and discuss the most important factors that influence its design: energy efficiency and price-performance ratio. Energy efficiency is key at our scale of operation . . .”

    -joel

    p.s. The fact that a transistor is “virtually free” misses the point that the CPU in a modern PC contains hundreds of millions of them. Gold is “virtually free” too if you’re talking about a 65 (soon to be 45) nanometer wide piece of it!

Comments are closed.