Bezos on Amazon’s utility

In an interview with Technology Review’s Wade Roush, Amazon.com chief executive Jeff Bezos discusses the company’s rapidly expanding effort to sell utility computing services. One reason it makes sense for Amazon to provide data storage and processing as metered services, Bezos notes, is that the massive computing infrastructure the company has built to run its online store has a great deal of spare capacity. “There are times,” says Bezos, “when we’re using less than 10 percent of capacity. And that is one of the advantages of doing things this way – it promises higher rates of hardware utilization. That’s a system-wide efficiency that should make everybody happy.”

It may seem astounding to think that a big computing system sometimes runs at less than 10 percent of its capacity, but it’s actually the norm in the business world, where millions of private computing plants operate in isolation. And, beyond just Amazon, that’s one of the main reasons utility computing makes such economic sense. The same thing happened with electric power. When manufacturers built private generators to power their plants, they had to build enough capacity to handle their peak potential demand, or load. That meant that the vast majority of capacity went unused. When the alternating current grid arrived, allowing power to be transported over long distances, it suddenly became possible to share capacity among many users, which in turn made it possible for centralized utilities to operate at very high levels of capacity utilization, balancing their load among many users. For computing, the internet is the AC grid. Computing and electric power may be very different technologies, but the supply economics are remarkably similar.

Noting that “we may all eventually do most of our computing within the cloud of computing and storage resources,” Roush asks Bezos, “Can you imagine Amazon evolving into a company that’s as famous for being a kind of Web utility as it is for being an e-retailer?” “No,” replies Bezos. “This is a completely separate business that will grow up in its own way … Today, Amazon itself is the biggest user of [our utility services] like EC2 and S3. We’ve been our own beta customers. But one day my hope is (to answer your question about the future) that Amazon will be just one of many big providers of infrastructure services, not the big provider.” In the past, I’ve written about the inevitable tensions that arise between being a retailer and being a supplier of technology services, given the very different economics of those two businesses. If Amazon’s utility services prove successful, it will be fascinating to watch how the company navigates those tensions.

UPDATE: Bezos was interviewed after a speech he gave about Amazon’s technology “guts” and how they’re being used for its web services. An MP3 of his speech is available here and written summaries are available here and here.

9 thoughts on “Bezos on Amazon’s utility

  1. Andrew

    This is a great first step towards utility computing. The applications to use these services are in their infancy. I wish it were easier to harness the power Amazon is offering. Right now, they provide a series of APIs into their services. I’d love to use S3 and EC2 as rented horsepower with a click of button. Edge case examples are, need to apply a sharpness filter to this 20 minute video? Click here to transfer it to amazon and we’ll notify you when it’s done. My poor G5 wouldn’t suffer the slings and arrows of iMovie’s filters anymore. Currently, I have to find a way to code this into an app, or use one of the few apps that exist to do it.

    I don’t think this model will become successful until it’s easy. Providing a foundation and all of the supplies to build a house doesn’t make everyone a builder.

  2. Morgan Goeller

    Nick,

    Good link and great analysis. You are right that the competition between these two competing sides of the business will be the really interesting story in the years to come.

    For example, what happens if they discover that Amazon itself isn’t the most profitable customer for AWS? What happens come December 15th when there is a challenge for resources?

    IBM seemed to handle it pretty well with its Global Services unit offering very tight integration with internal products, but willing to work with whatever gave them billable hours. If Amazon can do this they will really have the tiger by the tail.

    I would say that AWS needs to be its own independent entity, be that an organization within Amazon with its own P/L responsibilities or a different company. Otherwise, it is going to die a slow death.

    I just got into the beta myself and have started to chronicle my own experiences with EC2, from a more technical point of view. So far, I think they are really onto something, although it might not be the something that non-techies make it out to be.

  3. marc moore

    When I first read about EC2, I thought it was cool as heck. I can rent a virtual machine from Amazon and essentially do whatever I want with it? Rockin’!

    Sadly, their offerings are limited to *nix VMs at the moment and their prices are too high to be cost effective for me. I overpay at $10 per month for my site because I do .NET development – how can I justify $75 per month for a VM?

    But for business purposes, I think it’s a great way to avoid the plague of “physical” machines. No one needs another server to look after, that’s for darn sure. Want to turn it off an stop paying – poof, it’s gone. Brilliant!

    Econimically it makes sense, regardless of the current price point. Squeezing revenue out of underutilized resources is the name of the game and Amazon’s found an untapped resource.

  4. pwb

    Over time, the pressure has got to be for users to bring their technology in house. There’s no reason that the technology shouldn’t be easy enough for the average user to operate.

  5. marianc

    Nick,

    Even if Amazon succeeds in becoming a computing utility supplier (no sure thing), the fundamental economics of the shared services model have not changed since the last time we visited them around 1979. The moment customers perceive that they can provide in-house the same or better service for no more cost than a utility provider, they will in-source once again as soon as cost-cutting becomes a priority. A second driver for in-sourcing will come from the iron-mongers. They will press for increasing their customer base, not concentrating it within a narrow range of shared services providers and they will use financial incentives to do it. This will add fuel to the in-sourcing fire as C-level execs realize that they can in-source ever more cheaply and maintain their independence in the bargain.

    I think that Bezos may misunderstand the essence of the utility model in a market economy. Customers buy from utilities not because they are the only or even the best source of supply. They buy because it is convenient and relatively economical to do so. But there are still plenty of individuals and enterprises who find themselves fully capable of supplying for themselves more cheaply and conveniently what others buy from utilities, e.g. water and power.

    It misunderstands the “Current Wars” to assume that utility-based AC power grids arose due to under-utilization of privately held resources. AC power naturally lends itself to long distance transmission over a grid, while DC does not. Prior to Niagara Falls, many businesses generated their own DC power or purchased it from local sources. The problems of DC power were technological—the infrastructure was more complex and therefore more fragile, hence more costly to maintain and more likely to fail. Utility-style computing is no more or less complex than in-house installations and no more or less likely to fail. The supposed parallel to the AC power grid does not really hold up.

    Finally, the question arises as to what a shared computing services utility will do when it reaches its maximum capacity. As we’ve seen with electric utilities, there is general reluctance to add capacity. There may also be financial inability to add capacity. What leverage would the utility customer have to force the supplier to maintain expected levels of service and price? As we saw in California, with electricity, absolutely none.

  6. Norm Potter

    I don’t doubt that Amazon Services will be spun off as a separate company, with Amazon buying time just like everybody else. That’s the best way to unlock value for shareholders and create a more dynamic corporation.

  7. Nick Carr

    marianc, you’re right that it’s incorrect to assume that “utility-based AC power grids arose due to under-utilization of privately held resources” – any more than the internet arose for that reason. What I wrote was that, by making the long distance transmission of power possible, the AC grid enabled utilities to achieve the economies of scale and balancings of load that allowed them to provide energy much more economically than private generators could. As to “Utility-style computing is no more or less complex than in-house installations and no more or less likely to fail,” that’s incorrect. The emerging utility-class infrastructures, whether maintained by utilities or internally by large companies, promise to be far simpler, more flexibale and more efficient than traditional client-server infrastructures. Nick

  8. Thomas Lord

    There’s a lot of concepts that are relevant to the future of utility computing and, to an extent, if you aren’t a VC or potential partner I don’t want to get into it with you but, one worth mentioning here is the concept of “reserved, critical peak capacity”.

    By “reserved, critical peak capacity” I mean that a utility consumer contracts for guarantees that they can satisfy their peak demand — for critical operations. Of that reserved amount, they sell surplus into the grid. Usage beyond that amount, they buy from the grid.

    Absent a grid, consumers are more often forced to contract for “reserved, absolute peak capacity” — they buy up enough computing resources to handle their idealized peak demands. This is horribly wasteful since they are almost always operating well below absolute peak and, likely, are most often operating below critical peak. The result, which we see today, is most machines in the world sitting idle, most of the time, usually sucking up power and administration.

    The sweet spot is when consumers are buying from the grid between critical and absolute peak, operating self-sufficiently at critical peak, and selling back to the grid when they are below critical peak.

    This nicely spreads out and incentivizes the cost of build-out. Transaction costs and lousy small-scale generation technology have kept the power grid from following this pattern but you can see both beginning to change.

    There’s a lot of infrastructure work to do to get to a situation where the normal thing is to contract for reserved, critical peak. The next 10 years will be very exciting.

    You’re missing a lot of subtleties of the emerging utility computing environment, though, Mr. Carr. ;-)

    -t

  9. Flower Delivery

    The truth is that Amazon doesn’t really know what uses people will find for its services and therefore it can’t say what its market will be until it sees what people come up with. I can see parallels with a 128k hobbyist microcomputer that a large computer company introduced twenty-five years ago called the IBM Personal Computer.

Comments are closed.