Are we missing the point about cloud computing?
That question has been rattling around in my mind for the last few days, as the chatter about the role of the cloud in business IT has intensified. The discussion to date has largely had a retrospective cast, focusing on the costs and benefits of shifting existing IT functions and operations from in-house data centers into the cloud. How can the cloud absorb what we’re already doing? is the question that’s being asked, and answering it means grappling with such fraught issues as security, reliability, interoperability, and so forth. To be sure, this is an important discussion, but I fear it obscures a bigger and ultimately more interesting question: What does the cloud allow us to do that we couldn’t do before?
The history of computing has been a history of falling prices (and consequently expanding uses). But the arrival of cloud computing – which transforms computer processing, data storage, and software applications into utilities served up by central plants – marks a fundamental change in the economics of computing. It pushes down the price and expands the availability of computing in a way that effectively removes, or at least radically diminishes, capacity constraints on users. A PC suddenly becomes a terminal through which you can access and manipulate a mammoth computer that literally expands to meet your needs. What used to be hard or even impossible suddenly becomes easy.
My favorite example, which is about a year old now, is both simple and revealing. In late 2007, the New York Times faced a challenge. It wanted to make available over the web its entire archive of articles, 11 million in all, dating back to 1851. It had already scanned all the articles, producing a huge, four-terabyte pile of images in TIFF format. But because TIFFs are poorly suited to online distribution, and because a single article often comprised many TIFFs, the Times needed to translate that four-terabyte pile of TIFFs into more web-friendly PDF files. That’s not a particularly complicated computing chore, but it’s a large computing chore, requiring a whole lot of computer processing time.
Fortunately, a software programmer at the Times, Derek Gottfrid, had been playing around with Amazon Web Services for a number of months, and he realized that Amazon’s new computing utility, Elastic Compute Cloud (EC2), might offer a solution. Working alone, he uploaded the four terabytes of TIFF data into Amazon’s Simple Storage Service (S3) utility, and he hacked together some code for EC2 that would, as he later described in a blog post, “pull all the parts that make up an article out of S3, generate a PDF from them and store the PDF back in S3.” He then rented 100 virtual computers through EC2 and ran the data through them. In less than 24 hours, he had his 11 million PDFs, all stored neatly in S3 and ready to be served up to visitors to the Times site.
The total cost for the computing job? Gottfrid told me that the entire EC2 bill came to $240. (That’s 10 cents per computer-hour times 100 computers times 24 hours; there were no bandwidth charges since all the data transfers took place within Amazon’s system – from S3 to EC2 and back.)
If it wasn’t for the cloud, Gottfrid told me, the Times may well have abandoned the effort. Doing the conversion would have either taken a whole lot of time or a whole lot of money, and it would have been a big pain in the ass. With the cloud, though, it was fast, easy, and cheap, and it only required a single employee to pull it off. “The self-service nature of EC2 is incredibly powerful,” says Gottfrid. “It is often taken for granted but it is a real democratizing force in lowering the barriers.” Because the cloud makes hard things easy, using it, Gottfrid told Business Week’s Stephen Baker, “is highly addictive.” The Times has gone on to use S3 and EC2 for other chores, and, says Gottfrid, “I have ideas for countless more.”
The moral of this story, for IT types, is that they need to look at the cloud not just as an alternative means of doing what they’re already doing but as a whole new form of computing that provides, today, a means of doing things that couldn’t be done before or that at least weren’t practicable before. What happens when the capacity constraints on computing are lifted? What happens when employees can bypass corporate systems to perform large-scale computing tasks in the cloud for pennies? What happens when computer systems are built on the assumption that they will be broadly shared rather than used in isolation?
I think we will find that a whole lot happens, and it will go well beyond IT-as-usual. When electricity became a utility – cheap and ubiquitous – it didn’t just reduce the cost of running existing factory machines. As I describe in my book The Big Switch, it allowed a creative fellow like Henry Ford to build an electrified assembly line and change manufacturing forever. It’s natural to see a new technology through the lens of the technology it supplants, but that’s a blinkered view, and it can blind you to the future.