Some people may be antsy about the consolidation of the world’s computing power into the hands of a few companies, but Google CEO Eric Schmidt sure isn’t one of them. He believes that the cost and the sophistication of the kind of enormous network supercomputer that his company is building present huge barriers to entry for would-be competitors. In a new Business Week interview, Schmidt says that becoming a giant of computing “is our goal.” He continues:
We’re doing it because the applications actually need these [supercomputer] services. A typical example is that you’re a Gmail user. Most people’s attachments are megabytes long, because they’re attaching everything plus the kitchen sink, and they’re using Gmail for transporting random bags of bits. That’s the problem of scale. But from a Google perspective, it provides significant barriers to entry against our competitors, except for the very well-funded ones. I like to think of [the data centers] as cyclotrons. There are only a few cyclotrons in physics and every one of them is important …
The interview runs alongside a long feature story about the development of Google and IBM’s education initiative to train students in the operation of massively parallel computing grids like Google’s. The article quotes a Yahoo executive as saying that the consolidation of computing has, in effect, already happened:
“In a sense,” says Yahoo Research Chief Prabhakar Raghavan, “there are only five computers on earth.” He lists Google, Yahoo, Microsoft, IBM, and Amazon. Few others, he says, can turn electricity into computing power with comparable efficiency.
That seems a little premature – and hubristic – to me. At the very least, I seriously doubt that all the world’s supercomputers will end up in the hands of US corporations.
The article also includes an interesting, if ambiguous, passage in which Eric Schmidt implies that Google will rent out its supercomputer to outside developers and businesses the way that Amazon.com does through Amazon Web Services:
In the past year, Amazon has opened up its own networks of computers to paying customers, initiating new players, large and small, to cloud computing. Some users simply park their massive databases with Amazon. Others use Amazon’s computers to mine data or create Web services …
For clouds to reach their potential, they should be nearly as easy to program and navigate as the Web. This, say analysts, should open up growing markets for cloud search and software tools—a natural business for Google and its competitors. Schmidt won’t say how much of its own capacity Google will offer to outsiders, or under what conditions or at what prices. “Typically, we like to start with free,” he says, adding that power users “should probably bear some of the costs.” And how big will these clouds grow? “There’s no limit,” Schmidt says. As this strategy unfolds, more people are starting to see that Google is poised to become a dominant force in the next stage of computing. “Google aspires to be a large portion of the cloud, or a cloud that you would interact with every day,” the CEO says.
It wouldn’t be a surprise to see Google get into the pure utility computing business, but it would certainly accelerate the development of the computing-on-demand industry immensely, adding to its legitimacy and drawing in other big players.
UPDATE: Amazon isn’t sitting still. Om Malik reports that it has just added a database service, SimpleDB (currently in beta), to its set of web computing services. Says Amazon:
Traditionally, this type of functionality has been accomplished with a clustered relational database that requires a sizable upfront investment, brings more complexity than is typically needed, and often requires a DBA to maintain and administer. In contrast, Amazon SimpleDB is easy to use and provides the core functionality of a database – real-time lookup and simple querying of structured data – without the operational complexity. Amazon SimpleDB requires no schema, automatically indexes your data and provides a simple API for storage and access. This eliminates the administrative burden of data modeling, index maintenance, and performance tuning. Developers gain access to this functionality within Amazon’s proven computing environment, are able to scale instantly, and pay only for what they use.
Here are more details from Information Week.
Schmidt was a pretty brilliant hire for Google.
Doesn’t this remind anyone of the big fiber optic build-outs during the 1990s? Please tell me this: how does infinite supply guarantee infinite demand? Isn’t that a problem for what they are proposing? Build it and they will come? Is this super computing or a cheap re-make of “Field of Dreams”?
How about a new Ron Howard film: “Field of Servers”? Kevin Costner plays a mid-western farmer who gets a vision to build a vast server complex in his corn field. When he does it, the ghosts of Alan Turing and Edsger Dijkstra, appear and write a program that aces the Turing Test! Can’t wait for the DVD!!! ;)
Great article. I think consolidation is inevitable, and the various players ignore it at their peril. Amazon has indeed fired the latest shot with Simple DB. There are enormous economic forces at work here:
http://smoothspan.wordpress.com/2007/12/15/to-rule-the-clouds-takes-software-why-amazon-simpledb-is-a-huge-next-step/
I’m betting we either end up with:
a competitive utility markets (video of the talk I gave about this subject at OSCON ’07) based upon open sourced implementations of open standards
or
eventual government intervention in this industry.
As for barriers to entry – well of course there are huge barriers of entry into becoming a utility provider. However for most companies the concern should be if such utility computing environment reduces barriers to entry into their industry.
Which it will.
The only other thing to note is that there exists a huge network of machines in idle time owned by end consumers – an 800,000 machine network, continuously online, doing nothing and growing bigger. Anyone planning a F2F infrastructure service?
Whoops – that was my FOWA talk and not my OSCON talk.
Still, it covers the same issues.
Simon Wardley:
>> eventual government intervention in this
>> industry.
Maybe in the EU but not here in the US. The big tech companies OWN the Federal government; they dictate policy not comply with it.
Ah, Linuxguru1968 – I assume you are referring to the Net Neutrality arguments? I suspect one catastrophic incident as per Hoff’s predictions and either the the market will wake up to the old lessons of second sourcing or the government will.
Simon Wardley:
>> I assume you are referring to the Net
>> Neutrality arguments
No. I meant that the US government always gives corporate what it wants. Right now that is Net Neutrality(NN) because corporate is making money off of open traffic without restriction. If that ever changes, meaning that NN begins to decrease profits over all, then corporate will demand government intervention. For example, recently ISPs have been trying to block peer-to-peer file sharing of digital music:
Recording Industry vs. The People. There are no laws that redefine copyright law for peer-to-peer sharing of content. Since government won’t intervene,content provider corporations are trying to “make” case law by suing poor college kids and young housewives(That’ll show ’em!). My point: it’s a quorum of corporate interests that determine Internet policy: “The corporation with the most toys(or Washington hookers) wins…” ;)
“how does infinite supply guarantee infinite demand” – that is a fantastic question and there in lies the billion $ question. Much like Microsoft was to the PC someone will be to this Cloud. Build an application capable of using these resources productivly and you win!