For anyone still doubting that computing is becoming a utility, check out this New York Times report on the huge new “powerplant” Google is quietly constructing near the Columbia River on the Oregon-Washington border. The massive complex includes “a computing center as big as two football fields, with twin cooling plants protruding four stories into the sky.” Both Microsoft and Yahoo are also building new computing powerplants nearby, drawn by the area’s cheap electricity and good network connections. What we’re seeing is a shift in the pattern of capital investment into information technology – away from individual users toward central utilities. It’s a shift that will play out slowly over the course of many years, as the new utilities expand their capacity and capabilities, and that will reshape the entire IT business.
Yes, the network is the watchamacallit. NEwayz, c u on Myspace LOL !
Utility is closely related to the Masses; without the number the cost of the utility will be high, resulting in death by starvation. Utility will succeed if it delivers the same thing (or with slight variations) that is useful to many users. This means utility can live only if there is a standardization. Standardization of IT?? It’s a long way to go because it is really about the standardization of business process. Why should businesses standardize the processes and make themselves look exactly like their competitors? Then where is innovation and the likes of Dell? Unlike other utilites IT is about information, which has a 3 difficult (shall we say unhappy?)faces to satisfy: Confidentiality, Integrity and Accessibility. To enforce the C, I and A (business/domain) “knowledge” is critical since with out the knowldge to organize data, the data itself is useless. This knowledge is diverse and is unique to individual organization. How can utility be possible? Remember – each unhappy person is unhappy about their own way? Thus it leaves software that are used in day-to-day jobs like word processing, speech processing, translations, file sharing, knowledge sharing mechanisms like portals etc. etc.. This is already under the utility model. May be one day they’ll go to the new power plants – it doesn’t really matter. BPO? That is another story.
Radha, IT commodification is not about standardizing process. It’s about standardized interfaces (both programmatic and user) that lower switching costs. I couldn’t care less how Google gets their map images, but I can mash them up with enterprise data in a matter of minutes.
Nick, I think the article ignores peer to peer computing, which I think will grow to be an important factor, including in the enterprise space.
Nick,
The most interesting thing to me in that article was this: The plant will have “10,000… processors and disks…” and is “…expected to create 60 to 200 permanent jobs in a town of 12,000…”
Wow. That’s how you scale human capital. We have customers who have that many dba’s!
If you consider the costs that will be taken out of the global IT system by computing as a utility (SaaS and it’s progeny), well, it’s almost too staggering to consider. Global IT spend will be reduced by billions and the business can get back to running itself instead of, say, waiting on the next promotion cycle onto the WebSphere app server.
It won’t happen overnight, but it is happening. Disintermediation of duplicative admin costs and the related armies of IT consultants will free up quite a bit of spend for a business renaissance… we’re on the road toward it, and this Googleplex (or power plant, as you put it) is a signpost along the way.
Google in The Dalles, Oregon. Yahoo & Microsoft up near Quincy, Washington. And what’s in between?? The NSA Echelon facility in Yakima, WA. Draw your own conclusions. :)
Kingsley-
I am also bullish on peer to peer. I believe Microsoft and Apple have much to gain by ‘bringing it back to the desktop.’ (Users, too.) They’ll need to bring carriers into the deal, however, since carriers can throttle p2p activity.
The following is a hypothesis:
Google is based on a business model that if a corporation does what its customers would like it to do, while still showing a profit, everything else will take care of itself. This business model is demonstrably superior outside of an environment of information suppression. Global communication makes information suppression a thing of the past.
Proof of the above hypotheses will be the relative growth of Google in comparison to Microsoft.
“This means utility can live only if there is a standardization”…perhaps more standardization at the infrastructure & middleware level would leave more attention free for meaningful customization at the business application level.
Is this utility computing? How many people are going to lose their jobs or get reassigned once this “powerplant” is up? Which IT expenditures are going to be saved? If a new channel pops up on TV will companies start saving money from their IT budget? Looks like Google is building another “advertising” medium like the radio and TV, fine tuning the buyer/seller match with the viewer participation. Airwaves = Internet, TV = Computer, TV Network = Google Data Center Network. Google wants to shift people to the “google channel” – moving search, email, office work to this chanel – and store data to gain context and sell advertising. Google needs to generate content in addition to distribution, it has split the task into two parts – user generated (email, internet, office docs) and google generated (digitization of books etc). Why would google get into utility computing business – which reduces IT expenditure? The hardware component is a commodity and google doesn’t have the hardware chops (ex. introducing new chip technologies). The services part – maintaining high SLAs – is a so-so margin business which cannot scale as your customer base becomes diverse. As microsoft realized (nathan myrvohld) during the internet beginnings, nothing beats being the sole collector of tolltax between buyers and sellers. This is a tremondously profitable business – justifying google-like valuations.
Kingsly, the hidden question in Nicks’ note is about software delivery in the utility model (can you deliver software like water or electricity?), which is about software as a service (and not software services). For long term sustenance of this model there needs to be critical mass of users for the “same” service. Businesses of various sizes and domains have diversified requirements and data confidentiality issues – so utility is of no value to them except in specialized common services like service request management (CRM)in which case the business process is fairly standardized (get user inputs, issue a ticket, pass on to the technical service, get the response back to CRM, inform the user etc.). I doubt how many of such standardized process are possible in the business environment that can be delivered in the utility model. That leaves the individual users, who need certain services, which are already being available now (Google Spreadsheets, Microsoft Live etc.). This will definitely grow.
Zephram – I’d like Google to give me good search results (so I’m glad they fixed this bug). However, I’d also like them to deliver search as a service in its own right, rather than as a vehicle for advertising, as I believe that the current strategy has major costs for the continuing health of the Web (more here and here). But they’re not likely to do what I’d like them to do, as it would stop them making so much money. Lunch does not want to be free.
hi nick – well, they need the space and some more. i could hardly believe this post at beetv http://www.beet.tv/2006/06/google_wants_al.html – what a drastic change. or did i miss something there?
Phil, the Motley Fool article that you base your premise upon assumes Google is static. The new “powerplants,” as Nick puts it, should be ample evidence that Google is massively upgrading their services. Google was aware of click-fraud and the “user-generated recommendations” solution that Seth espouses long before the thought ever entered his Motley mind. Google insiders are “dropping shares” because they know implementation of a solution is immanent, and that redesign of a flagship utility always shakes up the market for a period of time.
Google insiders are acting on the same information that everyone else has: rumors of the next big push. I sold my shares because I understand the tech industry and I believe the rumors. There will be a dip in Google stock price in the near future as the search engine transitions from a universal algorithm to one that is customizable for individual needs, and as the default algorithm expands to include this new layer of natural selection. After the augmented system proves more useful than the original and its competition, Google stock value will reach an all time high. Long term investors can keep their money at Google with confidence, but day traders, like myself, take advantage of the microtrends.
I wont venture a guess at what Google has planned for the new facility. However, whether it’s Google’s facility, T Systems’, Rackspace’s or IBM’s, the shift to utilty computing is very real. When Savvis took the first steps they were forced to use expensive hardware from Inkra, Egenera and 3Par. That drawback has been overcome.
To give you an idea of how far the technology has come, a few months ago we demonstrated to some select data center operators the ability to take a fully functioning distributed application running in Eastern Europe, move it to the US and run it on different hardware, and then move it again to Canada and run it on yet another set of hardware. No code changes. All in minutes. All with full SLAs on the resources. All with fully mirrored data volumes. All with only commodity servers and Gigabit Ethernet. And I’m not talking about a single server app, but a full blown multi-server distributed app with databases, NAS, application server, load balancer, etc.
The capex advantage in operating this way is interesting, of course, but the opex leverage is an order of magnitude when measured over the life of an application. Numbers like that are going to be hard to ignore for long.
So, whether you end up in Google’s new powerplant or not, I don’t know. But you should start considering the possibility.
Radha:
“For long term sustenance of this model there needs to be critical mass of users for the “same” service.”
This is not true. If your implementation costs are low enough, it doesn’t matter if you only have ten users for it. Make a 100 apps for 10 users each – it’s easy. This is what we are trying to do with AppExchange, and it’s working out really well.
Kingsly, Agreed, but in which case the premium for the services will be quite high, otherwise service provider will not make profit. High premium is a destabilizing force.
Nick, as you and I have bantered before you have a optimistic projection of utility computing. Corporate America and ROW is stuck with multi year deals with IBM, HP, EDS, Accenture, CSC and increasingly offshore vendors – they have shown little interest in using the utility model…even though they have way more scale than the average CIO. Chiseling away at that $ 300 billion of contracts and marketing power/boardroom access is going to take a long long time…kinda interesting you suggest CIOs drag their feet on this but never ask why IBM or EDS does not catch this religion…
Vinnie, I guess it is way too early to talk about Utility computing for Corporate applications. We aren’t even in a prototype phase of the concept (in comparison to the size of the problem). What we have today is only the hosted application model (“ASP” model). No sane corporation would be expected to hand over their complex set of business applications and their critical business data to an unknown “power plant”. Even if IBM or EDS set up their power plants in remote locations, it is not going change the fear factor. The feasibility itself is questionable because of the network consideration and this itself will have prohibitive costs. This is not to say that the utility model has no value but only that it is more useful in certain areas. Innovation and progress in technology will change many of the current roadblocks but Corporate data and applications will and should always remain within the corporate vaults.
Radha and Vinnie,
Much of the complexity in enterprise applications comes from the fact hardware and software have been expensive. I’ve run accross numerous customers concerned about database scalability. Yet, when you dig a little, they really don’t have huge databases. Instead, you find they’ve configured one or two large database servers to handle numerous databases on the same server. This is a recipe for disaster, but they do it to try saving money on licenses and maintenance.
In a well designed utility model, however, creating an instance of a database server is trivial and cheap. Therefore it makes sense to have it serve only one database, which also makes it easy to maintain. And, because the database server is no longer shared, you no longer get the parasitic interference between applications.
Why aren’t IBM, HP and Oracle pushing this? MONEY! They sell the hardware and/or software, and they charge consulting fees to maintain it. True utility computing is disruptive to them, and as such is likely to start outside the enterprises that are so dependent on them. A pattern, by the way, we’ve seen before with PCs, the web, Linux . . .
Barmijo, in my view the complexity is more enormous in the variety and the proprietary nature of information. This is the information interlocked with the way business is conducted. Example: the information needs of Walmart is completely different from the needs of say, Target Corporation, though they are in the same type of business. And I see IT is increasingly becoming Business’s nervous system. In this way, one can say that IT is a commodity, but a commodity that is required for the very life of the Business. Try to separate the nervous system from the body and a lot of trouble will happen – that is the fear of business.