The end is near(er)
September 21, 2005
At the start of this year, I wrote an article about utility computing that came to be published in the spring edition of the MIT Sloan Management Review under the title The End of Corporate Computing. In it, I argued that advances in networking and related technologies like virtualization and web services are going to radically transform the way information technology is supplied to businesses. Companies are going to shift from the traditional, fragmented model of client-server computing, which requires them to buy, assemble and maintain vast quantities of complex and inefficient computing machinery, to a centralized utility model, in which computing assets are rationalized, standardized and consolidated and what we've come to call "IT" is supplied over networks as a service from central utility plants. The economic advantages of the utility model are so great, I argued, that the transformation of IT is inevitable.
When I wrote the piece, I assumed this shift would play out slowly, as the utility model battled against a status quo propped up by the entrenched interests of both suppliers and corporate IT departments. But now I'm not so sure. I may have been thinking too conservatively. In just the last few weeks, we've seen, particularly on the software side, growing signs of a sea change. As consolidation, commoditization and weakening demand turn traditional packaged software into a rust-belt industry, dominated by a couple of big suppliers looking to milk the installed base, innovation and growth seem to be shifting quickly to the software-as-a-service (SaaS) arena. Pushed by diverse utility upstarts like Google (on the consumer side, so far), Salesforce.com, RightNow Technologies, 37signals and Rearden Commerce, traditional software makers are now jockeying to position themselves as players in the SaaS world.
Yesterday, for example, Microsoft put software services at the center of its strategy, or at least very near the center. It announced that its diffuse business units would be collapsed into three groups, including a "platform" organization combining the Windows operating systems with the MSN web site. Ray Ozzie will become the company's services guru, helping to coordinate the shift to a services model across the three groups. "Our goal in making these changes," explained CEO Steve Ballmer in an email to staff, "is to enable Microsoft to achieve greater agility in ... executing our software-based services strategy." It remains to be seen eactly how far and fast Microsoft will go down the services route - and whether it can figure out a services business model that will deliver the kind of rich profits its packaged software has long provided - but the dramatic reorganization shows that it recognizes the old model is dying.
Also yesterday, David Berlind reported on an interesting conversation he had with IBM's Ken Bisconti on the company's aggressive plans to establish IBM Workplace as what I would call a utility interface - a unified access point, or portal, for software services delivered over a network (from either a company's own servers or those of an outside host). Each user's portal can be customized with different service "components" depending on the person's job. "The components can be custom-built components designed to give you access to the parts of a particular business process that your [sic] authorized to have access to, or they can be canned ones such as ... OpenOffice.org-derived productivity components" (ie, open-source substitutes for Microsoft Office). Upgrading and maintenance of the components is all done centrally, by the host, rather than locally, at each client machine. Moreover, "users will be able to save their data to the network and, much the same way their portal with all its components follows them everywhere, so too will the storage and everything the user has saved to it." How well Workplace actually comes to fulfill this vision remains to be seen, but it's another clear sign of the transformation in software delivery.
Software applications are, of course, only one side of the utility model. The other side is the infrastructure - the trillions of dollars worth of hardware and system software that companies today maintain privately, within their proprietary data centers. That, too, is an obsolete model, and the shift to a rationalized, "multi-tenant" utility infrastructure will likely entail even more profound changes than the related shift to software-as-a-service. (Yesterday, to add one more data point, Sun CEO Scott McNealy talked about "selling thin clients as a service, a display grid at $1 per day.") It's on the hardware side, perhaps, that the status quo will prove toughest to displace, though even here I see indications that things may move faster than I originally expected. But more on that later.
How does security play in all this? Because, right now, it stinks. We have identity thievery, spam, spyware, virii, etc., all over the place. Have I mentioned that it stinks? I know quite a few people who want to give up on "computing" because of these very real problems. There is no trust.
Posted by: ordaj at September 21, 2005 10:55 AM
Oracle has been actively providing an OnDemand Service for a few years now. Interestingly its called "Software as a Service".
At the same time, I agree that the hardware clusters will give way to large monolithic single datacentres. We already see that happening at Yahoo and Google for example - the other advantage is that these huge datacentres give you the leverage of practically zero marginal costs for new setups and deployments. This in turn accelerates IT as a commodity, and the spiral continues and grows.
As per security, yes its an area of concern, but vendors are agressively making it work, and legally and contractually binding to make sure all the offerings are secure. Its like trusting yahoo mail with all your credit card details - which most of us anyways do.
Posted by: Nitin at September 22, 2005 03:17 AM
The examples I read from the MIT article excerpt seem to focus on basic office software, but I haven't seen much talked about how large enterprise software would be handled in a utility model.
The fact is that every company I have worked with that implemented enterprise software, whether big stuff like SAP or smaller like Microsoft Navision, had their own unique configurations and requirements. That's easy when the company has its own software on its own machines.
How would that work in a utility environment? Each company would still have its own version of enterprise software, and pay for the customisation and support which is typically much greater than the actual cost of software, but just not its own hardware?
Posted by: Simon G, South Africa at September 22, 2005 04:20 AM
Indeed the economic offerings are indeed appealing as is the fact the internal IT will have less responsibilities.
However if I were a company I would not want my sensitive financial data to be located and managed by some outsider were they could fall into the hands of competitors, the tax service ;) etc. Nevermind the liablities, contract engagements etc. I wouldn't want to go in trouble with the courts either.
In my opinion neither software or hardware is a real asset for a company. Data is.
Posted by: pgrontas at September 23, 2005 07:50 AM
Yanking control of I.T. from business mandarins WILL lead to a market advantage. The conflict between Bean counters/Business Project Management and oversold internal I.T. directly contributes to the 1/5 billion wasted annually in I.T. initiative overload.
Your underlying hypothesis that IT is redundant is primarily based on analysis of inefficient internal IT depts. Quite aside from "IT Doesn't Matter" drum beating, the outsourcing of I.T. (SOA/Utility computing/BI/whatever is this weeks buzzword) to dedicated professionals (to gain an advantage) is a more effective strategy than home grown attempts (with home grown problems).
Posted by: Ezi at September 23, 2005 04:16 PM
"The fact is that every company I have worked with that implemented enterprise software, whether big stuff like SAP or smaller like Microsoft Navision, had their own unique configurations and requirements. That's easy when the company has its own software on its own machines. How would that work in a utility environment? Each company would still have its own version of enterprise software, and pay for the customisation and support which is typically much greater than the actual cost of software, but just not its own hardware?"
I work for a Saas implementor who is maintaining more than 450 customers running various functionalities of an ERP suite, from large scale installations to across the continents usage. Most of these customers have customizations to the base ERP and have specifically tailored their implementations as per their needs, and we are still able to offer it as a utility, at a fixed predictable per user monthly charge.
Nick: If it is of interest to you, I can shed more light on the utility model for complex software implementations like ERP or CRM.
Posted by: Nitin at September 24, 2005 12:42 PM
Nick, most CIOs would actually agree with your vision of utility computing and shrinking IT staff and budgets. They woudl disagree with the timing. They are waiting for the utility economics and realiability to arrive. See my blog below and various links within - we are far, far from getting there not because of CIO reluctance but because of skewed industry economics.
Posted by: vinnie mrichandani at October 9, 2005 07:05 PM
Post a comment
Thanks for signing in, . Now you can comment. (sign out)(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)
"Riveting" -San Francisco Chronicle
"Rewarding" -Financial Times
"Ominously prescient" -Kirkus Reviews
"Riveting stuff" -New York Post