In order to meet a looming deadline for a book I’ve been working on, I’m going to need to cut back my blogging for the next few weeks. Rather than let Rough Type go dormant, though, I thought I might publish, in serial fashion, some earlier pieces I’ve written that haven’t been easily accessible online. I’ll start with “IT Doesn’t Matter,” an article published in the Harvard Business Review in May 2003. The piece caused quite a stir – I documented the controversy as it unfolded – and it continues to be a lightning rod for debate in IT circles.
I went on to greatly expand the argument of the article, and clarify a few of its points, in the 2004 book Does IT Matter?, which you can order here. You can also purchase the full text of “IT Doesn’t Matter,” in its original form, here. The original article includes some additional sidebars and graphics as well as many pages of letters written to HBR in response to the article.
IT doesn’t matter
In 1968, a young Intel engineer named Ted Hoff found a way to put the circuits necessary for computer processing onto a tiny piece of silicon. His invention of the microprocessor spurred a series of technological breakthroughs – desktop computers, local and wide area networks, enterprise software, and the Internet – that have transformed the business world. Today, no one would dispute that information technology has become the backbone of commerce. It underpins the operations of individual companies, ties together far-flung supply chains, and, increasingly, links businesses to the customers they serve. Hardly a dollar or a euro changes hands anymore without the aid of computer systems.
As IT’s power and presence have expanded, companies have come to view it as a resource ever more critical to their success, a fact clearly reflected in their spending habits. In 1965, according to a study by the U.S. Department of Commerce’s Bureau of Economic Analysis, less than 5% of the capital expenditures of American companies went to information technology. After the introduction of the personal computer in the early 1980s, that percentage rose to 15%. By the early 1990s, it had reached more than 30%, and by the end of the decade it had hit nearly 50%. Even with the recent sluggishness in technology spending, businesses around the world continue to spend well over $2 trillion a year on IT.
But the veneration of IT goes much deeper than dollars. It is evident as well in the shifting attitudes of top managers. Twenty years ago, most executives looked down on computers as proletarian tools – glorified typewriters and calculators – best relegated to low-level employees like secretaries, analysts, and technicians. It was the rare executive who would let his fingers touch a keyboard, much less incorporate information technology into his strategic thinking. Today, that has changed completely. Chief executives now routinely talk about the strategic value of information technology, about how they can use IT to gain a competitive edge, about the “digitization” of their business models. Most have appointed chief information officers to their senior management teams, and many have hired strategy consulting firms to provide fresh ideas on how to leverage their IT investments for differentiation and advantage.
Behind the change in thinking lies a simple assumption: that as IT’s potency and ubiquity have increased, so too has its strategic value. It’s a reasonable assumption, even an intuitive one. But it’s mistaken. What makes a resource truly strategic – what gives it the capacity to be the basis for a sustained competitive advantage – is not ubiquity but scarcity. You only gain an edge over rivals by having or doing something that they can’t have or do. By now, the core functions of IT – data storage, data processing, and data transport – have become available and affordable to all. Their very power and presence have begun to transform them from potentially strategic resources into commodity factors of production. They are becoming costs of doing business that must be paid by all but provide distinction to none.
IT is best seen as the latest in a series of broadly adopted technologies that have reshaped industry over the past two centuries – from the steam engine and the railroad to the telegraph and the telephone to the electric generator and the internal combustion engine. For a brief period, as they were being built into the infrastructure of commerce, all these technologies opened opportunities for forward-looking companies to gain real advantages. But as their availability increased and their cost decreased – as they became ubiquitous – they became commodity inputs. From a strategic standpoint, they became invisible; they no longer mattered. That is exactly what is happening to information technology today, and the implications for corporate IT management are profound.