Throwing computers at health care

Computerworld reports on an extensive new Harvard Medical School study, appearing in the American Journal of Medicine, that paints a stark and troubling picture of the essential worthlessness of many of the computer systems that hospitals have invested in over the last few years. The researchers, led by Harvard’s David Himmelstein, begin their report by sketching out the hype that now surrounds health care automation:

Enthusiasm for health information technology spans the political spectrum, from Barack Obama to Newt Gingrich. Congress is pouring $19 billion into it. Health reformers of many stripes see computerization as a painless solution to the most vexing health policy problems, allowing simultaneous quality improvement and cost reduction …

In 2005, one team of analysts projected annual savings of $77.8 billion, whereas another foresaw more than $81 billion in savings plus substantial health gains from the nationwide adoption of optimal computerization. Today, the federal government’s health information technology website states (without reference) that “Broad use of health IT will: improve health care quality; prevent medical errors; reduce health care costs; increase administrative efficiencies; decrease paperwork; and expand access to affordable care.”

As was true of business computing systems in general, at least until the early years of this decade, it’s been taken on faith that big IT investments will translate into performance gains: If you buy IT, the rewards will come. Never mind that, as the researchers note, no actual studies “have examined the cost and quality impacts of computerization at a diverse national sample of hospitals.”

Now, at last, we have such a study. The researchers combed through data on IT spending, administrative costs, and quality of care at 4,000 US hospitals for the years 2003 through 2007. Their analysis found no correlation between IT investment and cost savings or efficiency at hospitals and in fact found some evidence of a link between aggressive IT spending and higher administrative costs. There appeared to be a slight correlation between IT spending and care quality, in some areas, though even here the link was tenuous:

We found no evidence that computerization has lowered costs or streamlined administration. Although bivariate analyses found higher costs at more computerized hospitals, multivariate analyses found no association. For administrative costs, neither bivariate nor multivariate analyses showed a consistent relationship to computerization. Although computerized physician order entry was associated with lower administrative costs in some years on bivariate analysis, no such association remained after adjustment for confounders. Moreover, hospitals that increased their computerization more rapidly had larger increases in administrative costs. More encouragingly, greater use of information technology was associated with a consistent though small increase in quality scores.

We used a variety of analytic strategies to search for evidence that computerization might be cost-saving. In cross-sectional analyses, we examined whether more computerized hospitals had lower costs or more efficient administration in any of the 5 years. We also looked for lagged effects, that is, whether cost-savings might emerge after the implementation of computerized systems. We looked for subgroups of computer applications, as well as individual applications, that might result in savings. None of these hypotheses were borne out. Even the select group of hospitals at the cutting edge of computerization showed neither cost nor efficiency advantages. Our longitudinal analysis suggests that computerization may actually increase administrative costs, at least in the near term.

The modest quality advantages associated with computerization are difficult to interpret. The quality scores reflect processes of care rather than outcomes; more information technology may merely improve scores without actually improving care, for example, by facilitating documentation of allowable exceptions …

[A]s currently implemented, health information technology has a modest impact on process measures of quality, but no impact on administrative efficiency or overall costs. Predictions of cost-savings and efficiency improvements from the widespread adoption of computers are premature at best.

There is a widespread faith, beginning at the very top of our government, that pouring money into computerization will lead to big improvements in both the cost and quality of health care. As this study shows, those assumptions need to be questioned – or a whole lot of taxpayer money may go to waste. Information technology has great promise for health care, but simply dumping cash into traditional commercial systems and applications is unlikely to achieve that promise – and may backfire by increasing costs further.

Cloud computing, circa 1965

A correspondent pointed me to this document, dated March 30, 1965, in which an executive with Western Union, the telegraph company, lays out the company’s ambitious plan to create “a nationwide information utility, which will enable subscribers to obtain, economically, efficiently, immediately, the required information flow to facilitate the conduct of business and other affairs.”

The idea of a “computing utility” was much discussed in the 1960s, but this document nonetheless provides a remarkably prescient outline of what we now call cloud computing. Some excerpts:

Over the past century or more there have evolved in this country a limited number of basic systems serving the general public – a group generally termed “public utilities.” These utilities serve, among others, such fields as transportation; communications (telegraph, telephone, cable, radio, the broadcast services, etc.); and the energy systems, distributing power.

What is now developing, very rapidly, is a critical need – as yet not fully perceived – for a new national information utility which can gather, store, process, program, retrieve and distribute on the broadest possible scale, to industry; to the press; to military and civilian government; to the professions; to department stores, banks, transportation companies and retailers; to educational institutions, hospitals and other organizations in the fields of public health, welfare and safety; and to the general public, virtually all of the collected useful intelligence available, through locally-, regionally- and nationally-linked systems of computers. Just as an electrical energy system distributes power, this new information utility will enable subscribers to obtain, economically, efficiently, and immediately, the required information flow to facilitate the conduct of business, personal and other affairs.

There is no substantial technical bar even now to the establishment of such a nationwide information utility. Computers and associated equipment, the methodology, the storage and retrieval techniques, the knowledge required to provide the very broad bandwidth required for high-speed data transmission – all these exist today. Their harnessing into a national system presents no technical problems essentially more difficult than the strategic placements a half-century ago of steam turbines to create electrical energy, and the related building of power grids … Indeed, the computer and the turbine share a common characteristic in that (within appropriate limits of optimum sizes and capacities) the larger the unit, the more efficient it is in terms of unit-cost production … The cardinal economic principle at issue here is that an information utility serving a large number of users can provide service to each more economically than he can provide it for himself, just as a power system can provide energy to its customers at lower cost than they, individually, can generate it for themselves …

We envision, then, the expansion of the existing plant, offices, personnel, and nationwide operations of Western Union, to transform it into a national information system [that] would furnish a uniform, efficient, integrated information service to meet the needs of all types of users, everywhere …

It might be added, here, that any movement by the Bell System to substitute itself for Western Union as the nation’s information utility, as well as the pervasive, dominant power in the telephone field, would obviously create profound national concern on the score of “giantism” – since any further and large assumption of added power would bring about one entity of even more menacing size than now …

Western Union has the skills and experience that uniquely qualify if for such a role; the public need for such a new utility is growing at a rapid rate; the field is already large and the potential tremendous – probably at least as large as any other national utility that exists today.

When the history of cloud computing is written, it may be that Western Union will play the role that Xerox now plays in the history of the personal computer: the company that saw the future first, but couldn’t capitalize on its vision.

Murdoch’s wink

Could the status quo of the commercial internet be shaken as a result of an old man’s misinterpretation of a question?

Maybe so.

Earlier this month, Rupert Murdoch sat down for an interview with Sky News Australia (a company that Murdoch’s News Corporation partially owns). A little way into the interview, the following exchange took place:

Interviewer: The other argument from Google is that you could choose not to be on their search engine … so that when someone does do a search, your web sites don’t come up. Why haven’t you done that?

Murdoch: Well, I think we will. Uh, but that’s when we start charging. We do it already, with the Wall Street Journal. We have a wall, but it’s not right to the ceiling. You can get usually the first paragraph of any story, but if you’re not a paying subscriber to wsj.com, there’s immediately – you get a paragraph and a subscription form.

Murdoch seemed to be saying – and was widely reported to have said – that News Corp is planning to block Google from indexing its content. But when you listen to his full answer, in which he confuses opting out of Google’s search engine with raising pay walls on sites, it’s hard to know what he was actually intending to say.

But inadvertent or not, Murdoch’s suggestion that he’ll pull News Corp content out of Google’s database could turn out to be a brilliant signaling strategy, one that, as Mike Arrington has written, could ultimately alter the balance of power on the Net.

Last spring, in my post Google in the Middle, I described the dilemma in which newspapers find themselves when it comes to to Google’s search engine. On the one hand, Google is an important source of traffic for their sites. On the other hand, Google prevents them from making decent money online – by massively fragmenting traffic, by undermining brand power, and by turning news stories into fungible commodities. Individual newspapers can’t live with Google, but they can’t live without it either:

When it comes to Google and other aggregators, newspapers face a sort of prisoners’ dilemma. If one of them escapes, their competitors will pick up the traffic they lose. But if all of them stay, none of them will ever get enough traffic to make sufficient money. So they all stay in the prison, occasionally yelling insults at their jailer through the bars on the door.

Of course, there has always been a way to break out of the prison: If a critical mass of newspapers were to opt out of Google’s search engine simultaneously, they would suddenly gain substantial market power. Newspapers are struggling, but they remain, by far, the world’s dominant producers of hard news. That gives them, as a group, a great deal of leverage over companies like Google who depend on a steady stream of good, fresh online content. Google needs newspapers at least as much as newspapers need Google – a fact that’s been largely hidden up to now.

What Murdoch effectively did in his interview with Sky News was to send a signal to other newspaper companies: We’ll opt out if you’ll opt out. Murdoch positioned himself as the would-be ringleader of a massive jailbreak, without actually risking a jailbreak himself.

There are signs that the signal is working. Bloomberg reports today that the publishers of the Denver Post and the Dallas Morning News are now considering blocking Google in one way or another. More ominously (if you’re Google), Microsoft has apparently responded to the signal by offering to pay News Corp to make Bing the exclusive search engine for its content. Microsoft doesn’t have a lot of weapons to use against Google in the search business, but getting prominent news organizations to block Google would be a very powerful weapon. (Steve Ballmer would be more than happy to reduce the basic profitability of the search business, as that would inflict far more damage on Google than on Microsoft.)

Faced with a large-scale loss of professional news stories from its search engine, Google would likely have little choice but to begin paying sites to index their content. That would be a nightmare scenario for Google – and a dream come true for newspapers and other big content producers.

Of course, for now this is all just speculation. The odds are that none of it will come to pass. The idea that newspapers might come together to pursue a radical and risky strategy seems far-fetched. Then again, maybe the time has finally come for newspapers to take a deep collective breath and apply the leverage they still hold. They don’t have a whole lot left to lose.

Recent writings

Here are links to a few pieces I’ve written that have appeared over the last week or so:

The Price of Free, in the New York Times Magazine, looks at how online video is beginning to reshape the TV business.

The San Francisco Chronicle ran my review of Ken Auletta’s new book, Googled.

As part of its online retrospective about the last decade, Newsweek has a brief piece I wrote about how the Google Guys have altered the way we think.

Germany’s Die Zeit ran an article I wrote about the implications of cloud computing.

Also, today’s edition of Le Figaro, in France, has a piece that draws on my ideas about cloud computing.

The Singularity University fight cheer

Singularity University appears to be in full swing now, which is a great comfort to me. Already I feel much less fearful about being turned into a sex slave for a gang of immeasurably brainy robots.

Ted Greenwald, from Wired’s Epicenter blog, has been hanging out at the Sing U campus – it feels, he says, like “a top-secret installation out of a James Bond movie, crowned with strange domed buildings and adorned by sculptures of airships” – and auditing some classes. You can find a rundown of his reports here.

It bothers me, though, that Sing U doesn’t appear to have a school mascot yet. I certainly understand that the university is unlikely to be a sports powerhouse, but, still, it’s bound to have a few teams – fencing and mental gymnastics, at least – so it really needs a mascot to rally the student body. I’ve been doing some brainstorming and have come up with a few ideas:

The Exponential Curves

The Uploaded Brains

The Odd Ducks

The Supplements

The Transhumans

But these pale beside what I’ve come to consider to be the obvious choice: The Singularity University Methuselahs.

Of course, you can’t have a school mascot without a school fight cheer. I’ve come up with one of those too:

Sing! U!

Me! thu! se! lahs!

Never say die!

Never say die!

Sing! U!

Me! thu! se! lahs!

Outlast ’em!

Does my tweet look fat?

As the velocity of communication approaches realtime, language compresses.

Think about it. When people originally started talking about Twitter, the first thing they’d always mention was the 140-character limit that the service imposes on tweets. So short! Who can say anything in 140 lousy characters? Crazy!

And it’s true that when a person who is used to longer forms of writing starts emitting tweets, keeping to just 140 characters can be a challenge. You actually have to think a bit about how to squeeze your thoughts to fit the format. It doesn’t take long, though, for a twitterer to adapt to the new medium, and once you’re fully adapted something funny happens. The sense that 140 characters is a constraint not only disappears, but 140 characters starts to seem, well, long. Your own tweets shrink, and it becomes kind of annoying when somebody actually uses the full 140 characters. Jeez, I’m going to skip that tweet. It’s too long.

The same thing has happened, of course, with texting. Who sends a 160-character text? A 160-character text would feel downright Homeric. And that’s what a 140-character tweet is starting to feel like, too.

I think our alphabetic system of writing may be doomed. It doesn’t work well with realtime communication. That’s why people are forced to use all sorts of abbreviations and symbols – the alphabet’s just too damn slow. In the end, I bet we move back to a purely hieroglyphic system of writing, with the number of available symbols limited to what can fit onto a smartphone keypad. Honestly, I think that communicating effectively in realtime requires no more than 25 or 30 units of meaning.

Give me 30 glyphs and a URL shortener, and I’m good.

This post is an installment in Rough Type’s ongoing series “The Realtime Chronicles,” which began here.

Be everywhere now

The BBC has recently featured two thoughtful takes on how the Net is altering people’s experience of popular music. What’s particularly interesting (to me, anyway) is how the two articles examine the same phenomenon – the ability to listen to pretty much anything that’s ever been recorded, immediately and for free – but see very different consequences.

In an article that appeared a week or so ago, John Harris proclaimed a new “golden age of infinite music.” And he made a compelling case:

I [recently] had a long chat about music with the 16-year-old son of a friend, and my mind boggled. At virtually no cost, in precious little time and with zero embarrassment, he had become an expert on all kinds of artists, from English singer-songwriters like Nick Drake and John Martyn to such American indie-rock titans as Pavement and Dinosaur Jr. Though only a sixth-former, he seemingly knew as much about most of these people as any music writer.

Like any rock-oriented youth, his appetite for music is endless, and so is the opportunity – whether illegally or not – to indulge it. He is a paid-up fan of bands it took me until I was 30 to even discover – and at this rate, by the time he hits his 20s, he’ll have reached the true musical outer limits …

As the great digital revolution rolls on, bands are no longer having to compete for people’s money. Instead, they’re jockeying for our time. And the field is huge, crossing not just genres, but eras. Who do you want to investigate today: TV On The Radio or Crosby, Stills and Nash? Do you fancy losing yourself in the brilliant first album by Florence And The Machine, or deriving no end of entertainment from how awful The Rolling Stones got in the 1980s? Little Richard or La Roux? White Lies or Black Sabbath?

As one of my music press colleagues use to say, there’s no longer any past – just an endless present … Really: what’s not to like?

Today, the BBC is featuring an equally compelling article by John Taylor – yes, the Duran Duran bass player. “Something the internet has most definitely done,” he writes, echoing Harris, “is bring more music from more places and more eras into the hearts and minds of us all, but young people in particular, which is great … My stepson is at New York University (NYU) and he was telling me how he’s currently into Cole Porter, music from the 1920s and swing music from the 40s. So the availability and accessibility of music on the internet today is truly incredible, and I applaud anything that can inspire interest or curiosity in anyone.”

But rather than simply heralding this as the arrival of an endless and endlessly bountiful “present,” Taylor takes a more nuanced view. He wonders whether such easy abundance doesn’t lead to a flattening of experience: When everything’s present, nothing’s new.

He recalls a formative experience from his own youth:

In September 1972, Roxy Music appeared on prime time TV in the UK. It was their first national TV exposure, a three-minute appearance performing their first single. And the way they looked and sounded stunned me, and a generation of mes.

But we had no video recorders, and of course there was no YouTube. There was no way whatsoever that I could watch that appearance again, however badly I wanted to. And the power of that restriction was enormous. The only way I could get close to that experience was to own the song. I lived in the suburbs, so I had to ride my bike for miles before I could find a store that sold music, let alone one that had the record in stock. It was a small trial of manhood and an adventure.

But once I had that song, I could play it whenever I chose. I had to go on a quest of sorts to get it, but my need was such that I did it.

The power of that single television appearance created such pressure, such magnetism, that I got sucked in and I had to respond as I know now previous generations had responded to Elvis Presley on the Ed Sullivan show, or The Beatles, or Jimi Hendrix. I believe there’s immense power in restriction and holding back.

That “immense power,” which in an age of abundance can easily be misperceived as mere constraint, is draining from culture.

I have sympathy for both views. Like Harris, I appreciate, and certainly indulge in, the ability to leap easily from song to song, artist to artist, with no temporal or physical limitation on the experience of music. There is a sense of liberation in being able to be everywhere now – to be able to indulge in what Harris terms “completely risk-free listening.” But I have also shared Taylor’s experience of, quite literally, going on a ten-mile bike ride to a record store to purchase a yearned-for record, which would then spin for weeks on my turntable, pulling me ever further into the depths of the music. Taylor’s right: it was “a quest of sorts.” And as with all quests, there were risks involved.

There are those who, in their desire to sell themselves and others an idealized version of progress, are quick to dismiss all fond personal memories as nostalgia. But some of those memories are not sentimental distortions of the past but accurate records of experience. Taylor argues that, when it comes to music or any other form of art, the price of our “endless present” is the loss of a certain “magical power” that the artist was once able to wield over the audience. I suspect he’s right.