Complete control

completecontrol.jpgThere’s an exhilarating moment in the middle of the Clash’s “Complete Control” when, during Mick Jones’s brief, skittering, anarchic solo, Joe Strummer screams, “You’re my guitar hero!” The song is, along with the Sex Pistols’ “Anarchy in the UK” and X-Ray Spex’ “Oh Bondage Up Yours!,” one of the perfect expressions of the punk ethos of regeneration through degeneration, of accelerating the machinery of pop until it disintegrates, liberating, at least momentarily, both band and listener, and Strummer’s scream is the song’s climax, perhaps the climax of the entire punk anti-movement. Strummer, by turning himself, through his scream, into both the object and the subject of fandom, frees the fan from his prescribed identity as consumer, subverts fandom by turning fandom into an act of subversion. The band is the fan, and the fan is the band, and both stand as one outside and beyond the producer-consumer dynamic that would contain them.

“You’re my guitar hero!”

It’s a fleeting act, though, as Strummer well understood. When the song ends so does its spell. We’re “in control” – or, more precisely, “out of control” – only for three minutes. Then the prescribed identities reassert themselves. When the single came out, late in 1977, my friends and I would play it over and over again, renewing its promise and its illusion. We wanted to be uncontrollable – or at least to feel uncontrollable. But we were smart enough to realize that listening to “Complete Control” was also, and already, an act of nostalgia. The song was, as Jon Savage writes in England’s Dreaming, “a hymn to Punk autonomy at the moment of its eclipse.”

Strummer died in 2002, sparing him the despair, and humiliation, of seeing “Complete Control” turned into pure merchandise, a complete parody:

This is the perfect subversion of the Clash’s subversion, anarchy turned into routine, complete with a score-keeping mechanism. Now when Strummer screams “You’re my guitar hero!,” it’s an act of distancing rather an embrace. It’s also, bewilderingly, an act of advertising, the cynical come-on of a hawker. Strummer’s scream becomes a moment not of mutual liberation but of deep creepiness. The ironies are piled so high that the only way out is to ignore them, as Johnny Rotten and Steve Jones have learned to do so well.

“I’ve been puzzled by the popularity of the game Guitar Hero,” writes Rob Horning at PopMatters. “If you want a more interactive way to enjoy music, why not dance, or play air guitar? Or better yet, if holding a guitar appeals to you, why not try actually learning how to play? For the cost of an Xbox and the Guitar Hero game, you can get yourself a pretty good guitar.” Horning, apparently, doesn’t quite get the point of prosumerism; its joys are lost on him. He continues: “I can’t help but feel that Guitar Hero (much like Twitter) would have been utterly incomprehensible to earlier generations, that it is a symptom of some larger social refusal to embrace difficulty.”

What kills me about Twitter is how this perfect consumerist tool, this nifty agent for packaging intimacy as a product, for simplifying self-expression out of existence, can’t discover a business model to justify its own existence. Marx must have had something to say about this.

Speaking of which, Horning quotes the Marxist theorist Jon Elster in explaining the way that trivial, if diverting, pursuits like Guitar Hero provide an easy alternative to the hard work of self-realization:

Activities of self-realization are subject to increasing marginal utility: They become more enjoyable the more one has already engaged in them. Exactly the opposite is true of consumption. To derive sustained pleasure from consumption, diversity is essential. Diversity, on the other hand, is an obstacle to successful self-realization, as it prevents one from getting into the later and more rewarding stages.

“Consumerism and its infrastructure,” observes Horning, “keeps us well supplied with stuff and seems to enrich our identities by allowing us to become familiar with a wide range of phenomena – a process that the internet has accelerated immeasurably. (I encounter a stray idea, digest the relevant Wikipedia entry, and just like that, I’ve broadened my conceptual vocabulary! I get bored with the book I’m reading, Amazon suggests a new one! I am too distracted to read blog posts, I’ll check Twitter instead!) But this comes at the expense with developing any sense of mastery of anything, eroding over time the sense that mastery is possible, or worth pursuing.”

Distraction is the permanent end state of the perfected consumer, not least because distraction is a state that is eminently programmable. To buy a guitar is to open possibilities. To buy Guitar Hero is to close them. A commenter on Horning’s article writes, “To me, the radical move that Guitar Hero makes is to turn music into an objectively measurable activity that is more amenable to our Protestant work ethic. It brings the corporation’s focus on quantitative performance indicators to the domain of music, displacing the usual mode of subjective enjoyment.”

Who’s in control?

The cloud’s Chrome lining

Google’s release today of a test version of its new open-source web browser, Chrome, marks an important moment in the ongoing shift of personal computing from the PC hard drive to the Internet “cloud.” I distinctly remember when, back in 1988, Apple Computer added MultiFinder to its Macintosh operating system, allowing my beloved Mac Plus to run more than one application at a time. That was, for us Mac users, anyway, a very big deal. Chrome – if we can trust the comic book – promises a similar leap in the capacity of the cloud to run applications speedily, securely, and simultaneously. Indeed, it is the first browser built from the ground up with the idea of running applications rather than displaying pages. It takes the browser’s file-tab metaphor, a metaphor reflecting the old idea of the web as a collection of pages, and repurposes it for application multitasking. Chrome is the first cloud browser.

Though the initial beta release of Chrome runs only on Microsoft’s Windows operating system, Chrome is being seen as yet another sharp Google stick aimed at the Beast of Redmond’s cyclopean eye – an attempt not only to displace Internet Explorer but to disintermediate Windows itself as the platform of choice for running PC software. There is, no doubt, truth to that view, but in this case I think Google is motivated by something much larger than its congenital hatred of Microsoft. It knows that its future, both as a business and as an idea (and Google’s always been both), hinges on the continued rapid expansion of the usefulness of the Internet, which in turn hinges on the continued rapid expansion of the capabilities of web apps, which in turn hinges on rapid improvements in the workings of web browsers.

To Google, the browser has become a weak link in the cloud system – the needle’s eye through which the outputs of the company’s massive data centers usually have to pass to reach the user – and as a result the browser has to be rethought, revamped, retooled, modernized. Google can’t wait for Microsoft or Apple or the Mozilla Foundation to make the changes (the first has mixed feelings about promoting cloud apps, the second is more interested in hardware than in clouds, and the third, despite regular infusions of Google bucks, lacks resources), so Google is jump-starting the process with Chrome.

Although I’m sure Google would be thrilled if Chrome grabbed a sizable chunk of market share, winning a “browser war” is not its real goal. Its real goal, embedded in Chrome’s open-source code, is to upgrade the capabilities of all browsers so that they can better support (and eventually disappear behind) the applications. The browser may be the medium, but the applications are the message.

Machine head

I saw that Amazon’s chief technology officer, Werner Vogels, has picked up on the album-a-year meme over at his blog. It seems to me that a guy who’s running a heavy-metal utility computing operation should be a serious headbanger, and Vogels, I’m relieved to say, does not disappoint. Fresh Cream, Live at Leeds, Back in Black, Raising Hell, Appetite for Destruction, Nevermind, Rage Against the Machine: yeah, you bet your ass I’d entrust my mission-critical data and apps to this guy. I mean, he even ranks Made in Japan, that ridiculous double-record Deep Purple live album with the 20-minute version of Space Truckin’, as the best LP of ’72. I have to admit that I got a little nervous when I saw The Eagles’ Hotel California in the 1976 slot, but seeing as it’s squeezed between two massive testosterone fests – Zeppelin’s Physical Graffiti and the Stranglers’ Rattus Norvegicus – I’m just going to write it off as a momentary lapse. Hell, ’76 was a tough year for a lot of us.

Easy does it

A recent edition of Science featured a worrying paper by University of Chicago sociologist James A. Evans titled Electronic Publication and the Narrowing of Science and Scholarship. Seeking to learn more about how research is conducted online, Evans scoured a database of 34 million articles from science journals. He discovered a paradox: as journals begin publishing online, making it easier for researchers to find and search their contents, research tends to become more superficial.

Evans summarizes his findings in a new post on the Britannica Blog:

[My study] showed that as more journals and articles came online, the actual number of them cited in research decreased, and those that were cited tended to be of more recent vintage. This proved true for virtually all fields of science … Moreover, the easy online availability of sources has channeled researcher attention from the periphery to the core—to the most high-status journals. In short, searching online is more efficient, and hyperlinks quickly put researchers in touch with prevailing opinion, but they may also accelerate consensus and narrow the range of findings and ideas grappled with by scholars.

If part of the Carr thesis [in “Is Google Making Us Stupid?”] is that we are lazier online, and if efficiency is laziness (more results for less energy expended), then in professional science and scholarship, researchers yearn to be lazy…they want to produce more for less.

Ironically, my research suggests that one of the chief values of print library research is its poor indexing. Poor indexing—indexing by titles and authors, primarily within journals—likely had the unintended consequence of actually helping the integration of science and scholarship. By drawing researchers into a wider array of articles, print browsing and perusal may have facilitated broader comparisons and scholarship.

Evans’s study is consistent with the study of researcher behavior conducted by University College London that I cited in my article, which found that online researchers tend to go for “quick wins” through rapid “power browsing.”

When the efficiency ethic moves from the realm of goods production to the realm of intellectual exploration, as it is doing with the Net, we shouldn’t be surprised to find a narrowing rather than a broadening of the field of study. Search engines, after all, are popularity engines that concentrate attention rather than expanding it, and, as Evans notes, efficiency amplifies our native laziness.

“Is Google Making Us Stupid?”: sources and notes

Since the publication of my essay Is Google Making Us Stupid? in The Atlantic, I’ve received several requests for pointers to sources and related readings. I’ve tried to round them up below.

The essay builds on my book The Big Switch: Rewiring the World, from Edison to Google, particularly the final chapter, “iGod.” The essential theme of both the essay and the book – that our technologies change us, often in ways we can neither anticipate nor control – is one that was frequently, and deeply, discussed during the last century, in books and articles by such thinkers as Lewis Mumford, Eric A. Havelock, J. Z. Young, Marshall McLuhan, and Walter J. Ong.

The screenplay for the film 2001: A Space Odyssey was written by Stanley Kubrick and Arthur C. Clarke. Clarke’s book 2001, a lesser work than the film, was based on the screenplay rather than vice versa.

Scott Karp’s blog post about how he’s lost his capacity to read books can be found here, and Bruce Friedman’s post can be found here. Both Karp and Friedman believe that what they’ve gained from the Internet outweighs what they’ve lost. An overview of the University of College London study of the behavior of online researchers, “Information Behaviour of

the Researcher of the Future,” is here. Maryanne Wolf’s fascinating Proust and the Squid: The Story and Science of the Reading Brain was published last year by Harpercollins.

I found the story of Friedrich Nietzsche’s typewriter in J. C. Nyíri’s essay Thinking with a Word Processor as well as Friedrich A. Kittler’s winningly idiosyncratic Gramophone, Film, Typewriter and Darren Wershler-Henry’s history of the typewriter, The Iron Whim.

Lewis Mumford discusses the impact of the mechanical clock in his 1934 Technics and Civilization. See also Mumford’s later two-volume study The Myth of the Machine. Joseph Weizenbaum’s Computer Power and Human Reason remains one of the most thoughtful books written about the human implications of computing. Weizenbaum died earlier this year, and I wrote a brief appreciation of him here.

Alan Turing’s 1936 paper on the universal computer was titled On Computable Numbers, with an Application to the Entscheidungsproblem. Tom Bodkin’s explanation of the New York Times‘s design changes came in this Slate interview with Jack Shafer.

For Frederick Winslow Taylor’s story, I drew on Robert Kanigel’s biography The One Best Way and Taylor’s own The Principles of Scientific Management.

Eric Schmidt made his comments about Google’s Taylorist goals during the company’s 2006 press day. The Harvard Business Review article on Google, “Reverse Engineering Google’s Innovation Machine,” appeared in the April 2008 issue. Google describes its “mission” here and here. A much lengthier recital of Sergey Brin’s and Larry Page’s comments on Google’s search engine as a form of artificial intelligence, along with sources, can be found at the start of the “iGod” chapter in The Big Switch. Schmidt made his comment about “using technology to solve problems that have never been solved before” at the company’s 2006 analyst day.

I used Neil Postman’s translation of the excerpt from Plato’s Phaedrus, which can be found at the start of Technopoly: The Surrender of Culture to Technology. Walter J. Ong quotes Hieronimo Squarciafico in Orality and Literacy. Clay Shirky’s observation about the printing press was made here.

Richard Foreman’s “pancake people” essay was originally distributed to members of the audience for Foreman’s play The Gods Are Pounding My Head. It was reprinted in Edge. I first noted the essay in my 2005 blog post Beyond Google and Evil.

Net brain syndrome

Discussions of my Atlantic article, Is Google Making Us Stupid?, continue. Edge has been hosting a forum with comments from Danny Hillis, Kevin Kelly, Larry Sanger, George Dyson, Jaron Lanier, and Douglas Rushkoff. This past week the Britannica Blog launched a forum with posts from Clay Shirky, Sven Birkerts, Matthew Battles, and Sanger. I also contributed a reply to Shirky’s piece.

UPDATE: In today’s Sunday Times (of London), Bryan Appleyard contemplates the costs of “chronic distraction.”

The cloud’s not-so-silver lining

At Business Week, Sarah Lacy has a good article on the daunting challenges that software-as-a-service companies face as they try to build vibrant, profitable businesses. Some traditional software powerhouses, like SAP, are spending a lot to develop web versions of their applications, but they have little to show for the investments so far. Pursuing two radically different business models simultaneously, they’re running a race with their legs tied together.

Oracle, for its part, is deliberately moving slowly in shifting to the cloud model, preferring to milk the old, lucrative license-and-maintenance-fee model for as long as possible. Writes Lacy:

[Oracle] has offered a “hosted” version of its software for about 10 years, and CEO Larry Ellison clearly foresaw the on-demand wave, personally funding Salesforce.com and NetSuite. But spreading any kind of on-demand religion throughout his own company is another matter. Nowhere was this more clear than on Oracle’s most recent earnings call. Why isn’t Oracle a bigger player in on-demand software? It doesn’t want to be, Ellison told the analysts and investors. “We’ve been in this business 10 years, and we’ve only now turned a profit,” he said. “The last thing we want to do is have a very large business that’s not profitable and drags our margins down.” No, Ellison would rather enjoy the bounty of an acquisition spree that handed Oracle a bevy of software companies, hordes of customers, and associated maintenance fees that trickle straight to the bottom line.

More evidence of the challenges came yesterday with the announcement of Microsoft’s disappointing profits for the last quarter, attributable at least in part to the weak results of its online services business. The company has been spending billions building big utility data centers, but the revenues generated by all that capital investment remain paltry.

Anyone who thinks the software-as-a-service business is a gold mine for vendors is wrong. The economics are fundamentally different from those of the traditional software business – and not in a good way. As Lacy writes, the Web is “just as good at displacing revenue as it is in generating sources of it. Just ask the music industry or, ahem, print media. Think Robin Hood, taking riches from the elite and distributing them to everyone else, including the customers who get to keep more of their money and the upstarts that can more easily build competing alternatives.” Web apps remain a hard sell when it comes to big, conservative enterprises, and the capital and marketing costs are daunting, particularly if you’re running your own data centers. This revolution in business software will play out slowly and, for most suppliers, painfully.

So far the smartest players appear to be Ellison and his former protege, Marc Benioff of Salesforce.com. The unsentimental Ellison will wait until the profits from traditional software begin to decay, and then will buy his way into the software-as-a-service business, cherry-picking attractive suppliers. Benioff wisely chose the right target for his initial web app – salesforce automation, or CRM, which had become an advertisement for the flaws of large-scale enterprise software – and has built his business over the course of a decade through steady technical improvements and relentless marketing (aimed at both customers and investors).

“On-demand software,” Lacy writes, “has turned out to be a brutal slog.” Don’t expect it to get easier anytime soon. Success will come to a few smart, tenacious companies, but it will be hard-won.