Exodus

Has it begun?

James Sturm, the cartoonist, can’t take it anymore, “it” being the Internet:

Over the last several years, the Internet has evolved from being a distraction to something that feels more sinister. Even when I am away from the computer I am aware that I AM AWAY FROM MY COMPUTER and am scheming about how to GET BACK ON THE COMPUTER. I’ve tried various strategies to limit my time online: leaving my laptop at my studio when I go home, leaving it at home when I go to my studio, a Saturday moratorium on usage. But nothing has worked for long. More and more hours of my life evaporate in front of YouTube … Essential online communication has given way to hours of compulsive e-mail checking and Web surfing. The Internet has made me a slave to my vanity: I monitor the Amazon ranking of my books on an hourly basis, and I’m constantly searching for comments and discussions about my work.

He’s not quite ready to divorce the web. But he’s decided on a four-month trial separation. Like Edan Lepucki, he’s having someone visit his online accounts and change all his passwords, just to be safe.

I know there’s no going back to the pre-Internet days, but I just want to move forward a little more slowly.

Disconnection is the new counterculture.

UPDATE: There’s an amusing exchange in the comments to Sturm’s article at Slate:

detox.jpg

The iPad Luddites

Is it possible for a Geek God to also be a Luddite? That was the question that popped into my head as I read Cory Doctorow’s impassioned anti-iPad diatribe at Boing Boing. The device that Apple calls “magical” and “revolutionary” is, to Doctorow, a counterrevolutionary contraption conjured up through the black magic of the wizards at One Infinite Loop. The locked-down, self-contained design of the iPad – nary a USB port in sight, and don’t even think about loading an app that hasn’t been blessed by Apple – manifests “a palpable contempt for the owner,” writes Doctorow. You can’t fiddle with the dang thing:

The original Apple ][+ came with schematics for the circuit boards, and birthed a generation of hardware and software hackers who upended the world for the better. If you wanted your kid to grow up to be a confident, entrepreneurial, and firmly in the camp that believes that you should forever be rearranging the world to make it better, you bought her an Apple ][+ …

The way you improve your iPad isn’t to figure out how it works and making it better. The way you improve the iPad is to buy iApps. Buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.

Doctorow is not the only Geek God who’s uncomfortable with Apple’s transformation of the good ole hacktastic PC into a sleek, slick, sterile appliance. Many have accused Apple of removing from the personal computer not only its openness and open-endedness but also what Jonathan Zittrain, founder of Harvard’s Berkman Center for Internet & Society, calls its “generativity” – its capacity for encouraging and abetting creative work by its users. In criticizing the closed nature of the iPhone, from which the iPad borrows its operating system, Zittrain, like Doctorow, invoked the ancient, beloved Apple II: “a clean slate, a device built – boldly – with no specific tasks in mind.”

Tim Bray, the venerated programmer who recently joined Google, worries that the iPad, which is specifically designed to optimize a few tasks and cripple others, could lead to “a very nasty future scenario”:

At the moment, more or less any personal computer, given enough memory, can be used for ‘creative’ applications like photo editors and IDEs (and, for pedal-to-the-metal money people, big spreadsheets). If memory-starved tablets become ubiquitous, we’re looking at a future in which there are “normal” computers, and then “special” computers for creative people … I dislike this future not just for personal but for ideological reasons; I’m deeply bought-into the notion of a Web populated by devices that almost anyone can afford and on which anyone can be creative, if they want.

What these folks are ranting against, or at least gnashing their teeth over, is progress – or, more precisely, progress that goes down a path they don’t approve of. They want progress to, as Bray admits, follow their own ideological bent, and when it takes a turn they don’t like they start grumbling like granddads, yearning for the days of their idealized Apple IIs, when men were men and computers were computers.

If Ned Ludd had been a blogger, he would have written a post similar to Doctorow’s about those newfangled locked-down mechanical looms that distance the weaver from the machine’s workings, requiring the weaver to follow the programs devised by the looms’ manufacturer. The design of the mechanical loom, Ned would have told us, exhibits a palpable contempt for the user. It takes the generativity out of weaving.

And Ned would have been right.

I have a lot of sympathy for the point of view expressed by Doctorow, Zittrain, Bray, and others of their ilk. The iPad, for all its glitzy technical virtuousity, does feel like a step backwards from the Apple II and its progeny. Hell, I still haven’t gotten over Apple’s removal of analog RCA plugs for audio and video input and output from the back of its Macs. Give me a beige box with easily accessible innards, a big rack of RAM, and a dozen or so ports, and I’m a happy camper.

But I’m not under any illusion that progress gives a damn about what I want. While progress may be spurred by the hobbyist, it does not share the hobbyist’s ethic. One of the keynotes of technological advance is its tendency, as it refines a tool, to remove real human agency from the workings of that tool. In its place, we get an abstraction of human agency that represents the general desires of the masses as deciphered, or imposed, by the manufacturer and the marketer. Indeed, what tends to distinguish the advanced device from the primitive device is the absence of “generativity.” It’s useful to remember that the earliest radios were broadcasting devices as well as listening devices and that the earliest phonographs could be used for recording as well as playback. But as these machines progressed, along with the media systems in which they became embedded, they turned into streamlined, single-purpose entertainment boxes, suitable for living rooms. What Bray fears – the divergence of the creative device from the mass-market device – happened, and happened quickly and without much, if any, resistance.

Progress may, for a time, intersect with one’s own personal ideology, and during that period one will become a gung-ho technological progressivist. But that’s just coincidence. In the end, progress doesn’t care about ideology. Those who think of themselves as great fans of progress, of technology’s inexorable march forward, will change their tune as soon as progress destroys something they care deeply about. “We love the things we love for what they are,” wrote Robert Frost. And when those things change we rage against the changes. Passion turns us all into primitivists.

Digital decay and the archival cloud

Throughout human history, the documentation of events and thoughts usually required a good deal of time and effort. Somebody had to sit down with a stylus or a pen or, later, a typewriter or a tape recorder, and make a deliberate recording. That happened only rarely. Most events and thoughts vanished from memory, individual and collective, soon after they occurred. If they were described or discussed at all, it was usually in conversation, face to face or over a phone line, and the words evaporated as they were spoken.

That’s all changed now. Thanks to digital tools, media, and networks, recording is easy, cheap, and often automatic. Hard drives, flash drives, CDs, DVDs, and other storage devices brim with audio, video, photographic, and textual recordings. Evidence of even the most trivial of events and thoughts, communicated through texts, posts, status updates, and tweets, is retained in the data centers of the companies that operate popular Internet sites and services.

We live, it seems, in a golden age of documentation. But that’s not quite true. The problem with making a task cheap and effortless is that the results of that task come to be taken for granted. You care about work that’s difficult and expensive, and you want to preserve its product; you don’t pay much attention to the things that happen automatically and at little or no cost. In Avoiding a Digital Dark Age, an article appearing in the new edition of American Scientist, Kurt Bollacker, of the Long Now Foundation, expertly describes the conundrum of digital recording: everything’s documented, but the documents don’t last. The problem stems from the fact that, with digital recordings, we don’t only have to preserve the data itself; we have to preserve the devices and techniques used to read the data and output it in a form we can understand. As Bollacker writes:

With most analog technologies such as photographic prints and paper text documents, one can look directly at the medium to access the information. With all digital media, a machine and software are required to read and translate the data into a human-observable and comprehensible form. If the machine or software is lost, the data are likely to be unavailable or, effectively, lost as well.

The problem is magnified by the speed with which old digital media and recording techniques, including devices and software, are replaced by new ones. It’s further magnified by the fact that even modest damage to a digital recording can render that recording useless (as anyone who has scratched a CD or DVD knows). In contrast, damage to an analog recording – a scratch in a vinyl record, a torn page in a book – may be troublesome and annoying, but it rarely renders the recording useless. You can still listen to a scratched record, and you can still read a book with a missing page. Analog recordings are generally more robust than digital ones. As Bollacker explains, history reveals a clear and continuing trend: “new media types tend to have shorter lifespans than older ones, and digital types have shorter lifespans than analog ones.” The lifespan of a stone tablet was measured in centuries or millennia; the lifespan of a magnetic tape or a hard drive is measured in years or, if you’re very lucky, decades.

After describing the problem, Bollacker goes on to provide a series of suggestions for how digital recordings could be made more robust. The suggestions include applying better error correction algorithms when recording data and being more thoughtful about the digital formats and recording techniques we use. None of the recommendations would be particularly difficult to carry out. What’s required more than anything else is that people come to care about the problem. Apathy remains the biggest challenge in combating digital decay.

But there’s a new wrinkle to this story, and it’s one that Bollacker doesn’t address in his article: the cloud. Up to now, there has been one characteristic of digital recordings that has provided an important counterweight to the fragility of digital media – it’s what Bollacker refers to as “data promiscuity.” Because it’s easy to make copies of digital files, we’ve tended to make a lot of them. The proliferation of perfect digital copies has provided an important safeguard against the loss of data. An MP3 of even a moderately popular song will, for instance, exist on many thousands of computer hard drives as well as on many thousands of iPods, CDs, and other media. The more copies that are made of a recording, and the more widely the copies are dispersed, the more durable that recording becomes.

By centralizing the storage of digital information, cloud computing promises to dramatically reduce data promiscuity. When all of us are able to, in effect, share a copy of a digital file, whether a song or a video or a book, then we don’t need to make our own copies of that file. Cloud computing replaces the download with the stream, and that means that, as people come to use the cloud as their default data store, we’ll have fewer copies of files and hence less of the protection that multiple copies provides. Indeed, in the ultimate form of cloud computing, you’d need only a single copy of any digital recording.

Apple’s new iPad, which arrived with much fanfare over the weekend, provides a good example of where computing is heading. The iPad is much more of a player than a recorder. It has a much smaller storage capacity than traditional desktops and laptops, because it’s designed on the assumption that more and more of what we do with computers will involve streaming data over the Net rather than storing it on our devices. The iPad manifests a large and rapidly accelerating trend away from local, redundant storage and toward central storage. In fact, I’d bet that if you charted the average disk size of personal computers, including smartphones, netbooks and tablets as well as laptops and desktops, you would discover that in recent years it has shrunk, marking a sea change in the history of personal computing. An enormous amount of digital copying and local storage still goes on, of course, but the trend is clear. Streaming will continue to replace downloading, and the number of copies of digital recordings will decline.

The big cloud computing companies take the safeguarding of data very seriously, of course. For them, loss of data means loss of business, and catastrophic data loss means catastrophic business loss. A company like Google stores copies of its files in many locations, and it takes elaborate steps to protect its data centers and systems. Nevertheless, one can envision a future scenario (unlikely but not impossible) involving a catastrophe – natural, technological, political, or even commercial – that obliterates a cloud operator’s store of data. More prosaically, companies go out of business, change hands, and alter their strategies and priorities. They may not always care that much about data that once seemed very important, particularly data that has lost its commercial value. A business exists to make money, not to run an archive in perpetuity. Seen in this light, our embrace of the cloud may have the unintended effect of making digital recordings even more fragile, especially over the long run.

As digital recordings displace physical ones, the risks expand. Think about books. Google’s effort to scan every physical book ever published into its database has been compared to the creation of the great library of Alexandria. Should Google (or another organization) succeed in creating an easy-to-use, universally available store of digital books, we might well become dependent on that store – and take it for granted. We would stream books as we today stream videos. In time, we would find fewer and fewer reasons to maintain our own digital copies of books inside our devices; we would keep our e-books in the cloud. We would also find it increasingly hard to justify the cost of keeping physical copies of books, particularly old ones, on shelves, either in our homes or in libraries.

At that point, if we hadn’t been very, very careful in how we developed and maintained our great cloud library, we would be left with few safeguards in the event that, for whatever foreseeable or unforeseeable reason, that library was compromised or ceased to function. We all know what happened to the library of Alexandria.

The post-book book

The iPad’s iBooks application may or may not become our e-reader of choice – even uber-fanboy David Pogue seems a mite skeptical this morning – but the model of book reading (and hence book writing) the iPad promotes seems fated, in time, to become the dominant one. The book itself, in this model, becomes an app, a multihypermediated experience to click through rather than a simple sequence of pages to read through. To compete with the iPad, the current top-selling e-reader, Amazon’s Kindle, will no doubt be adding more bells and whistles to its suddenly tired-seeming interface. Already, Amazon has announced it will be opening an app store for the Kindle later this year. “People don’t read anymore,” Steve Jobs famously said, and the iPad emanates from that assumption.

John Makinson, the CEO of publishing giant Penguin Books, is thrilled about the iPad’s potential to refresh his company’s product line. “The definition of the book itself seems up for grabs,” he said at a recent media industry powwow. Unlike traditional e-book readers, which had a rather old-fashioned attachment to linear text, the iPad opens the doors to incorporating all sorts of “cool stuff,” Makinson continued. “We will be embedding audio, video and streaming into everything we do.” He foresees sprinkling movie clips among Jane Austen’s paragraphs in future editions of “Pride and Prejudice.” No need to conjure up a picture of Lizzie Bennet in your own mind; there’s Keira Knightley stomping through the grounds of Netherfield, cute as a mouse button.

Makinson gave a preview of the post-book book, which seems unsurprisingly toylike:

A sentence from The Shallows may be pertinent here: “When a printed book is transferred to an electronic device connected to the Internet, it turns into something very like a Web site.” Makinson’s presentation leads Peta Jinnath Andersen, of PopMatters, to ask, “What makes a book a book?” A book, she concludes, is just “a delivery system” for text, and one delivery system is as good as another: “How the words are delivered doesn’t matter.” A stone tablet is a scroll is a wax tablet is a scribal codex is a printed book is a Kindle is an iPad. And yet history shows us that each change in the physical form of the written word was accompanied by a change – often a profound one – in reading and writing habits. If the delivery system mattered so much in the past, are we really to believe that it won’t matter in the future?

Jobs is no dummy. As a text delivery system, the iPad is perfectly suited to readers who don’t read anymore.

Greenpeace raids the cloud

In late 2006, I wrote a post about the energy consumption of modern computing plants, in which I made a prediction:

As soon as activists, and the public in general, begin to understand how much electricity is wasted by computing and communication systems – and the consequences of that waste for the environment and in particular global warming – they’ll begin demanding that the makers and users of information technology improve efficiency dramatically. Greenpeace and its rainbow warriors will soon storm the data center – your data center.

Soon is now. Today, Greenpeace issued a report on “cloud computing and its contribution to climate change,” in which it specifically targets big cloud operators like Google, Amazon, Apple, Facebook, Salesforce.com, and Microsoft. The report is timed to coincide with the launch of Apple’s iPad, an event that underscores just how dramatically personal computing has changed, and expanded, over the last few years. Many of us now own a slew of computers in various forms – desktops, laptops, smartphones, iPods, tablets, e-readers, gaming consoles – that don’t just suck up electricity themselves but are connected to the vast cloud grid that also consumes enormous amounts of energy. Drawing mainly on a 2008 analysis by the Climate Group and the Global e-Sustainability Initiative, Greenpeace predicts that the electricity consumed by the cloud – defined as both Internet data centers and the communications network that connects all of us to those centers – will rise from 623 billion kWh in 2007 to 1,964 billion kWh in 2020.

The rise of cloud computing is a two-edged sword when it comes to energy consumption and related carbon emissions. On the one hand, since electricity is a critical component of the cost of running a cloud operation, major cloud computing providers like Google and Microsoft have a big economic incentive to become more energy efficient, and they have been admirably aggressive in pioneering technologies that reduce energy use. The energy-conserving equipment, designs, and processes that the cloud giants invent should in time spread throughout the information technology industry, making computing in general much more energy efficient. At the same time, however, the free data and services supplied through the cloud are rapidly expanding the scope of computing and its attractiveness – people use computers, particularly internet-connected computers, much more than in the past – and so even as computing is becoming more efficient, when measured by units of output, the dramatic expansion in its use means that it is, in absolute terms, sucking up much more electricity than it has in the past, a trend that promises to accelerate pretty much indefinitely.

What that means is that, as the Greenpeace report makes clear, both the economic and the political stakes involved in mitigating the environmental impact of the cloud will increase. Greenpeace argues that what’s important is not only the efficiency of data centers but the sources of the power they use. The heavenly cloud, it turns out, runs largely on earthbound coal. In this regard, it singles out Facebook for criticism:

Facebook’s decision to build its own highly-efficient data centre in Oregon that will be substantially powered by coal-fired electricity clearly underscores the relative priority for many cloud companies. Increasing the energy efficiency of its servers and reducing the energy footprint of the infrastructure of data centres are clearly to be commended, but efficiency by itself is not green if you are simply working to maximise output from the cheapest and dirtiest energy source available.

Greenpeace also links Apple’s decision to locate a huge cloud data center in North Carolina to that state’s cheap electricity supplies, which come mainly from coal-fired plants. Other companies, including Google, also run big data center operations in the Carolinas. Noting that the IT industry “holds many of the keys to reaching our climate goals,” Greenpeace says that it is pursuing a “Cool IT Campaign” that is intended to pressure the industry to “put forward solutions to achieve economy-wide greenhouse gas emissions reductions and to be strong advocates for policies that combat climate change and increase the use of renewable energy.”

The Greenpeace action promises to intensify the public’s focus on the cloud’s environmental shadow. But while Greenpeace’s main target appears to be the big cloud providers, its report also suggests, if only in passing, that the devices that all of us use to connect to the cloud actually consume more energy than the cloud itself. Those of us who spend a large proportion of our waking hours peering into multiple computer screens can’t offload responsibility for the environmental consequences of our habits to companies like Google and Facebook. The cloud, after all, exists for us.

The Shallows at SXSW

I will be reading from my forthcoming book, The Shallows, a week from today at the South by Southwest conference in Austin. The reading is scheduled to take place on March 16 at 11:30 am on the Day Stage. If you are in the neighborhood, and are properly badged, please stop by.