Monthly Archives: April 2010

The iPad Luddites

Is it possible for a Geek God to also be a Luddite? That was the question that popped into my head as I read Cory Doctorow’s impassioned anti-iPad diatribe at Boing Boing. The device that Apple calls “magical” and “revolutionary” is, to Doctorow, a counterrevolutionary contraption conjured up through the black magic of the wizards at One Infinite Loop. The locked-down, self-contained design of the iPad – nary a USB port in sight, and don’t even think about loading an app that hasn’t been blessed by Apple – manifests “a palpable contempt for the owner,” writes Doctorow. You can’t fiddle with the dang thing:

The original Apple ][+ came with schematics for the circuit boards, and birthed a generation of hardware and software hackers who upended the world for the better. If you wanted your kid to grow up to be a confident, entrepreneurial, and firmly in the camp that believes that you should forever be rearranging the world to make it better, you bought her an Apple ][+ …

The way you improve your iPad isn’t to figure out how it works and making it better. The way you improve the iPad is to buy iApps. Buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.

Doctorow is not the only Geek God who’s uncomfortable with Apple’s transformation of the good ole hacktastic PC into a sleek, slick, sterile appliance. Many have accused Apple of removing from the personal computer not only its openness and open-endedness but also what Jonathan Zittrain, founder of Harvard’s Berkman Center for Internet & Society, calls its “generativity” – its capacity for encouraging and abetting creative work by its users. In criticizing the closed nature of the iPhone, from which the iPad borrows its operating system, Zittrain, like Doctorow, invoked the ancient, beloved Apple II: “a clean slate, a device built – boldly – with no specific tasks in mind.”

Tim Bray, the venerated programmer who recently joined Google, worries that the iPad, which is specifically designed to optimize a few tasks and cripple others, could lead to “a very nasty future scenario”:

At the moment, more or less any personal computer, given enough memory, can be used for ‘creative’ applications like photo editors and IDEs (and, for pedal-to-the-metal money people, big spreadsheets). If memory-starved tablets become ubiquitous, we’re looking at a future in which there are “normal” computers, and then “special” computers for creative people … I dislike this future not just for personal but for ideological reasons; I’m deeply bought-into the notion of a Web populated by devices that almost anyone can afford and on which anyone can be creative, if they want.

What these folks are ranting against, or at least gnashing their teeth over, is progress – or, more precisely, progress that goes down a path they don’t approve of. They want progress to, as Bray admits, follow their own ideological bent, and when it takes a turn they don’t like they start grumbling like granddads, yearning for the days of their idealized Apple IIs, when men were men and computers were computers.

If Ned Ludd had been a blogger, he would have written a post similar to Doctorow’s about those newfangled locked-down mechanical looms that distance the weaver from the machine’s workings, requiring the weaver to follow the programs devised by the looms’ manufacturer. The design of the mechanical loom, Ned would have told us, exhibits a palpable contempt for the user. It takes the generativity out of weaving.

And Ned would have been right.

I have a lot of sympathy for the point of view expressed by Doctorow, Zittrain, Bray, and others of their ilk. The iPad, for all its glitzy technical virtuousity, does feel like a step backwards from the Apple II and its progeny. Hell, I still haven’t gotten over Apple’s removal of analog RCA plugs for audio and video input and output from the back of its Macs. Give me a beige box with easily accessible innards, a big rack of RAM, and a dozen or so ports, and I’m a happy camper.

But I’m not under any illusion that progress gives a damn about what I want. While progress may be spurred by the hobbyist, it does not share the hobbyist’s ethic. One of the keynotes of technological advance is its tendency, as it refines a tool, to remove real human agency from the workings of that tool. In its place, we get an abstraction of human agency that represents the general desires of the masses as deciphered, or imposed, by the manufacturer and the marketer. Indeed, what tends to distinguish the advanced device from the primitive device is the absence of “generativity.” It’s useful to remember that the earliest radios were broadcasting devices as well as listening devices and that the earliest phonographs could be used for recording as well as playback. But as these machines progressed, along with the media systems in which they became embedded, they turned into streamlined, single-purpose entertainment boxes, suitable for living rooms. What Bray fears – the divergence of the creative device from the mass-market device – happened, and happened quickly and without much, if any, resistance.

Progress may, for a time, intersect with one’s own personal ideology, and during that period one will become a gung-ho technological progressivist. But that’s just coincidence. In the end, progress doesn’t care about ideology. Those who think of themselves as great fans of progress, of technology’s inexorable march forward, will change their tune as soon as progress destroys something they care deeply about. “We love the things we love for what they are,” wrote Robert Frost. And when those things change we rage against the changes. Passion turns us all into primitivists.

Digital decay and the archival cloud

Throughout human history, the documentation of events and thoughts usually required a good deal of time and effort. Somebody had to sit down with a stylus or a pen or, later, a typewriter or a tape recorder, and make a deliberate recording. That happened only rarely. Most events and thoughts vanished from memory, individual and collective, soon after they occurred. If they were described or discussed at all, it was usually in conversation, face to face or over a phone line, and the words evaporated as they were spoken.

That’s all changed now. Thanks to digital tools, media, and networks, recording is easy, cheap, and often automatic. Hard drives, flash drives, CDs, DVDs, and other storage devices brim with audio, video, photographic, and textual recordings. Evidence of even the most trivial of events and thoughts, communicated through texts, posts, status updates, and tweets, is retained in the data centers of the companies that operate popular Internet sites and services.

We live, it seems, in a golden age of documentation. But that’s not quite true. The problem with making a task cheap and effortless is that the results of that task come to be taken for granted. You care about work that’s difficult and expensive, and you want to preserve its product; you don’t pay much attention to the things that happen automatically and at little or no cost. In Avoiding a Digital Dark Age, an article appearing in the new edition of American Scientist, Kurt Bollacker, of the Long Now Foundation, expertly describes the conundrum of digital recording: everything’s documented, but the documents don’t last. The problem stems from the fact that, with digital recordings, we don’t only have to preserve the data itself; we have to preserve the devices and techniques used to read the data and output it in a form we can understand. As Bollacker writes:

With most analog technologies such as photographic prints and paper text documents, one can look directly at the medium to access the information. With all digital media, a machine and software are required to read and translate the data into a human-observable and comprehensible form. If the machine or software is lost, the data are likely to be unavailable or, effectively, lost as well.

The problem is magnified by the speed with which old digital media and recording techniques, including devices and software, are replaced by new ones. It’s further magnified by the fact that even modest damage to a digital recording can render that recording useless (as anyone who has scratched a CD or DVD knows). In contrast, damage to an analog recording – a scratch in a vinyl record, a torn page in a book – may be troublesome and annoying, but it rarely renders the recording useless. You can still listen to a scratched record, and you can still read a book with a missing page. Analog recordings are generally more robust than digital ones. As Bollacker explains, history reveals a clear and continuing trend: “new media types tend to have shorter lifespans than older ones, and digital types have shorter lifespans than analog ones.” The lifespan of a stone tablet was measured in centuries or millennia; the lifespan of a magnetic tape or a hard drive is measured in years or, if you’re very lucky, decades.

After describing the problem, Bollacker goes on to provide a series of suggestions for how digital recordings could be made more robust. The suggestions include applying better error correction algorithms when recording data and being more thoughtful about the digital formats and recording techniques we use. None of the recommendations would be particularly difficult to carry out. What’s required more than anything else is that people come to care about the problem. Apathy remains the biggest challenge in combating digital decay.

But there’s a new wrinkle to this story, and it’s one that Bollacker doesn’t address in his article: the cloud. Up to now, there has been one characteristic of digital recordings that has provided an important counterweight to the fragility of digital media – it’s what Bollacker refers to as “data promiscuity.” Because it’s easy to make copies of digital files, we’ve tended to make a lot of them. The proliferation of perfect digital copies has provided an important safeguard against the loss of data. An MP3 of even a moderately popular song will, for instance, exist on many thousands of computer hard drives as well as on many thousands of iPods, CDs, and other media. The more copies that are made of a recording, and the more widely the copies are dispersed, the more durable that recording becomes.

By centralizing the storage of digital information, cloud computing promises to dramatically reduce data promiscuity. When all of us are able to, in effect, share a copy of a digital file, whether a song or a video or a book, then we don’t need to make our own copies of that file. Cloud computing replaces the download with the stream, and that means that, as people come to use the cloud as their default data store, we’ll have fewer copies of files and hence less of the protection that multiple copies provides. Indeed, in the ultimate form of cloud computing, you’d need only a single copy of any digital recording.

Apple’s new iPad, which arrived with much fanfare over the weekend, provides a good example of where computing is heading. The iPad is much more of a player than a recorder. It has a much smaller storage capacity than traditional desktops and laptops, because it’s designed on the assumption that more and more of what we do with computers will involve streaming data over the Net rather than storing it on our devices. The iPad manifests a large and rapidly accelerating trend away from local, redundant storage and toward central storage. In fact, I’d bet that if you charted the average disk size of personal computers, including smartphones, netbooks and tablets as well as laptops and desktops, you would discover that in recent years it has shrunk, marking a sea change in the history of personal computing. An enormous amount of digital copying and local storage still goes on, of course, but the trend is clear. Streaming will continue to replace downloading, and the number of copies of digital recordings will decline.

The big cloud computing companies take the safeguarding of data very seriously, of course. For them, loss of data means loss of business, and catastrophic data loss means catastrophic business loss. A company like Google stores copies of its files in many locations, and it takes elaborate steps to protect its data centers and systems. Nevertheless, one can envision a future scenario (unlikely but not impossible) involving a catastrophe – natural, technological, political, or even commercial – that obliterates a cloud operator’s store of data. More prosaically, companies go out of business, change hands, and alter their strategies and priorities. They may not always care that much about data that once seemed very important, particularly data that has lost its commercial value. A business exists to make money, not to run an archive in perpetuity. Seen in this light, our embrace of the cloud may have the unintended effect of making digital recordings even more fragile, especially over the long run.

As digital recordings displace physical ones, the risks expand. Think about books. Google’s effort to scan every physical book ever published into its database has been compared to the creation of the great library of Alexandria. Should Google (or another organization) succeed in creating an easy-to-use, universally available store of digital books, we might well become dependent on that store – and take it for granted. We would stream books as we today stream videos. In time, we would find fewer and fewer reasons to maintain our own digital copies of books inside our devices; we would keep our e-books in the cloud. We would also find it increasingly hard to justify the cost of keeping physical copies of books, particularly old ones, on shelves, either in our homes or in libraries.

At that point, if we hadn’t been very, very careful in how we developed and maintained our great cloud library, we would be left with few safeguards in the event that, for whatever foreseeable or unforeseeable reason, that library was compromised or ceased to function. We all know what happened to the library of Alexandria.

The post-book book

The iPad’s iBooks application may or may not become our e-reader of choice – even uber-fanboy David Pogue seems a mite skeptical this morning – but the model of book reading (and hence book writing) the iPad promotes seems fated, in time, to become the dominant one. The book itself, in this model, becomes an app, a multihypermediated experience to click through rather than a simple sequence of pages to read through. To compete with the iPad, the current top-selling e-reader, Amazon’s Kindle, will no doubt be adding more bells and whistles to its suddenly tired-seeming interface. Already, Amazon has announced it will be opening an app store for the Kindle later this year. “People don’t read anymore,” Steve Jobs famously said, and the iPad emanates from that assumption.

John Makinson, the CEO of publishing giant Penguin Books, is thrilled about the iPad’s potential to refresh his company’s product line. “The definition of the book itself seems up for grabs,” he said at a recent media industry powwow. Unlike traditional e-book readers, which had a rather old-fashioned attachment to linear text, the iPad opens the doors to incorporating all sorts of “cool stuff,” Makinson continued. “We will be embedding audio, video and streaming into everything we do.” He foresees sprinkling movie clips among Jane Austen’s paragraphs in future editions of “Pride and Prejudice.” No need to conjure up a picture of Lizzie Bennet in your own mind; there’s Keira Knightley stomping through the grounds of Netherfield, cute as a mouse button.

Makinson gave a preview of the post-book book, which seems unsurprisingly toylike:

A sentence from The Shallows may be pertinent here: “When a printed book is transferred to an electronic device connected to the Internet, it turns into something very like a Web site.” Makinson’s presentation leads Peta Jinnath Andersen, of PopMatters, to ask, “What makes a book a book?” A book, she concludes, is just “a delivery system” for text, and one delivery system is as good as another: “How the words are delivered doesn’t matter.” A stone tablet is a scroll is a wax tablet is a scribal codex is a printed book is a Kindle is an iPad. And yet history shows us that each change in the physical form of the written word was accompanied by a change – often a profound one – in reading and writing habits. If the delivery system mattered so much in the past, are we really to believe that it won’t matter in the future?

Jobs is no dummy. As a text delivery system, the iPad is perfectly suited to readers who don’t read anymore.