The iPad Luddites

Is it possible for a Geek God to also be a Luddite? That was the question that popped into my head as I read Cory Doctorow’s impassioned anti-iPad diatribe at Boing Boing. The device that Apple calls “magical” and “revolutionary” is, to Doctorow, a counterrevolutionary contraption conjured up through the black magic of the wizards at One Infinite Loop. The locked-down, self-contained design of the iPad – nary a USB port in sight, and don’t even think about loading an app that hasn’t been blessed by Apple – manifests “a palpable contempt for the owner,” writes Doctorow. You can’t fiddle with the dang thing:

The original Apple ][+ came with schematics for the circuit boards, and birthed a generation of hardware and software hackers who upended the world for the better. If you wanted your kid to grow up to be a confident, entrepreneurial, and firmly in the camp that believes that you should forever be rearranging the world to make it better, you bought her an Apple ][+ …

The way you improve your iPad isn’t to figure out how it works and making it better. The way you improve the iPad is to buy iApps. Buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.

Doctorow is not the only Geek God who’s uncomfortable with Apple’s transformation of the good ole hacktastic PC into a sleek, slick, sterile appliance. Many have accused Apple of removing from the personal computer not only its openness and open-endedness but also what Jonathan Zittrain, founder of Harvard’s Berkman Center for Internet & Society, calls its “generativity” – its capacity for encouraging and abetting creative work by its users. In criticizing the closed nature of the iPhone, from which the iPad borrows its operating system, Zittrain, like Doctorow, invoked the ancient, beloved Apple II: “a clean slate, a device built – boldly – with no specific tasks in mind.”

Tim Bray, the venerated programmer who recently joined Google, worries that the iPad, which is specifically designed to optimize a few tasks and cripple others, could lead to “a very nasty future scenario”:

At the moment, more or less any personal computer, given enough memory, can be used for ‘creative’ applications like photo editors and IDEs (and, for pedal-to-the-metal money people, big spreadsheets). If memory-starved tablets become ubiquitous, we’re looking at a future in which there are “normal” computers, and then “special” computers for creative people … I dislike this future not just for personal but for ideological reasons; I’m deeply bought-into the notion of a Web populated by devices that almost anyone can afford and on which anyone can be creative, if they want.

What these folks are ranting against, or at least gnashing their teeth over, is progress – or, more precisely, progress that goes down a path they don’t approve of. They want progress to, as Bray admits, follow their own ideological bent, and when it takes a turn they don’t like they start grumbling like granddads, yearning for the days of their idealized Apple IIs, when men were men and computers were computers.

If Ned Ludd had been a blogger, he would have written a post similar to Doctorow’s about those newfangled locked-down mechanical looms that distance the weaver from the machine’s workings, requiring the weaver to follow the programs devised by the looms’ manufacturer. The design of the mechanical loom, Ned would have told us, exhibits a palpable contempt for the user. It takes the generativity out of weaving.

And Ned would have been right.

I have a lot of sympathy for the point of view expressed by Doctorow, Zittrain, Bray, and others of their ilk. The iPad, for all its glitzy technical virtuousity, does feel like a step backwards from the Apple II and its progeny. Hell, I still haven’t gotten over Apple’s removal of analog RCA plugs for audio and video input and output from the back of its Macs. Give me a beige box with easily accessible innards, a big rack of RAM, and a dozen or so ports, and I’m a happy camper.

But I’m not under any illusion that progress gives a damn about what I want. While progress may be spurred by the hobbyist, it does not share the hobbyist’s ethic. One of the keynotes of technological advance is its tendency, as it refines a tool, to remove real human agency from the workings of that tool. In its place, we get an abstraction of human agency that represents the general desires of the masses as deciphered, or imposed, by the manufacturer and the marketer. Indeed, what tends to distinguish the advanced device from the primitive device is the absence of “generativity.” It’s useful to remember that the earliest radios were broadcasting devices as well as listening devices and that the earliest phonographs could be used for recording as well as playback. But as these machines progressed, along with the media systems in which they became embedded, they turned into streamlined, single-purpose entertainment boxes, suitable for living rooms. What Bray fears – the divergence of the creative device from the mass-market device – happened, and happened quickly and without much, if any, resistance.

Progress may, for a time, intersect with one’s own personal ideology, and during that period one will become a gung-ho technological progressivist. But that’s just coincidence. In the end, progress doesn’t care about ideology. Those who think of themselves as great fans of progress, of technology’s inexorable march forward, will change their tune as soon as progress destroys something they care deeply about. “We love the things we love for what they are,” wrote Robert Frost. And when those things change we rage against the changes. Passion turns us all into primitivists.

55 Comments

Filed under Uncategorized

55 Responses to The iPad Luddites

  1. KiltBear

    “If memory-starved tablets become ubiquitous, we’re looking at a future in which there are “normal” computers, and then “special” computers for creative people …”

    And this would be different from when? The ubergeeks have always gone to the cutting edge raw systems. You can still build your own PC, you can still put darwin on it, or any other flavor of Unix. You can still create the “next big thing”. However, if you want to put it in front of 90% of the people instead of the 10-20% represented by the holy geek-hood, you will have to put it on a device that the average person will want to and be capable of using…

    What they uber-geeks fail to realize is that what they are pining for is actually the exclusivity of power and ability available only to those who gave up their lives to their geek driven avocations.

    If anything the iPad puts creative possibilities in many more people’s hands. Just the wonderful and fun photo apps on the iPhone should be telling.

    It’s the software stupid. The entry price for creating something amazing is lower than it has ever been, and the ability to get it in front of as many people as possible is easier than it has ever been.

  2. The iPad is a media consumption device, like a TV. It’s not made for Corey or the rest of the Digitera. It’s for, literally, the unwashed masses.

    It would be great if everyone was as creative and curious as the 140+ IQ crowd but the bell curve of IQ shows us that most people just don’t care to create. They want to consume.

    As ‘crass’ as making a product for the majority may seem to the elite, it’s damn good business sense.

    The good news is it will create a market demand for real tablets, including I’d bet one running Android that’s infinitely configurable and that you’ll be able to run damn near anything on.

    So, Bad on Apple for making a device for ‘just average’ people, but Good on Apple for creating the market where others can compete with better targeted to 140+ IQ folks products.

  3. Average humans spend six hours a day with a dumb TV clicker in their hands. If the iPad can replace the clicker whilst making these people a whole lot happier, smarter or productive, it will be a hit.

  4. I think the anti-iPad hobbyist crowd has it exactly backwards. The iPad is certainly more limited in some ways than a PC but it is open to far, far more content than, say, a television set in your living room or the radio in your car. Once upon a time, a novelist who couldn’t get his or her work into Barnes & Noble couldn’t sell a book, radio was reserved for a few professional folks and forget about television.

    Today, my family enjoys reading self-published ebooks and ezines, listening to all manner of niche podcasts and watching boatloads of amateur video on Youtube. The proliferation of slick, OPEN media-consumption platforms not limited to your desk is a great development for supporters of non-traditional, small-scale and niche media of all kinds. A couple of people in their garage can have the same reach as Rupert Murdoch, James Cameron or John Grisham.

    p.s. Although you may need iTunes to put content directly on an iPad, you can put just about whatever kind of content into iTunes yourself without buying anything from Apple, including self-created and indie stuff.

  5. Out In Sun

    Citrix has released an iPad app that lets you access a hosted virtal Windows machine. If Amazon did the same for EC2, you could use your iPad to control an entire cloud-bank of wildly inexpensive Linux machines. Sounds like “generativity” to me. I think the Luddites still think that “the computer is the computer”. Think of the iPad as a screen, not an entire computer.

  6. jesse calderon

    Some of the comments above reflect this, but the iPad isn’t for computer geeks. I am a software developer and feel like our industry has failed to provide the proper solution to a large segment of the market.

    My parents and grandparents would love to be online and use the internet but they can’t figure out their computers. They are constantly struggling with viruses, upgrades, patches, etc. They want their computer to be like a toaster. They don’t care how the toaster works, they just want to eat the toast.

    The iPad will be great for them because they can turn it on and access email and web content without the noise and overhead of the computer. For this audience, the full power of the computer is a barrier to entry, not a benefit.

  7. alsomike

    The existence of computers for different people with different priorities is not elitist. What is elitist is saying that one way of using a computer is better than how someone else uses it. The big contradiction is that Cory Doctorow stands for freedom and autonomy, but only his kind of freedom, the kind of freedom that privileges people like him, and perversely, he wants to impose those freedoms on other people.

    You must be creative. You must hack your devices. You must not be a consumer. You must be an entrepreneur. These aren’t freedoms, they are obligations, and Doctorow wants people to feel ashamed for failing to live up to his standard. This is authoritarianism, pure and simple, there’s absolutely nothing revolutionary about it.

    This is supposed to be an anti-consumer message, but that’s laughable. Has he seen any advertising in the last 20 years? Corporations today want to empower us, they want us to be creative and express our unique personalities, live life on our own terms, be individuals, customize, accessorize and hack everything. If you don’t, you’re letting yourself down. Funny how Doctorow’s “anti-consumerist” ideology is indistinguishable from consumerism.

  8. David Pinto

    computer : truck

    ipad : car

    (iphone : bike)

  9. Pierce Lamb

    I had a similar reaction to all the anti-iPad writing going on in tech-savvy circles. Although put much less eloquently here is what I posted on buzz:

    “I haven’t bought an iPad, but from reading a lot of the criticisms forming in the tech circles I run in, i must say this: once computing devices became worthy of B2C business, the history of computing became a story of abstracting away technical details so to appeal to larger and larger non-tech savvy crowds and allow more and more people to enjoy some of the benefits of a computing device (and therefore make more money). I think a lot of tech-savvy people falsely believe Apple has some desire to appeal to them… the truth is in making computing both easy and VERY nice looking, they appeal to a much larger crowd where a lot more money lies. The average computer user doesn’t really care about being able to multi-task etc, what they do care about is being able to access their favorite websites/applications very quickly and have the user experience of doing that be very easy on the eyes. This is probably why the iPad will enjoy success among the mass majority of people, because it appeals to them.

    But don’t worry my tech savvy friends, we have Google to make us feel better. And the future of Apple/Google battle will be the answer to the question: does appealing to the wealth of 3rd party developers eventually win the war against appealing to the everyday consumer? My sense is that it will. (This is of course assuming that developers will be more inclined to develop for a developer-friendly environment and not for one that boasts mass market appeal, which is not a trivial/obvious/safe assumption)”

  10. Great post, Nick – I’d like to add two points. First, the iPad exists in a world where PCs and Macs already exist, which do allow for content and other creation. Users have the option of creating on those devices to their hearts content. Second, I’m seeing services being offered that would let “everyday” users to easily create apps that could go onto an iPhone or iPad – the creation aspect still exists out there, and you will soon be able to create apps for the iPad and iPhone. There’s still the issue of “Apple approving” the app, which does make it seem more Big Brother-ish (imagine the 1984 ad today?), but then of course there’s always Android :)

  11. “But I’m not under any illusion that progress gives a damn about what I want.”

    The iPad isn’t progress Nick, except in the minds of Apple fanboys. It’s a very closed tablet, nothing more, nothing less. Tablets have been done before and will be done again in the future.

    Failing to fawn over the iPad doesn’t make one a Luddite.

  12. Mike,

    Redefining progress that you don’t like as nonprogress is a nice way to make believe you’re not a situational Luddite, but it doesn’t wash.

    Nick

  13. Nick, the point about technical evolution tending toward a reduction of generativity is an awfully broad brush-stroke, and I can look around my home and see several examples of the opposite tendency. For example, television. Books. And too lots of examples that would, in a modern world of increasingly segmented division of labor, prove the opposite: furniture, food, and, as you have shown in Big Switch, electricity.

    Perhaps we could say that as technologies become ‘infrastructural’ and move lower down the Big Stack of everyday digital life, that they ipso facto become less generative. Though not always.

    In any event, my chief disagreement is in the definition of the iPad as the culmination of an historical arc of PC’s, that it is a “late” computer. It is not late, it is if anything a half-baked public prototype of a new mode of ambient computation that we do not yet understand, can’t properly visualize and will have to invent as we go along.

    For that reason, that the iPad is an early and not a late technology, it’s generativity should be high not low.

  14. len

    It’s a boombox.

    In so far as those are progress over owning a decent recording deck, I guess so but this purely comes down to the uses one has for technology. As a content consumer device, it’s fine. It is a game changer in the same way a boombox was. For break dancers, it was. For songwriters, it could be helpful. For serious recording, it was just another way to preview.

  15. Every thing should come from somewhere, should have an integral story for it’s existence.

    The iPad, more than any other touch-device so far, at a software level re-imagines what the web and media should look like for real. When you think about it, it is also the first “computer” designed not keeping a typewriter in mind. It’s a device designed for a kind of usage that doesn’t exist yet, and that’s the beauty of it.

  16. So many threads here …

    1) There are different types of Geek Gods. Bill Gates is Hades. Google is Athena. Cory Doctorow is Hephaestus.

    2) “progress that goes down a path they don’t approve of” – that’s a question-begging, or question-assuming, way of putting it. Almost by definition, anything which is opposed can be described that way.

    3) “If Ned Ludd had been a blogger, he would have written a post similar to …”. I think he was much more focused on worker’s rights and a living wage.

    4) “But that’s just coincidence. In the end, progress doesn’t care about ideology.” – this conflates too much in terms of technology versus business models versus legal changes.

  17. Sam Penrose

    A consistent flaw in your always interesting work is the lack of quantification. According to http://oldcomputers.net/appleii.html, Apple did not sell it’s 1Mth Apple ][ until 1983. It will sell that many iPads by June. There are ~150K *apps* distributed for iPhone OS so far, a number sure to explode. More to the point, the number of hackers, including young hackers, is much, much bigger than it ever was in the 70s, especially outside of the US. A few hundreds are writing Objective C to Apple’s APIs instead of assembler to its registers; many thousands are writing Javascript and Ruby/Java/Python/etc., not to mention vast amounts of HTML . In short, there has been an explosion in hacking. It may be dwarfed by the explosion in Facebook browsing, but in turn it dwarfs the 70s enthusiasts both in scale and in scope of usefulness and ambition.

  18. It’s funny that this post is being attacked by some for being anti-iPad and by others for being pro-iPad. I guess it depends on where you stop reading.

  19. timjones17

    When people want their real human agency removed, they do it in front of a 52″ LCD TV with 1080p Full HD and kickin’ surround sound. This iPad is a pathetic imitator. Apple has been trying with its Apple TV and now the iPad- pretty sad really.

  20. Kevin Kelly

    Nick wrote: “What tends to distinguish the advanced device from the primitive device is the absence of “generativity.”

    That’s an admirably broad and sweeping theory, Nick, but it is unclear whether you are talking about all devices, or just modern devices, or just modern media devices, or what? In any case, I don’t see the evidence for your theory. What’s more generative, a primitive piece of charcoal for drawing on a cave wall, or a decked out Mac with Photoshop, Illustrator, and Maya? What’s more generative, an incandescent light bulb 100 years ago, or a more advanced LED bulb today? I don’t see the correspondence between generativity and primitiveness. Maybe you can explain this clearer.

  21. Richard Smith

    Keith Shaw’s point is important. I was sympathetic to Cory’s criticism but we have to remember that the iPad is a media device and media devices rarely if ever entirely displace the entire media ecosystem – they fit in. People have lots of other tools at their disposal, and you can’t even really use an iPad without an iTunes-running computer to connect it to. In other words, if the entire world of media and technology were to be absorbed by the iPad that is one thing, but adding iPads to the mix is quite another.

  22. Gee Kevin, that’s funny. Really. Back in the 70s, I got kicked out of art school because I kept telling my professors I was sick of taking drawing classes and doing charcoal drawings. That was the same thing primitive man did by pulling a burnt wooden stump out of a fire and scratching on a cave wall. But this newfangled 8080A microprocessor kit I just built showed me the way and it was computer art. I wanted a new artist’s studio more like a laboratory, with mysterious electronic devices that spit sparks and countertops laden with bubbling beakers. No wonder they kicked me out.

    But unlike some other pretentious twats (yes I mean you, Doctorow), I grew up. I even went back to art school to finish my degree, after working in computer graphics for years. I refused to do any CG in art school, I was sick of it. I wanted to reconnect with the reasons I wanted to make images in the first place. I finally fell in love with drawing.

    Well anyway, what is the point of all that stupid personal reminiscing? When I finished art school, I finally found out what people really want to see in this impersonal world, and in impersonal media. Art historians call it “the hand.” People want to see visible evidence of the personal touch of the artist as he created the work. A smooth mechanical object seems cold because it doesn’t have any personal touch. A rough drawing shows the results of millions of tiny decisions as the artist’s hand moves the charcoal across the page. This connects them with the artist, they relate to it just as if it was their hand that made it.

    And that’s what’s so brilliant about the iPad. It is nothing BUT the touch of the user. Applications to take advantage of this have yet to be made, because people don’t understand the concept yet, they haven’t touched it. Maybe the best example of this is the iPhone application “Brushes.” It’s coming to the iPad, I hear. There are expensive LCD panels with pressure sensitive stylus inputs, so when you paint in Photoshop, the paint appears right underneath the point of your stylus, it’s like painting with a real brush. But this is so expensive and difficult to set up, the technology just gets in the way. But apps like Brushes disintermediate, they remove all the technology that gets between you and your results, and makes it invisible. Your finger is the paint, the screen is the canvas.

  23. paulj

    When GUIs first appeared some of my colleagues in a university computing support department couldn’t stand them. You just couldn’t do as much as you could via the command line. I soon realised that what they really meant was that most of our users could do what they did without learning all those strange commands with mostly meaningless parameters. They stayed with MS-DOS and CP/M for years and managed to distance themselves from our users. This slagging off of iPhone/iPad-like devices reminds me of those days.

    You can’t get lid off an iPhone/iPad. How many ordinary computer users have the nerve to get inside their computer? Even when they do these days there’s not a lot that you can do, the geeks can look up what a component does and where it was made… A colleague tweeted recently that he won’t buy a device that he can’t get inside. I asked him what he can do when he gets inside his Android phone. Change the battery, add memory, er, er…

    You can’t make your own apps for an iPhone/iPad without paying Apple. Forget Objective C, write webapps. You have almost the same functionality as in an app and with a small effort it will work on a number of platforms.

  24. Matthew Hutchings

    Just one brief thought, I enjoyed this tweet from a web developer last night: http://twitter.com/garrettc/status/11780949676

  25. Bart Whitebook

    Touch is, itself, the future.

  26. Chris Duffy

    So you think Google and the Internet is making us dumb, but the iPad is a refinement not understood by Luddites?

    “When something exceeds your ability to understand how it works, it sort of becomes magical, and that’s exactly what the iPad is.” -Jonathan Ive in the iPad video on Apple.com

    At least progress is shiny.

  27. CS Clark

    I suppose sooner or later music always sounds like noise.

    But are they right to complain not because it’s happening as it has before but because it’s happening *again*, and this time, this time it was going to be different? It was supposed to be a thing of beauty!

    They might also be thinking of the iPad as a new form, rather than the mass-market closed off end version of an existing form which has previously been generative. And – do devices automatically lose their generative properties over time, or do they lose them as new devices that are better for that sort of thing are invented or become widespread? (Did people stop buying radios for transmitting as well as receiving because of increading ubquity of the telephone?) In which case it seems retrograde that the generative device that will provide these abilities already exists as a mass-market product.

    I also wonder whether it’s beside the point to debate how many people will actually create things using a specific tool. Firstly that we don’t know (no, not even with IQ tests) who the people who will create worthwhile stuff that the rest of us will consume, so there’s a benefit to all of us to exposing everyone to a wide range of tools. Secondly, that given that many are excited not about what is available now but what could/will be available, if you start with the wrong philosophy you are less likely to end up with worthwhile results. For example, you might end up with a majority of things being clones of existing games and brain functions.

    @Seth Finkelstein – I’ve always thought of Google more as Nyarlathotep.

  28. I would rather buy a “less magical” but more usable device, and Apple seems to think the same: A rumored smaller iPad could be launched in 2011, and I’m sure it will benefit from the impact of the current, larger and more impressive (but clumsy) iPad.

  29. Nick Carr

    Kevin Kelly (and Bratton),

    I can’t think of a device that incorporates more “generativity” (I keep putting that word in quotes, because the word itself annoys me) than a piece of coal used to draw a picture on the wall of a cave. Here’s why: the piece of coal, as a simple byproduct of a natural process (burning wood), incorporates no manufacturing or design intent. It doesn’t come into existence as a device, ie a tool. Therefore, when our ancient cave-dwelling ancestor picked it up to draw a picture, he was not only using a tool to do creative work, he was actually inventing the tool. That’s a profoundly generative act – one of the very highest order.

    (Also, on a related note, read Charles’s eloquent comment above.)

    Having said that, what I particular had in mind when I wrote the sentence you quoted are manufactured devices (those designed for a purpose) geared toward a general, or mass, market. (As I noted, there will always be specialized devices geared toward a particular creative class.) And (reflecting the comments of the iPad critics) I’m thinking of two sorts of generativity (separate but related and sometimes intertwined). First is the user’s ability to involve himself in the actual workings of the device, to express creativity or at least personal agency in modifying, repairing, or maintaining the device. (Changing the oil in a car is more generative than driving the car into a Jiffy-Lube in response to a message appearing on a dashboard display. Adding more RAM to a computer is more generative than putting a decal on the computer’s case.) This is the sense of generativity that Doctorow was talking about when he mentioned that computers (and other electronic devices) use to come with schematics. It’s pretty clear, particularly in the last few decades, that one of the major commercial thrusts of manufacturers is to remove this type of generativity from their products – to in effect hide their workings from the consumer. (See Matthew Crawford’s book “Shop Class as Soulcraft” for an in-depth discussion of this trend.) The second type of generativity is the user’s ability to use the device for creative work. I gave the examples of the phonograph and the radio, both of which incorporated creative functions originally but no longer do (the creative functions moved into specialized devices). The diminishment of this type of generativity, in the iPhone/iPad as compared to earlier general-purpose PCs, is Zittrain’s point. That doesn’t mean that the iPad can’t be used for creative work; it just means that, as we saw in the evolution of the radio and phonograph, the iPad places a strong emphasis on the consumption of creative works rather than their creation and as such probably manifests a broader trend in the design of computing devices.

    So that’s what I was getting at when I wrote that “what tends to distinguish the advanced device from the primitive device is the absence of ‘generativity.'” The consumer eagerly trades off generativity for convenience and ease of use (and a “high tech” image and feel), and the manufacturer responds (quite happily, since the tradeoff makes the consumer a more compliant consumer) by reducing the scope for personal agency in the product’s design.

    Nick

  30. Nick, I think there’s something of an error right at this point in your argument, which has given your post a geek-bashing tinge (note, a tinge – I’m not saying you set out to go bashing) – “The consumer eagerly trades off generativity for convenience and ease of use”. One part of the criticism is not of the consumer, but the manufacturer. It doesn’t change anything for the consumer if the manufacturer publishes schematics. It’s the manufacture’s decision, whether or not to support that sort of openness. In fact, some of this has been made illegal with the DMCA.

    I don’t know if it would be a good idea for me to get into a long debate on this, but this post has a thread of a type of criticism that’s irritating. To oversimplify to get it into a comment-box, it’s a subtext of “Those elitist pointy-heads, those self-centered geeks, don’t they know that not everyone is a double-dome like them, and that ordinary people like their gadgets that just work?”. Well, yes, that is understood, Zittrain, Doctorow, etc are not stupid. The argument being made is one about social and legally promoting certain values – and this is often misunderstood (indeed, per my previous point, you seem to deride this very advocacy itself for being advocacy).

  31. Seth,

    I don’t particularly disagree with anything you say – and I believe I indicated in the piece that I’m as much in agreement with the Doctorow/Zittrain view as not – but I think you’re oversimplifying when you contend that “It’s the manufacturer’s decision, whether or not to support that sort of openness.” On the particular subject of schematics, one of the reasons they’re rarely included anymore is that the devices are so complex that fiddling with them requires professional competence – and being “locked down” becomes essential to proper functioning. And this is not purely the choice of the manufacturer; it also responds to the desires of the consumer – for a sleek, idiot-proof machine that works without requiring any involvement on his part. (Including schematics, I’d also argue, makes a device feel less “modern” and “high tech” by suggesting it’s not as mysterious – and “magical” – as it seems, so it’s also important to branding and image, which consumers also value.) Apple, I’m pretty sure, restricts iPad/iPhone apps not only to protect a new source of revenue but to tightly control the user’s experience (and, as part of that, to protect the user from problems). So it responds to the wishes of the general consumer (who does not take the hobbyist’s view of devices) even as it furthers Apple’s commercial interests.

    Anyway, this is a very complex and fascinating subject, and I wouldn’t want to suggest that it doesn’t have a whole lot of facets.

    Nick

  32. Karim

    My first computer didn’t have a serial port. I had to buy a handful of discrete components and solder them onto a circuit board before I had an RS-232 port. Did this experience, as Doctorow suggests, put me “firmly in the camp of people who believe you should forever be rearranging the world to make it better?” No. It put me firmly in the camp of people who believe computers should come with a damn serial port.

    The sad thing is that we keep having this argument about abstractions and making things easier for people. You lower the barrier to something for millions of people, and there’s always some gearhead screaming that it’s too easy, too simple, the controls aren’t granular enough, you can’t customize it, etc. ad nauseam.

  33. Albo Fossa

    There are geeks who use “real” computers and “non-geeks” who use “appliance computers” (e.g. iPads). The geeks may often forget that the non-geeks are valid users of appliances.

    The non-geeks are people who live lives in the “real” (three-dimensional, not-on-the-screen) world. Imagine that: spending more, maybe even SUBSTANTIALLY more, of your time AWAY from your computer device than you spend WITH it! Imagine life as a landscape gardener, or an oil painter, or a doctor, or a carpenter, or a race car driver, or a solar panel installer, or a gymnast, or a movie star, or a great chef!

    Some of these non-geeks don’t have, and don’t wish to have the time to spend all day learning and fiddling with a danged computer device. They may spend an hour or two in the evening checking and sending email and ordering something online: for them, an iPad is plenty, and they shouldn’t be looked down upon for any lack of sophistication for that characteristic.

  34. twitter.com/epobirs

    The mistake here is regarding the iPad as a computer rather than an appliance. Recall that Jobs was seeking an appliance model for making computers more accessible by non-technical people from the beginning of the first Mac development. The Web is the killer app that ties it all together. The universal application for the universal info appliance.

    The real triumph here is Nintendo’s. This is a direct descendent of the business they applied to revive the video game console. A close environment where all publishing passed through the maker of the hardware. The Web makes things a bit more open but the best native apps rely on developers using hardware and software not part of the delivery system most consumers see.

    Go back to 1982. If you just wanted to play video games, you could buy an Atari 5200, among others. If you wanted to create video games, you bought an Atari 800 computer. Not a lot has changed in three decades other than the appliance has gotten a lot more traction. If you bought an Atari 800 back then, it had no software out of the box except for a BASIC interpreter. As was the case with most other computers for a long time. If you weren’t going to take up programming the appeal was very limited.

    Personally, I’d rather have a Tablet PC slate model than an iPad but I’m one of those Atari 800 guys from way back. My concept of what such a device should offer is very different from those 20-somethings who can barely remember a world without the Web. The ones who want to make things know where to get the gear for that. Those who merely want to consume the latest info no longer need to pretend they’re going to understand what is happening under the hood, any more than they would do an oil change on their car instead of paying someone to deal with it.

  35. Chris Duffy

    I’ve been thinking all morning about what it is that really bothers me (and perhaps other Luddites like me) about this device.

    Then it occurred to me: it’s the ambiguity surrounding its identity in the current mix of technology products.

    If I look at the iPad like a big iPod or media consumption device, I have no problem with it. It’s fine as peripheral, but not suitable as a netbook or lightweight laptop. After all, The very first thing you do when taking it out of the box is syncing it with iTunes on a computer.

    But then I read reviews -many linked to from Apple.com- that treat it as if it is a netbook, while brushing aside it’s numerous limitations from the perspective of someone who actually uses one. Moreover, if it’s not intended to compete with laptops, why are educated journalists comparing it to them?

    I think it’s pretty clear that Apple wants people to look at it that way, because they want that market segment’s dollars, but not on that market’s terms.

    The “user experience” of netbooks may well suck and be insanely great on an iPad, but if they’re fundamentally apples and oranges, one isn’t a suitable replacement for the other.

    So when the tech press sings the chorus of how this is the future of computing, we’re either buying into a misleading conception, or aren’t really talking about the same thing.

    Is it the future of Human-computer interaction? Sure, that I can buy. Is it the future in terms of appliance-centric consumer electronics? Okay, there again, I can see that in some cases. Is it the future of computing? (i.e. totalitarian OEM control, little adherence to industry standards, complete disregard for interoperability and data portability, etc) No way. Or rather, ‘Way Frightening,’ if it is. I’m talking about practical user function, not “passion.”

    It’s a question of “new and improved” over “new and removed” If it’s a toy, that’s fine. I don’t think Apple wants you to feel that way about it though- even if that’s what it really is.

  36. www.facebook.com/profile.php?id=691981658

    I don’t think the line between geek and non-geek is absolute. I have a netbook running linux. I install all sorts of things on it. Sometimes it is working, sometimes not. I certainly have a place in my life for something that always works. Doctorow seems to have infinite free time. After working a full day, cooking, cleaning, paying the bills, walking the dog three times, working out and hacking my computer, I damn well want the thing I read or watch TV shows on to just work. The thing I am tinkering with might not be when I am just too tired to tinker anymore.

  37. Progress is too abstract word. If your do not set-up the speed and direction, you are not able to measure the progress. In any other cases it would be not the progress, but just the process :-)

  38. alan

    Good job Nick, out of the desert and into a new long thread. That’s progress! Alan

  39. Jason Treit

    A terrific post and discussion, well worth bookmarking. JZ’s generativity thesis has taken a while to catch ears. It’s good to see a pendulum swing of thoughtfulness in the other direction now.

    Amused, tho, at the Professor Frink-like extrapolations of fingers directly applied to surfaces being the future. The iPad, magical or not, doesn’t erase the discovery that begat technology: our meaty paws are neither the strongest nor the subtlest thing for every task.

  40. For the status conscious it is a must have despite the fact that in nine months it will be an obsolete coffee table book. For the Apple stockholders, it is an ‘everyone must buy’ for obvious reasons.

    For a non-insignificant number, it is a 1k choice between it and a set of tires and a front end alignment for the second car they have to keep running to make the mortgage. In short, in times like these, despite the “brilliance of execution” or the lack or presence of “generativity”, iPad lust is ludicrous. Apple stepped into the zeitgeist and has yet to clean it’s shoes.

  41. Andytedd.wordpress.com

    I was sceptical about the Ipad at first, but now I realise it will be the first computing device specifically aimed at the ‘Shallows’. And thanks to the magic of Jonathon Ives, a highly covetable one.

    I can see why people might reject this as a backwards step, but on the other hand, if a lot of functionality in a device is redundant why not get rid?

    The only thing I find hard to reconcile is Nick’s apparent contradiction – he seems to be in favour of depth, yet I think this gadget will be at its best as a shallow hypermedia use it while you are doing something else thing. The Ipad is a product which reduces complexity for people who dont want it, but that does not make critics of it Luddites

    Cheers

    A

  42. the idea of “progress” as some sort of impersonal, implacable force is a myth. Progress is whatever we as a society come up with as some sort of collective “consensus” – and I put that in quotes because it’s not real consensus, usually, but mostly involves some kind of struggle ending in compromise between a variety of stakeholders.

    Take the example of DDT. That was a beneficial chemical, that was “progress” – until people discovered how bad it was for wildlife and the rest of the ecosystem. People, at least in the “developed” world, decided as a society to not use it anymore and banned it. That was “progress” too. It’s still being sold and used in the “developing world”, because the people there don’t have the political/economic power to get it banned.

    To brand people Luddites because they have a different idea of what “progress” means seems like a narrow and shallow view.

  43. Karim

    Criticizing a new technology or gadget doesn’t make one a Luddite, but maybe FEARING it does. Geeks usually criticize things on a logical, no-nonsense, quantifiable basis: if the argument was about crappy price/performance (i.e. the argument made about Microsoft’s tablet PCs and UMPCs), that would be par for the course. Instead, we have people AFRAID of what the iPad might represent: more DRM, less “generativity,” less hacking & tinkering. We even have people who phear teh iPad because it doesn’t fall into some tidy, predefined category of computing device.

    These people should take their own advice and hack the iPad. Create an IDE app and interpreter for it. Write an app that uses the iPad Bluetooth to talk to an Arduino board. Publish your music and books without DRM. Or come out with a $500 tablet that is more hackable, as I imagine is planned at several of Apple’s competitors. If you don’t like the future, CHANGE IT. “Doctorow, heal thyself.”

  44. Detritus,

    Then again, maybe in your eagerness to read what you wanted to read you missed the point.

    Nick

  45. Blake Merriam

    Been looking forward to your weighing in on the iPad for the last few weeks now Nick. :) I’m suprised with your sympathy for Doctorow and others. I’ve found that you have often been eager to critique those who champion “technology for technology’s sake” (Going back to your argument in “Does IT matter?”) as Doctorow seems to do.

    What I find interesting is this debate over the iPad and how it may change the fortunes of the languishing publishing business. Jobs and company know like the back of their hand the dynamics and requirements of establishing a new computing platform/category. The iPad is not supposed to be like a PC. My guess is 99% of iPad owners have a PC at home and can commune with the internet by blogging, videotaping, uploading all they want. It’s the whole holistic nature of the iPad that strikes me as unique. So relax on your couch, at Starbucks, on the train, etc. and consume the rich multimedia content that the New York Times and others will create for you with the touch of your fingers (not something that can be done quite the same with a laptop). Apple and the news industry have gone all out in one of the biggest PR blitzes I have seen in a long time. Is it any wonder that Walter Mossberg was on Charlie Rose last Friday singing the iPad’s praises? It may well save his job and everyone else at WSJ.

    The extent to which the iPad’s content delivery will be proprietary (to save the news industries business model presumably) is *facinating* to me. But content is still king, and Apple can PR the iPad all they want, but it will be now up to others to create the content that will determine if this “3rd category” (After PC and Phone/Music Player) establishes a foothold. The iPod had all the music in the world as it’s content, no problem there. The big question is will developers create the kind of content/apps to justify the iPad?

  46. I too wrote a little diatribe about the iPad, http://pairadimes.davidtruss.com/ipads-are-for-iconsumers/ “I’m a huge Mac fan, but I have no interest in a bigger version of my iPhone that isn’t a phone, isn’t a camera, doesn’t like to multitask, requires me to have a laptop on the side and then doesn’t fit in my pocket.”

    The key point being the laptop ‘on the side’. I’m probably going to end up getting an iPad (on a later version), but I want to see schools putting content creator, not content consumer tools in the hands of kids! Spend that money on a 2G/160G 12inch netbook with a camera & a full-sized keyboard and save the iPad for home entertainment.

    You say, “In the end, progress doesn’t care about ideology.” – Perhaps, but it seems progress does care about consumerism and product placement, and so some not-so-altruistic ideologies might just be influencing progress in a way that -at least to me- doesn’t need to be thrown into schools, (when other ‘tools’ would be more effective at helping students be active creators of digital content).

  47. Nick, the two facets, and the two definitions, you give for “generativity” suggests to me that the word you are using needs more clarity before you invoke it for your theory.

    Let me suggest a different metaphor, one which may answer your challenge.

    The nature of every invention is to start out vague, incomplete, and open to change. It begins primed for hacking, and for re-definition. It is many things to many people. At this stage, the device is in the hands of tinkerers, nerds, fans, and hacks who will make it do all kinds of things no one had thought of. This skeletal generality enables one kind of generativity. The tool itself is being invented, which is, as you note, the highest kind of genesis. In this mode a device or invention or idea is thrilling and its naked potential appeals to a set of early adopters (which are often called an elite) who explore this glorious incompleteness in many ways.

    But for many others (who are often called the masses), this very openness, this ill-defined thing requires too much expertise, or control, or knowledge, or care, or time to use and to them its skeletal generality is a turn off. One has only to think of the early days of automobiles, or windmills, or radios, or cell phones, and now unappealing most people found their incompleteness (“they hardly work and are hard to use”).

    But the masses usually don’t have to wait long for the natural history of an invention to kick in. A device become more specialized and “complete” as it evolves. As it does, it becomes more specific in what it does, more closed in its identity. Yet at the same time, it becomes more powerful in evolving identity. It becomes more completed, more approachable, more understandable, more able to do things for more people. For instance the first cameras gave great latitude to the photographers. Since you had to make your own film, you could make it do all kinds of things — favor the infrared spectrum, or embed it in fabric, or make it three feet wide. But as the outlines of popular photography became clearer, the camera honed in one a certain specific design, film was manufactured, and equipment more certain what it was for. The result were cameras that anyone, not just photo geeks, could use, and the result was an incredible generativity as millions and eventually billions of people started photographing EVERYTHING. As the craft became more specific, it became more ubiquitous, and the levels of creativity unleased by this easy to use device vastly exceeded the amazing creativity of its founders. In this way a refined device is more “generative” than earlier when it was vague and incomplete.

    So there is a natural arc in each invention which moves it from the generative openness when it is new-born to the refined generativity as it becomes well defined.

    Nick, you seem to suggest that it is modern manufacturing/consumerism that closes off all inventions to the first kind of generativity, but I suggest this maturity it has always happened, long before the industrial age. It is merely being accelerated now.

    But if that was the total story it would be a pretty small world. What happens is more greater. Each new unformed, hackable, potential invention is refine by use, and this use makes it more specific, more conditional, more open to use by know-nothings, more powerful in defined ways.

    And then these mature products enable entirely new tools. In turn these new tools are again open to the first kind of generativity. Hacker and nerds and tinkers flock to the immature zone, where they can help define the new new thing.

    So, the same story is told over and over again. Once upon a time the early adopters made their own electrical parts — capacitors, resistors, etc. — from which they cobbled together radios and other equipment. Once it was clear what a capacitor was, the frontier moved on to making the gadget — but not with the old guard complaining about the loss of the joy of making your capacitors. Then as radios became more defined, hams did not make radios, but at least repaired them. So they made their own Altair computers. For a while. What you don’t make your own computer? No, but I wrote the operating system. What you don’t write your own OS? No, but I wrote my own program? What you don’t write your own programs? No, but I code my own website. What you don’t code your own website? No, but I write my own apps. What you don’t write your own apps? No, but I weave my own lifestreams…..

    New-borns with infinite potential but low-productivity become middle-agers generating great productivity and unleashing fantastic creativity; in turn the mature keep the frontiers expanding by generating more new borns. I speak here, of course, of ideas and devices.

  48. I took Cory Doctorow’s remarks a bit differently. Although he expresses his distaste for the iPad in hardware terms, I think what bothers him is that it’s a device that you have to Apple’s way, which means buying only the software that Apple approves of in the way Apple allows. To use the appliance analogy, that would be like having the manufacturer of the toaster be the arbiter of what you could put into the toaster and being the only place to buy bread. He was also talking of having the power to make things better for yourself and maybe for others. But when everything is locked down – and the hardware aspect is only one, and really an analogy at that – then you don’t get to work “outside the box,” because the box, and what is approved for the box, is all that there is.

    Personally, I think that the way things are moving, you need to be able to create as well as consume media on the fly. The thoughtful comments on this post are an example. And progress might be a media device that encourages such interaction by making it easier to do. From that view, progress has been the movement from top-down communications to bi-directional. Perhaps enforcing the “either consume or produce” model is the truly Luddite stance.

  49. len

    So now Apple also gets to choose the developer languages thus setting itself up for a developer smackdown and claiming that limiting choice is a means of improving the user experience. Isn’t this what Apple claimed about Microsoft, the sort of thing Microsoft has been repeatedly sued for? Isn’t this Apple becoming the antithesis of the famous Mac commercial? So rather than being a war on Adobe as Jobs claims, they’ve gone to war on the developer ecosystems themselves.

    Wow. All of this to push video to a weak wifi tablet through Verizon high speed networks? Wow.