I’ll be at the LA Times Festival of Books on Saturday, participating on a panel with David Shields and Ander Monson called “Rebooting Culture: Narrative & Information in the New Age.” Shields is playing the nihilist, Monson is playing the anarchist, and I’m taking the role of the vigilante. Moderating is David Ulin, the Times’s book editor. Stop by if you’re in Los Angeles. More details and a bonus link.
Category Archives: Uncategorized
Echoes
Three strangely echoing visions of the future:
2010: “As humans rely on the Internet for all aspects of our lives, our ability to think increasingly depends on fast, reliable applications. The web is our collective consciousness, which means web operators become the brain surgeons of our distributed nervous system. Each technology we embrace makes us more and more reliant on the web … For much of the Western world, technology, culture, and society are indistinguishable … Today’s web tells you what’s interesting. It learns from your behavior. It shares, connects, and suggests. It’s real-time and contextual. These connected systems augment humanity, and we rely on them more and more while realizing that dependency less and less … Take away our peripheral brains, and we’re helpless. We’ll suddenly be unable to do things we took for granted, much as a stroke victim loses the ability to speak … A slow-down will feel like collective Alzheimers.” -Alistair Croll
2005: “What will most surprise us is how dependent we will be on what the Machine knows – about us and about what we want to know. We already find it easier to Google something a second or third time rather than remember it ourselves. The more we teach this megacomputer, the more it will assume responsibility for our knowing. It will become our memory. Then it will become our identity. In 2015 many people, when divorced from the Machine, won’t feel like themselves – as if they’d had a lobotomy.” -Kevin Kelly
1995: “[As] machines become more and more intelligent, people will let machines make more of their decisions for them, simply because machine-made decisions will bring better results than man-made ones. Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People won’t be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.” -Theodore Kaczynski
Mad rant becomes ecstatic rhapsody becomes offhand remark.
TuringCraft
Inspired by Weizenbaum’s ELIZA, Ben Weber has created a chatbot that converses with other players in online games of StarCraft. The bot’s name is EISBot. “Once every thirty seconds,” explains Weber, “EISBot randomly selects a message from a pool of 75 messages and sends it to the console.” As the following transcript reveals, the bot’s randomly chosen messages fit seamlessly into the conversation:
EISBot: What to ally victory?
Player: damn you
Player: im not stupid
EISBot: Wanna join my clan
Player: dunno
Player: im not that great
Player: lol
EISBot: So, this game is pretty much over
Player: yeah
Player: lol
Player: im playing off race
Player: lol
EISBot: Quit wasting time, come kill me
EISBot: What kind of hardware are you using? I’m on a 486 with 16M ram, pure luxory!!!
Player: i dunno
EISBot: PWN PWN PWN!
EISBot: I’m a huge nonyt fanboy
EISBot: Have you played SC 2 yet?
Player: yeah
Player: pretty sick
EISBot: Have you ever played against a C player?
Player: nope
Player: wait
Player: yeah
Player: i lost
Player: badly
Player: haa
EISBot: Did you see the past OSL?
Player: kinda
Note that the bot’s one major flaw is that its command of the English language, particularly the use of punctuation marks, is much too sophisticated in comparison with that of the human. The sure way to distinguish the computer’s messages from the human’s is to recognize that the computer has a rather sentimental attachment to the apostrophe and the comma.
I take this as another indication that I am correct in my suspicion that when computers finally pass the Turing test it won’t be because computers have become smarter; it will be because humans have become dumber.
lol
Pynchon and the Badass Luddites
To close out Luddite Week here at Rough Type, I would like to direct the Internet’s attention (when the Internet pays attention, servers fail and nodes collapse, and a rictal grin spreads across Ned Ludd’s bony face) to an article on the topic of Ludditism by Thomas Pynchon, which ran in the New York Times Book Review in that fabled year, 1984. Written nearly a decade before the World Wide Web would turn the Internet into a popular medium, the article is nevertheless entirely up to date in its description of humankind’s submergence in a superabundance of accessible data:
… we have come to live among flows of data more vast than anything the world has seen. Demystification is the order of our day, all the cats are jumping out of all the bags and even beginning to mingle. We immediately suspect ego insecurity in people who may still try to hide behind the jargon of a specialty or pretend to some data base forever “beyond” the reach of a layman. Anybody with the time, literacy, and access fee can get together with just about any piece of specialized knowledge s/he may need … the problem has really become how to find the time to read anything outside one’s own specialty.
Pynchon recalls C. P. Snow’s assertion, in his famous 1959 lecture about the growing divide between the “two cultures” of the literary intellectual and the scientific intellectual, that “if we forget the scientific culture, then the rest of intellectuals have never tried, wanted, or been able to understand the Industrial Revolution.” Those intellectuals, the literary types, were, said Snow, “natural Luddites.” Things have changed, notes Pynchon, in the years since Snow’s lecture:
… it’s hard to imagine anybody these days wanting to be called a literary intellectual, though it doesn’t sound so bad if you broaden the labeling to, say, “people who read and think.” Being called a Luddite is another matter. It brings up questions such as, Is there something about reading and thinking that would cause or predispose a person to turn Luddite?
Which leads Pynchon to a consideration of the possibly mythical, and definitely mystical, figure of Ned Ludd, who in 1779, as legend has it,
broke into a house and “in a fit of insane rage” destroyed two machines used for knitting hosiery. Word got around. Soon, whenever a stocking-frame was found sabotaged … folks would respond with the catch phrase “Lud must have been here.” By the time his name was taken up by the frame-breakers of 1812, historical Ned Lud was well absorbed into the more or less sarcastic nickname “King (or Captain) Ludd,” and was now all mystery, resonance and dark fun: a more-than-human presence, out in the night, roaming the hosiery districts of England, possessed by a single comic shtick – every time he spots a stocking-frame he goes crazy and proceeds to trash it.
The twist here is that the mechanical knitting-frame had already been around for nearly two centuries, having been invented in 1589 by a gentleman annoyed that the woman he was courting seemed more interested in fiddling with her knitting needles than heeding his romantic overtures. (Which may mean that the Industrial Revolution originated in sex-craziness.) So it’s an oversimplification, Pynchon continues, to assume that Ned was ” a technophobic crazy” lashing out at a new automated device that was endangering a way of work and a way of life:
No doubt what people admired and mythologized him for was the vigor and single-mindedness of his assault … Ned Lud’s anger was not directed at the machines, not exactly. I like to think of it more as the controlled, martial-arts type anger of the dedicated Badass.
Ned Ludd as Bruce Lee! Or as Uma Thurman in Kill Bill! The movie treatment writes itself.
Public feeling about the machines could never have been simple unreasoning horror, but likely something more complex: the love/hate that grows up between humans and machinery – especially when it’s been around for a while – not to mention serious resentment toward at least two multiplications of effect that were seen as unfair and threatening. One was the concentration of capital that each machine represented, and the other was the ability of each machine to put a certain number of humans out of work – to be “worth” that many human souls. What gave King Ludd his special Bad charisma, took him from local hero to nationwide public enemy, was that he went up against these amplified, multiplied, more than human opponents and prevailed.
The Rough Type lawyers tell me that I’ve reached the limits of the fair-use doctrine. Which comes as a relief, since I find that the remainder of Pynchon’s essay, weaving from Frankenstein to Star Wars by way of Hiroshima, defies the blogger’s (never mind the Tweeter’s) urge to tidbitize. You’ll have to read it yourself.
But just remember this one thing if you’re ever tempted to call me a Luddite: I am not a Luddite. I am a Badass.
The law of situational Ludditism
As I’ve thought some more about my iPad Luddites post, and the many fine comments that have affixed themselves to its hull, I’ve formulated the following observation:
We are all Luddites, but to avoid admitting our Ludditism to ourselves we will define any manifestation of progress that we don’t approve of as “regress” and criticize it as such.
Exodus
Has it begun?
James Sturm, the cartoonist, can’t take it anymore, “it” being the Internet:
Over the last several years, the Internet has evolved from being a distraction to something that feels more sinister. Even when I am away from the computer I am aware that I AM AWAY FROM MY COMPUTER and am scheming about how to GET BACK ON THE COMPUTER. I’ve tried various strategies to limit my time online: leaving my laptop at my studio when I go home, leaving it at home when I go to my studio, a Saturday moratorium on usage. But nothing has worked for long. More and more hours of my life evaporate in front of YouTube … Essential online communication has given way to hours of compulsive e-mail checking and Web surfing. The Internet has made me a slave to my vanity: I monitor the Amazon ranking of my books on an hourly basis, and I’m constantly searching for comments and discussions about my work.
He’s not quite ready to divorce the web. But he’s decided on a four-month trial separation. Like Edan Lepucki, he’s having someone visit his online accounts and change all his passwords, just to be safe.
I know there’s no going back to the pre-Internet days, but I just want to move forward a little more slowly.
Disconnection is the new counterculture.
UPDATE: There’s an amusing exchange in the comments to Sturm’s article at Slate:

The iPad Luddites
Is it possible for a Geek God to also be a Luddite? That was the question that popped into my head as I read Cory Doctorow’s impassioned anti-iPad diatribe at Boing Boing. The device that Apple calls “magical” and “revolutionary” is, to Doctorow, a counterrevolutionary contraption conjured up through the black magic of the wizards at One Infinite Loop. The locked-down, self-contained design of the iPad – nary a USB port in sight, and don’t even think about loading an app that hasn’t been blessed by Apple – manifests “a palpable contempt for the owner,” writes Doctorow. You can’t fiddle with the dang thing:
The original Apple ][+ came with schematics for the circuit boards, and birthed a generation of hardware and software hackers who upended the world for the better. If you wanted your kid to grow up to be a confident, entrepreneurial, and firmly in the camp that believes that you should forever be rearranging the world to make it better, you bought her an Apple ][+ …
The way you improve your iPad isn’t to figure out how it works and making it better. The way you improve the iPad is to buy iApps. Buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.
Doctorow is not the only Geek God who’s uncomfortable with Apple’s transformation of the good ole hacktastic PC into a sleek, slick, sterile appliance. Many have accused Apple of removing from the personal computer not only its openness and open-endedness but also what Jonathan Zittrain, founder of Harvard’s Berkman Center for Internet & Society, calls its “generativity” – its capacity for encouraging and abetting creative work by its users. In criticizing the closed nature of the iPhone, from which the iPad borrows its operating system, Zittrain, like Doctorow, invoked the ancient, beloved Apple II: “a clean slate, a device built – boldly – with no specific tasks in mind.”
Tim Bray, the venerated programmer who recently joined Google, worries that the iPad, which is specifically designed to optimize a few tasks and cripple others, could lead to “a very nasty future scenario”:
At the moment, more or less any personal computer, given enough memory, can be used for ‘creative’ applications like photo editors and IDEs (and, for pedal-to-the-metal money people, big spreadsheets). If memory-starved tablets become ubiquitous, we’re looking at a future in which there are “normal” computers, and then “special” computers for creative people … I dislike this future not just for personal but for ideological reasons; I’m deeply bought-into the notion of a Web populated by devices that almost anyone can afford and on which anyone can be creative, if they want.
What these folks are ranting against, or at least gnashing their teeth over, is progress – or, more precisely, progress that goes down a path they don’t approve of. They want progress to, as Bray admits, follow their own ideological bent, and when it takes a turn they don’t like they start grumbling like granddads, yearning for the days of their idealized Apple IIs, when men were men and computers were computers.
If Ned Ludd had been a blogger, he would have written a post similar to Doctorow’s about those newfangled locked-down mechanical looms that distance the weaver from the machine’s workings, requiring the weaver to follow the programs devised by the looms’ manufacturer. The design of the mechanical loom, Ned would have told us, exhibits a palpable contempt for the user. It takes the generativity out of weaving.
And Ned would have been right.
I have a lot of sympathy for the point of view expressed by Doctorow, Zittrain, Bray, and others of their ilk. The iPad, for all its glitzy technical virtuousity, does feel like a step backwards from the Apple II and its progeny. Hell, I still haven’t gotten over Apple’s removal of analog RCA plugs for audio and video input and output from the back of its Macs. Give me a beige box with easily accessible innards, a big rack of RAM, and a dozen or so ports, and I’m a happy camper.
But I’m not under any illusion that progress gives a damn about what I want. While progress may be spurred by the hobbyist, it does not share the hobbyist’s ethic. One of the keynotes of technological advance is its tendency, as it refines a tool, to remove real human agency from the workings of that tool. In its place, we get an abstraction of human agency that represents the general desires of the masses as deciphered, or imposed, by the manufacturer and the marketer. Indeed, what tends to distinguish the advanced device from the primitive device is the absence of “generativity.” It’s useful to remember that the earliest radios were broadcasting devices as well as listening devices and that the earliest phonographs could be used for recording as well as playback. But as these machines progressed, along with the media systems in which they became embedded, they turned into streamlined, single-purpose entertainment boxes, suitable for living rooms. What Bray fears – the divergence of the creative device from the mass-market device – happened, and happened quickly and without much, if any, resistance.
Progress may, for a time, intersect with one’s own personal ideology, and during that period one will become a gung-ho technological progressivist. But that’s just coincidence. In the end, progress doesn’t care about ideology. Those who think of themselves as great fans of progress, of technology’s inexorable march forward, will change their tune as soon as progress destroys something they care deeply about. “We love the things we love for what they are,” wrote Robert Frost. And when those things change we rage against the changes. Passion turns us all into primitivists.