Within the New Class [of computer experts] are more than 30,000 people, including hundreds of attractive young ladies who have studied computers at such colleges as Vassar. They are concerned not with the executive problems of how to use computers but with How to Talk to the Machine. They have their own language and like to discuss such things as automorphisms, combinatorial lemmas, (0,1)-matrices of size m by n, Monte Carlo theory, heuristic programing, Boolean trees and don’t care conditions. Fortunately space does not permit these terms to be explained here.
Our computers advance, but our fears about them remain remarkably consistent, cycling through peaks and valleys in some as yet undiagnosed pattern. In the spring of 1961, Life magazine ran a long feature story titled “The Machines Are Taking Over: Computers Outdo Man at His Work — and Soon May Outthink Him.” It made for unsettling reading.
“The American economy,” reported the writer, Warren R. Young, “is approaching the point of no return in its reliance on computers.” He provided a long list of examples to show how computers were quickly taking over not only factory work but also professional jobs requiring analysis and decision-making, in such fields as engineering, finance, and business. Computers “will tend to make middle-management obsolete.” The digital machines, he went on, were even moving into the creative trades, composing “passable pop songs” and “Beatnik poems.” Soon, they’d be able to perform “robotic translation of foreign publications, particularly scientific and political material written in Russian.”
The use of language is, of course, one of the traits that has most notably distinguished human beings from all other creatures. The complete mastery of human language by computers may well be on its way. Some scientists say that digital computers can already “think.” Though they greatly doubt that computers will be able to do creative thinking, they are coming close.
Most ominous of all, wrote Young, was the arrival of machine learning:
A new machine called the Perceptron is actually able to learn things by itself, by studying its environment. Built by a Cornell psychologist, Dr. Frank Rosenblatt, it is equipped to look at pictures and in future versions will hear spoken words. It not only recognizes what it has seen before but also teaches itself generalizations about these. It can even identify new shapes similar to those it has seen before.
The Perceptron is so complex that even its inventor can no longer predict how it will react to a new problem. “If devices like the Perceptron,” says one expert, “can really learn effectively by themselves, we will be approaching the making of a true robot, fantastic as that sounds. But remember, all this was begun and devised by human brains, so humans — if they take care — will remain supreme.”
Young didn’t find such tepid reassurances all that convincing:
This is cheering news, no doubt. But there is another view of the future in a story that computer designers now tell only as a macabre joke: A weary programmer who has spent his life tending a computer that always has the right answer for everything finally gets fed up. “All right,” he asks his machine, “if you’re so smart, tell me — is there a God?” The computer whirs gently, its lights flicker, its coils buzz and hum, and at last it clicks out the answer: THERE IS NOW.
Computers hadn’t even mastered lower-case letters, and already we’d infused them with delusions of grandeur.
I have an op-ed about how we misperceive our computers and ourselves, “Why Robots Will Always Need Us,” in this morning’s New York Times. A snippet:
While our flaws loom large in our thoughts, we view computers as infallible. Their scripted consistency presents an ideal of perfection far removed from our own clumsiness. What we forget is that our machines are built by our own hands. When we transfer work to a machine, we don’t eliminate human agency and its potential for error. We transfer that agency into the machine’s workings, where it lies concealed until something goes awry.
Vocabulary is rarely so rich, so dense with branch and twig, as in the realm of flora and fauna. Plants and animals go by all sorts of strange and evocative names depending on where you are and whom you’re talking with. One local term for the kestrel, reports Robert Macfarlane in an article in Orion, is wind-fucker. Having learned the word, he writes, “it is hard now not to see in the pose of the hovering kestrel a certain lustful quiver.”
I’m reminded of Seamus Heaney’s translation of a Middle English poem, “The Names of the Hare”:
The wimount, the messer,
the skidaddler, the nibbler,
the ill-met, the slabber.
The quick-scut, the dew-flirt,
the grass-biter, the goibert,
the home-late, the do-the-dirt.
It goes on that way for a couple dozen more lines, each of which brings you a little closer to the nature of the beastie.
Macfarlane’s piece, drawn from his forthcoming book Landmarks, was inspired by the discovery that a great dictionary for kids, the Oxford Junior Dictionary, is being pruned of words describing the stuff of the natural world. Being inserted in their place are words describing the abstractions and symbols of the digital and bureaucratic spheres:
Under pressure, Oxford University Press revealed a list of the entries it no longer felt to be relevant to a modern-day childhood. The deletions included acorn, adder, ash, beech, bluebell, buttercup, catkin, conker, cowslip, cygnet, dandelion, fern, hazel, heather, heron, ivy, kingfisher, lark, mistletoe, nectar, newt, otter, pasture, and willow. The words introduced to the new edition included attachment, block-graph, blog, broadband, bullet-point, celebrity, chatroom, committee, cut-and-paste, MP3 player, and voice-mail.
They yanked out bluebell and put in bullet-point? What shit-asses.
The substitutions made in the dictionary — the outdoor and the natural being displaced by the indoor and the virtual — are a small but significant symptom of the simulated life we increasingly live. Children are now (and valuably) adept ecologists of the technoscape, with numerous terms for file types but few for different trees and creatures. A basic literacy of landscape is falling away up and down the ages.
As Macfarlane goes on to say, the changes in the dictionary don’t just testify to our weakening grasp on nature. Something else is being lost: “a kind of word magic, the power that certain terms possess to enchant our relations with nature and place.”
As the writer Henry Porter observed, the OUP deletions removed the “euphonious vocabulary of the natural world — words which do not simply label an object or action but in some mysterious and beautiful way become part of it.”
I’m sure that many will label Macfarlane and Porter “romantics.” I’ve begun to notice that romantic is replacing Luddite and nostalgist as the insult-of-choice deployed by techno-apologists to dismiss anyone with more expansive interests than their own. That, too, is telling. It’s always been a sin against progress to look backward. Now it’s also a sin against progress to look inward. And so, fading from sight and imagination alike, the world becomes ever vaguer to us — not mysterious but peripheral, its things unworthy even of being named. Who now would think of the wind as something that might be fucked?
Photo: Rick Cameron.
Forget Mayweather and Pacquiao. Tonight in New York, Intelligence Squared is hosting a debate on the proposition “Smart Technology Is Making Us Dumb.” Arguing for the proposition will be Andrew Keen and I. Arguing against it will be Genevieve Bell and David Weinberger. The event is sold out, but you can watch the video here.
Today is Rough Type‘s tenth birthday. If I’m calculating correctly, that’s eighty in human years.
A transmigration of the blog would seem to be in order.
Image: Strolic Furlan-Davide Gabino.
[first series, 2012]
1. The complexity of the medium is inversely proportional to the eloquence of the message.
2. Hypertext is a more conservative medium than text.
3. The best medium for the nonlinear narrative is the linear page.
4. Twitter is a more ruminative medium than Facebook.
5. The introduction of digital tools has never improved the quality of an art form.
6. The returns on interactivity quickly turn negative.
7. In the material world, doing is knowing; in media, the opposite is often true.
8. Facebook’s profitability is directly tied to the shallowness of its members: hence its strategy.
9. Increasing the intelligence of a network tends to decrease the intelligence of those connected to it.
10. The one new art form spawned by the computer – the videogame – is the computer’s prisoner.
11. Personal correspondence grows less interesting as the speed of its delivery quickens.
12. Programmers are the unacknowledged legislators of the world.
13. The album cover turned out to be indispensable to popular music.
14. The pursuit of followers on Twitter is an occupation of the bourgeoisie.
15. Abundance of information breeds delusions of knowledge among the unwary.
16. No great work of literature could have been written in hypertext.
17. The philistine appears ideally suited to the role of cultural impresario online.
18. Television became more interesting when people started paying for it.
19. Instagram shows us what a world without art looks like.
20. Online conversation is to oral conversation as a mask is to a face.
[second series, 2013]
21. Recommendation engines are the best cure for hubris.
22. Vines would be better if they were one second shorter.
23. Hell is other selfies.
24. Twitter has revealed that brevity and verbosity are not always antonyms.
25. Personalized ads provide a running critique of artificial intelligence.
26. Who you are is what you do between notifications.
27. Online is to offline as a swimming pool is to a pond.
28. People in love leave the sparsest data trails.
29. YouTube fan videos are the living fossils of the original web.
30. Mark Zuckerberg is the Grigory Potemkin of our time.
[third series, 2014]
31. Every point on the internet is a center of the internet.
32. On Twitter, one’s sense of solipsism intensifies as one’s follower count grows.
33. A thing contains infinitely more information than its image.
34. A book has many pages; an ebook has one page.
35. If a hard drive is a soul, the cloud is the oversoul.
36. A self-driving car is a contradiction in terms.
37. The essence of an event is the ghost in the recording.
38. A Snapchat message becomes legible as it vanishes.
39. When we turn on a GPS system, we become cargo.
40. Google searches us.
41. Tools extend us; technology confines us.
42. People take facts as metaphors; computers take metaphors as facts.
43. We need not fear robots until robots fear us.
44. Programmers are ethicists in denial.
45. The dream of frictionlessness is a death wish.
46. A car without a steering wheel is comic; a car without a rearview mirror is tragic.
47. One feels lightest after one clears one’s browser cache.
48. The things of the world manifest themselves as either presence or absence.
49. Memory is the medium of absence; time is the medium of presence.
50. A bird resembles us most when it flies into a window.