Insert human here

ibm

I have an op-ed about how we misperceive our computers and ourselves, “Why Robots Will Always Need Us,” in this morning’s New York Times. A snippet:

While our flaws loom large in our thoughts, we view computers as infallible. Their scripted consistency presents an ideal of perfection far removed from our own clumsiness. What we forget is that our machines are built by our own hands. When we transfer work to a machine, we don’t eliminate human agency and its potential for error. We transfer that agency into the machine’s workings, where it lies concealed until something goes awry.

Read it.

Wind-fucking

kestrel

Vocabulary is rarely so rich, so thick with branch and twig, as in the realm of flora and fauna. Plants and animals go by all sorts of strange and evocative names depending on where you are and whom you’re talking with. One local term for the kestrel, reports Robert Macfarlane in an article in Orion, is wind-fucker. Having learned the word, he writes, “it is hard now not to see in the pose of the hovering kestrel a certain lustful quiver.”

I’m reminded of Seamus Heaney’s translation of a Middle English poem, “The Names of the Hare”:

Beat-the-pad, white-face,
funk-the-ditch, shit-ass.

The wimount, the messer,
the skidaddler, the nibbler,
the ill-met, the slabber.

The quick-scut, the dew-flirt,
the grass-biter, the goibert,
the home-late, the do-the-dirt.

It goes on that way for a couple dozen more lines, each of which brings you a little closer to the nature of the beastie.

Macfarlane’s piece, drawn from his forthcoming book Landmarks, was inspired by the discovery that a great dictionary for kids, the Oxford Junior Dictionary, is being pruned of words describing the stuff of the natural world. Being inserted in their place are words describing the abstractions and symbols of the digital and bureaucratic spheres:

Under pressure, Oxford University Press revealed a list of the entries it no longer felt to be relevant to a modern-day childhood. The deletions included acorn, adder, ash, beech, bluebell, buttercup, catkin, conker, cowslip, cygnet, dandelion, fern, hazel, heather, heron, ivy, kingfisher, lark, mistletoe, nectar, newt, otter, pasture, and willow. The words introduced to the new edition included attachment, block-graph, blog, broadband, bullet-point, celebrity, chatroom, committee, cut-and-paste, MP3 player, and voice-mail.

They yanked out bluebell and put in bullet-point? What shit-asses.

The substitutions made in the dictionary — the outdoor and the natural being displaced by the indoor and the virtual — are a small but significant symptom of the simulated life we increasingly live. Children are now (and valuably) adept ecologists of the technoscape, with numerous terms for file types but few for different trees and creatures. A basic literacy of landscape is falling away up and down the ages.

As Macfarlane goes on to say, the changes in the dictionary don’t just testify to our weakening grasp on nature. Something else is being lost: “a kind of word magic, the power that certain terms possess to enchant our relations with nature and place.”

As the writer Henry Porter observed, the OUP deletions removed the “euphonious vocabulary of the natural world — words which do not simply label an object or action but in some mysterious and beautiful way become part of it.”

I’m sure that many will label Macfarlane and Porter “romantics.” I’ve begun to notice that romantic is replacing Luddite and nostalgist as the insult-of-choice deployed by techno-apologists to dismiss anyone with more expansive interests than their own. That, too, is telling. It’s always been a sin against progress to look backward. Now it’s also a sin against progress to look inward. And so, fading from sight and imagination alike, the world becomes ever vaguer to us — not mysterious but peripheral, its things unworthy even of being named. Who now would think of the wind as something that might be fucked?

Photo: Rick Cameron.

Smart-tech smackdown

thrones

Forget Mayweather and Pacquiao. Tonight in New York, Intelligence Squared is hosting a debate on the proposition “Smart Technology Is Making Us Dumb.” Arguing for the proposition will be Andrew Keen and I. Arguing against it will be Genevieve Bell and David Weinberger. The event is sold out, but you can watch the video here.

Theses in tweetform (fourth series)

sprout

[first series, 2012]

1. The complexity of the medium is inversely proportional to the eloquence of the message.

2. Hypertext is a more conservative medium than text.

3. The best medium for the nonlinear narrative is the linear page.

4. Twitter is a more ruminative medium than Facebook.

5. The introduction of digital tools has never improved the quality of an art form.

6. The returns on interactivity quickly turn negative.

7. In the material world, doing is knowing; in media, the opposite is often true.

8. Facebook’s profitability is directly tied to the shallowness of its members: hence its strategy.

9. Increasing the intelligence of a network tends to decrease the intelligence of those connected to it.

10. The one new art form spawned by the computer – the videogame – is the computer’s prisoner.

11. Personal correspondence grows less interesting as the speed of its delivery quickens.

12. Programmers are the unacknowledged legislators of the world.

13. The album cover turned out to be indispensable to popular music.

14. The pursuit of followers on Twitter is an occupation of the bourgeoisie.

15. Abundance of information breeds delusions of knowledge among the unwary.

16. No great work of literature could have been written in hypertext.

17. The philistine appears ideally suited to the role of cultural impresario online.

18. Television became more interesting when people started paying for it.

19. Instagram shows us what a world without art looks like.

20. Online conversation is to oral conversation as a mask is to a face.

[second series, 2013]

21. Recommendation engines are the best cure for hubris.

22. Vines would be better if they were one second shorter.

23. Hell is other selfies.

24. Twitter has revealed that brevity and verbosity are not always antonyms.

25. Personalized ads provide a running critique of artificial intelligence.

26. Who you are is what you do between notifications.

27. Online is to offline as a swimming pool is to a pond.

28. People in love leave the sparsest data trails.

29.  YouTube fan videos are the living fossils of the original web.

30. Mark Zuckerberg is the Grigory Potemkin of our time.

[third series, 2014]

31. Every point on the internet is a center of the internet.

32. On Twitter, one’s sense of solipsism intensifies as one’s follower count grows.

33. A thing contains infinitely more information than its image.

34. A book has many pages; an ebook has one page.

35. If a hard drive is a soul, the cloud is the oversoul.

36. A self-driving car is a contradiction in terms.

37. The essence of an event is the ghost in the recording.

38. A Snapchat message becomes legible as it vanishes.

39. When we turn on a GPS system, we become cargo.

40. Google searches us.

[fourth series]

41. Tools extend us; technology confines us.

42. People take facts as metaphors; computers take metaphors as facts.

43. We need not fear robots until robots fear us.

44. Programmers are ethicists in denial.

45. The dream of frictionlessness is a death wish.

46. A car without a steering wheel is comic; a car without a rearview mirror is tragic.

47. One feels lightest after one clears one’s browser cache.

48. The things of the world manifest themselves as either presence or absence.

49. Memory is the medium of absence; time is the medium of presence.

50. A bird resembles us most when it flies into a window.

Image: Sam-Cat.

What do robots do?

stamps

Yesterday I posted an excerpt from the start of Paul Goodman’s 1969 NYRB essay “Can Technology Be Humane?” Here’s another bit, equally relevant to our current situation, from later in the piece, when Goodman turns his attention to automation, robots, and what we today call “big data”:

In automating there is an analogous dilemma of how to cope with masses of people and get economies of scale, without losing the individual at great consequent human and economic cost. A question of immense importance for the immediate future is, Which functions should be automated or organized to use business machines, and which should not? This question also is not getting asked, and the present disposition is that the sky is the limit for extraction, refining, manufacturing, processing, packaging, transportation, clerical work, ticketing, transactions, information retrieval, recruitment, middle management, evaluation, diagnosis, instruction, and even research and invention. Whether the machines can do all these kinds of jobs and more is partly an empirical question, but it also partly depends on what is meant by doing a job. Very often, e.g., in college admissions, machines are acquired for putative economies (which do not eventuate); but the true reason is that an overgrown and overcentralized organization cannot be administered without them. The technology conceals the essential trouble, e.g., that there is no community of scholars and students are treated like things. The function is badly performed, and finally the system breaks down anyway. I doubt that enterprises in which interpersonal relations are important are suited to much programming.

But worse, what can happen is that the real function of the enterprise is subtly altered so that it is suitable for the mechanical system. (E.g., “information retrieval” is taken as an adequate replacement for critical scholarship.) Incommensurable factors, individual differences, the local context, the weighting of evidence are quietly overlooked though they may be of the essence. The system, with its subtly transformed purposes, seems to run very smoothly; it is productive, and it is more and more out of line with the nature of things and the real problems. Meantime it is geared in with other enterprises of society e.g., major public policy may depend on welfare or unemployment statistics which, as they are tabulated, are blind to the actual lives of poor families. In such a case, the particular system may not break down, the whole society may explode.

I need hardly point out that American society is peculiarly liable to the corruption of inauthenticity, busily producing phony products. It lives by public relations, abstract ideals, front politics, show-business communications, mandarin credentials. It is preeminently overtechnologized. And computer technologists especially suffer the euphoria of being in a new and rapidly expanding field. It is so astonishing that the robot can do the job at all or seem to do it, that it is easy to blink at the fact that he is doing it badly or isn’t really doing quite that job.

Goodman here makes a crucial point that still gets overlooked in discussions of automation. Computers and people work in different ways. When any task is shifted from a person to a computer, therefore, the task changes in order to be made suitable for the computer. As the process of automation continues, the context in which the task is performed also changes, in order to be made amenable to automation. The enterprise changes, the school changes, the hospital changes, the household changes, the economy changes, the society changes. The temptation, all along the way, is to look to the computer to provide the measures by which we evaluate those changes, which ends up concealing rather than revealing the true and full nature of the changes. Goodman expresses the danger succinctly: “The system, with its subtly transformed purposes, seems to run very smoothly; it is productive, and it is more and more out of line with the nature of things and the real problems.”

Image: cutetape.

The prudent technologist

erector

Paul Goodman, 1969:

Whether or not it draws on new scientific research, technology is a branch of moral philosophy, not of science. It aims at prudent goods for the commonweal and to provide efficient means for these goods. At present, however, “scientific technology” occupies a bastard position in the universities, in funding, and in the public mind. It is half tied to the theoretical sciences and half treated as mere know-how for political and commercial purposes. It has no principles of its own. To remedy this—so Karl Jaspers in Europe and Robert Hutchins in America have urged—technology must have its proper place on the faculty as a learned profession important in modern society, along with medicine, law, the humanities, and natural philosophy, learning from them and having something to teach them. As a moral philosopher, a technician should be able to criticize the programs given him to implement. As a professional in a community of learned professionals, a technologist must have a different kind of training and develop a different character than we see at present among technicians and engineers. He should know something of the social sciences, law, the fine arts, and medicine, as well as relevant natural sciences.

Prudence is foresight, caution, utility. Thus it is up to the technologists, not to regulatory agencies of the government, to provide for safety and to think about remote effects. This is what Ralph Nader is saying and Rachel Carson used to ask. An important aspect of caution is flexibility, to avoid the pyramiding catastrophe that occurs when something goes wrong in interlocking technologies, as in urban power failures. Naturally, to take responsibility for such things often requires standing up to the front office and urban politicians, and technologists must organize themselves in order to have power to do it.