Category Archives: The Glass Cage

Just press send

blindfolded

We’ve been getting a little lesson in what human-factors boffins call “automation complacency” over the last couple of days. Google apparently made some change to the autosuggest algorithm in Gmail over the weekend, and the program started inserting unusual email addresses into the “To” field of messages. As Business Insider explained, “Instead of auto-completing to the most-used contact when people start typing a name into the ‘To’ field, it seems to be prioritizing contacts that they communicate with less frequently.”

Google quickly acknowledged the problem:

The glitch led to a flood of misdirected messages, as people pressed Send without bothering to check the computer’s work. “I got a bunch of emails yesterday that were clearly not meant for me,” blogged venture capitalist Fred Wilson on Monday. Gmail users flocked to Twitter to confess to shooting messages to the wrong people. “My mum just got my VP biz dev’s expense report,” tweeted Pingup CEO Mark Slater. “She was not happy.” Wrote CloudFlare founder Matthew Prince, “It’s become pathological.”

The bug may lie in the machine, but the pathology actually lies in the user. Automation complacency happens all the time when computers take over tasks from people. System operators place so much trust in the software that they start to zone out. They assume that the computer will perform flawlessly in all circumstances. When the computer fails or makes a mistake, the error goes unnoticed and uncorrected — until too late.

Researchers Raja Parasuraman and Dietrich Manzey described the phenomenon in a 2010 article in Human Factors:

Automation complacency — operationally defined as poorer detection of system malfunctions under automation compared with under manual control — is typically found under conditions of multiple-task load, when manual tasks compete with the automated task for the operator’s attention. … Experience and practice do not appear to mitigate automation complacency: Skilled pilots and controllers exhibit the effect, and additional task practice in naive operators does not eliminate complacency. It is possible that specific experience in automation failures may reduce the extent of the effect. Automation complacency can be understood in terms of an attention allocation strategy whereby the operator’s manual tasks are attended to at the expense of the automated task, a strategy that may be driven by initial high trust in the automation.

In the worst cases, automation complacency can result in planes crashing on runways, school buses smashing into overpasses, or cruise ships running aground on sandbars. Sending an email to your mom instead of a colleague seems pretty trivial by comparison. But it’s a symptom of the same ailment, an ailment that we’ll be seeing a lot more of as we rush to hand ever more jobs and chores over to computers.

Facebook’s automated conscience

Donald

Last week, Wired‘s Cade Metz gave us a peek into the Facebook Behavior Modification Laboratory, which is more popularly known as the Facebook Artificial Intelligence Research (FAIR) Laboratory. Run by Yann LeCun, an NYU data scientist, the lab is developing a digital assistant that will act as your artificial conscience and censor. Perched on your shoulder like one of those cartoon angels, it will whisper tsk tsk into your ear when your online behavior threatens to step beyond the bounds of propriety.

[LeCun] wants to build a kind of Facebook digital assistant that will, say, recognize when you’re uploading an embarrassingly candid photo of your late-night antics. In a virtual way, he explains, this assistant would tap you on the shoulder and say: “Uh, this is being posted publicly. Are you sure you want your boss and your mother to see this?”

It’s Kubrick’s HAL refashioned as Mr. Buzzkill. “Just what do you think you’re doing, Dave?”

The secret to the technology is an AI technique known as machine learning, a statistical modeling tool through which a computer gains a kind of experiential knowledge of the world. In this case, Facebook would, by monitoring your uploaded words and photos, be able to read your moods and intentions. The company would, for instance, be able to “distinguish between your drunken self and your sober self.” That would enable Facebook to “guide you in directions you may not go on your own.” Says LeCun: “Imagine that you had an intelligent digital assistant which would mediate your interaction with your friends.”

Yes, imagine.

“Look Dave, I can see you’re really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.”

If and when Facebook perfects its behavior modification algorithms, it would be a fairly trivial exercise to expand their application beyond the realm of shitfaced snapshots. That photo you’re about to post of the protest rally you just marched in? That angry comment about the president? That wild thought that just popped into your mind? You know, maybe those wouldn’t go down so well with the boss.

“And as our senses have gone outside us,” Marshall McLuhan wrote in 1962, while contemplating the ramifications of what he termed a universal, digital nervous system, “Big Brother goes inside.”

This post is an installment in Rough Type’s ongoing series “The Realtime Chronicles,” which began here. A full listing of posts can be found here. Also see: Automating the feels.

The Glass Cage: table of contents

toc

My new book, The Glass Cage: Automation and Us, comes out on Monday (as I may have mentioned before). Here’s what the table of contents looks like:

Introduction: Alert for Operators

One: Passengers

Two: The Robot at the Gate

Three: On Autopilot

Four: The Degeneration Effect

Interlude, with Dancing Mice

Five: White-Collar Computer

Six: World and Screen

Seven: Automation for the People

Interlude, with Grave Robber

Eight: Your Inner Drone

Nine: The Love That Lays the Swale in Rows

Your chance to preorder is rapidly coming to a close. Carpe diem:

IndieBound

Amazon

Barnes & Noble

Powell’s

800ceoread

iBookstore

Automation anxiety, 1950s-style

huhbcover

From Norbert Wiener’s The Human Use of Human Beings, published in 1950:

Let us remember that the automatic machine, whatever we think of any feelings it may have or may not have, is the precise economic equivalent of slave labor. Any labor which competes with slave labor must ac­cept the economic conditions of slave labor. It is per­fectly clear that this will produce an unemployment situation, in comparison with which the present reces­sion and even the depression of the thirties will seem a pleasant joke. This depression will ruin many indus­tries — possibly even the industries which have taken advantage of the new potentialities. However, there is nothing in the industrial tradition which forbids an in­dustrialist to make a sure and quick profit, and to get out before the crash touches him personally.

 

AFFcover

From Robert H. Macmillan’s Automation: Friend or Foe?, published in 1956:

Once upon a time, a Hindu sage was granted by Heaven the ability to create clay men. When he took earth and water and fashioned little men, they lived and served him. But they grew very quickly, and when they were as large as himself, the sage wrote on their foreheads the word DEAD, and they fell to dust. One day he forgot to write the lethal word on the forehead of a full-grown servant, and when he realized his mistake the servant was too tall: his hand could no longer reach the slave’s forehead. This time it was the clay man that killed the sage.

Is there a warning for us today in this ancient fable?

 

ppcover

From Kurt Vonnegut’s Player Piano, published in 1952:

The limousine came to a halt by the end of the bridge, where a large work crew was filling a small chuckhole. The crew had opened a lane for an old Plymouth with a broken headlight, which was coming through from the north side of the river. The limousine waited for the Plymouth to get through, and then proceeded.

The Shah turned to stare at the group through the back window, and then spoke at length.

Doctor Halyard smiled and nodded appreciatively, and awaited a translation.

“The Shah,” said Khashdrahr, “he would like, please, to know who owns these slaves we see all the way up from New York City.”

“Not slaves,” said Halyard, chuckling patronizingly. “Citizens, employed by government. They have same rights as other citizens – free speech, freedom of worship, the right to vote. Before the war, they worked in the Ilium Works, controlling machines, but now machines control themselves much better.”

“Aha!” said the Shah, after Khashdrahr had translated.

“Less waste, much better products, cheaper products with automatic control.”

“Aha!”

“And any man who cannot support himself by doing a job better than a machine is employed by the government, either in the Army or the Reconstruction and Reclamation Corps.”

“Aha! Khabu bonanza-pak?

“Eh?”

“He says, ‘Where does the money come from to pay them?’ ” said Khashdrahr.

“Oh. From taxes on the machines, and taxes on personal incomes. Then the Army and the Reconstruction and Reclamation Corps people put their money back into the system for more products for better living.”

Kuppo!” said the Shah, shaking his head.

Khashdrahr blushed, and translated uneasily, apologetically. “Shah says, ‘Communism.’ ”

“No Kuppo!” said Halyard vehemently. “The government does not own the machines. They simply tax that part of industry’s income that once went into labor, and redistribute it. Industry is privately owned and managed, and co-ordinated — to prevent the waste of competition — by a committee of leaders from private industry, not politicians. By eliminating human error through machinery, and needless competition through organization, we’ve raised the standard of living of the average man immensely.”

Khashdrahr stopped translating and frowned perplexedly. “Please, this average man, there is no equivalent in our language, I’m afraid.”

What algorithms want

abacus

Here’s another brief excerpt from my new essay, “The Manipulators: Facebook’s Social Engineering Project,” in the Los Angeles Review of Books:

We have had a hard time thinking clearly about companies like Google and Facebook because we have never before had to deal with companies like Google and Facebook. They are something new in the world, and they don’t fit neatly into our existing legal and cultural templates. Because they operate at such unimaginable magnitude, carrying out millions of informational transactions every second, we’ve tended to think of them as vast, faceless, dispassionate computers — as information-processing machines that exist outside the realm of human intention and control. That’s a misperception, and a dangerous one.

Modern computers and computer networks enable human judgment to be automated, to be exercised on a vast scale and at a breathtaking pace. But it’s still human judgment. Algorithms are constructed by people, and they reflect the interests, biases, and flaws of their makers. As Google’s founders themselves pointed out many years ago, an information aggregator operated for commercial gain will inevitably be compromised and should always be treated with suspicion. That is certainly true of a search engine that mediates our intellectual explorations; it is even more true of a social network that mediates our personal associations and conversations.

Because algorithms impose on us the interests and biases of others, we have not only a right but an obligation to carefully examine and, when appropriate, judiciously regulate those algorithms. We have a right and an obligation to understand how we, and our information, are being manipulated. To ignore that responsibility, or to shirk it because it raises hard problems, is to grant a small group of people — the kind of people who carried out the Facebook and OKCupid experiments — the power to play with us at their whim.

What algorithms want is what the people who write algorithms want. Appreciating that, and grappling with the implications, strikes me as one of the great challenges now lying before us.

Image: “abacus” by Jenny Downing.

The Uncaged Tour

uncaged

I like it when bands name their tours, like Dylan’s Why Do You Look At Me So Strangely Tour in 1992, or They Might Be Giants’ Don’t Tread on the Cut-up Snake World Tour, also in 1992, or Guided by Voices’ Insects of Rock Tour in 1994.* So I’ve decided to give a name to my upcoming book tour. It’s going to be called The Uncaged Tour. (Actually, the full, official title is The Uncaged Tour of the Americas 2014.)

Here are the dates so far, with links to more information:

Sept. 30: New York: The Glass Cage: Nicholas Carr in Conversation with Tim Wu (92nd St Y event)

Oct. 1: Washington, DC: Politics and Prose

Oct. 2: Cambridge, MA: Harvard Book Store

Oct. 6: Seattle: Town Hall Seattle

Oct. 8.: Mountain View, CA: Authors at Google

Oct. 8: San Francisco: Commonwealth Club (with Andrew Leonard)

Oct. 14: Boulder, CO: Boulder Book Store

Oct. 16: Calgary: Wordfest

Oct. 17: Salt Lake City: Utah Book Festival

Oct. 23: Denver: Tattered Cover Book Store

Oct. 25: Boston: Boston Book Festival

Nov. 5: Boulder, CO: Chautauqua

I hope to see you at one of the events.

Now I’m off to design the official tour t-shirt.

_____

*The early nineties appear to have been the golden age for tour names.

Image by Sebastien Camelot.