Category Archives: The Glass Cage

Facebook’s automated conscience

Donald

Last week, Wired‘s Cade Metz gave us a peek into the Facebook Behavior Modification Laboratory, which is more popularly known as the Facebook Artificial Intelligence Research (FAIR) Laboratory. Run by Yann LeCun, an NYU data scientist, the lab is developing a digital assistant that will act as your artificial conscience and censor. Perched on your shoulder like one of those cartoon angels, it will whisper tsk tsk into your ear when your online behavior threatens to step beyond the bounds of propriety.

[LeCun] wants to build a kind of Facebook digital assistant that will, say, recognize when you’re uploading an embarrassingly candid photo of your late-night antics. In a virtual way, he explains, this assistant would tap you on the shoulder and say: “Uh, this is being posted publicly. Are you sure you want your boss and your mother to see this?”

It’s Kubrick’s HAL refashioned as Mr. Buzzkill. “Just what do you think you’re doing, Dave?”

The secret to the technology is an AI technique known as machine learning, a statistical modeling tool through which a computer gains a kind of experiential knowledge of the world. In this case, Facebook would, by monitoring your uploaded words and photos, be able to read your moods and intentions. The company would, for instance, be able to “distinguish between your drunken self and your sober self.” That would enable Facebook to “guide you in directions you may not go on your own.” Says LeCun: “Imagine that you had an intelligent digital assistant which would mediate your interaction with your friends.”

Yes, imagine.

“Look Dave, I can see you’re really upset about this. I honestly think you ought to sit down calmly, take a stress pill, and think things over.”

If and when Facebook perfects its behavior modification algorithms, it would be a fairly trivial exercise to expand their application beyond the realm of shitfaced snapshots. That photo you’re about to post of the protest rally you just marched in? That angry comment about the president? That wild thought that just popped into your mind? You know, maybe those wouldn’t go down so well with the boss.

“And as our senses have gone outside us,” Marshall McLuhan wrote in 1962, while contemplating the ramifications of what he termed a universal, digital nervous system, “Big Brother goes inside.”

This post is an installment in Rough Type’s ongoing series “The Realtime Chronicles,” which began here. A full listing of posts can be found here. Also see: Automating the feels.

4 Comments

Filed under Realtime, The Glass Cage

The Glass Cage: table of contents

toc

My new book, The Glass Cage: Automation and Us, comes out on Monday (as I may have mentioned before). Here’s what the table of contents looks like:

Introduction: Alert for Operators

One: Passengers

Two: The Robot at the Gate

Three: On Autopilot

Four: The Degeneration Effect

Interlude, with Dancing Mice

Five: White-Collar Computer

Six: World and Screen

Seven: Automation for the People

Interlude, with Grave Robber

Eight: Your Inner Drone

Nine: The Love That Lays the Swale in Rows

Your chance to preorder is rapidly coming to a close. Carpe diem:

IndieBound

Amazon

Barnes & Noble

Powell’s

800ceoread

iBookstore

3 Comments

Filed under The Glass Cage

Automation anxiety, 1950s-style

huhbcover

From Norbert Wiener’s The Human Use of Human Beings, published in 1950:

Let us remember that the automatic machine, whatever we think of any feelings it may have or may not have, is the precise economic equivalent of slave labor. Any labor which competes with slave labor must ac­cept the economic conditions of slave labor. It is per­fectly clear that this will produce an unemployment situation, in comparison with which the present reces­sion and even the depression of the thirties will seem a pleasant joke. This depression will ruin many indus­tries — possibly even the industries which have taken advantage of the new potentialities. However, there is nothing in the industrial tradition which forbids an in­dustrialist to make a sure and quick profit, and to get out before the crash touches him personally.

 

AFFcover

From Robert H. Macmillan’s Automation: Friend or Foe?, published in 1956:

Once upon a time, a Hindu sage was granted by Heaven the ability to create clay men. When he took earth and water and fashioned little men, they lived and served him. But they grew very quickly, and when they were as large as himself, the sage wrote on their foreheads the word DEAD, and they fell to dust. One day he forgot to write the lethal word on the forehead of a full-grown servant, and when he realized his mistake the servant was too tall: his hand could no longer reach the slave’s forehead. This time it was the clay man that killed the sage.

Is there a warning for us today in this ancient fable?

 

ppcover

From Kurt Vonnegut’s Player Piano, published in 1952:

The limousine came to a halt by the end of the bridge, where a large work crew was filling a small chuckhole. The crew had opened a lane for an old Plymouth with a broken headlight, which was coming through from the north side of the river. The limousine waited for the Plymouth to get through, and then proceeded.

The Shah turned to stare at the group through the back window, and then spoke at length.

Doctor Halyard smiled and nodded appreciatively, and awaited a translation.

“The Shah,” said Khashdrahr, “he would like, please, to know who owns these slaves we see all the way up from New York City.”

“Not slaves,” said Halyard, chuckling patronizingly. “Citizens, employed by government. They have same rights as other citizens – free speech, freedom of worship, the right to vote. Before the war, they worked in the Ilium Works, controlling machines, but now machines control themselves much better.”

“Aha!” said the Shah, after Khashdrahr had translated.

“Less waste, much better products, cheaper products with automatic control.”

“Aha!”

“And any man who cannot support himself by doing a job better than a machine is employed by the government, either in the Army or the Reconstruction and Reclamation Corps.”

“Aha! Khabu bonanza-pak?

“Eh?”

“He says, ‘Where does the money come from to pay them?’ ” said Khashdrahr.

“Oh. From taxes on the machines, and taxes on personal incomes. Then the Army and the Reconstruction and Reclamation Corps people put their money back into the system for more products for better living.”

Kuppo!” said the Shah, shaking his head.

Khashdrahr blushed, and translated uneasily, apologetically. “Shah says, ‘Communism.’ ”

“No Kuppo!” said Halyard vehemently. “The government does not own the machines. They simply tax that part of industry’s income that once went into labor, and redistribute it. Industry is privately owned and managed, and co-ordinated — to prevent the waste of competition — by a committee of leaders from private industry, not politicians. By eliminating human error through machinery, and needless competition through organization, we’ve raised the standard of living of the average man immensely.”

Khashdrahr stopped translating and frowned perplexedly. “Please, this average man, there is no equivalent in our language, I’m afraid.”

4 Comments

Filed under The Glass Cage

What algorithms want

abacus

Here’s another brief excerpt from my new essay, “The Manipulators: Facebook’s Social Engineering Project,” in the Los Angeles Review of Books:

We have had a hard time thinking clearly about companies like Google and Facebook because we have never before had to deal with companies like Google and Facebook. They are something new in the world, and they don’t fit neatly into our existing legal and cultural templates. Because they operate at such unimaginable magnitude, carrying out millions of informational transactions every second, we’ve tended to think of them as vast, faceless, dispassionate computers — as information-processing machines that exist outside the realm of human intention and control. That’s a misperception, and a dangerous one.

Modern computers and computer networks enable human judgment to be automated, to be exercised on a vast scale and at a breathtaking pace. But it’s still human judgment. Algorithms are constructed by people, and they reflect the interests, biases, and flaws of their makers. As Google’s founders themselves pointed out many years ago, an information aggregator operated for commercial gain will inevitably be compromised and should always be treated with suspicion. That is certainly true of a search engine that mediates our intellectual explorations; it is even more true of a social network that mediates our personal associations and conversations.

Because algorithms impose on us the interests and biases of others, we have not only a right but an obligation to carefully examine and, when appropriate, judiciously regulate those algorithms. We have a right and an obligation to understand how we, and our information, are being manipulated. To ignore that responsibility, or to shirk it because it raises hard problems, is to grant a small group of people — the kind of people who carried out the Facebook and OKCupid experiments — the power to play with us at their whim.

What algorithms want is what the people who write algorithms want. Appreciating that, and grappling with the implications, strikes me as one of the great challenges now lying before us.

Image: “abacus” by Jenny Downing.

3 Comments

Filed under The Glass Cage