Big data and the limits of social engineering


The following review of Alex Pentland’s book Social Physics appeared originally, in a slightly different form, in MIT Technology Review.

In 1969, Playboy published a long, freewheeling interview with Marshall McLuhan in which the media theorist and sixties icon sketched a portrait of the future that was at once seductive and repellent. Noting the ability of digital computers to analyze data and communicate messages, McLuhan predicted that the machines eventually would be deployed to fine-tune society’s workings. “The computer can be used to direct a network of global thermostats to pattern life in ways that will optimize human awareness,” he said. “Already, it’s technologically feasible to employ the computer to program societies in beneficial ways.” He acknowledged that such centralized control raised the specter of “brainwashing, or far worse,” but he stressed that “the programming of societies could actually be conducted quite constructively and humanistically.”

The interview appeared when computers were used mainly for arcane scientific and industrial number-crunching. To most readers at the time, McLuhan’s words must have sounded far-fetched, if not nutty. Now, they seem prophetic. With smartphones ubiquitous, Facebook inescapable, and wearable computers proliferating, society is gaining a digital sensing system. People’s location and behavior are being tracked as they go through their days, and the resulting information is being transmitted instantaneously to vast server farms. Once we write the algorithms needed to parse all that “big data,” many sociologists and statisticians believe, we’ll be rewarded with a much deeper understanding of what makes society tick.

One of big data’s keenest advocates is Alex “Sandy” Pentland, a data scientist who, as the director of MIT’s Human Dynamics Laboratory, has long used computers to probe the dynamics of businesses and other organizations. In his brief but ambitious book, Social Physics, Pentland argues that our greatly expanded ability to gather behavioral data will allow scientists to develop “a causal theory of social structure” and ultimately establish “a mathematical explanation for why society reacts as it does” in all manner of circumstances. As the book’s title makes clear, Pentland thinks that the social world, no less than the material world, operates according to rules. There are “statistical regularities within human movement and communication,” he writes, and once we fully understand those regularities, we’ll discover “the basic mechanisms of social interactions.”

What has prevented us from deciphering society’s mathematical underpinnings up to now, Pentland believes, is a lack of empirical rigor in the social sciences. Unlike physicists, who can measure the movements of objects with great precision, sociologists have had to make do with fuzzy observations. They’ve had to work with rough and incomplete data sets drawn from small samples of the population, and they’ve had to rely on people’s notoriously flawed recollections of what they did, when they did it, and whom they did it with. Computer networks promise to remedy those shortcomings. Tapping into the streams of data that flow through gadgets, search engines, social media, and credit-card payment systems, scientists will be able to collect precise, real-time information on the behavior of millions, if not billions, of individuals. And because computers neither forget nor fib, the information will be reliable.

To illustrate what lies in store, Pentland describes a series of experiments that he and his associates have been conducting in the private sector. They go into a business and give each employee an electronic ID card, called a “sociometric badge,” that hangs from the neck and communicates with the badges worn by colleagues. Incorporating microphones, location sensors, and accelerometers, the badges monitor where people go and whom they talk with, taking note of their tone of voice and even their body language. The devices are able to measure not only the chains of communication and influence within an organization but also “personal energy levels” and traits such as “extraversion and empathy.” In one such study of a bank’s call center, the researchers discovered that productivity could be increased simply by tweaking the coffee-break schedule.

Pentland dubs this data-processing technique “reality mining,” and he suggests that similar kinds of information can be collected on a much broader scale by smartphones outfitted with specialized sensors and apps. Fed into statistical modeling programs, the data could reveal “how things such as ideas, decisions, mood, or the seasonal flu are spread in the community.”

The mathematical modeling of society is made possible, according to Pentland, by the innate tractability of human beings. We may think of ourselves as rational actors, in conscious control of our choices, but in reality most of what we do is reflexive. Our behavior is determined by our subliminal reactions to the influence of other people, particularly those in the various peer groups we belong to. “The power of social physics,” he writes, “comes from the fact that almost all of our day-to-day actions are habitual, based mostly on what we have learned from observing the behavior of others.” Once you map and measure all of a person’s social influences, you can develop a statistical model that predicts that person’s behavior, just as you can model the path a billiard ball will take after it strikes other balls.

Deciphering people’s behavior is only the first step. What really excites Pentland is the prospect of using digital media and related tools to change people’s behavior, to motivate groups and individuals to act in more productive and responsible ways. If people react predictably to social influences, then governments and businesses can use computers to develop and deliver carefully tailored incentives, such as messages of praise or small cash payments, to “tune” the flows of influence in a group and thereby modify the habits of its members. Beyond improving the efficiency of transit and health-care systems, Pentland suggests that group-based incentive programs can enhance the harmony and creativity of communities. “Our main insight,” he reports, “is that by targeting [an] individual’s peers, peer pressure can amplify the desired effect of a reward on the target individual.” Computers become, as McLuhan foresaw, civic thermostats. They not only register society’s state but bring it into line with some prescribed ideal. Both the tracking and the maintenance of the social order are automated.

Ultimately, Pentland argues, looking at people’s interactions through a mathematical lens will free us of time-worn notions about class and class struggle. Political and economic classes, he contends, are “oversimplified stereotypes of a fluid and overlapping matrix of peer groups.” Peer groups, unlike classes, are defined by “shared norms” rather than just “standard features such as income” or “their relationship to the means of production.” Armed with exhaustive information about individuals’ habits and associations, civic planners will be able to trace the full flow of influences that shape personal behavior. Abandoning general categories like “rich” and “poor” or “haves” and “have-nots,” we’ll be able to understand people as individuals—even if those individuals are no more than the sums of all the peer pressures and other social influences that affect them.

Replacing politics with programming might sound appealing, particularly given Washington’s paralysis. But there are good reasons to be nervous about this sort of social engineering. Most obvious are the privacy concerns raised by the collection of ever more intimate personal information. Pentland anticipates such criticism by arguing that public fears about privacy can be ameliorated through a “New Deal on Data” that gives people more control over the information collected about them. It’s hard, though, to imagine Internet companies agreeing to give up ownership of the behavioral information they hoard. The data are, after all, crucial to their competitive advantage and their profit.

Even if we assume that the privacy issues can be resolved, the idea of what Pentland calls “a data-driven society” remains problematical. Social physics is a variation on the theory of behavioralism that found favor in McLuhan’s day, and it suffers from the same limitations that doomed its predecessor. Defining social relations as a pattern of stimulus and response makes the math easier, but it ignores the deep, structural sources of social ills. Pentland may be right that people’s behavior is determined in large part by the social norms and influences exerted upon them by their peers, but what he fails to see is that those norms and influences are themselves shaped by history, politics, and economics, not to mention the ever-present forces of power and prejudice. People don’t have complete freedom in choosing their peer groups. Their choices are constrained by where they live, where they come from, how much money they have, and what they look like. A statistical model of society that ignores issues of class, that takes patterns of influence as givens rather than as historical contingencies, will tend to perpetuate existing social structures and dynamics. It will encourage us to optimize the status quo rather than challenge it.

Politics is messy because society is messy, not the other way around. Pentland does a commendable job in describing how better data can enhance social planning. But like other would-be social engineers he overreaches. Letting his enthusiasm get the better of him, he begins to take the metaphor of “social physics” literally, even as he acknowledges that influence-based mathematical models will always be reductive. “Because it does not try to capture internal cognitive processes,” he writes at one point, “social physics is inherently probabilistic, with an irreducible kernel of uncertainty caused by avoiding the generative nature of conscious human thought.” What big data can’t account for is what’s most unpredictable, and most interesting, about us.

The art of Instagram

Jacob Mikanowski has, in The Point, an invigorating essay on Instagram, that most civilized of social networks. It begins:

Of all the social networks, it’s the easiest, the simplest, the least full of harm. Let’s put it a different way. Facebook is Sauron. It’s also your mom’s couch, a yoga-center bulletin board, a school bus, a television tuned to every channel. Twitter is Grub Street, a press scrum, the crowd in front of a bar. Reddit is a tin-foil hat and a sewer. Snapchat is hover boards, Rock ’em Sock ’em Robots and Saturday morning cartoons. Instagram is a garden: curated, pruned, clean and pretty. It lets you be creative, but not too creative; communicate, but without saying too much. No embedding, no links—just photos, captions and hashtags. Elegant. Simple. Twenty-three filters. A crisp square around each frame.

Instagram, the charm of which seems fated to eventual extermination by Facebook, its charmless owner, is the only social network, other than, maybe, Tumblr, that Jane Austen might have liked. It is wrapped in unstated but almost ceremonious codes of conduct. It sometimes seems, as Mikanowski points out, like “an index of mores in the age of self-branding and self-surveillance.” And in its fleeting, multitudinous images we see, as in the bowl of a smoothie-making blender, a history of visual art. “Practically every photograph of nature on Instagram,” writes Mikanowski, by way of example, “stems in one way or another from the impact of the Romantic era.”

But if Instagram is inspired by what might be termed an artistic impulse — “that need to make life itself aesthetic, to ask, over and over ‘What will this look like in a square?'” — it also subverts that impulse, drains it of its upsetting, dislocating energies: filters it, frames it, captions it. In the busy world of Facebook, Instagram is the museum.

The music of mind-fracking


I have seen the future of music, and its name is ThinkEar.

A new audio gadget from, oddly enough, a Finnish oil company named Neste, ThinkEar is a set of “mind-controlled earphones” that will allow your brain to choose the songs you listen to without any input from your thumbs or other body parts. Let’s go to the press release:

The world is poised on the brink of a technological revolution; rapid progress in brain mapping technology means that the ability to control devices with our minds is no longer the stuff of science fiction. Neste’s ThinkEar earphones are a bold entertainment concept that offers thought-controlled personal audio.

If I had listened to Gary Numan instead of Gang of Four when I was growing up, I would have seen all this shit coming. I mean, the guy was already using an Amazon Echo in 1979:


Back to the press release:

Making full use of the latest developments in brain wearables, the earphone’s integrated 5 point EEG sensors are able to read your brainwaves while an integrated microcomputer translates them into interaction commands to navigate your audio content.

You know who had the nicest brain wearables? The Borg.


OK, so here’s where the press release reaches its climax:

Unlike other systems, the earphones are not tethered to any external device. [They] access your favorite cloud services directly.

Which means, of course, that the cloud services will also be able to access your brainwaves directly. (Interaction is not a one-way street.) And that’s where things get really cool — you might even say numanesque. Remember when I last wrote about the future of pop? It was a year ago when Google announced the shift of its Google Play Music service from the old paradigm of listener-selected music to the new paradigm of outsourced “activity-based” music. As Google explained:

At any moment in your day, Google Play Music has whatever you need music for — from working, to working out, to working it on the dance floor — and gives you curated radio stations to make whatever you’re doing better. Our team of music experts … crafts each station song by song so you don’t have to.

ThinkEar is the missing link in mind-free listening. With your ThinkEar EEG sensors in place, Google will be able to read your brainwaves, on a moment by moment basis, and serve up an engineered set of tunes perfectly geared to your mental state as well as your activity mode. Not only will you save enormous amounts of time that you would have wasted figuring out what songs you felt like listening to, but Google will be able to use its expertly crafted soundscapes to help keep your mental state within some optimal parameters.

Far-fetched? I don’t think so. It’s basically just Shazam in reverse. The music susses you.

The applications go well beyond music. Cloud services could, for instance, beam timely notifications or warnings to your ears based on what’s going on in your brain, either at the subconscious or the conscious level. Think of what Facebook could do with that kind of capability. And if Amazon melded ThinkEar with both Echo and Audible, it could automatically intervene in your thought processes by reading you inspiring passages from pertinent books, like, say, The Fountainhead.

Maybe it’s not so odd that an oil company would invent a set of mind-reading earbuds. Once the land is tapped out, the extraction industries are going to need a new target, and what could possibly be more lucrative than fracking the human brain?

Questioning Silicon Valley

Time magazine’s Rana Foroohar says my new book, Utopia Is Creepy, “punches a hole in Silicon Valley cultural hubris.” The book comes out on September 6, the day after Labor Day, but you can read an excerpt from the introduction at Aeon today.

“Computing is not about computers any more,” wrote Nicholas Negroponte of the Massachusetts Institute of Technology in his 1995 bestseller Being Digital. “It is about living.” By the turn of the century, Silicon Valley was selling more than gadgets and software: it was selling an ideology. The creed was set in the tradition of U.S. techno-utopianism, but with a digital twist. The Valley-ites were fierce materialists – what couldn’t be measured had no meaning – yet they loathed materiality. In their view, the problems of the world, from inefficiency and inequality to morbidity and mortality, emanated from the world’s physicality, from its embodiment in torpid, inflexible, decaying stuff. The panacea was virtuality – the reinvention and redemption of society in computer code. They would build us a new Eden not from atoms but from bits. All that is solid would melt into their network. We were expected to be grateful and, for the most part, we were.

Our craving for regeneration through virtuality is the latest expression of what Susan Sontag in On Photography described as “the American impatience with reality, the taste for activities whose instrumentality is a machine.” What we’ve always found hard to abide is that the world follows a script we didn’t write. We look to technology not only to manipulate nature but to possess it, to package it as a product that can be consumed by pressing a light switch or a gas pedal or a shutter button. We yearn to reprogram existence, and with the computer we have the best means yet. We would like to see this project as heroic, as a rebellion against the tyranny of an alien power. But it’s not that at all. It’s a project born of anxiety. Behind it lies a dread that the messy, atomic world will rebel against us. What Silicon Valley sells and we buy is not transcendence but withdrawal. The screen provides a refuge, a mediated world that is more predictable, more tractable, and above all safer than the recalcitrant world of things. We flock to the virtual because the real demands too much of us.

Read on.

Solitaire as symbol and synecdoche


“When a man is reduced to such a pass as playing cards by himself, he had better give up — or take to reading.” –Rawdon Crawley, The Card Player’s Manual, 1876

Big news out of the Googleplex today: the internet giant is offering a free solitaire game through its search engine and its mobile app. “When you search for ‘solitaire’ on Google,” goes the announcement on the company’s always breathless blog, “the familiar patience game may test yours!”

Pokémon Go, Candy Crush, Angry Birds, Farmville, Minesweeper, Space Invaders, Pong: computer games come and go, offering fleeting amusements before they turn stale.

But not solitaire. Solitaire endures.

Invented sometime in the eighteenth century, the single-player card game made a seamless leap to virtuality with the arrival of personal computers in the early 1980s. The gameplay was easy to program, and a deck of cards could be represented on even the most rudimentary of computer displays. Spectrum Holobyte’s Solitaire Royal became a huge hit when it was released in 1987. After Microsoft incorporated its own version of the game into the Windows operating system in 1990, solitaire quickly became the most used PC app of all time.

“Though on its face it might seem trivial, pointless, a terrible way to waste a beautiful afternoon, etc., solitaire has unquestionably transformed the way we live and work,” wrote Slate’s Josh Levin in 2008. “Computer solitaire propelled the revolution of personal computing, augured Microsoft’s monopolistic tendencies, and forever changed office culture.”

Google is late to the party, but it’s a party that will never end.

Microsoft had ulterior motives when it bundled solitaire into Windows — the game helped people learn how to use a mouse, and it kept them sitting in front of their Microsoft-powered computers like, to quote Iggy Pop, hypnotized chickens — and Google, too, is looking to accomplish something more than just injecting a little fun into our weary lives. “A minor move like putting games in search means that users – especially mobile users – will turn to the Google search app at a time when a lot of the information we need is available elsewhere on our devices,” reports TechCrunch.

It’s a devious game these companies play. We are but deuces in their decks.

Would it be too much of a stretch to suggest that solitaire is a perfect microcosm of personal computing, particularly now, in our social media age? In “The Psychology of Games,” a 2000 article in Psychology Review, Mark Griffiths pointed out that games are a “world-building activity.” They offer a respite from the demands of the real. “Freud was one of the first people to concentrate on the functions of playing games,” Griffiths wrote. “He speculated that game playing provided a temporary leave of absence from reality which reduced individual conflict and brought about a change from the passive to the active.” We love games because they “offer the illusion of control over destiny and circumstance.”

Solitaire, a game mixing skill and chance, also provides what psychologists call “intermittent reinforcement.” Every time a card is revealed, there is, for the player, the possibility of a reward. The suspense, and the yearning, is what makes the game so compelling, even addictive. “Basically,” wrote Griffiths, “people keep playing in the absence of a reward hoping that another reward is just around the corner.” Turning over an ace in solitaire is really no different from getting a like on Facebook or a retweet on Twitter. We crave such symbolic tokens of accomplishment, such sweet nothings.

Shuffle that deck again, Google. This time I’m going to be a winner.

“All that is solid would melt into their network”


It’s my longest, funniest book yet — granted, the competition was not exactly fierce on either count — and it is now printed, bound, and on its way to a bookstore near you. The title is Utopia Is Creepy . . . and Other Provocations, and the book collects my favorite posts published here at Rough Type since the blog launched in 2005, along with a selection of essays, aphorisms, and reviews that appeared over the same period. It also features a couple of new pieces, including one on transhumanism called “The Daedalus Mission.”

UIC hardcover

As I was pulling the collection together over the last year, I began to see it as an alternative history of recent times, from the founding of Facebook to the rise of @realDonaldTrump. It is, as well, a critique of Silicon Valley and its cultural powers and pretensions. Here’s a peek at the introduction:

UIC intro

Utopia Is Creepy is out on September 6. More information, including those all-important preorder links, can be found here.

Thanks to all who have read Rough Type over the years.

Art in an age of augmentation


“Instagram shows us what a world without art looks like.” –Theses in Tweetform, #19

Ricky D’Ambrose, in “Instagram and the Fantasy of of Mastery,” a mournful essay in The Nation, examines what he sees as a fundamental shift in aesthetics: “the transition from art, long vaunted as a special, and autonomous, area of sensuous intelligence, to creativity, to which art can only ever be superficially related.” Society’s love for the overlay, the template, the filter, is on the rise, inexorably it seems. In place of a personal style born of a mastery of technique, we have the instant application of a “look,” a set of easily recognizable visual tropes, usually borrowed either from an earlier artist’s style or from the output of an earlier creative technology, executed through a software routine. The McCabe & Mrs. Miller look. The Brownie 127 look. The Ms. Pac-Man look. Looks take the work, and the anxiety, out of art.

With looks, there is no time for squinting, no time for whatever is, or might be, inexplicable. A look—insofar as it has any resemblance to style at all—is a kind of instant style: quickly executed and dispatched, immediately understood, overcharged with incident. To say that a film, a photograph, a painting, or a room’s interior has a look is to assume a consensus about which parts of a nascent image are the most worthy of being parceled out and reproduced on a massive scale. It means making a claim about how familiar an image is, and how valuable it seems.

The shift from style to look is abetted by technology, in particular the infinite malleability of the digital artifact, but it seems to spring from a deeper source: our postmodern cultural exhaustion, with its attendant sense that fabrication is the defining quality of art and that all fabrications are equal in their fabricatedness. As the erstwhile taste-making class becomes ever more uncomfortable with the concept of taste, a concept now weighted with the deadly sins of elitism and privilege, the middlebrow becomes the new highbrow. The egalitarianism of the digital filter makes it a particularly attractive refuge for the antsy flâneur.

An insidious quality of the aesthetic of the look is, as D’Ambrose notes, its insatiable retrospective hunger. It gobbles up the past as well as the present. The very style that gave rise to a look comes to be seen as just another manifestation of the look: “One can now watch John Cassavetes’s A Woman Under the Influence just as one watches Joe Swanberg’s recent Happy Christmas: in quotation marks. (Both have ‘the 16-millimeter look.’) The look and its source become, in the mind of the viewer who knows the corresponding filter, identical.” The exercise of taste, like the exercise of creativity, becomes a matter of choosing the correct filter.

The phenomenon isn’t limited to the visual arts. Popular music also increasingly has a digitally constructed “look.” Writing is trickier, more resistant to programming than image or sound, but it’s not impossible to imagine a new breed of word processor able to apply a literary filter to a person’s words. A Poe filter. A Goethe filter. A Slouching Towards Bethlehem filter. Instagram for prose: surely somebody’s working on it.

Should augmented reality take off, we’ll be able to rid ourselves of artists and their demands once and for all. We’ll all be free to exercise our full, transformative creativity as observers and consumers, imposing a desired look on the world around us. Blink once for sepia-tinged. Blink twice for noir. Already there are earbuds in testing that allow you to tweak the sound of a concert you’re attending. They’re controlled by an app that includes, reports Motherboard, “a bunch of custom sound settings like ‘dirty country,’ ‘8-track,’ ‘Carnegie Hall,’ or ‘small studio.'” Sean Yeaton, of the band Parquet Courts, admitted “it could be cool to match your soundscape to your mood in mundane settings like the grocery store, but [he] balked at the idea of giving the audience control over the live sound at concerts. He pointed out that it would be pretty fucked up to go see Nine Inch Nails only to make it sound like Jefferson Starship.”

I guess your perspective depends on which side of the filter you happen to be on.