Category Archives: Uncategorized

Smartness is a zero-sum game


In her article “The Internet of Way Too Many Things,” Allison Arleff reviews some of the exciting new products on display at Target’s trendy Open House store in San Francisco. There’s Leeo, a night light “that ‘listens’ for your smoke detector to go off and then calls your smartphone to let you know your house might be on fire.” There’s Whistle, a $100 doggie dongle that “attaches to your pet’s collar and allows you to set a daily activity goal customized to your dog’s age, breed and weight.” And there’s Mimo, a web-enabled onesie that monitors your baby’s “body position” during the night. “When Mimo is connected to other devices in your home and discerns that your baby is stirring,” reports Arleff, “the lights turn on, coffee begins brewing and some Baby Mozart starts playing on the stereo.”

Welcome to Peter Thiel’s “innovation desert.” You’ll die of thirst, but at least the mirages are amusing.

There’s something else going on here, though, something deeper than the production of trinkets for neurotics. Each of these products is an example of a defining trend of our networked age:  the outsourcing of common sense to gadgetry. A foundational level of human perception and competence is being mechanized through apps and online services. The more mediated our lives become, the more we rely on media to make sense of the world for us. We can’t even trust ourselves to take Rover for a walk. Continue reading

Dawn of the automatic age


It’s Labor Day. To mark the occasion, here’s a brief excerpt from The Glass Cage that describes the origins of automation after the Second World War:

The word automation entered the language only recently. It was first uttered in 1946, when engineers at the Ford Motor Company needed a new term to describe the latest machinery being installed on the company’s assembly lines. “Give us some more of that automatic business,” a Ford vice president reportedly said in a meeting. “Some more of that — that — ‘automation.’”

Ford’s plants were already famously mechanized, with sophisticated machines streamlining every job on the line. But factory hands still had to carry parts and subassemblies from one machine to the next. The workers still controlled the pace of production. The equipment installed in 1946 changed that. Machines took over the material-handling and conveyance functions, allowing the entire assembly process to proceed automatically. The alteration in work flow may not have seemed momentous to those on the factory floor. But it was. Control over a complex industrial process had shifted from worker to machine.

That the new Ford equipment arrived just after the end of the Second World War was no accident. It was during the war that modern automation technology took shape. When the Nazis began their bombing blitz against Great Britain in 1940, English and American scientists faced a challenge as daunting as it was pressing: How do you knock high-flying, fast-moving bombers out of the sky with heavy missiles fired from unwieldy antiaircraft guns on the ground? The mental calculations and physical adjustments required to aim a gun accurately — not at a plane’s current position but at its probable future position — were far too complicated for a soldier to perform with the speed necessary to get a shot off while a plane was still in range. The missile’s trajectory, the scientists saw, had to be computed by a calculating machine, using tracking data coming in from radar systems along with statistical projections of a plane’s course, and then the calculations had to be fed automatically into the gun’s aiming mechanism to guide the firing. The gun’s aim, moreover, had to be adjusted continually to account for the success or failure of previous shots.

As for the members of the gunnery crews, their work would have to change to accommodate the new generation of automated weapons. And change it did. Artillerymen soon found themselves sitting in front of screens in darkened trucks, selecting targets from radar displays. Their identities shifted along with their jobs. They were no longer seen “as soldiers,” writes one historian, but rather “as technicians reading and manipulating representations of the world.” Continue reading

The Donald and The Swarm

In a tweeted response to my Politico essay on social media’s influence on the 2016 campaign, The Atlantic‘s political editor, Yoni Appelbaum, suggests that cable TV, rather than social media, is the real driver of the Trump phenomenon. He offers a chart as backup:

I think Appelbaum may be mistaking effect for cause. Yes, Trump has dominated cable coverage over the last month. But Trump has dominated all coverage over the last month. Pull together the same chart for, say, print news or radio news — perhaps even for — and you’ll almost certainly see a similar picture. The news media is a swarm organism. No individual medium operates in isolation.

Raw measures of media coverage, in other words, reveal who’s getting covered, but they don’t say much about why coverage is playing out the way it is.  Continue reading

The coming of the Snapchat candidate


In Western politics, we no longer get to experience the fun of a revolution, but at least we get the occasional media revolution. In an essay out today at Politico, I argue that we’re at the start of the third big media makeover of modern political campaigns. First came radio in the twenties, then TV in the sixties. Now it’s social media that’s changing the tone and tenor of elections. The Donald may burn out soon, but the inability of both the press and his adversaries to make sense of the Trump campaign suggests that the rules have changed. The tidy narratives of TV campaigns are yesterday’s news.

Here’s a bit from the piece:

What’s important now is not so much image as personality. But, as the Trump phenomenon reveals, it’s only a particular kind of personality that works — one that’s big enough to grab the attention of the perpetually distracted but small enough to fit neatly into a thousand tiny media containers. It might best be described as a Snapchat personality. It bursts into focus at regular intervals without ever demanding steady concentration.

Social media favors the bitty over the meaty, the cutting over the considered. It also prizes emotionalism over reason. The more visceral the message, the more quickly it circulates and the longer it holds the darting public eye. In something of a return to the pre-radio days, the fiery populist now seems more desirable, more worthy of attention, than the cool wonk. It’s the crusty Bernie and the caustic Donald that get hearted and hash-tagged, friended and followed. Is it any wonder that “Feel the Bern” has become the rallying cry of the Sanders campaign?

Emotional appeals can be good for politics. They can spur civic involvement, even among the disenfranchised and disenchanted. And they can galvanize public attention, focusing it on injustices and abuses of power. An immediate emotional connection can, at best, deepen into a sustained engagement with the political process. But there’s a dark side to social media’s emotionalism. Trump’s popularity took off only after he demonized Mexican immigrants, playing to the public’s frustrations and fears. That’s the demagogue’s oldest tactic, and it worked. The Trump campaign may have qualities of farce, but it also suggests that a Snapchat candidate, passionate yet hollow, could be a perfect vessel for a cult of personality.

Here’s the rest.

Image: Dick Nixon reacts to the arrival of the TV era.

Logical fantasies

From Tim Hwang and Madeleine Clare Elish’s “The Mirage of the Marketplace,” in Slate:

When you open the Uber app as a rider, you see a map of your local pickup area, with little sedans around that appear to be drivers available for a request. While you might assume these reflect an accurate picture of market supply, the way drivers are configured in Uber’s marketplace can be misleading. According to Rosenblat and Stark, the presence of those virtual cars on the passenger’s screen does not necessarily reflect an accurate number of drivers who are physically present or their precise locations. Instead, these phantom cars are part of a “visual effect” that Uber uses to emphasize the proximity of drivers to passengers. Not surprisingly, the visual effect shows cars nearby, even when they might not actually exist. Demand, in this case, sees a simulated picture of supply. Whether you are a driver or a rider, the algorithm operating behind the curtain at Uber shows a through-the-looking-glass version of supply and demand.

From Edward Moore Geist’s “Is Artificial Intelligence Really an Existential Threat to Humanity?,” in the Bulletin of the Atomic Scientists: Continue reading

The new behavioralism


We live mythically, even the most rational among us. In the middle of a bromidic Q&A session on Facebook last month, Mark Zuckerberg fielded a question from the cosmologist Stephen Hawking:

I would like to know a unified theory of gravity and the other forces. Which of the big questions in science would you like to know the answer to and why?

Zuckerberg replied that he was “most interested in questions about people,” and he gave some examples of the questions about people that he found most interesting. “What will enable us to live forever?” was one. “How can we empower humans to learn a million times more?” was another.

He then divulged something interesting, if not unexpected, about his perception of the social world:

I’m also curious about whether there is a fundamental mathematical law underlying human social relationships that governs the balance of who and what we all care about. I bet there is.

Call it the Unified Theory of Love.

Zuckerberg’s answer underscores, yet again, what an odd choice we made when we picked a person to oversee the world’s predominant social network. We’ve placed our social lives in the hands of a maladroit young man who believes that human relations and affiliations can be reduced to equations.

The fault, dear Brutus, is not in our stars,
But in ourselves, that we are underlings.

What Brutus saw in stars, Zuckerberg sees in data. Both believe that human affairs are governed by fate.

It’s not hard to understand the source of Zuckerberg’s misperception. Human beings, like ants or chickens, share a certain bundle of tendencies, a certain nature, and if you analyze our behavior statistically that nature will evidence itself in mathematical regularities. Zuckerberg is hardly the first to confuse the measurement of a phenomenon with the cause of the phenomenon. If some amount of data reveals a pattern, then, surely, more data will reveal “a fundamental mathematical law.”

Zuckerberg’s belief that social relations are the output of a cosmic computer running a cosmic algorithm is more than just the self-serving fantasy of a man who has made a fortune by seeing people as nodes in a mathematical graph. It’s an expression, however extreme, of a new form of behavioralism that has recently come into vogue, pulled along in the slipstream of the excitement over “big data.”

From the mid-1950s to the mid-1960s, sociological thinking in the United States was dominated by the behavioralist school. Heirs of the earlier positivists, behavioralists believed that social structures and dynamics could only be understood through the rigorous, scientific analysis of hard data.* David Easton, a prominent University of Chicago political scientist, laid out the tenets of the movement in his 1962 article “The Current Meaning of ‘Behavioralism’ in Political Science”:

There are discoverable uniformities in political behavior. These can be expressed in generalizations or theories with explanatory and predictive value. … The validity of such generalizations must be testable in principle, by reference to relevant behavior. … Precision in recording of data and the statement of findings requires measurement and quantification.

The rise of behavioralism reflected a frustration with the perceived subjectivity of traditional modes of sociological and political inquiry, particularly historical analysis and philosophical speculation. History and philosophy, behavioralists believed, led only to ideological bickering, not to unbiased knowledge or reliable solutions to problems. But behavioralism also had technological origins. It was spurred by the post-war arrival of digital computers, machines that promised to open new horizons in the collection and analysis of data on human behavior. Objectivity would replace subjectivity. Technology would replace ideology.

Today’s neobehavioralism has also been inspired by advances in computer technology, particularly the establishment of vast databases of information on people’s behavior and the development of automated statistical techniques to parse the information. The MIT data scientist Alex Pentland, in his revealingly titled 2014 book Social Physics, offered something of a manifesto for the new behavioralism, using terms that, consciously or not, echoed what was heard in the early 60s:

We need to move beyond merely describing social structure to building a causal theory of social structure. Progress in developing this represents steps toward what [neuroscientist] David Marr called a computational theory of behavior: a mathematical explanation of why society reacts as it does and how these reactions may (or may not) solve human problems. … Such a theory could tie together mechanisms of social interactions with our newly acquired massive amounts of behavior data in order to engineer better social systems.

As with their predecessors, today’s neobehavioralists also view the scientific analysis of “big data” as a means of escaping subjective modes of sociological inquiry and the ideological baggage those modes often carry. “The importance of a science of social physics,” Pentland suggested, goes beyond “its utility in providing accurate, useful mathematical predictions.” It promises to provide “a language that is better than the old vocabulary of markets and classes, capital and production”:

Words such as “markets,” “political classes,” and “social movements” shape our thinking about the world. They are useful, of course, but they also represent overly simplistic thinking; they therefore limit our ability to think clearly and effectively. [Big data offers] a new set of concepts with which I believe we can more accurately discuss our world and plan the future.

Zuckerberg will lose his bet, and Pentland and the other neobehavioralists will not discover “a causal theory of social structure” that can be expressed in the pristine language of mathematics. Neobehavioralism will, like behavioralism before it, fall short of its lofty goals, even if it does provide valuable insights into social dynamics. Despite, or because of, their subjective messiness, history and philosophy will continue to play central roles in the exploration of what makes all of us tick. The end of ideology is not nigh.

But there is something that sets neobehavioralism apart from behavioralism. The collection of behavioral data today generates great commercial value along with its value in social research, and there’s an inevitable tension between the data’s scientific and commercial exploitation. That tension will shadow any attempt to, as Pentland put it, “engineer better social systems.” Better for whom, and by what measure? Even if no fundamental mathematical law of social relationships is in the offing, the ability to closely monitor and influence those relationships will continue to provide rich profit potential. One suspects that Zuckerberg’s dream of a Unified Theory of Love is inspired less by cupid than by cupidity.



*Though the two are related, behavioralism shouldn’t be confused with behaviorism, the psychological movement popular earlier in the twentieth century.

Image: Detail of Sodoma’s “Cupid in the Landscape.”

The end of corporate computing (10th anniversary edition)


Last week, in its quarterly earnings report, revealed for the first time how much money its cloud computing operation, Amazon Web Services, takes in. The numbers were impressive. AWS has become an $8 billion business, and its revenues continue to grow swiftly, nearly doubling in the most recent quarter from the same period last year. The unit’s profit margin — a surprisingly robust 21 percent — is vastly wider than that of the company’s retailing operation. Indeed, without AWS, Amazon would have lost a lot of money in the quarter instead of posting a narrow profit.

AWS’s results show how well established “the cloud” has become. Most personal computing these days relies on cloud services — lose your connection, and your computing device becomes pretty much useless — and businesses, too, are looking more and more to the cloud, rather than their own data centers, to fill their information technology needs. It’s easy to forget how quickly this epochal shift in the nature of computing has occurred. Just ten years ago, the term “cloud computing” was unknown, and the idea that computing would become a centrally managed utility service was considered laughable by many big IT companies and their customers. Back then, in 2005, I wrote an article for MIT’s Sloan Management Review titled “The End of Corporate Computing” in which I argued that computing was fated to become a utility, with big, central data centers feeding services to customers over the internet’s grid. (The article inspired my 2008 book The Big Switch.) I got plenty of things wrong in the article, but I think the ensuing ten years have shown that the piece was fundamentally on target in predicting the rise of what we now call the cloud. So here, to mark the tenth birthday of the article, is the full text of “The End of Corporate Computing.”

Something happened in the first years of the 20th century that would have seemed unthinkable just a few decades earlier: Manufacturers began to shut down and dismantle their waterwheels, steam engines and electric generators. Since the beginning of the Industrial Age, mills and factories had had no choice but to maintain private power plants to run their machinery — power generation was a seemingly intrinsic part of doing business — but as the new century dawned, an alternative was emerging. Dozens of fledgling electricity producers were erecting central generating stations and using a network of wires to distribute their power to distant customers. Manufacturers no longer had to run their own dynamos; they could simply buy the electricity they needed, as they required it, from the new suppliers. Power generation was being transformed from a corporate function into a utility.

Now, almost exactly a century later, history is repeating itself. The most important commercial development of the last 50 years — information technology — is undergoing a similar transformation. It, too, is beginning an inexorable shift from being an asset that companies own — in the form of computers, software and myriad related components —to being a service that they purchase from utility providers. Few in the business world have contemplated the full magnitude of this change or its far-reaching consequences. To date, popular discussions of utility computing have rarely progressed beyond a recitation of IT vendors’ marketing slogans, laden with opaque terms like “autonomic systems,” “server virtualization” and “service-oriented architecture” [1]. Rather than illuminate the future, such gobbledygook has only obscured it.

The prevailing rhetoric is, moreover, too conservative. It assumes that the existing model of IT supply and use — and the corporate data center that lies at its core — will endure. But that view is perilously short-sighted. The traditional model’s economic foundation is already crumbling, and it is unlikely to survive in the long run. As the earlier transformation of electricity supply suggests, IT’s shift from a fragmented capital asset to a centralized utility service will be a momentous one. It will overturn strategic and operating assumptions, alter industrial economics, upset markets, and pose daunting challenges to every user and vendor. The history of the commercial application of information technology has been characterized by astounding leaps, but nothing that has come before — not even the introduction of the personal computer or the opening of the Internet — will match the upheaval that lies just over the horizon. Continue reading