Meet the new gatekeeper, worse than the old gatekeeper


We want to be freed of constraints, until our freedom from constraints reminds us of why we created the constraints in the first place.

Once upon a time — not so long ago, really — there was something called the mainstream media, and it employed lots of journalists and editors and fact-checkers to filter the news. We came to resent these “gatekeepers,” as we took to calling them, because they restricted what we read and saw. They were self-interested elites who, granted their hegemony through the accidents of markets, imposed their own values on the flow of information. They were anti-democratic. They turned us into a passive audience of media consumers.

And then the internet arrived, and the flood of information poured over the gates and swept away the gatekeepers. We celebrated our emancipation from filters, and we praised the democratization brought about by “new media.” The “people formerly known as the audience” had taken charge, proclaimed one herald of the new order, as he wagged his finger at the disempowered journalistic elites. “You were once (exclusively) the editors of the news, choosing what ran on the front page. Now we can edit the news, and our choices send items to our own front pages.”

“The means of media are now in the hands of the people,” declared another triumphalist:

So now anyone can control, create, market, distribute, find, and interact with anything they want. The barrier to entry to media is demolished. Media, always a one-way pipe, now becomes an open pool. . . . Whenever citizens can exercise control, they will. Today they are challenging and changing media — where bloggers now fact-check Dan Rather’s ass — but tomorrow they will challenge and change politics, government, marketing, and education as well. This isn’t just a media revolution, though that’s where we are seeing the impact first. This is a chain-reaction of revolutions. It has just begun.

And the pundits were right — the old media filters dissolved, and “we” took control — though the great disruption has not played out in quite the way they anticipated. The “open pool” of citizen-controlled media looks more and more like a cesspool, “our own front pages” are often filled with fake news, and the “chain-reaction of revolutions” has been guided in its chaotic course by ignorance, insults, and misinformation. Now, in a last turn of the wheel, we are demanding that the hackers who took down the old gatekeepers — the Facebooks, Googles, and Twitters of the world — become our new gatekeepers, even though it’s a role they abhor and are entirely unsuited to.

In place of the thoughtful if flawed judgments of human editors, we seem fated to have as our new filters the robotic routines of secret algorithms written by software programmers, supplemented by squads of contract reviewers following procedure manuals written by corporate lawyers along with more powerful tools for muting offensive speech and the voices of people we disagree with. Regress is more palatable when it goes by the name of progress.

One of the less remarked upon effects of our digital age is that it has provided us with an opportunity to relearn the humbling lessons of the past, to relive all the hopes, disappointments, and compromises of our forebears, albeit in a much speeded-up manner. History is a GIF loop.

Inside Amazon Books


Is real the new virtual? With that question in mind, I went undercover last month to review Amazon’s Seattle bookshop, the first in what’s shaping up to be a national chain, for MIT Technology Review.

It begins:

As I pull my phone from my pocket and start snapping pictures, I feel like a private eye, or even a secret agent. I’ve just walked into Amazon Books, the web giant’s flagship bricks-and-mortar bookstore in Seattle, but my intentions have little to do with shopping. I’m on a reconnaissance mission.

Like many authors, I have a love-hate relationship with Amazon. The love is transactional. Amazon sells about a third of all printed books purchased in the country, and some two-thirds of all ebooks. The hate is a form of mistrust. The company’s size gives it immense power, and it has at times acted like a predator, trying to dictate the terms of bookselling while showing contempt for the traditions of publishing. I’m not entirely sure whether Amazon wants to be my benefactor or my undertaker.

So here I am, behind frenemy lines, taking photographs of shelving. . . .

Read on.

Silicon Valley has our backs

The new road to serfdom — actually, it’s more like a hyperloop — runs right through Silicon Valley. From Tad Friend’s funny-scary profile of Y Combinator president Sam Altman:

The immediate challenge is that computers could put most of us out of work. Altman’s fix is YC Research’s Basic Income project, a five-year study, scheduled to begin in 2017, of an old idea that’s suddenly in vogue: giving everyone enough money to live on. …

The problems with the idea seem as basic as the promise: Why should people who don’t need a stipend get one, too? Won’t free money encourage indolence? And the math is staggering: if you gave each American twenty-four thousand dollars, the annual tab would run to nearly eight trillion dollars — more than double the federal tax revenue. However, Altman told me, “The thing most people get wrong is that if labor costs go to zero” — because smart robots have eaten all the jobs — “the cost of a great life comes way down. If we get fusion to work and electricity is free, then transportation is substantially cheaper, and the cost of electricity flows through to water and food. People pay a lot for a great education now, but you can become expert level on most things by looking at your phone. So, if an American family of four now requires seventy thousand dollars to be happy, which is the number you most often hear, then in ten to twenty years it could be an order of magnitude cheaper, with an error factor of 2x. Excluding the cost of housing, thirty-five hundred to fourteen thousand dollars could be all a family needs to enjoy a really good life.”

By “a really good life,” Altman means a virtual reality headset and an opioid prescription.

But if the idea of living off a small monthly PayPal stipend leaves you cold, you’ll be pleased to know that the Valley guys are also working on a much sunnier scenario for our future:

Many people in Silicon Valley have become obsessed with the simulation hypothesis, the argument that what we experience as reality is in fact fabricated in a computer; two tech billionaires have gone so far as to secretly engage scientists to work on breaking us out of the simulation.

The only downside I see with that plan is that, if we break out of the simulation, we’re not going to get the Mars colonies.

Terms of endearment, computer-generated

“As for living,” wrote the French symbolist Auguste Villiers de l’Isle-Adam in his 1890 play Axël, “our servants will do that for us.” Silicon Valley seems intent on giving the infamous remark a new, digital spin: “As for living, our computers will do that for us.”

The latest evidence is Allo, the new Google messaging app that uses artificial intelligence algorithms to generate replies on a user’s behalf. “If your friend sends you a photo of their pet,” Google explained when it launched the software two weeks ago, Allo’s “smart reply” feature will suggest a suitable response, such as “aww cute!” Tap it, and you’re done.

As Evan Selinger and Brett Frischmann pointed out, it’s like an autopilot for friendship.

The smart-reply system, which is built into the Pixel phones Google introduced yesterday, has been in the works for a while. Back in 2012, the company filed for a patent on the “automated generation of suggestions for personalized reactions in a social network.” In the application, Google pointed to birthdays and anniversaries as occasions when a person might want a machine to compose a congratulatory message to send to a friend. What with juggling Snapchat, Instagram, Facebook, and Twitter, who has time to pen a personal note anymore?

Some might point to Allo as yet another example of the trivialization of innovation. Now that the smartphone has become our all-purpose mediator of existence, Google is in a competitive war with rivals like Facebook, Apple, and Amazon to corner the market on human attention and agency. No feature is too trifling to exploit as a potential advantage.

But there’s something deeper going on here. Allo’s message-generation algorithm reveals, in its own small way, the strange view of personal relations that seems to hold sway in Silicon Valley. To the entrepreneurs and coders who run today’s massive social networks, our conversations are data streams. They can be tracked, parsed, and ultimately automated to enhance efficiency and remove kinks from the system.

We already use computers to converse, so the next logical step, in this view, is to use software to conduct the conversations themselves. By relying on an AI to compose our messages, we can optimize our productivity in managing our relationships. Call it the industrialization of affiliation.

Last year, in an online question-and-answer session, Facebook founder and CEO Mark Zuckerberg said that he thinks “there is a fundamental mathematical law underlying human social relationships that governs the balance of who and what we all care about.” Stripped to our essence, we humans are just aggregations of data, and it’s only a matter of time before information scientists discern the statistical pattern that defines our beings. At that point, we’ll all be perfectly programmable.

I expect most people would find such a pinched view of the human condition off-putting, if not repulsive. But as we continue to adapt to the digital processing of our thoughts and words, we may find ourselves embracing, without really thinking about it, the Silicon Valley ethos. We already consider it normal to respond to a friend’s message or photo with a quick click on a like button. Is it really such a leap to let a computer dash off a reply?

The German sociologist Theodor Adorno, in his prescient 1951 book Minima Moralia, warned of the dangers of allowing the values of the business world to creep into our personal lives. Behind the push to make communication more streamlined and efficient, he wrote, lies “an ideology for treating people as things.” Allo and its myriad kin would seem to bear out Adorno’s fears.

In its patent application, Google wrote that an “unstated protocol for behavior” often governs conversations between friends. What to a programmer might look like a formal protocol is actually something fuzzier yet much more meaningful: an expression of kindness, affection, care. It will be interesting to see whether we’ll come to draw a line between artificial intelligence and artificial emotion, or just take them as a package deal.

Those without substance suffer no wounds


An excerpt from “The Snapchat Candidate,” in Utopia Is Creepy:

Twice before in the last hundred years a new medium has transformed elections. In the 1920s, radio disembodied candidates, reducing them to voices. It also made national campaigns much more intimate. Politicians, used to bellowing at fairgrounds and train depots, found themselves talking to families in their homes. The blustery rhetoric that stirred big, partisan crowds came off as shrill and off-putting when piped into a living room or a kitchen. Gathered around their wireless sets, the public wanted an avuncular statesman, not a rabble rouser. With Franklin Roosevelt, master of the soothing fireside chat, the new medium found its ideal messenger.

In the 1960s, television gave candidates their bodies back, at least in two dimensions. With its jumpy cuts and pitiless close-ups, TV placed a stress on sound bites, good teeth, and an easy manner. Image became everything, as the line between politician and celebrity blurred. John Kennedy was the first successful candidate of the TV era, but it was Ronald Reagan and Bill Clinton who perfected the form. Born actors, they managed to project a down-home demeanor while also seeming bigger than life. They were made for television.

Today, with the public looking to their smartphones for news and entertainment, we’re at the start of the third technological transformation of modern electioneering. The presidential campaign is becoming just another social-media stream, its swift and shallow current intertwining with all the other streams that flow through people’s devices. This shift is changing the way politicians communicate with voters, altering the tone and content of political speech. But it’s doing more than that. It’s changing what the country wants and expects from its would-be leaders. If radio and TV required candidates to be nouns — to present themselves as stable, coherent figures — social media pushes them to be verbs, engines of activity. Authority and esteem don’t accumulate on social media; they have to be earned anew at each moment.

What’s important now is not so much image as personality. But, as the Trump phenomenon suggests, it’s a particular kind of personality that works best — one that’s big enough to grab the attention of the perpetually distracted but small enough to fit neatly into a thousand tiny media containers. It might best be described as a Snapchat personality. It bursts into focus at regular intervals without ever demanding steady concentration.

Facebook is not a media company


“We are a tech company, not a media company,” said Mark Zuckerberg in Rome on August 29, shortly after presenting the Pope with a toy drone. And Zuckerberg — l never thought I’d write this sentence — was right.

Media companies saw it differently. They responded to the Facebook CEO’s remark with a collective, peeved guffaw. At best Zuckerberg was being disingenuous; at worst he was lying. “Yes, Facebook is a media company,” wrote Recode. “Sorry, Mark Zuckerberg, but Facebook is definitely a media company,” wrote Fortune. “Facebook is a media company even though it says it’s not,” wrote Business Insider. “Facebook is totally a media company,” wrote Mashable. “Dude,” tweeted Slate chief Jacob Weisberg, “Facebook is a media company.”

The message could not have been clearer: Dammit, Zuck, you’ve got your hands all over our precious goods — our words, our pictures, our thoughts, our ads — so you better come clean and admit that you’re a media company now. You’re one of us.

That’s like telling the fox that, now that he’s entered the henhouse, he’s a farmer. The fox may be part of the agriculture business — he may at times deal in chickens — but the fox’s business is not agriculture.

And so it is with Facebook. Facebook is an automated data processing company that manages — brilliantly, by any technical standard — an extraordinarily complex network graph, one with well over a billion nodes. To an outsider, the nodes may look like persons or readers or consumers, and the data may look like news stories or photographs or advertisements. But to Facebook they’re just numbers, just the mathematical abstractions of graph theory. Facebook uses software algorithms to optimize data flows among the nodes on its graph in a way that produces a pattern of network activity that maximizes the flow of a certain kind of data (dollars) to one particular node (the one labeled “Facebook”). That’s its business. Everything else — the lobbying, the PR, the meetings with Popes — is window-dressing.

“The fox may at times deal in chickens, 
but the fox’s business is not agriculture.”

Facebook’s goal, and its ideal, is a thoroughly technical one: total automation. It wants to operate its social network entirely with computers. (If you want to know where Zuckerberg is coming from, remember that he has said he believes “there is a fundamental mathematical law underlying human social relationships.”) But the technology is not quite there yet. The abstract network has a real-life manifestation, and in real life there are still some subtle qualities of human common sense and judgment that lie beyond the ability of programmers to replicate in code. And so Facebook still has to rely on people to perform a small number of network-management functions, such as negotiating the terms of its relationships with certain important nodes (a prominent newspaper, say, or a big advertiser) or interpreting the real-world meaning of ambiguous data objects (is the headline on that news story serious or a joke? is the nudity in that photograph intended to titillate or to inform?).

The handoffs between humans and computers inevitably cause confusion, both within the company and outside it, as they introduce the haltingness and messiness of personal judgment into an unimaginably fast, standardized data-processing routine. But it’s important to recognize that, to Facebook, these occasional reversions to the human eye and mind, which at times entail the making of editorial judgments, are matters of exception management — and necessarily, due to the size of the network, peripheral and even contrary to the company’s real business.

Facebook’s software will get better at making distinctions — and we humans will, for better or worse, continue to adapt ourselves to the limitations of the software — but it’s naive to think that the company will, or even could, take on the editorial responsibilities of a media company. When there’s an outcry over some filtering or labeling miscue, whether it stems from a software error or a human bias, Facebook will make a show of fixing the problem and tweaking “the process” (as we’ve just seen with the imbroglio over the deletion of a harrowing Vietnam War photograph). But that’s still just exception management. Facebook’s scale precludes the kind of day-to-day editorial decision-making that characterizes media companies.

Does that mean that Facebook bears no responsibility for the workings of its software, or that its operations lie beyond public scrutiny? Absolutely not, on both counts. It means that both corporate responsibility and public scrutiny are going to take different forms for Facebook than they do for media businesses. The Court of Justice of the European Union, in its trenchant 2014 ruling on what’s come to be known, misleadingly, as the Right to Be Forgotten case, observed that companies like Facebook and Google — companies that “control” online information flows on a grand scale — are new kinds of businesses and need to be treated as such by the public and its institutions. The data-processing giants play a different role from that of tradional media companies like newspapers, but it’s a role that extends well beyond mere information distribution. They are not, as they like to pretend, just data pipelines. In filtering, sorting, and arranging information produced by others, whether media companies or individuals, a business like Facebook transforms that information into a new product. It manipulates the information to serve its own interests, and it does so on a scale far greater than anything we’ve seen before. Because the company doesn’t fit old molds, and because it keeps its data-processing protocols secret, it deserves particularly close and thoughtful scrutiny by the public. Labeling Facebook a media company does not illuminate what Facebook does; it obscures it.

“The pressing challenge for journalism companies
is to define what they are, not what Facebook is.”

As for news outlets, their demand that Facebook assume the identity and responsibility of a media company may feel good, but it’s going to accomplish nothing. As we’ve already seen, pointing out that Facebook still occasionally relies on people to make editorial judgments is not going to inspire Facebook to hire more editors; it’s going to inspire Facebook to redouble its efforts to automate those judgments, even if the price in the immediate term is more foul-ups. (The company has a lot of experience dealing with self-inflicted embarrassments; it has learned that the press and the public lose interest quickly.) Facebook, in short, will continue to be true to its calling as a technology company, a company in the lucrative business of large-scale data processing.

The pressing challenge for journalism companies is to define what they are, not what Facebook is. Together and individually, they’re going to have to decide precisely what kinds of nodes they want to be — or whether they want to be nodes at all. That’s not going to be easy. But if you hand the fox your chickens and tell him he must take proper care of them, you have only yourself to blame if you come round the next day and find a pile of feathers.

White Ocean Riot


Everything had been going swimmingly at this year’s Burning Man, the annual desert festival devoted to “decommodification” and “radical self-reliance,” reports social media specialist Becky Wicks in a GQ post:

I turned my head up to the giant shrimp rotating on the ceiling, and realised the end of it had been cleverly moulded into the shaft of a penis. Before I could voice this fact aloud however, I was being thrust a sippy cup full of champagne, the shrimp-penis was forgotten and I found myself bouncing with my new friend MacGyver on a trampoline, in my shimmering fairy costume and wings. “Life is soooooo fun!” we screamed into the dust clouds, as my champagne flew everywhere. “This is so good!” And it was.

Then the hooligans arrived. In a wink Burning Man turned into Occupy Burning Man. The target of the insurgents’ wrath was the White Ocean luxury camp, an air-conditioned “plug-and-play” enclave of the rich and beautiful bankrolled by the son of a Russian oil billionaire. (Radical self-reliance doesn’t come cheap these days.) In the middle of the night, the hooligans snuck past White Ocean’s security detail, raided the posh outpost, flooded it with water, glued the doors of its RVs shut, and cut its electrical lines. With no power, the refrigeration system shut down and the champagne lost its chill.

It was a class war in a classless commune and, as The Telegraph reported, symbolized a larger rift: “The big tensions that have been rubbing up against each other in the tech scene for decades erupted to the surface.”

On Facebook, White Ocean issued a plaintive message about the unpleasantness:

A very unfortunate and saddening event happened last night at White Ocean, something we thought would never be possible in OUR Burning Man utopia. A band of hooligans raided our camp, stole from us, pulled and sliced all of our electrical lines leaving us with no refrigeration and wasting our food and, glued our trailer doors shut, vandalized most of our camping infrastructure, dumped 200 gallons of potable water flooding our camp.

We immediately contacted authorities. Sheriffs came to our camp along with rangers to take our report.

Sad, yes, but it’s comforting to know that, even in utopia, authorities are on call.