You are an intimate data bundle

wristband

In case you thought I was kidding about Facebook’s forthcoming Oculus Networked Mood Ring, the BBC reports on the demonstration, at a big advertising fest in France, of a “smart bracelet that can read your emotions,” designed by Studio XO in London.

The telltale bangle, the style of which might best be described as hospital-patient chic, is part of the design shop’s “emotional technology platform,” called XOX, which enables companies “to track users’ emotional states, collect data and tailor services and experiences for both individuals and large audiences.” The system sounds nifty:

At the heart of the XOX Emotional Technology Platform are the XOX servers, [which provide] access to the audience’s intimate data. This is processed locally and available through an industry standard API. … The basic system includes specially designed ergonomic wristbands that are worn on the upper wrist. Intimate data is read via a number of wearable biometric sensors. This raw data is processed in real-time on the wristband before being transmitted to the XOX server via one of a number of XOX base transceiver units. … Intimate data bundles can be packaged up for our clients to enable them to better understand human emotion and an audience’s engagement with experiences, products and services.

This may be off-topic, but have you noticed that “iWatch” has a double meaning?

Promotional image from Studio XO.

4 Comments

Filed under Uncategorized

The platform is the conversation

facebooktable

“Prior to the Internet, the last technology that had any real effect on the way people sat down and talked together was the table,” observed Clay Shirky in a 2003 speech. One thing you can say about a table, as a social-networking technology, is that it doesn’t have an agenda. It is agnostic about the conversation that goes on around it. The table is a mute and neutral host; all the action occurs at its edges, where the people are. And so, when Shirky discussed the essence of online communication in his speech, he focused not on the role of the table but on the roles of the people around the table and in particular on the dynamic tension between the interests of the group and the interests of the group’s individual members.

Shirky’s reference to the table as a precursor to the Internet was a joking one, but, as the thrust of his speech made clear, it also reflected the prevailing sense of the Net’s role as a neutral host, or “platform,” for online social interaction. Here is Yochai Benkler discussing the rise of “social software” in his 2006 book The Wealth of Networks: “The design of the Internet itself is agnostic as among the social structures and relations it enables. At its technical core is a commitment to push all the detailed instantiations of human communications to the edges of the network — to the applications that run on the computers of users.” The Internet, in other words, is just a very, very big table. It convenes, but it doesn’t intervene. All the action occurs at its edges.

Benkler’s mistake, we can now see, lay in underestimating the Net’s capabilities and its commercial incentives. He missed the fact that, with cloud computing, the essential functionality of applications, along with the data they process, would move from “the computers of users” to the data centers of big Internet companies — from the edges to the center.

Here’s something else that Shirky said in that 2003 speech: “The normal experience of social software is failure. If you go into Yahoo groups and you map out the subscriptions, it is, unsurprisingly, a power law. There’s a small number of highly populated groups, a moderate number of moderately populated groups, and this long, flat tail of failure. And the failure is inevitably more than 50% of the total mailing lists in any category. So it’s not like a cake recipe. There’s nothing you can do to make it come out right every time.” What’s most interesting here, in retrospect, is the trivial role that Shirky attributes to Yahoo. People gather through Yahoo, but the company otherwise stays out of the picture. Yahoo  is just another table carved out of the larger table of the Internet. It’s a neutral platform that doesn’t involve itself in the social dynamics playing out along its edges. It, too, convenes but doesn’t intervene. It certainly doesn’t fiddle with the recipe. In Benkler’s work, as well, the corporate conveners — the Yahoos, Googles, MySpaces, Facebooks, etc. — are notable largely by their absence. For his thesis to hold, they need to be agnostic and relatively uninteresting players. They need to be tables.

The revelation that Facebook is something of a World Wide Skinner Box, routinely conducting behavioral-modification experiments on its unknowing members and then incorporating the results of those experiments into the algorithms that determine the shape of its members’ conversations, tells us how naive we were to look at social-networking platforms as high-tech versions of tables and to believe that the Net had a “commitment” to push social interaction to its edges. Facebook, and every other large social-networking and information-aggregation company, both convenes and intervenes. Indeed, it convenes in order to intervene. The platform is the conversation. To fully analyze online social dynamics, one has to attend not only to the tension between the group and the individual but between the platform and both the group and the individual. The problem is that whereas the group-individual tension is visible, the manipulations of the platform are invisible. With the publication of the Facebook study, the veil trembled. We all knew the veil was there — we all knew we were inside a Skinner Box — but suddenly we had to admit the fact.

“The platform is the conversation.” I intend that to be taken not as a literal fact — all of Facebook’s experiments and algorithmic tweaks may in the end have a trivial influence on people’s conversations and thoughts — but as a provocation. The old romantic “Internet,” to borrow Evgeny Morozov’s quotation marks, is dead and gone. The center held. The table has an agenda.

Image taken from the Facebook advertisement “Dinner.”

5 Comments

Filed under Uncategorized

Impure thoughts

rawmilkcheese

Alan Jacobs points to a wonderful passage in Claude Levi-Strauss’s Triste Tropiques:

In Martinique, I had visited rustic and neglected rum-distilleries where the equipment and the methods used had not changed since the eighteenth century. In Puerto Rico, on the other hand, in the factories of the company which enjoys a virtual monopoly over the whole of the sugar production, I was faced by a display of white enamel tanks and chromium piping. Yet the various kinds of Martinique rum, as I tasted them in front of ancient wooden vats thickly encrusted with waste matter, were mellow and scented, whereas those of Puerto Rico are coarse and harsh. We may suppose, then, that the subtlety of the Martinique rums is dependent on impurities the continuance of which is encouraged by the archaic method of production. To me, this contrast illustrates the paradox of civilization: its charms are due essentially to the various residues it carries along with it, although this does not absolve us of the obligation to purify the stream. By being doubly in the right, we are admitting our mistake. We are right to be rational and to try to increase our production and so keep manufacturing costs down. But we are also right to cherish those very imperfections we are endeavouring to eliminate. Social life consists in destroying that which gives it its savour.

Comments Alan: “The underlying philosophy of liberalism, and the consumer culture it generates, condensed into nine sentences.” I love the fact that he gave in to the urge to count the sentences. That seems so . . . Bacardian.

Image: Christina Hsu.

5 Comments

Filed under Uncategorized

Promoting human error

wheel

From a report on a prototype of a self-driving tractor-trailer developed by Daimler as part of its Mercedes-Benz Future Truck 2025 project:

For Daimler, the truck driver of the future looks something like this: He is seated in the cab of a semi, eyes on a tablet and hands resting in his lap …

The Daimler truck retains a steering wheel as a safety measure. This allows a driver to intervene for critical maneuvers …

The experience of guiding a self-driving truck is far less stressful than the vigilance required from a human to respond to traffic conditions. This means that drivers could have enough free time to speak with their families or employers, take care of paperwork or make travel plans …

“It’s strange at first,” said Hans Luft, who sat in the truck’s cab during the demonstration on Thursday. He waved his hands to show observers that he did not need them on the wheel, tapping at his tablet while taking advantage of the 45-degree swivel of his driver’s seat. “But you quickly learn to trust it and then it’s great.”

So you create an automated system that actively undermines the vigilance and situational awareness of the operator while at the same time relying on the operator to take control of the system for “critical maneuvers” in emergencies. This is a textbook case of automation design that borders on the criminally insane. And when an accident occurs — as it will — the crash will be blamed not on “stupid design” but on “human error.”

Image: Randy von Liski.

Comments Off

Filed under The Glass Cage

The soma cloud

soma

“The computer could program the media to determine the given messages a people should hear in terms of their overall needs, creating a total media experience absorbed and patterned by all the senses. … By such orchestrated interplay of all media, whole cultures could now be programmed in order to improve and stabilize their emotional climate.” —Marshall McLuhan, 1969

“The experiment manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed. This tested whether exposure to emotions led people to change their own posting behaviors, in particular whether exposure to emotional content led people to post content that was consistent with the exposure — thereby testing whether exposure to verbal affective expressions leads to similar verbal expressions, a form of emotional contagion.” —Kramer et al., 2014

“I’m excited to announce that we’ve agreed to acquire Oculus VR, the leader in virtual reality technology. … This is really a new communication platform. By feeling truly present, you can share unbounded spaces and experiences with the people in your life. Imagine sharing not just moments with your friends online, but entire experiences and adventures.” —Mark Zuckerberg, 2014

The strategy behind the Oculus acquisition has become much clearer to me over the last week. Haters gonna hate, worrywarts gonna worry, but I for one am looking forward to Facebook’s Oculus Rift experiments . Once the company is able to manipulate “entire experiences and adventures,” rather than just bits and pieces of text, the realtime engineering of a more harmonious and stabilized emotional climate may well become possible. I predict that the next great opportunity in wearables lies in finger-mountables — in particular, the Oculus Networked Mood Ring. We’ll all wear them, as essential Rift peripherals, and they’ll all change color simultaneously, depending on the setting that Zuck dials into the Facebook Soma Cloud.

I know, I know: this is all just blue-sky dreaming for now. But as the poet said, in dreams begin realities.

At least I think that’s what he said.

This post is an installment in Rough Type’s ongoing series “The Realtime Chronicles,” which began here. A full listing of posts can be found here.

Image: detail of cover of paperback edition of Brave New World.

4 Comments

Filed under Realtime

I feel measurably less emotional now

Une_leçon_clinique_à_la_Salpêtrière

Sheryl Sandberg, Facebook’s COO, responds to the uproar about the company’s clandestine psychological experiment on its members:

“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologize. We never meant to upset you.”

So an experiment designed to explore how the delivery of information can be programmed to manipulate people’s emotional states was just part of routine product-development testing? No worries. I apologize for getting upset.

Image: Detail of Andre Brouillet’s “Une Leçon Clinique à la Salpêtrière

1 Comment

Filed under Uncategorized

An android dreams of automation

beluga

Google’s Android guru, Sundar Pichai, provides a peek into the company’s conception of our automated future:

“Today, computing mainly automates things for you, but when we connect all these things, you can truly start assisting people in a more meaningful way,” Mr. Pichai said. He suggested a way for Android on people’s smartphones to interact with Android in their cars. “If I go and pick up my kids, it would be good for my car to be aware that my kids have entered the car and change the music to something that’s appropriate for them,” Mr. Pichai said.

What’s illuminating is not the triviality of Pichai’s scenario — that billions of dollars might be invested in developing a system that senses when your kids get in your car and then seamlessly cues up “Baby Beluga” — but what the urge to automate small, human interactions reveals about Pichai and his colleagues. With this offhand example, Pichai gives voice to Silicon Valley’s reigning assumption, which can be boiled down to this: Anything that can be automated should be automated. If it’s possible to program a computer to do something a person can do, then the computer should do it. That way, the person will be “freed up” to do something “more valuable.” Completely absent from this view is any sense of what it actually means to be a human being. Pichai doesn’t seem able to comprehend that the essence, and the joy, of parenting may actually lie in all the small, trivial gestures that parents make on behalf of or in concert with their kids — like picking out a song to play in the car. Intimacy is redefined as inefficiency.

I guess it’s no surprise that what Pichai expresses is a robot’s view of technology in general and automation in particular — mindless, witless, joyless; obsessed with productivity, oblivious to life’s everyday textures and pleasures. But it is telling. What should be automated is not what can be automated but what should be automated.

Image: “Communicating with the Beluga” by Bob.

33 Comments

Filed under The Glass Cage