Monthly Archives: November 2010

Absorbing self-communion

Virginia Heffernan is not the first New York Times Magazine writer to tackle the topic of attention. A correspondent pointed me to another piece (published precisely 100 years ago today), which was titled “The Secret of Success – Intellectual Concentration.” It looks at “notable cases where men won fame and fortune through absorbing self-communion.” These fellows – they include “Edison, Keene, Pupin, Hewitt, Westinghouse and Gould” – seem to have been gifted with big, fat, wonky attention spans.

I particularly enjoyed the description of Edison’s ability to focus his attention:

When Edison, still a telegraph operator in Boston, was receiving or sending messages with a rapidity which at that time had never been surpassed [one can only imagine the vigor of the tweet stream Edison would have emitted! –editor], he began to wonder whether it might not be possible to send two messages each way at the same time through a telegraph wire. As he thought about this matter he became convinced that this seemingly impossible feat could be shown to be not impossible at all, but entirely practicable.

Finally Edison concentrated his mind upon the problem. He ate mechanically; he was almost unconscious of what he put into his mouth. He gave himself no sleep. He sat in his little laboratory as silent as a graven image. What was in his mind no man could tell. What he saw with his intensely concentrated mental vision he alone knew. But as a result of that concentration of mind he gave the world the quadruplex telegraph.

Some years later, one hot Summer day in the year 1878, Edison took a train at New York for the manufacturing village of Derby in the Housatonic Valley in Connecticut. With him was his friend George H. Barker, Professor of Chemistry at the University of Pennsylvania. Both were in high spirits, for the excursion seemed to them to be a sort of brief vacation – a pleasure trip.

But when Edison, on arriving at his destination, stood in front of a new electrical apparatus his whole manner changed instantly. He stood before that great piece of machinery which was creating, or at least capturing, electricity of great volume, utterly unconscious of his surroundings. There was no other intelligence in his eye than that which seemed to be reflected from the machine. If any one spoke to him he did not hear.

He was, in fact, under the complete domination of absolutely concentrated thought. He was another man – almost superman; and when later in the day his friend called to him, saying that it was time to take the return train, then only did Edison seem to awaken from what appeared to be almost a hypnotic or somnambulistic state. And then he said simply: “I think I have solved the secret of the divisibility of the electric current so that we may have the incandescent electric light.”

I love that “simply.”

It has recently become fashionable (as we swing to the sway of our new technologies) to denigrate solitary, deeply attentive thinking, the kind celebrated and symbolized by Rodin’s The Thinker. Ideas and inventions, we’re urged to believe, leap not from the head of the self-communing genius but from the whirl of “the network.” In fact, you need both – the lonely wizard and the teeming bazaar – as Edison’s life so clearly demonstrates. Edison certainly drew on the work and ideas of his predecessors and contemporaries, and his Menlo Park laboratory was by all accounts a noisy orgy room of intellectual cross-fertilization. But, like other deep thinkers, Edison had the ability to screen out the noise and focus his mind – and that capacity, half innate and half hard-won, was also essential to his creativity.

Newton stood on the shoulders of giants, but that doesn’t make Newton any less of a giant.

Yes, Virginia, there is attentiveness

Virginia Heffernan has a funny little column in this Sunday’s New York Times Magazine. She opens by pointing to Jonah Lehrer and me as examples of people who allegedly believe that, as she puts it, “everyone has an attention span” and “an attention span is a freestanding entity like a boxer’s reach, existing independently of any newspaper or chess game that might engage or repel it, and which might be measured by the psychologist’s equivalent of a tailor’s tape.” This is complete horseshit. Lehrer and I have different views of how the internet and other media influence attentiveness, but I certainly don’t believe that individual human beings have fixed and precisely measurable attention spans, and I’m pretty sure that Lehrer doesn’t believe that either. In fact, I can’t say I’ve come across anyone of any sentience who subscribes to such a naive notion. Of course attentiveness is situational, and of course it’s influenced by the activities one pursues – indeed, it’s the nature of that influence that concerns Lehrer, me, and the many other people who are interested in the cognitive effects of media and other technologies.

In trotting out the strawman of a fixed attention span, Heffernan obfuscates a whole array of interesting, complicated, and important questions. Central to those questions is the fact that “attentiveness” takes many forms. One can, for instance, be attentive to rapid-paced changes in the environment, a form of attentiveness characterized by quick, deliberate shifts in focus. As Lehrer and others have described, there is evidence that video gaming can enhance this kind of attentiveness. There is a very different form of attentiveness that involves filtering out, or ignoring, environmental stimuli in order to focus steadily on one thing – reading a long book, say, or repairing a watch. Our capacity for this kind of sustained attention is being eroded, I argue, by the streams of enticing info-bits pouring out of our networked gadgets. There are also differing degrees of control that we wield over our attention. Research by Clifford Nass and his associates at Stanford suggests that people who are heavy media multitaskers may be sacrificing their ability to distinguish important information from trivia – it all blurs together. And there are, as well, different sorts of distractions – those that can refresh our thinking and those that can short-circuit it.

We’re still a long way from understanding exactly how attention works in our minds, but we do know that the way we pay attention has a crucial effect on many of our most important mental processes, from the formation of memories to conceptual and critical thinking. As the psychology professor David Strayer puts it, “Attention is the holy grail. Everything that you’re conscious of, everything you let in, everything you remember and you forget, depends on it.” Heffernan is right to remind us that there is no one ideal form of attentiveness – that focusing too tightly can be as bad as focusing too loosely – but if she truly believes that “the dominant model” of discourse about attentiveness “ignores the content of activities in favor of a wonky span thought vaguely to be in the brain,” she hasn’t been paying attention.

UPDATE: Another view of attention spans,* also from today’s Sunday Times. And an older take.

*In the vernacular sense, meaning, roughly, “ability to sustain one’s concentration.”

FURTHER UPDATE: Rob Horning chimes in, smartly:

… unlike Heffernan, I see concentration rather than distraction as an act of cultural resistance.

The problem with reckoning with attention problems is not that it is ineffable but that it doesn’t correspond with an economic model that has us spending and replenishing some quantifiable supply of it. But the metaphors built into an “attention span” or “paying attention” or the “attention economy” imagine a scarce resource rather than a quality of consciousness, a mindfulness. It may be that the notion of an attention economy is a sort of self-fulfilling prophecy, bringing into being the problems its posits through the way it frames experience. It may not be constructive to regard attention as scarce or something that can be wasted and let those conceptions govern our relation to our consciousness. The metaphor of how we exert control over our focus may be more applicable, more politically useful in imagining an alternative to the utility-maximizing neoliberal self. The goal would then be not to maximize the amount of stuff we can pay attention to but instead an awareness that much of what nips at us is beneath our attention.

Privacy is relative

January 17, 2010: “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” -Eric Schmidt

November 10, 2010: “Google CEO Eric Schmidt announced the salary hike in a memo late Tuesday, a copy of which was obtained by Fortune. The memo was also leaked to Business Insider, which broke the news. Within hours, Google notified its staff that it had terminated the leaker, several sources told CNNMoney. A Google spokesman declined to comment on the issue, or on the memo.”

The unrevolution

“I am not a Communist,” declared the author-entrepreneur Steven Johnson in a recent column in the business section of the New York Times. Johnson made his disclaimer in the course of celebrating the creativity of “open networks,” the groups of volunteers who gather on the net to share ideas and produce digital goods of one stripe or another. Because they exist outside the marketplace and don’t operate in response to the profit motive, one might think that such collaboratives would represent a threat to traditional markets. After all, what could be more subversive to consumer capitalism than a mass movement of people working without pay to create free stuff for other people? But capitalists shouldn’t worry, says Johnson; they should rejoice. The innovations of the unpaid web-enabled masses may be “conceived in nonmarket environments,” but they ultimately create “new platforms” that “support commercial ventures.” What appears to excite Johnson is not the intrinsic value of volunteerism as an alternative to consumerism, but the way the net allows the efforts of volunteers to be turned into the raw material for profit-making ventures.

communist.jpg

Johnson’s view is typical of many of the web’s most enthusiastic promoters, the Corporate Communalists who feel compelled to distance themselves from, if not ignore entirely, the more radical implications of the trends they describe with starry-eyed avidity. In a new book with a Marx-tinged title, What’s Mine Is Yours, the business consultants Rachel Botsman and Roo Rogers begin by describing the onset of what sounds like an anti-market revolution. “The convergence of social networks, a renewed belief in the importance of community, pressing environmental concerns, and cost consciousness,” they write, “are moving us away from the old, top-heavy, centralized, and controlled forms of consumerism toward one of sharing, aggregation, openness, and cooperation.” Indeed, we are at a moment of transition from “the twentieth century of hyper-consumption,” during which “we were defined by credit, advertising, and what we owned,” to “the twenty-first century of Collaborative Consumption,” in which “we will be defined by reputation, by community, and by what we can access and how we share and what we give away.”

But, having raised the specter of an anti-consumerist explosion, Botsman and Rogers immediately defuse the revolution they herald. Like Johnson, they turn out to be more interested in the way online sharing feeds into profit-making ventures. “Perhaps what is most exciting about Collaborative Consumption,” they write, with charming naiveté, “is that it fulfills the hardened expectations on both sides of the socialist and capitalist ideological spectrum without being an ideology in itself.” In fact, “For the most part, the people participating in Collaborative Consumption are not Pollyannaish do-gooders and still very much believe in the principles of capitalist markets and self-interest … Collaborative Consumption is by no means antibusiness, antiproduct or anticonsumer.” Whew!

As Rob Horning notes in his review of the book, Botsman and Rogers are more interested in co-opting anti-consumerist energies than unleashing them. Economically speaking, they’re radical conservatives:

Were the emphasis of What’s Mine Is Yours strictly on giving things away, as opposed to reselling them or mediating the exchanges, it might have been a different sort of book, a far more utopian investigation into practical ways to shrink the consumer economy. It would have had to wrestle with the ramifications of advocating a steady-state economy in a society geared to rely on endless growth. But instead, the authors are more interested in the new crop of businesses that have sprung up to reorient some of the anti-capitalistic practices that have emerged online — file sharing, intellectual property theft, amateur samizdat distribution, gift economies, fluid activist groups that are easy to form and fund, and so on — and make them benign compliments [sic] to mainstream retail markets. Indeed, conspicuously absent from the book is any indication that any business entities would suffer if we all embraced the new consumerism, a gap that seems dictated by the book’s intended audience: the usual management-level types who consume business books.

A similar tension, between revolutionary rhetoric and counterrevolutionary message, runs through the popular “wikinomics” writings of Don Tapscott and Anthony D. Williams. In their new book, Macrowikinomics, they once again promote the net as, to quote from Tom Slee’s review, “a revolutionary force for change, carrying us to a radically different future.” And yet the blurbs on the back of the book come from a who’s who of big company CEOs. The revolution that Tapscott and Williams describe is one that bears, explicitly, the imprimatur of Davos billionaires. For them, too, the ultimate promise of open networks, of wikis, lies in providing new opportunities, or “platforms,” for profiteers. Slee notes some of the contradictions inherent in their argument:

On one side, Macrowikinomics exaggerates the political and economic possibilities of digital collaboration as well as the discontinuity between today’s digital culture and the activities of previous generations. On the other side, it ignores the unsavoury possibilities that seem to accompany each and every inspiring initiative on the Internet (every technology has its spam) and inspirational initiatives for change that take place away from the digital world. Most importantly, it does not register the corrosive effect of money (and particularly large amounts of money) on the social production and voluntary networked activity that they are so taken with.

What most characterizes today’s web revolutionaries is their rigorously apolitical and ahistorical perspectives – their fear of actually being revolutionary. To them, the technological upheaval of the web ends in a reinforcement of the status quo. There’s nothing wrong with that view, I suppose – these are all writers who court business audiences – but their writings do testify to just how far we’ve come from the idealism of the early days of cyberspace, when online communities were proudly uncommercial and the free exchanges of the web stood in opposition to what John Perry Barlow dismissively termed “the Industrial World.” By encouraging us to think of sharing as “collaborative consumption” and of our intellectual capacities as “cognitive surplus,” the technologies of the web now look like they will have, as their ultimate legacy, the spread of market forces into the most intimate spheres of human activity.

revolution.jpg

PS These are the first lolcats I’ve created. Pretty good, huh?