Two aphorisms and a few notes

Aphorism #1: To a man with a blog, everything looks like fodder.

Geert Lovink ends his 2006 essay Blogging, the nihilist impulse with this remarkable paragraph:

Can we talk of a “fear of media freedom”? It is too easy to say that there is freedom of speech and that blogs materialize this right. The aim of radical freedom, one could argue, is to create autonomy and overcome the dominance of media corporations and state control and to no longer be bothered by “their” channels. Most blogs show an opposite tendency. The obsession with news factoids borders [on] the extreme. Instead of selective appropriation, there is over-identification and straight out addiction, in particular to the speed of real-time reporting. Like Erich Fromm (author of Fear of Freedom), we could read this as “a psychological problem” because existing information is simply reproduced and in a public act of internalization. Lists of books that still have to be read, a common feature on blogs, lead in the same direction. According to Fromm, freedom has put us in an unbearable isolation. We thus feel anxious and powerless. Either we escape into new dependencies or realize a positive freedom that is based upon “the uniqueness and individuality of man”. “The right to express our thoughts means something only if we are able to have thoughts of our own.” The freedom from traditional media monopolies leads to new bondages, in this case to the blog paradigm, where there is little emphasis on positive freedom, on what to [do] with the overwhelming functionality and the void of the empty, white entry window. We do not hear enough about the tension between the individual self and the “community”, “swarms”, and “mobs” that are supposed to be part of the online environment. What we instead see happening on the software side are daily improvements of ever more sophisticated (quantitive) measuring and manipulation tools (in terms of inbound linking, traffic, climbing higher on the Google ladder, etc.). Isn’t the document that stands out the one that is not embedded in existing contexts? Doesn’t the truthness lie in the unlinkable?

From this perspective, the blogosphere, and indeed the entire link-denominated Web, is not a machine for exposing the truth but rather one for hiding it. For Google, and for its users, the unlinkable does not just lack value; it doesn’t exist. The overriding goal, for bloggers and other purveyors of online content, is the creation of the linkable, the link-worthy: that which will immediately attract approval or disapproval, that which is easily assimilated. Bloggers break the mass media bauble, then spend all day in the nursery playing with the shards. Lovink guotes Baudrillard: “If there was in the past an upward transcendence, there is today a downward one. This is, in a sense, the second Fall of Man Heidegger speaks of: the fall into banality, but this time without any possible redemption.”

A rephrasing: Does truth begin where the long tail ends?

Twitter is often referred to as a “micro-blogging platform,” but twittering seems more like antiblogging, or at least an escape – retreat? – from blogging. Blogging is the soapbox in the park, the shout in the street; Twitter is the whispering of a clique. You can easily see why it’s compelling, but you can just as easily see its essential creepiness. (At least it’s up-front about its creepiness, using the term “follower” in place of the popular euphemism “friend.”)

Aphorism #2: To a man with a Twitter account, every action is a pretext.

What are you doing? is the question Twitter asks you to answer. But in the world of Twitter, there can be only one honest answer: I am twittering. Any other answer is a fib, a fabrication – a production.

As with other media of the self, Twitter makes the act subservient to its expression. It turns us into observers of our own lives, and not in the traditional sense of self-consciousness (watching with the inner eye) but in the mass media sense (watching with the eye of the producer). As the Observer Effect tells us, the act of observing the act changes the act. So how does Twitter warp the lives of twitterers? If truth lies in the unlinkable, does life lie in the untweetable?

Yet if Nietzsche’s typewriter pushed him further into the aphoristic mode and set the stage for some of his greatest works, might not Twitter be an empty cage awaiting its resident genius? It’s worth remembering, in any case, one of Nietzsche’s aphorisms: “Talking about oneself can also be a means to conceal oneself.” That’s a tweet worth twittering.

More food for thought

Reactions to my Atlantic essay continue to roll in. In today’s Globe and Mail, columnist Margaret Wente becomes the latest writer to fess up to an evaporating ability to read long works of prose:

Google has done wondrous things for my stock of general knowledge. It also seems to have destroyed my attention span. Like a flea with ADD, I jump back and forth from the Drudge Report to gardening sites that list the growing time of Green Zebras …

Thanks to Google, we’re all turning into mental fast-food junkies. Google has taught us to be skimmers, grabbing for news and insights on the fly. I skim books now too, even good ones. Once I think I’ve got the gist, I’ll skip to the next chapter or the next book. Forget the background, the history, the logical progression of an argument. Just give me the takeaway.

Meanwhile, on the BBC News site, Bill Thompson takes the discussion in an interesting new direction:

The Swiss developmental psychologist Jean Piaget described two processes that he believed lay behind the development of knowledge in children. The first is assimilation, where new knowledge fits into existing conceptual frameworks. More challenging is accommodation, where the framework itself is modified to include the new information.

The current generation of ‘search engines’ seem to encourage a model of exploration that is disposed towards assimilative learning, finding sources, references and documents which can be slotted into existing frameworks, rather than providing material for deeper contemplation of the sort that could provoke accommodation and the extension, revision or even abandonment of views, opinions or even whole belief systems.

Perhaps the real danger posed by screen-based technologies is not that they are rewiring our brains but that the collection of search engines, news feeds and social tools encourages us to link to, follow and read only that which we can easily assimilate.

Another interesting (and possibly related) psychological theory that I came across in researching the Atlantic article (but didn’t pursue) is that there are two very different modes of thought: exploration (finding new information) and exploitation (reflecting on or synthesizing information in order to come up with fresh ideas). It may be that the Net is increasing our incentives for exploration while decreasing our incentives for exploitation.

UPDATE: Also see Christine Rosen’s The Myth of Multitasking in the new issue of The New Atlantis.

Gains and losses

In a column about my Atlantic article in the Sunday Times (London), Andrew Sullivan draws on his personal experience as a prolific blogger to describe what the Web has given and what it has taken away:

In researching a topic [online], or just browsing through the blogosphere, the mind leaps and jumps and vaults from one source to another. The mental multitasking – a factoid here, a YouTube there, a link over there, an e-mail, an instant message, a new PDF – is both mind-boggling when you look at it from a distance and yet perfectly natural when you’re in mid-blog.

When it comes to sitting down and actually reading a multiple-page print-out, or even, God help us, a book, however, my mind seizes for a moment. After a paragraph, I’m ready for a new link. But the prose in front of my nose stretches on.

I get antsy. I skim the footnotes for the quick info high that I’m used to. No good. I scan the acknowledgments, hoping for a name I recognise. I start again.

A few paragraphs later, I reach for the laptop. It’s not that I cannot find the time for real reading, for a leisurely absorption of argument or narrative. It’s more that my mind has been conditioned to resist it.

Is this a new way of thinking? And will it affect the way we read and write? If blogging is corrosive, the same could be said for Grand Theft Auto, texting and Facebook messaging, on which a younger generation is currently being reared. But the answer is surely yes – and in ways we do not yet fully understand. What we may be losing is quietness and depth in our literary and intellectual and spiritual lives.

Hofstadter on AI

Speaking of the Singularity – and how can you avoid it, really, these days? – Douglas Hofstadter, author of the classic Gödel, Escher, Bach as well as, more recently, I Am a Strange Loop, spots the misanthropy that lies beneath the sunny surfaces of the AI millennialists and many other techno-utopians:

Am I disappointed by the amount of progress in cognitive science and AI in the past 30 years or so? Not at all. To the contrary, I would have been extremely upset if we had come anywhere close to reaching human intelligence — it would have made me fear that our minds and souls were not deep. Reaching the goal of AI in just a few decades would have made me dramatically lose respect for humanity, and I certainly don’t want (and never wanted) that to happen …

Do I still believe it will happen someday? I can’t say for sure, but I suppose it will eventually, yes. I wouldn’t want to be around then, though. Such a world would be too alien for me. I prefer living in a world where computers are still very very stupid. And I get a huge kick out of laughing at the hilariously unpredictable inflexibility of the computer models of mental processes that my doctoral students and I co-design. It helps remind me of the immense subtlety and elusiveness of the human mind.

Indeed, I am very glad that we still have a very very long ways to go in our quest for AI. I think of this seemingly “pessimistic” view of mine as being in fact a profound kind of optimism, whereas the seemingly “optimistic” visions of Ray Kurzweil and others strike me as actually being a deeply pessimistic view of the nature of the human mind.

Another voice

Leonard Pitts Jr., the Pulitzer-Prize-winning columnist for the Miami Herald, admits that he, too, has “forgotten how to read”:

I do not mean that I have lost the ability to decode letters into words. I mean, rather, that I am finding it increasingly difficult to read deeply, to muster the focus and concentration necessary to wrestle any text longer than a paragraph or more intellectually demanding than a TV listing.

You’re talking to a fellow whose idea of fun has always been to retire to a quiet corner with a thick newspaper or a thicker book and disappear inside. But that has become progressively harder. More and more, I have to do my reading in short bursts; anything longer and I start drowsing over the page even though I’m not sleepy, or fidgeting about checking e-mail, visiting that favorite Web site, even though I did both just minutes ago.

He wonders:

In an era in which everyone has a truth and the means to fling it around the world, an era in which knowledge is increasingly broad but seldom deep, maybe that’s the ultimate act of sedition: to pick up a single book and read it.

I’m not sure it’s the ultimate act of sedition (it’s hard to compete with standing in front of a tank), but it does at this point seem a good deal more seditious than, say, writing a blog or dishing a tweet. Web 2.0, we may come to discover, is just the latest opiate of the masses. If Abbie Hoffman were alive and writing his book today, he’d probably title it, simply, Read This Book.

UPDATE: On the other side of the fence, Scott Rosenberg says that, despite years of web surfing, he hasn’t noticed any erosion in his ability to submerge himself in long-form writing. “When I do get the chance to sit back with a good book,” he writes, “I don’t feel any less absorbed than when I was a teenager plowing my way through a shelf of Tolstoy and Dostoyevsky.” Which goes to show (at the least) that we can expect the same kind of variations in brain function among individuals that we’d find with any other part of our anatomy. It also convinces me that, when the Singularity arrives, I want Scott’s brain uploaded into my noggin.

UPDATE: Meanwhile, Michael Agger offers a tutorial on how to write for the web (drawing on Jakob Nielsen’s research into the habits of the “selfish, lazy, and ruthless” online reader).

The multi-tasking virus

In an essay written for Tim Ferriss’s blog, Josh Waitzkin, the former chess champion who was the subject of the book and subsequent film Waiting for Bobby Fisher, writes of his recent experience in returning to his alma mater, Columbia, and sitting in on a class taught by Dennis Dalton, “the most important college professor of my life.” Dalton, writes Waitzkin,

was describing the satyagraha of Mahatma Gandhi, building the discussion around the Amritsar massacre in 1919, when British colonial soldiers opened fire on 10,000 unarmed Indian men, women and children trapped in Jallianwala Bagh Garden. For 39 years, Professor Dalton has been inspiring Columbia and Barnard students with his two semester political theory series that introduces undergrads to the ideas of Gandhi, Thoreau, Mill, Malcolm X, King, Plato, Lao Tzu. His lectures are about themes, connections between disparate minds, the powerful role of the individual in shaping our world. Dalton is a life changer, and this was one of his last lectures before retirement.

But it was the audience’s reaction that left an even greater impression on Waitzkin:

Over the course of a riveting 75-minute discussion of the birth of Gandhian non-violent activism, I found myself becoming increasingly distressed as I watched students cruising Facebook, checking out the NY Times, editing photo collections, texting, reading People Magazine, shopping for jeans, dresses, sweaters, and shoes on Ebay, Urban Outfitters and J. Crew, reorganizing their social calendars, emailing on Gmail and AOL, playing solitaire, doing homework for other classes, chatting on AIM, and buying tickets on Expedia (I made a list because of my disbelief). From my perspective in the back of the room, while Dalton vividly described desperate Indian mothers throwing their children into a deep well to escape the barrage of bullets, I noticed that a girl in front of me was putting her credit card information into Urban Outfitters.com. She had finally found her shoes!

When the class was over I rode the train home heartbroken, composing a letter to the students, which Dalton distributed the next day. Then I started investigating. Unfortunately, what I observed was not an isolated incident. Classrooms across America have been overrun by the multi-tasking virus. Teachers are bereft. This is the year that Facebook has taken residence in the national classroom. Students defend this trend by citing their generation’s enhanced ability to multi-task. Unfortunately, the human mind cannot, in fact, multi-task without drastically reducing the quality of our processing.

That minds wander is not news – “wandering” may well be the default setting for our brains – but the scale and intensity of it today do seem to be something new and remarkable.

Pages and “pages”

In reading some of the comments posted online about my Atlantic piece, I kept coming across references to the article being “four pages long.” At first I wondered, “Can’t these people count? The article is six pages long!” (OK, five pages if you exclude the illustration and titling.) Then I realized – duh! – that people were referring to the online version of the article, which indeed is divided into four “pages.” (Of course, a page of text on the web is an arbitrary construct, so knowing the number of web pages doesn’t actually tell you much about the length of a piece, but that’s another story.)

Anyway, it would be interesting to do a study of how the experience of reading a particular piece of writing varies depending on whether a person reads it in print or online. A couple of people have pointed to the inclusion of hyperlinks in the web version of my article as showing the superiority of the web as a medium for writing. I don’t buy that (even though I’m well aware of the value of links). These days, I share Jon Udell’s sense of relief in reading text without links. Jon writes: “Nick Carr’s essay in the current Atlantic Monthly crystallizes a lot of what I’ve been feeling for a couple of years about how our use of the Net is changing us. Not co-incidentally I read the essay in the printed magazine whose non-hypertextuality I experienced as a feature, not a bug.” Hyperlinks have a lot of utility, but they’re distractions as well, scattering concentration and, often, getting in the way of deep reading.

Now go click on that link and read the rest of Udell’s post.