32 thoughts on “Read it, if you can

  1. Seth Finkelstein

    Good stuff Shelley (“In other words, we become less like the computers we use, as Nick presumes, and more like the rats in the lab box, pushing the lever that gives a sexual stimulus over the lever that gives food, because the short term gratification outweighs the longer term need.”)

    See also Dave Rogers:

    “I’m going to agree with Nick; and have this brief post perhaps serve as an example, though likely a bad one, for it will be brief mostly because I intend to go back to bed. But for now, I scratch where it itches, however briefly. I’m also going to agree with Shelley, and also with Seth, though not exactly the way Seth would see it.”

  2. Harald Felgner

    Excellent article, Nick, as always. Unfortunately I was not able to follow you beyond the 3rd paragraph – felt the urge to switch back to my Google Reader to not miss another important post ;)

    “I now have almost totally lost the ability to read and absorb a longish article on the web – AGREE

    or in print.” – DO NOT AGREE. I switch off my computer/ cell phone and devour a book/ magazine offline. In a street cafe, on the train or at home staying awake until 4 AM. This is a USP for classical publishing!

  3. michael webster

    I would agree that it is easy to spend too much time chasing down links, and thereby forgoing the pleasure of scholarly article/book reading.

    As with any new and radical technology, we have to adapt to having so many tempting choices.

    For some reason that is not apparent to me, it is difficult to read and follow serious work -unless in pdf form- on the web. I hope that changes, or perhaps it has for more serious scholars?

  4. alan

    Brilliant take on state of technology and the human being Nick! Your thoughtful yet friendly reading style certainly lets the reader drink in the intended pointers. I suspect that they will lay fallow as seeds for future considerations.

    “We still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition.”

    There has been lots of research done that unequivocally demonstrates the effects of modern technology upon cognition, perception and the social abilities, or inabilities, of the growing child. The excessive viewing of TV, the use of video games and now the pervasive connected-ness to cell phones and the internet has dramatically changed the way in which young people both learn and live. The imaginative forces have been greatly atrophied and that reality has forced the re-shaping of the educational process among many other things.

    “In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.”

    As you so rightly state, the medium of modern life has encouraged a monumental divide between the world of nature and the human being. The loss of that connectedness to the natural world is no small thing!

    “Still, their easy assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling.”

    That we might even think that that is a possibility is evidence that we are already losing our minds. Artificial intelligence is just that, artificial!

    “So, yes, you should be skeptical of my skepticism.”

    So, no, I am not skeptical of your skepticism, in fact three cheers for an insightful article.

    Regards, Alan

  5. Shelley

    Seth, your note about “technology-positive social criticism” was also spot on, particularly since I see that it is the usual line up of felons responding to Nick’s writing.

    I’m not sure that there is any technology-positive social criticism that will be acceptable, though. We’ve become rather primitive in our communications with each other. We have adapted filters that allow only the broadest communications through. We can’t just be neutral about a tech, we have to be either filled with loathing and hatred, or damn near asking the tech to go to bed with us. Subtle forms of criticism don’t seem to have a chance in filtering through.

    Perhaps this is also a by-product of our growing inability to focus on anything more than a few paragraphs at a time–to want everything simplified into sound and thought bites.

  6. Oran Kelley

    Have you or the other attention-challenged folks in your article considered that you may be burned out?

    Or that it’s just time to move on to some other way of making a living?

    Not that you are burned out on writing–perhaps you are just burned out on the work & thought that ought to go into it, and since you are pretty well situated, you can get away with not doing it.

    There’s something pretty persistently shallow in the Atlantic article. For instance, the way it keeps skipping from topic to topic–Google, how we read, Nietzsche’s typewriter, Taylorism, clocks–without giving any real serious consideration to them.

    For one thing, you can’t even demonstrate that the phenomenon that you are talking about actually exists, only that it could, in principle, exist. Your study of Internet reading habits doesn’t do much. We have to ask questions like “What would these people have been doing with their time in 1984? Reading a something in the New York Review, or watching TV? It could be that that skimming they’re doing now is a far more challenging intellectual activity than what they’d be doing absent the Internet.

    We read differently on the web? Sure. No surprise there.

    What we ought to try to do, though, is keep writing and researching well for print, rather than writing glorified blog posts which aren’t really a whole lot more than response bait.

  7. Edward Vielmetti

    do you know the condition “search engine dependency syndrome” ?

    http://vielmetti.typepad.com/vacuum/2005/06/search_engine_d.html

    from RISKS Digest

    Date: Mon, 9 May 2005 15:54:20 +0100

    From: “Peter G. Neumann”

    Subject: Search Engine Dependence Syndrome

    “We have allowed concepts from information technology to enter the cognitive consciousness of physicians without critical analysis of their impact.”

    Steven Merahn, MD, identifies Search Engine Dependence Syndrome as a neuropsychological disorder:

    1. The assumption/perception that computers are “smart”

    2. The task interference associated with competing problem-solving paradigms

    3. The loss or lack of development of critical thinking skills that comes with prolonged reliance on IT infrastructure

    http://www.cliniscience.com/objects/Cliniscience%20TEPR.pdf

  8. Kevin Arthur

    Excellent article Nick.

    Seth, to nitpick a little on terms: “technology-positive criticism” is too loaded a term (and may be putting the cart before the horse). I like “technology-realist” or “technorealist” better (to borrow a term from the short-lived project under that banner from about 10 years ago — see http://www.technorealism.org).

  9. Mas Nicolas

    Great article ! – It’s amazing how you describes it. I was raised without TV, just books.. and i could read a lot. Now, as entrepreneur and engineer, i am nearly drawning in the Information flow of the Internet. And i am ashamed i had to make an effort to concentrate and read all your four pages..

  10. Carpe Medium

    I’m a graduate student at the Missouri School of Journalism, but I think I can even speak on behalf of the professors here when I say that our curriculum — as well as that of countless other J-schools — is in shambles. Any academic administrator who says it isn’t is kidding himself.

    The reason, quite simply, is that no one can predict what the future of journalism will look like, for exactly the reasons you describe in your excellent, why-didn’t-I-write-that-first piece in the Atlantic. At 28, I am a first-hand witness to the fact that hardly anyone under the age of 30 has the attention span (or, dare I say, the intellectual curiosity about the world) to eat breakfast in the company of a print newspaper, let alone approach meaty periodicals like the Atlantic, Harper’s, or the New Yorker. Why, then, are my peers and I in training to write for an audience that might not exist in 20 or 30 years? I’m a good example: I began my master’s program here with a concentration in magazine writing, but after one of the more progressive professors began affectionately referring to me as an anachronism, I decided that for the sake of my career, I had better branch out into the unknown: online media. (Whatever that means.)

    The result is my fledgling website (carpemedium.net), where I’m hoping to introduce the 22-32 demographic to meaty stories by way of zippy text and an age-appropriate sensibility. Hopefully, they’ll go on to explore the topic further, by means of the hyperlinks sprinkled throughout.

    Unfortunately, an unintended consequence has emerged: the ego of the writer — i.e., his or her sparking byline perched above impressive blocks of text — comes second in the name of giving an important story the exposure it deserves.

    The site is Iess than a month old, so I haven’t yet figured out whether I’m wasting my time. Even if it succeeds, though, there’s still an underlying question that should disturb us all: with so few dedicated young readers out there, who will be our next great journalists?

  11. Guillaume Roques

    I’m wondering if it’s finally not just some kind of mind’s Darwinism, a sort of natural selection (of brains) where advancement in technology supply the mean for few of us to think differently, to invent new path towards the next singularity… Indeed, what is invention if not the ability to differentiate yourself from the herd? Big inventions reshaped the way we were thinking and our understanding of the word. It’s probably also because writing has been made broadly available that few people started to think beyond the “pensee unique” at that time. It’s probably because printing has been made “broadly” available that some people started to build on top of other and surpass the average thinking to bring us (the herd) to the next level of knowledge. “As we come to rely on computers to mediate our understanding of the world, it is our own (average) intelligence that flattens into artificial intelligence”. Yes that’s absolutely true and that’s why some new intelligence will emerge of the crowd to think out of the box and help us moving to the next stage of humanity/intelligence. Progressively as usual we will then decline (again) as we reach the critical mass of knowledge requiring new thinker to break the wall of thought!

  12. grizzly marmot

    For the reader (like yourself) when new technology is added to the mix it will disturb the way you read. For the writer (like Nietzsche) it disturbs the writing. For the person who job is to interact with other people (most of the service industries – lawyers, doctors etc) it introduces depersonalization.

  13. michael webster

    The negative reaction to the Atlantic article by various commentators is positive evidence for the thesis.

    A very young person, who had not seen the 7th grade, might understandably focus only on the title and then, on that limited basis, deride the conclusion.

    Doing so would provide evidence of that person’s inability to read.

    Is Mr. Carr’s overall thesis correct? We don’t know, and won’t for some time based on the examples he has cited for us of similar technological change.

    But, has there ever been a technological change that has been completely benign?

    It is certainly right to worry about how this marvelous new social connection may nonetheless rob of us something equally or more precious – a value not yet realized.

    Who would want to stop that enquiry and for what reason?

  14. Jon Garfunkel

    re:

    “A very young person, who had not seen the 7th grade, might understandably focus only on the title and then, on that limited basis, deride the conclusion.”

    Well, it’s a provocative title, but it’s important to have it concisely capture the theme of the essay. E.g., if you call your work “On the Origin of Species” you better deliver the goods (and Mr. Darwin did).

    Nick– how confident were you with the title? Perhaps you paused to think about titling it “Is blogging making us stupid” — and then fretted about the much-larger knee-jerk backlash you’d face?

    Jon

  15. Oran Kelley

    “A very young person, who had not seen the 7th grade, might understandably focus only on the title and then, on that limited basis, deride the conclusion.”

    “Blogging is making us stupid.”

    Personally, I derided the article because it doesn’t provide a whole lot of evidence for anything aside from the fact that technology changes things. And if you give it a moment’s thought, that’s pretty much why we spend time and effort on it. That technology has unintended and unpredictable consequences should come as no surprise to anyone, either.

    So we are left with the actual thesis of the article, for which I find very little evidence in support. The fact that the financial model of journalism is changing means little vis-a-vis how we read.

    I’ve studied all sorts of news and essayistic journalism going back to the 17th century, and the vast majority of it is NOT meaty. People have never read very much meaty journalism. Full Stop. For a great many people over the history of journalistic publication “meaty journalism” would have been an oxymoron.

    This article and many of the comments seem to me to reflect a mindless apocalypticism, with people crying about how “everything has changed.” Like when Internet stocks went through the roof. And like after 9/11.

    Some things have changed because of the new medium. It takes a disciplined mind to tease out what they are.

    A lot of people don’t have the patience to research or think about this sort of topic thoroughly. They really ought to find something else to write about, but there are plenty more where they came from to read at least some of their empty speculations, I suppose.

    But they, like the poor, have always been with us.

  16. Jake Kaldenbaugh

    I tried to read it by scanning the first few sentences of each paragraph, stopping only to read the paragraph about the research indicating…what did it indicate again? I forget.

    Anyways, abandoned the reading when I figured out that I’d have to click through several more pages…

  17. Nick Carr

    Nick– how confident were you with the title?

    The title came from The Atlantic’s editors (as is usually the case with magazine articles). My original working title was simply “Think Fast.” But I can see that that wouldn’t have been much of a grabber.

  18. michael webster

    @Oran Kelly who wrote:

    “Personally, I derided the article because it doesn’t provide a whole lot of evidence for anything aside from the fact that technology changes things.

    And if you give it a moment’s thought, that’s pretty much why we spend time and effort on it.

    That technology has unintended and unpredictable consequences should come as no surprise to anyone, either.”

    Oran, my take on Nick’s article -apart from the silly title- is three-fold.

    1. There is substantive number of people claiming that they find it more difficult to concentrate on books. I find this phenomena not surprising. Are they right, or merely loud? Don’t know, but worth finding out.

    2. The speed at which we can find “new” things interferes with concentration. For some reason, many people have recommending writing in short sentences for blogs. There appears to be something to this. Why? We don’t yet. Worthwhile finding out.

    3. Finally, do we have to learn a different method of concentration when “following the thread in the web”, and might this skill be at odds with our previous reading abilities – which are not natural in the way that language acquisition is. Don’t know, but worthwhile figuring out.

    Nick has done a nice job of flagging these issues without being a wanker about it – unlike some of the 7 year olds that have complained that “google hasn’t made them stupid”. And to them, I can only attribute some fundamental reason for their stupidity – they are dumb, neither reading nor comprehending what they do read.

  19. Don B

    Congrats on the article.

    My first thought was to the reaction I observed in so many executives when they saw your HBR article, IT Doesn’t Matter — a rush to react without a deep read of the content. I now chuckle that the people at The Atlantic titled this gem.

    Maryanne Wolf’s ‘Proust and The Squid’ is wonderful, and is but one of many contemporary neuroscience works shedding light on our brain functions. I believe there to be tremendous upside to designing our businesses, IT, web, whatever, to work better with the way our heads work. Yet it is OK to mourn the loss of deep reading as a skill in our society.

    I am half way through The Big Switch. Keep ’em coming.

  20. Kevin Kelly

    >Maybe I’m just a worrywart.

    Nick, I found that was the best explanation. You did such a fine job of rounding up all the other worryworts in history, that you convinced me you belong in their august company. (Isn’t that what you were trying to do?)

    Also you introduced me to the term “pancake people” which I will be forever grateful.

    What happens if Google is making us dumber without Google but absolutely smarter with it? Do you go or stay?

  21. Oran Kelley

    Jake Kaldenbaugh:

    I mentioned the study in my first post. I did in fact read that paragraph. And I actually thought for a minute about what the study showed:

    “It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”

    Does this mean that the Internet degrades our ability to read in other fashions? No. Does this study give us any reason to think that the average subject might read better now than a similar subject might have in, say, 1984? No. All it shows is that people read differently in the electronic medium: that they use those hyperlinks to go forward and rarely go back. As someone who surfed the gopher web using lynx, this all seems unsurprising and, if I am not mistaken, marketers long ago made observations along these same lines.

    So this is pretty much scientific confirmation of a commonplace: not useless, but it doesn’t really support the hypothesis of the article. And it’s not like the push for short sentences, short paragraphs, and short articles came out of nowhere: go to the library and check out the New York Post from 20 years ago or so and try to imagine what the city editor used to shout at his writers.

    As to the testimonials: perhaps Mr. Carr should see if he can get an article on alien abduction or vaccines causing autism or AIDS printed, and we’ll see how many 100% sincere testimonials he gets. In the absence of other evidence, they aren’t worth much.

    Like the rest of us, Messrs Carr, Sullivan et al. are getting older, and for them, maybe getting older means short attention spans. Since this is their first and only time through this race, I don’t think they necessarily know what causes their symptoms.

    Though the Internet is a convenient thing to blame. I wonder if there was mercury in their vaccines?

Comments are closed.