Steven Pinker and the Internet

As someone who has enjoyed and learned a lot from Steven Pinker’s books about language and cognition, I was disappointed to see the Harvard psychologist write, in Friday’s New York Times, a cursory op-ed column about people’s very real concerns over the Internet’s influence on their minds and their intellectual lives. Pinker seems to dismiss out of hand the evidence indicating that our intensifying use of the Net and related digital media may be reducing the depth and rigor of our thoughts. He goes so far as to assert that such media “are the only things that will keep us smart.” And yet the evidence he offers to support his sweeping claim consists largely of opinions and anecdotes, along with one very good Woody Allen joke.

One thing that didn’t surprise me was Pinker’s attempt to downplay the importance of neuroplasticity. While he acknowledges that our brains adapt to shifts in the environment, including (one infers) our use of media and other tools, he implies that we need not concern ourselves with the effects of those adaptations. Because all sorts of things influence the brain, he oddly argues, we don’t have to care about how any one thing influences the brain. Pinker, it’s important to point out, has an axe to grind here. The growing body of research on the adult brain’s remarkable ability to adapt, even at the cellular level, to changing circumstances and new experiences poses a challenge to Pinker’s faith in evolutionary psychology and behavioral genetics. The more adaptable the brain is, the less we’re merely playing out ancient patterns of behavior imposed on us by our genetic heritage.

In Adapting Minds, his epic critique of the popular brand of evolutionary psychology espoused by Pinker and others, David J. Buller argues that evolution “has not designed a brain that consists of numerous prefabricated adaptations,” as Pinker has suggested, but rather one that is able “to adapt to local environmental demands throughout the lifetime of an individual, and sometimes within a period of days, by forming specialized structures to deal with those demands.” To understand the development of human thought, and the influence of outside influences on that thought, we need to take into account both the fundamental genetic wiring of the brain – what Pinker calls its “basic information-processing capacities” – and the way our genetic makeup allows for ongoing changes in that wiring.

On the topic of neuroplasticity, Pinker claims to speak for all brain scientists. When confronted with suggestions that “experience can change the brain,” he writes, “cognitive neuroscientists roll their eyes.” I’m wary when any scientist suggests that his view of a controversial matter is shared by all his colleagues. I also wonder if Pinker read the reports on the Net’s cognitive effects published in the Times last week, in which several leading brain researchers offer views that conflict with his own. A few examples:

“The technology is rewiring our brains,” said Nora Volkow, director of the National Institute of Drug Abuse and one of the world’s leading brain scientists …

The nonstop interactivity is one of the most significant shifts ever in the human environment, said Adam Gazzaley, a neuroscientist at the University of California, San Francisco. “We are exposing our brains to an environment and asking them to do things we weren’t necessarily evolved to do,” he said. “We know already there are consequences” …

[Stanford professor Clifford] Nass says the Stanford studies [of media multitasking] are important because they show multitasking’s lingering [cognitive] effects: “The scary part for guys like Kord is, they can’t shut off their multitasking tendencies when they’re not multitasking.”

In a brief essay published last week on the Times website, Russell A. Poldrack, the director of the Imaging Research Center and professor of psychology and neurobiology at the University of Texas at Austin, wrote: “Our research has shown that multitasking can have an insidious effect on learning, changing the brain systems that are involved so that even if one can learn while multitasking, the nature of that learning is altered to be less flexible. This effect is of particular concern given the increasing use of devices by children during studying.”

Other scholars of the mind also believe, or at least worry, that our use of digital media is having a deep, and not necessarily beneficial, influence on our ways of thinking. The distinguished neuroscientist Michael Merzenich, who has been studying the adaptability of primate brains since the late 1960s, believes that human brains are being significantly “remodeled” by our use of the Net and other modern media. Maryanne Wolf, a developmental psychologist at Tufts, fears that the shift from immersive page-based reading to distracted screen-based reading may impede the development of the specialized neural circuits that make deep, richly interpretive reading possible. We may turn into mere “decoders” of text.

Pinker may well disagree with all these views, but to pretend they don’t exist is misleading.

Pinker also pokes at straw men. Instead of grappling with the arguments of others, he reduces them to caricatures in order to dismiss them. He writes, for example, that “the existence of neural plasticity does not mean the brain is a blob of clay pounded into shape by experience.” Who exactly does Pinker believe is proposing such an idea – John Locke? I haven’t seen anyone suggest that the brain is a shapeless blob of clay. What they are saying is that the brain, while obviously as much a product of evolution as any other part of the body, is not genetically locked into rigid modes of thought and behavior. Changes in our habits of thought echo through our neural pathways, for better and for worse.

In other cases, Pinker uses overstatement to gloss over subtleties. He writes at one point, “If electronic media were hazardous to intelligence, the quality of science would be plummeting.” Human intelligence takes many forms. Electronic media may enhance some aspects of our intelligence (the ability to spot patterns in arrays of visual data, for example, or to discover pertinent facts quickly or to collaborate at a distance) while at the same time eroding others (the ability to reflect on our experiences, say, or to express ourselves in subtle language or to read complex narratives critically). To claim that “intelligence” can be gauged by a single measure is to obfuscate rather than illuminate.

Pinker notes that “the decades of television, transistor radios and rock videos were also decades in which I.Q. scores rose continuously.” Actually, as the political scientist James Flynn first documented, general IQ scores have been rising at a steady clip since the beginning of the 1900s, so we should be wary about linking this long-term trend to the recent popularity of any particular technology or medium. Moreover, as Flynn himself has been careful to point out, the improvements in IQ scores are largely attributable to increases in measures of visual acuity and abstract problem-solving, such as the mental rotation of geometric forms, the identification of similarities between disparate objects, and the arrangement of shapes into logical sequences. These skills are certainly very important, but measures of other components of intelligence, including verbal skill, vocabulary, basic arithmetic, memorization, critical reading, and general knowledge, have been stagnant or declining. In warning against drawing overly broad conclusions about our intelligence from the rise in IQ scores, Flynn wrote, in his book What Is Intelligence?, “How can people get more intelligent and have no larger vocabularies, no larger stores of general information, no greater ability to solve arithmetical problems?”

Drifting briefly from science to the humanities, Pinker implies that our cultural life is richer than ever, a consequence, apparently, of the bounties of digital media. As evidence, he points to the number of stories appearing on the website Arts & Letters Daily. Suffice it to say that other indicators of the depth and richness of cultural life point in different directions.

Pinker also makes several observations that, while accurate, undercut the main thrust of his argument. He writes, for example, that “the effects of experience are highly specific to the experiences themselves. If you train people to do one thing (recognize shapes, solve math puzzles, find hidden words), they get better at doing that thing, but almost nothing else.” Well, yes, and that’s why some of us are deeply concerned about society’s ever-increasing devotion to the Net and other screen-based media. (The average American now spends more than eight hours a day peering into screens, while devoting only about 20 minutes to reading books and other printed works.) It’s hard not to conclude, or at least suspect, that we are narrowing the scope of our intellectual experiences. We’re training ourselves, through repetition, to be facile skimmers, scanners, and message-processors – important skills, to be sure – but, perpetually distracted and interrupted, we’re not training ourselves in the quieter, more attentive modes of thought: contemplation, reflection, introspection, deep reading, and so forth.

And there’s this: “Genuine multitasking, too, has been exposed as a myth, not just by laboratory studies but by the familiar sight of an S.U.V. undulating between lanes as the driver cuts deals on his cellphone.” Precisely so. Which is one of the reasons that many experts on multitasking are concerned about its increasing prevalence. People may think, as they juggle emails, texts, tweets, updates, Google searches, glances at web pages, and various other media tasks, that they’re adeptly doing a lot of stuff all at once, but what they’re really doing is switching constantly between different tasks, and suffering the cognitive costs that accompany such switching. As Steven Yantis, a professor of psychological and brain sciences at Johns Hopkins, told the Times:

In addition to the switch cost, each time you switch away from a task and back again, you have to recall where you were in that task, what you were thinking about. If the tasks are complex, you may well forget some aspect of what you were thinking about before you switched away, which may require you to revisit some aspect of the task you had already solved (for example, you may have to re-read the last paragraph you’d been reading). Deep thinking about a complex topic can become nearly impossible.

The fact that people who fiddle with cell phones drive poorly shouldn’t make us less concerned about the cognitive effects of media distractions; it should make us more concerned.

And then there’s this: “It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people.” Exactly. And that’s another cause for concern. Our most valuable mental habits – the habits of deep and focused thought – must be learned, and the way we learn them is by practicing them, regularly and attentively. And that’s what our continuously connected, constantly distracted lives are stealing from us: the encouragement and the opportunity to practice reflection, introspection, and other contemplative modes of thought. Even formal research is increasingly taking the form of “power browsing,” according to a 2008 University College London study, rather than attentive and thorough study. Patricia Greenfield, a professor of developmental psychology at UCLA, warned in a Science article last year that our growing use of screen-based media appears to be weakening our “higher-order cognitive processes,” including “abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination.”

We should all celebrate, along with Pinker, the many benefits that the Net and related media have brought us. I have certainly enjoyed those benefits myself over the last two decades. And we should heed his advice to look for “strategies of self-control” to ameliorate the distracting and addictive qualities of those media. But we should not share Pinker’s complacency when it comes to the Net’s ill effects, and we should certainly not ignore the mounting evidence of those effects.

21 thoughts on “Steven Pinker and the Internet

  1. dougiedd

    The very fact that Pinker is the go too guru about these sorts of questions suggests that the internet has already worked it “dumb-down” charms, not that that was needed

  2. Peter B. Reiner

    Precisely so.

    Pinker’s comments about Science thriving as an indicator that there is no problem is just plain silly. A somewhat tortured analogy that I thought of was this. Imagine WWI. Thousands of soldiers are dying daily, but the line still advances 200 feet each day. From the General’s point of view, things are just hunky dory, and victory is at hand. But the cannon fodder have a different perspective. It is easy to be sanguine when one is an endowed Professor at Harvard; out in the real world, matters seem a bit different.

  3. Todd I. Stark

    Nicely done, and I largely agree. Pinker’s nativism is well expressed but his implicit reliance on it for the purpose of this argument is misplaced I think. It misses the point, which is whether we should be looking for specific effects to help guide the use of technology and support the educational values we share. Calling all critiques “moral panics” is less than I expect from as good a scholar as Pinker.

    My own thoughts were very similar.

  4. Seth Finkelstein

    Regarding “Pinker, it’s important to point out, has an axe to grind here.”

    Ummm …. [not worth it :-(]

    Nick, the only people who are going to have both the platform and the inclination to produce a detailed rebuttal are those who have a punditry power-base in some area. That means you can always sneer at the excesses of that base. This is what I mean by two wrongs don’t make a right.

  5. Krishnan

    Don’t you think Pinker could be sticking on to his argument because he is a Libertarian. If he agrees with the environmental effects on the brain here, his whole Libertarian ideology breaks down real fast. It could be a reason why Pinker doesn’t want to yield his position here.

  6. Eric Schultz

    This was WAY too long to read. Only kidding. :) Great post, Nick. I believe this stuff is real and happening in real time, based on both the evidence presented in “The Shallows” and by the fact that I sometimes look up from my desk, paper, screen, phone and knock on the door and feel like I am in an MTV video. People are quick to dismiss it because there’s a lot of good that comes with it. A different take just appeared on TED: http://blog.ted.com/2010/06/phillip_zimbard.php?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+TEDBlog+(TEDBlog). Keep writing!

  7. Pbouzide

    I think the truth is likely to be somewhere between the genetically determined brain point of view and the one that says that the technologically-mediated brain is a (mainly?) harmful trend.

    I accept qualitatively that there’s some validity to the plasticity argument as someone who grew up in the television age. I also accept (as a software development professional) the costs of task switching in human as well as electronic information processing. The distracted driver example is apt indeed and I don’t doubt that hyperlinked media is a novel distracting factor here.

    I don’t however quite buy that “screen based” media consumption means an across-the-board reduction in verbal skill (including vocabulary and nuance). After all, televised and cinematic “content” – even with the jump cuts and breaks for ads – isn’t much without fairly sophisticated (albeit lowbrow) verbal banter. And the deeper sort of immersive verbal/textual experience is indeed available on the web for those few who take the time to read deeply. Your posts on this matter Nick as well as Clay Shirky’s are an existence proof.

    But hasn’t it always been a few who choose to do think “non-shallowly”, book or screen, past or present?

  8. Sprague Dawley

    Although Pinker’s comments weren’t aimed directly at you, I suspected you would respond and am glad that you did. As you say, he has a bias, which makes his critique somewhat foreordained but also a style that is consistently dismissive of ideological opponents which is rude and, frankly, anti-intellectual. His broad view from an evolutionary perspective is necessary, but it is simply arrogant to wave away questions of short-term effects.

    Though I know it’s wasted digibits to share a blog link, I wrote something once that reviewed an article on intelligence that he wrote for New Republic and his thoughts on gender which called out exactly the stylistic methods he relies upon (arguing with himself, setting up straw men, etc.) that you write about here. Good on you.

    http://www.ratdiary.com/2006/06/29/the-search-for-a-genetic-grail/

  9. Kelly Roberts

    I have to agree with Peter here. Oh, to view the wide world from behind the protective lenses of the ivory tower! The key to that op-ed, I think, is here:

    “And to encourage intellectual depth, don’t rail at PowerPoint or Google. It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people. They must be acquired in special institutions, which we call universities, and maintained with constant upkeep, which we call analysis, criticism and debate.”

    I agree with him that we have to blame the users instead of the tools, but only an academic would deny at this point that these particular tools present a danger to literary culture. I don’t think he really believes that, actually. It’s not that he doubts our general fragmentation of attention; it’s that he doesn’t care. As long as the wealthier among us keep attending universities (the only means of acquiring an education, apparently) and learning how to “develop strategies of self-control”, he’s happy.

  10. Todd I. Stark

    Seth: well said, I take your point to heart.

    Eric: very interesting to refer to Zimbardo’s talk in this context, thanks

    Gil: that is a very graceful and engaging essay! I’ve been reading both sides of the EP issue with interest since The Adapted Mind first came out and your position on this is one of the more interesting and unique I’ve come across. I think you catch the core issue very well, that an engine that imbues computation with symbols that have meaning renders independent computational modules largely irrelevant for some things. That also speaks to the split between Pinker and Deacon.

    Pbouzide: well said! I agree with you on roughly where the truth most likely lies, and on our capacity to use the web technology in an “immersive” or reflective way under proper conditions with the right motivation and skills. Deep reading is not easy, deep web reading may be even harder, but both may well be possible under the right conditions.

    Sprague: I agree with you on Pinker. I admire Steve Pinker for his passionate support and development of the scientific study of mind, but to me he does have a rhetorical style that is somewhat at odds with the values of good scholarship.

    Thanks to all for their thoughts here, I’m learning more from this than I initially expected because of all the lively participation.

  11. dr gribbins

    @Todd Stark

    “Thanks to all for their thoughts here, I’m learning more from this than I initially expected because of all the lively participation.”

    But you cannot be learning!! Haven’t u heard?? The Internet makes us dumb!!

  12. Justin McGregor

    Although I deeply admire and thoroughly enjoy reading the works of Steven Pinker, I think I have to agree with you here in that he has stepped out of his realm of expertise. Technological advaces in media have come at a cost and, contrary to what Pinker believes, we should acknowledge, rather than flat out deny, the potentially adverse effects of various forms of media. Having said that, I don’t think the level of concern you express in the above essay is warranted.

    Like all other technological inventions introduced to and readily consumed by society, the Internet has come with benefits and costs. Personally, I think that the benefits brought about by the Internet- e.g., the ease with which information can be disseminated- outweight the potential costs, such as decreased attention span. Moreover, I’m not quite sure I beleive some of the so-called science that you refer to to support your arguments. There is ample ‘scientific’ evidence suggesting that violent video games increase aggression in players, but, like Pinker, I’m skeptical of the implications of this finding on actual behaviour.

  13. Kruunch

    I can see both sides of the debate here but I have to lean towards Pinker’s view point (if not for exactly the same reasons).

    As a technologically progressive society (meaning those that are reading this) the sheer amount of data needed to stay abreast of today’s level of technology, much less to expand upon it demands more efficient ways of processing that data.

    Which leads me to my next point, which is that the way in which we are learning to process more data is entirely a naturally evolving phenomenon. How could it be anything other?

    Finally, Carr says that we shouldn’t stop the technological progress of the Internet and mass media but rather stand “guard” against the apparent dangers of it changing (for the worse) our ways of processing information.

    How does one do that exactly?

  14. Kruunch

    One more point I would like to make and that is I do agree with Carr that we will (and are) losing some of the things that he has mentioned (i.e. verbal/written nuance, forms of self expression, etc …).

    Or, more appropriately, these are changing and have been changing since we’ve been cognitive as a species.

    Just as there were those that noticed the loss the penmanship with the invention of the ball point pen and type writer, Carr is noticing the loss of more protracted forms of discourse and learning.

    I would contend that this is the natural evolution of the higher learning we as a species have developed.

    I would also contend that deep thinking and higher reasoning are also natural to us as we haven’t always had institutions to introduce these (in this case, the egg definitely came before the chicken).

  15. Les Posen

    “And to encourage intellectual depth, don’t rail at PowerPoint or Google. It’s not as if habits of deep reflection, thorough research and rigorous reasoning ever came naturally to people.”

    Have you seen a Pinker Powerpoint? Search in Google no less for and be prepared to be dismayed at what passes for knowledge transfer.

  16. Drkeyneswasright.blogspot.com

    The key observation: multi-tasking is a myth. What’s truly weird is the plain fact that computer folks have known that it is a myth even for “multi-tasking” computers. They never execute more than one instruction at once (on a uni-processor machine; recent multi-core machines are a different issue and have different, and more difficult, problems); the illusion of doing more than one task in a given time is due to the fact that computer programs have wait time in the computation while waiting for disk drives and such.

    We don’t have disk drives, we only have cpu and memory. For such machines, human or silicon, multi-tasking has no “down time” to exploit.

  17. Neuroconscience

    Hi Nick,

    As you know from our discussion at my blog, I’m not really a fan of the extreme views given by either you or Pinker. However, I applaud the thorough rebuttal you’ve given here to Stephen’s poorly researched response. As someone doing my PhD in neuroplasticity and cognitive technology, it absolutely infuriated me to see Stephen completely handwave away a decade of solid research showing generalizable cognitive gains from various from of media-practice. To simply ignore findings from, for example, the Bavalier lab that demonstrate reliable and highly generalizable cognitive and visual gains and plasticity is to border on the unethically dogmatic.

    Pinker isn’t well known for being flexible within cognitive science however; he’s probably the only person even more dogmatic about nativist modularism than Fodor. Unfortunately, Stephen enjoys a large public following and his work has really been embraced by the anti-religion ‘brights’ movement. While on some levels I appreciate this movement’s desire to promote rationality, I cringe at how great scholars like Dennett and Pinker seem totally unwilling to engage with the expanding body of research that casts a great deal of doubt on the 1980’s era cogsci they built their careers on.

    So I give you kudos there. I close as usual, by saying that you’re presenting a ‘sexy’ and somewhat sensationalistic account that while sure to sell books and generate controversy, is probably based more in moral panic than sound theory. I have no doubt that the evidence you’ve marshaled demonstrates the cognitive potency of new media. Further, I’m sure you are aware of the heavy-media multitasking paper demonstrating a drop in executive functioning in HMMs.

    However, you neglect in the posts I’ve seen to emphasize what those authors clearly did: that these findings are not likely to represent a true loss of function but rather are indicators of a shift in cognitive style. Your unwillingness to declare the normative element in your thesis regarding ‘deep thought’ is almost as chilling as Pinker’s total refusal to acknowledge the growing body of plasticity research. Simply put, I think you are aware that you’ve conflated executive processing with ‘deep thinking’, and are not really making the case that we know to be true.

    Media is a tool like any other. It’s outcome measures are completely dependent on how we use it and our individual differences. You could make this case quite well with your evidence, but you seem to embrace the moral panic surrounding your work. It’s obvious that certain patterns, including the ones probably driving your collected research, will play on our plasticity to create cognitive differences. Plasticity is limited however, and you really don’t play on the most common finding across mental training literature: balance and trade-off.

  18. Gosia Stergios

    A few reflections occasioned by reading the “The Shallows” and two other books: “The Other Brain” by R. Douglas Field and “Out of Our Heads” by Alva Noe.

    1/ The main argument of your book presupposes that brain and mind/consciousness are the same. How have you decided to settle one of the most difficult and unresolved debates in the history of western thought about the relationship between the brain and the mind?

    2/ Brain plasticity is not just neuron-based but also glia-based. The latter especially is far from understood by neuroscience, according to Douglas Field, yet it may have enormous implications for our understanding of memory and learning…

  19. Juan Jaramillo

    It is a mistake to measure (or to value) the intelligence of humanity by measuring the intelligence of individual humans. One may accept the deconstructive effects of the net in our individual cognition, but it is not straightforward to see this as a negative process. Highly specialized societies have many advantages over its counterparts. We should focus not so much in the nodes as in the way they interact each other.

    PD. Many of our time spent in “deep thoughts” have been spent reinventing the wheel.

  20. PandaPanda

    Word count.

    Steven Pinker’s NYT Column: 801

    Carr’s Blog post: 2119

    Irony anyone?

Comments are closed.