Grand Theft Attention: video games and the brain

Having recently come off a Red Dead Redemption jag, I decided, as an act of penance, to review the latest studies on the cognitive effects of video games. Because videogaming has become such a popular pastime so quickly, it has, like television before it, become a focus of psychological and neuroscientific experiments. The research has, on balance, tempered fears that video games would turn players into bug-eyed, bloody-minded droogs intent on ultraviolence. The evidence suggests that spending a lot of time playing action games – the ones in which you run around killing things before they kill you (there are lots of variations on that theme) – actually improves certain cognitive functions, such as hand-eye coordination and visual acuity, and can speed up reaction times. In retrospect, these findings shouldn’t have come as a surprise. As anyone who has ever played an action game knows, the more you play it, the better you get at it, and getting better at it requires improvements in hand-eye coordination and visual acuity. If scientists had done the same sort of studies on pinball players 50 years ago, they would have probably seen fairly similar results.

But these studies have also come to be interpreted in broader terms. Some popular-science writers draw on them as evidence that the heavy use of digital media – not just video games, but web-surfing, texting, online multitasking, and so forth – actually makes us “smarter.” The ur-text here is Steven Johnson’s 2005 book Everything Bad Is Good for You. Johnson draws on an important 2003 study, published as a letter to Nature magazine, by University of Rochester researchers Shawn Green and Daphne Bavelier, which demonstrated that “10 days of training on an action game is sufficient to increase the capacity of visual attention, its spatial distribution and its temporal resultion.” In other words, playing an action game can help you keep track of more visual stimuli more quickly and across a broader field, and these gains may persist even after you walk away from the gaming console. Other studies, carried out both before and after the Green and Bavelier research, generally back up these findings. In his book, Johnson concluded, sweepingly, that video games “were literally making [players] perceive the world more clearly,” and he suggested that gaming research “showed no evidence of reduced attention spans compared to non-gamers.”

More recently, the New York Times blogger Nick Bilton, in his 2010 book I Live in the Future, also suggested that videogaming improves attentiveness as well as visual acuity and concluded that “the findings argue for more game playing.” The science writer Jonah Lehrer last year argued that videogaming leads to “significant improvements in performance on various cognitive tasks,” including not only “visual perception” but also “sustained attention” and even “memory.” In her forthcoming book Now You See It, Cathy N. Davidson, an English professor at Duke, devotes a chapter to video game research, celebrating a wide array of apparent cognitive benefits, particularly in the area of attentiveness. Quoting Green and Bavelier, Davidson notes, for example, that “game playing greatly increases ‘the efficiency with which attention is divided.'”

The message is clear and, for those of us with a fondness for games, reassuring: Fire up the Xbox, grab the controller, and give the old gray matter a workout. The more you play, the smarter you’ll get.

If only it were so. The fact is, such broad claims about the cognitive benefits of video games, and by extension other digital media, have always been dubious. They stretch the truth. The mental faculties of attention and memory have many different facets – neuroscientists are still a long way from hashing them out – and to the extent that past gaming studies demonstrate improvements in these areas, they relate to gains in the kinds of attention and memory used in the fast-paced processing of a welter of visual stimuli. If you improve your ability to keep track of lots of images flying across a screen, for instance, that improvement can be described as an improvement in a type of attentiveness. And if you get better at remembering where you are in a complex fantasy world, that improvement can be described as an improvement in a sort of memory. The improvements may well be real – and that’s good news – but they’re narrow, and they come with costs. The fact that video games seem to make us more efficient at dividing our attention is great, as long as you’re doing a task that requires divided attention (like playing a video game). But if you’re actually trying to do something that demands undivided attention, you may find yourself impaired. As UCLA developmental psychologist Patricia Greenfield, one of the earliest researchers on video games, has pointed out, using media that train your brain to be good at dividing your attention appears to make you less able to carry out the kinds of deep thinking that require a calm, focused mind. Optimizing for divided attention means suboptimizing for concentrated attention.

Recent studies back up this point. They paint a darker picture of the consequences of heavy video-gaming, particularly when it comes to attentiveness. Far from making us smarter, heavy gaming seems to be associated with attention disorders in the young and, more generally, with a greater tendency toward distractedness and a reduced aptitude for maintaining one’s focus and concentration. Playing lots of video games, these studies suggest, does not improve a player’s capacity for “sustained attention,” as Lehrer and others argue. It weakens it.

In a 2010 paper published in the journal Pediatrics, Edward L. Swing and a team of Iowa State University psychologists reported on a 13-month study of the media habits of some 1,500 kids and young adults. It found that “[the] amount of time spent playing video games is associated with greater attention problems in childhood and on into adulthood.” The findings indicate that the correlation between videogaming and attention disorders is at least equal to and probably greater than the correlation between TV-viewing and those disorders. Importantly, the design of the study “rules out the possibility that the association between screen media use and attention problems is merely the result of children with attention problems being especially attracted to screen media.”

A 2009 study by a different group of Iowa State researchers, published in Psychophysiology, investigated the effects of videogaming on cognitive control, through experiments with 51 young men, both heavy gamers and light gamers. The study indicated that videogaming has little effect on “reactive” cognitive control – the ability to respond to some event after it happens. But when it comes to “proactive” cognitive control – the ability to plan and adjust one’s behavior in advance of an event or stimulus – videogaming has a significant negative effect. “The negative association between video game experience and proactive cognitive control,” the researchers write, “is interesting in the context of recent evidence demonstrating a similar correlation between video game experience and self-reported measures of attention deficits and hyperactivity. Together, these data may indicate that the video game experience is associated with a decrease in the efficiency of proactive cognitive control that supports one’s ability to maintain goal-directed action when the environment is not intrinsically engaging.” Videogamers, in other words, seem to have a difficult time staying focused on a task that doesn’t involve constant incoming stimuli. Their attention wavers.

These findings are consistent with more general studies of media multitasking. In a much-cited 2009 paper in Proceedings of the National Academy of Sciences, for example, Stanford’s Eyal Ophir, Clifford Nass, and Anthony D. Wagner show that heavy media multitaskers demonstrate significantly less cognitive control than light multitaskers. The heavy multitaskers “have greater difficulty filtering out irrelevant stimuli from their environment” and are also less able to suppress irrelevant memories from intruding on their work. The heavy multitaskers were actually less efficient at switching between tasks – in other words, they were worse at multitasking.

So should people be prevented from playing video games? Not at all (though parents should monitor and restrict young kids’ use of the games). Moderate game-playing probably isn’t going to have any significant long-term cognitive consequences, either good or bad. Video-gaming is fun and relaxing, and those are good things. Besides, people engage in all sorts of pleasant, diverting pursuits that carry risks, from rock-climbing to beer-drinking (don’t mix those two), and if we banned all of them, we’d die of boredom.

What the evidence does show is that while videogaming might make you a little better at certain jobs that demand visual acuity under stress, like piloting a jet fighter or being a surgeon, it’s not going to make you generally smarter. And if you do a whole lot of it, it may well make you more distracted and less able to sustain your attention on a single task, particularly a difficult one. More broadly, we should be highly skeptical of anyone who draws on video game studies to argue that spending a lot time in front of a computer screen strengthens our attentiveness or our memory or even our ability to multitask. Taken as a whole, the evidence, including the videogaming evidence, suggests it has the opposite effect.

7 thoughts on “Grand Theft Attention: video games and the brain

  1. Kevin Kelly

    In the end, isn’t Red Dead Redemption quite something? I can’t play (no twitch skills), but I spend a lot of time watching the screen as my 14yo son plays. I think it heralds something important — more than just more shooter gaming — but I have not been able to pinpoint what yet. Seeing it really rocked me.

  2. Adam

    I think an critical component in your discussion that is missing is really –

    What kind of games are we talking about?

    The majority of videogames are heavily scripted linear things. They provide a regulated drip of stimulation that the user comes to expect.

    There are other types of videogames that we call openworld or sandbox games.

    Games like RedDeadRedemption say. Perhaps what Kevin Kelly was seeing in his “something important”.

    In openworld or sandbox games (and they wildly vary in their degree and approach) the player has a great deal of agency in what the story will even be about. Hopefully there is little or no central plot.

    Have you tried Minecraft? Very frustrating for many videogamers(“What do I do?”) and tremendously stimulating for others.

    If we need a dichotomy, think of Donkey Kong on one end of the spectrum and Minecraft on the other perhaps.

  3. Will Aft

    Nick, this post was pleasure to read. I loved how LONG it was. I am happy to have come across this take on the “video gaming makes us smarter” conversation.

    For the record I grew up in the era where video games transitioned from arcade machines to desktop computers. I spent hours, at times entire weekends, as an adolescent sitting in front of my commodore 64 banging away at the keys trying to reach new levels. I was driven partly by the joys of the games and partly by the idea that each game I played I was “saving” a quarter. I must have saved tens of thousands of dollars a year! My reflexes were awesome! My brain could process optimally!

    And then in university I read Neil Postman who posed the question what does it mean to be smart? His writing pushed me to reflect on intelligence, wisdom and mortality in non-computational ways.

    A line of Postman’s that is still stuck in my head two decades later is that we are under the mistaken notion that all our problems will be solved by more information. I didn’t really know what he meant back then but I sure know now.

    Not all video games are created equal. I love my Nintendo Wii but I only play physical fitness games on it. I’m saving thousands of dollars not paying a gym membership!

    What does it mean to be smart? In a world organized around faster and faster rendering of graphics and quicker and quicker responses to human input it isn’t a surprise that being smart is defined in terms of speed of response. And I suppose in a way it is. Let’s just hope that’s not the ONLY way it’s defined. I’m still lighting a torch for slow, considered reflection.

    That’s my cheering you on to keep writing.

  4. Brutus.wordpress.com

    Thanks for reviewing the studies again. I would love to say that the conclusion is either obvious or intuitive and all the research isn’t really necessary, but arguments are cheap these days. Dispelling bogus claims is nearly impossible, even with good evidence. There are always arguments and evidence pointing the opposite direction.

    Perhaps I’ve thrown the baby out with the bathwater, but I stopped gaming just as I stopped TV watching more than a decade ago. While both are enjoyable after a fashion, I found myself too easily forking over large chunks of time when I would be little more than a drooling idiot. (Your own gaming jag is instructive.) That’s not how I want to live.

  5. CJSmeds

    Great post!

    For me the issue is not that we are spending (too much) time playing video games or watching TV or surfing the ‘net; it is what we are giving up in order to do so. Two hours a night playing video games means I do not have that two hours a night to do other things, like reading, exercising, getting outside, socializing. I fear we are sacrificing real life for a virtual (or worse, vicarious) life.

    From a cognitive development standpoint, a kid sitting on his behind in front of a screen gets an entirely different set of stimuli than a kid playing outside or building with Legos or drawing and painting or reading or hanging out with friends. The danger is not in the glowing screens we plop our children in front of for hours at a time; it is in their missing out on real life when we do so.

    Here’s a thought: instead of plugging our kids into the TV or the Internet or a video game so we can make dinner, why don’t we engage our kids in the real world activity of having them help us make the dinner? Kids would learn real world skills: cooking basics, following directions, counting and measuring, hand-eye coordination, and the fact that food doesn’t just magically appear on your plate but that it takes time and effort. Kids and parents would get to spend time working together and in the inevitable conversations that develop. And parents would get the benefit of an extra set of hands in the kitchen to help with the dishes!

    I can’t help but think we would all be better off if we spent less time in our digital lives and more time living real lives. (As he ironically posts to his digital life…)

Comments are closed.