Having recently come off a Red Dead Redemption jag, I decided, as an act of penance, to review the latest studies on the cognitive effects of video games. Because videogaming has become such a popular pastime so quickly, it has, like television before it, become a focus of psychological and neuroscientific experiments. The research has, on balance, tempered fears that video games would turn players into bug-eyed, bloody-minded droogs intent on ultraviolence. The evidence suggests that spending a lot of time playing action games – the ones in which you run around killing things before they kill you (there are lots of variations on that theme) – actually improves certain cognitive functions, such as hand-eye coordination and visual acuity, and can speed up reaction times. In retrospect, these findings shouldn’t have come as a surprise. As anyone who has ever played an action game knows, the more you play it, the better you get at it, and getting better at it requires improvements in hand-eye coordination and visual acuity. If scientists had done the same sort of studies on pinball players 50 years ago, they would have probably seen fairly similar results.
But these studies have also come to be interpreted in broader terms. Some popular-science writers draw on them as evidence that the heavy use of digital media – not just video games, but web-surfing, texting, online multitasking, and so forth – actually makes us “smarter.” The ur-text here is Steven Johnson’s 2005 book Everything Bad Is Good for You. Johnson draws on an important 2003 study, published as a letter to Nature magazine, by University of Rochester researchers Shawn Green and Daphne Bavelier, which demonstrated that “10 days of training on an action game is sufficient to increase the capacity of visual attention, its spatial distribution and its temporal resultion.” In other words, playing an action game can help you keep track of more visual stimuli more quickly and across a broader field, and these gains may persist even after you walk away from the gaming console. Other studies, carried out both before and after the Green and Bavelier research, generally back up these findings. In his book, Johnson concluded, sweepingly, that video games “were literally making [players] perceive the world more clearly,” and he suggested that gaming research “showed no evidence of reduced attention spans compared to non-gamers.”
More recently, the New York Times blogger Nick Bilton, in his 2010 book I Live in the Future, also suggested that videogaming improves attentiveness as well as visual acuity and concluded that “the findings argue for more game playing.” The science writer Jonah Lehrer last year argued that videogaming leads to “significant improvements in performance on various cognitive tasks,” including not only “visual perception” but also “sustained attention” and even “memory.” In her forthcoming book Now You See It, Cathy N. Davidson, an English professor at Duke, devotes a chapter to video game research, celebrating a wide array of apparent cognitive benefits, particularly in the area of attentiveness. Quoting Green and Bavelier, Davidson notes, for example, that “game playing greatly increases ‘the efficiency with which attention is divided.'”
The message is clear and, for those of us with a fondness for games, reassuring: Fire up the Xbox, grab the controller, and give the old gray matter a workout. The more you play, the smarter you’ll get.
If only it were so. The fact is, such broad claims about the cognitive benefits of video games, and by extension other digital media, have always been dubious. They stretch the truth. The mental faculties of attention and memory have many different facets – neuroscientists are still a long way from hashing them out – and to the extent that past gaming studies demonstrate improvements in these areas, they relate to gains in the kinds of attention and memory used in the fast-paced processing of a welter of visual stimuli. If you improve your ability to keep track of lots of images flying across a screen, for instance, that improvement can be described as an improvement in a type of attentiveness. And if you get better at remembering where you are in a complex fantasy world, that improvement can be described as an improvement in a sort of memory. The improvements may well be real – and that’s good news – but they’re narrow, and they come with costs. The fact that video games seem to make us more efficient at dividing our attention is great, as long as you’re doing a task that requires divided attention (like playing a video game). But if you’re actually trying to do something that demands undivided attention, you may find yourself impaired. As UCLA developmental psychologist Patricia Greenfield, one of the earliest researchers on video games, has pointed out, using media that train your brain to be good at dividing your attention appears to make you less able to carry out the kinds of deep thinking that require a calm, focused mind. Optimizing for divided attention means suboptimizing for concentrated attention.
Recent studies back up this point. They paint a darker picture of the consequences of heavy video-gaming, particularly when it comes to attentiveness. Far from making us smarter, heavy gaming seems to be associated with attention disorders in the young and, more generally, with a greater tendency toward distractedness and a reduced aptitude for maintaining one’s focus and concentration. Playing lots of video games, these studies suggest, does not improve a player’s capacity for “sustained attention,” as Lehrer and others argue. It weakens it.
In a 2010 paper published in the journal Pediatrics, Edward L. Swing and a team of Iowa State University psychologists reported on a 13-month study of the media habits of some 1,500 kids and young adults. It found that “[the] amount of time spent playing video games is associated with greater attention problems in childhood and on into adulthood.” The findings indicate that the correlation between videogaming and attention disorders is at least equal to and probably greater than the correlation between TV-viewing and those disorders. Importantly, the design of the study “rules out the possibility that the association between screen media use and attention problems is merely the result of children with attention problems being especially attracted to screen media.”
A 2009 study by a different group of Iowa State researchers, published in Psychophysiology, investigated the effects of videogaming on cognitive control, through experiments with 51 young men, both heavy gamers and light gamers. The study indicated that videogaming has little effect on “reactive” cognitive control – the ability to respond to some event after it happens. But when it comes to “proactive” cognitive control – the ability to plan and adjust one’s behavior in advance of an event or stimulus – videogaming has a significant negative effect. “The negative association between video game experience and proactive cognitive control,” the researchers write, “is interesting in the context of recent evidence demonstrating a similar correlation between video game experience and self-reported measures of attention deficits and hyperactivity. Together, these data may indicate that the video game experience is associated with a decrease in the efficiency of proactive cognitive control that supports one’s ability to maintain goal-directed action when the environment is not intrinsically engaging.” Videogamers, in other words, seem to have a difficult time staying focused on a task that doesn’t involve constant incoming stimuli. Their attention wavers.
These findings are consistent with more general studies of media multitasking. In a much-cited 2009 paper in Proceedings of the National Academy of Sciences, for example, Stanford’s Eyal Ophir, Clifford Nass, and Anthony D. Wagner show that heavy media multitaskers demonstrate significantly less cognitive control than light multitaskers. The heavy multitaskers “have greater difficulty filtering out irrelevant stimuli from their environment” and are also less able to suppress irrelevant memories from intruding on their work. The heavy multitaskers were actually less efficient at switching between tasks – in other words, they were worse at multitasking.
So should people be prevented from playing video games? Not at all (though parents should monitor and restrict young kids’ use of the games). Moderate game-playing probably isn’t going to have any significant long-term cognitive consequences, either good or bad. Video-gaming is fun and relaxing, and those are good things. Besides, people engage in all sorts of pleasant, diverting pursuits that carry risks, from rock-climbing to beer-drinking (don’t mix those two), and if we banned all of them, we’d die of boredom.
What the evidence does show is that while videogaming might make you a little better at certain jobs that demand visual acuity under stress, like piloting a jet fighter or being a surgeon, it’s not going to make you generally smarter. And if you do a whole lot of it, it may well make you more distracted and less able to sustain your attention on a single task, particularly a difficult one. More broadly, we should be highly skeptical of anyone who draws on video game studies to argue that spending a lot time in front of a computer screen strengthens our attentiveness or our memory or even our ability to multitask. Taken as a whole, the evidence, including the videogaming evidence, suggests it has the opposite effect.