“The practical effects of my decision to allow technology use in class grew worse over time,” writes Clay Shirky in explaining why he’s decided to ban laptops, smartphones, and tablets from the classes he teaches at NYU. “The level of distraction in my classes seemed to grow, even though it was the same professor and largely the same set of topics, taught to a group of students selected using roughly the same criteria every year. The change seemed to correlate more with the rising ubiquity and utility of the devices themselves, rather than any change in me, the students, or the rest of the classroom encounter.”
When students put away their devices, Shirky continues, “it’s as if someone has let fresh air into the room. The conversation brightens, [and] there is a sense of relief from many of the students. Multi-tasking is cognitively exhausting — when we do it by choice, being asked to stop can come as a welcome change.”
It’s been more than ten years now since Cornell’s Helene Hembrooke and Geri Gay published their famous “The Laptop and the Lecture” study, which documented how laptop use reduces students’ retention of material presented in class.* Since then, the evidence of the cognitive toll that distractions, interruptions, and multitasking inflict on memory and learning has only grown. I surveyed a lot of the evidence in my 2010 book The Shallows, and Shirky details several of the more recent studies. The evidence fits with what educational psychologists have long known: when a person’s cognitive load — the amount of information streaming into working memory — rises beyond a certain, quite low threshold, learning suffers. There’s nothing counterintuitive about this. We’ve all experienced cognitive overload and its debilitating effects.
Earlier this year, Dan Rockmore, a computer scientist at Dartmouth, wrote of his decision to ban laptops and other personal computing devices from his classes:
I banned laptops in the classroom after it became common practice to carry them to school. When I created my “electronic etiquette policy” (as I call it in my syllabus), I was acting on a gut feeling based on personal experience. I’d always figured that, for the kinds of computer-science and math classes that I generally teach, which can have a significant theoretical component, any advantage that might be gained by having a machine at the ready, or available for the primary goal of taking notes, was negligible at best. We still haven’t made it easy to type notation-laden sentences, so the potential benefits were low. Meanwhile, the temptation for distraction was high. I know that I have a hard time staying on task when the option to check out at any momentary lull is available; I assumed that this must be true for my students, as well.
As Rockmore followed the research on classroom technology use, he found that the empirical evidence backed up his instincts.
No one would call Shirky or Rockmore a Luddite or a nostalgist or a technophobe. They are thoughtful, analytical scholars and teachers who have great enthusiasm and respect for computers and the internet. So their critiques of classroom computer use are especially important. Shirky, in particular, has always had a strong inclination to leave decisions about computer and phone use up to his students. He wouldn’t have changed his mind without good reason.
Still, even as the evidence grows, there are many teachers who, for a variety of reasons, continue to oppose any restrictions on classroom computer use — and who sometimes criticize colleagues that do ban gadgets as blinkered or backward-looking. At this point, some of the pro-gadget arguments are starting to sound strained. Alexander Reid, an English professor at the University of Buffalo, draws a fairly silly parallel between computers and books:
Can we imagine a liberal arts degree where one of the goals is to graduate students who can work collaboratively with information/media technologies and networks? Of course we can. It’s called English. It’s just that the information/media technologies and networks take the form of books and other print media. Is a book a distraction? Of course. Ever try to talk to someone who is reading a book? What would you think of a student sitting in a classroom reading a magazine, doodling in a notebook or doing a crossword puzzle? However, we insist that students bring their books to class and strongly encourage them to write.
Others worry that putting limits on gadget use, even if justified pedagogically, should be rejected as paternalistic. Rebecca Schuman, who teaches at Pierre Laclede Honors College, makes this case:
My colleagues and I joke sometimes that we teach “13th-graders,” but really, if I confiscate laptops at the door, am I not creating a 13th-grade classroom? Despite their bottle-rocket butt pranks and their 10-foot beer bongs, college students are old enough to vote and go to war. They should be old enough to decide for themselves whether they want to pay attention in class — and to face the consequences if they do not.
A related point, also made by Schuman, is that teachers, not computers, are ultimately to blame if students get distracted in class:
You want students to close their machines and pay attention? Put them in a smaller seminar where their presence actually registers and matters, and be engaging enough — or, in my case, ask enough questions cold — that students aren’t tempted to stick their faces in their machines in the first place.
The problem with blaming the teacher, or the student, or the class format — the problem with treating the technology as a neutral object — is that it ignores the way software and social media are painstakingly designed to exploit the mind’s natural inclination toward distractedness. Shirky makes this point well, and I’ll quote him here at some length:
Laptops, tablets and phones — the devices on which the struggle between focus and distraction is played out daily — are making the problem progressively worse. Any designer of software as a service has an incentive to be as ingratiating as they can be, in order to compete with other such services. “Look what a good job I’m doing! Look how much value I’m delivering!”
This problem is especially acute with social media, because . . . social information is immediately and emotionally engaging. Both the form and the content of a Facebook update are almost irresistibly distracting, especially compared with the hard slog of coursework. (“Your former lover tagged a photo you are in” vs. “The Crimean War was the first conflict significantly affected by use of the telegraph.” Spot the difference?)
Worse, the designers of operating systems have every incentive to be arms dealers to the social media firms. Beeps and pings and pop-ups and icons, contemporary interfaces provide an extraordinary array of attention-getting devices, emphasis on “getting.” Humans are incapable of ignoring surprising new information in our visual field, an effect that is strongest when the visual cue is slightly above and beside the area we’re focusing on. (Does that sound like the upper-right corner of a screen near you?)
The form and content of a Facebook update may be almost irresistible, but when combined with a visual alert in your immediate peripheral vision, it is—really, actually, biologically—impossible to resist. Our visual and emotional systems are faster and more powerful than our intellect; we are given to automatic responses when either system receives stimulus, much less both. Asking a student to stay focused while she has alerts on is like asking a chess player to concentrate while rapping their knuckles with a ruler at unpredictable intervals.
A teacher has an obligation not only to teach but to create, or at least try to create, a classroom atmosphere that is conducive to the work of learning. Ignoring technology’s influence on that atmosphere doesn’t do students any favors. Here’s some of what Anne Curzan, a University of Michigan English professor, tells her students when she explains why she doesn’t want them to use computers in class:
Now I know that one could argue that it is your choice about whether you want to use this hour and 20 minutes to engage actively with the material at hand, or whether you would like to multitask. You’re not bothering anyone (one could argue) as you quietly do your email or check Facebook. Here’s the problem with that theory: From what we can tell, you are actually damaging the learning environment for others, even if you’re being quiet about it. A study published in 2013 found that not only did the multitasking student in a classroom do worse on a postclass test on the material, so did the peers who could see the computer. In other words, the off-task laptop use distracted not just the laptop user but also the group of students behind the laptop user. (And I get it, believe me. I was once in a lecture where the woman in front of me was shoe shopping, and I found myself thinking at one point, “No, not the pink ones!” I don’t remember all that much else about the lecture.)
Our attention is governed not just by our will but by our environment. That’s how we’re built.
I suspect the debate over classroom computer use has become a perennial one, and that it will blossom anew every September. That’s good, as it’s an issue that deserves ongoing debate. But there is a point on which perhaps everyone can agree, and from that point of agreement might emerge constructive action. It’s a point about design, and Shirky gets at it in his article:
The fact that hardware and software is being professionally designed to distract was the first thing that made me willing to require rather than merely suggest that students not use devices in class. There are some counter-moves in the industry right now — software that takes over your screen to hide distractions, software that prevents you from logging into certain sites or using the internet at all, phones with Do Not Disturb options — but at the moment these are rear-guard actions. The industry has committed itself to an arms race for my students’ attention, and if it’s me against Facebook and Apple, I lose.
Computers and software can be designed in many different ways, and the design decisions will always reflect the interests of the designers (or their employers). Beyond the laptops-or-no-laptops-debate lies a broader and more important discussion about how computer technology has come to be designed — and why.
*This post, and the other posts cited within it, concerns the use of personal computing devices in classes in which those devices have not been formally incorporated as teaching aids. There are, of course, plenty of classes in which computers are built into the teaching plan. It’s perhaps noteworthy, though, to point out that, in the “Laptop and Lecture” study, students who used their laptops to look at sites relevant to the class actually did even worse on tests of retention than did students who used their computers to look at irrelevant sites.
Image: “Viewmaster” by Geof Wilson.
I’m not sure why you want to characterize my post as “pro-gadget.” In some respects we are making the same argument: there is little or no role for student computer or mobile device use in the typical college classroom. These devices do affect the way that we think. Books affect the way we think, too; that’s why we value them, right? As you point out, and I agree, technologies are not neutral objects. We can’t just dump them into our existing curricula, classrooms and pedagogies and expect them to just work. On the other hand, how long can we just act as if such technologies do not exist? Don’t we have a responsibility to help students learn how to learn and work in the context of the media that pervade their (and our) lives? After all, we have worked very hard over the last century to help students like you and me learn how to learn and work in the context of print technologies. I’m not sure, given the challenges we face with digital media, that it makes sense to be either pro- or anti- gadget.
Point taken that we can’t opt out or ignore the technology, but that doesn’t mean we need stick our head into the maw of the beast either.
What it does requires is for us to be intensely thoughtful and deliberate about how we engage with it and given the wealth of anecdotal and study data in this particular instance, let’s not needlessly impair students’ uptake during instructional periods.
“The problem with blaming the teacher, or the student, or the class format — the problem with treating the technology as a neutral object — is that it ignores the way software and social media are painstakingly designed to exploit the mind’s natural inclination toward distractedness.”
Amen, but would you say this dilemma is unique to computer technology? It seems to me it’s one facet of a broader trend in which we are becoming the automated as often as we are the automators. If the issue is that our bodies’ built-in triggers are being pulled by a technology designed to circumvent or beguile our reason, then by my lights it’s the same debate we’re having over junk food, cigarettes, drugs, advertising, debt, pornography, television, etc. We’re stuck on the Cartesian notion of a transcendent mind, so all demands on the body are fair game. Wasn’t it inevitable that eventually techno-science + market competition would begin producing perfect (irresistible) products?
As Mr. Reid pointed out above, having no opinion—identifying with the system—is becoming an attractive strategy. It’s picked up too much speed for personal maneuvers, so maybe we should just make like the plankton, or Kevin Kelly’s anole in the mirrored box. What’s it like to be Kevin Kelly’s anole?
In my own classroom I’ve experimented with the creation of digital Walden Zones (a term I borrow from William Powers, see: http://lfernandez.org/bio/walden.html ). Lately though I just haven’t been raising the issue much in class. Maybe it’s because my students actually need to use computers in the classroom (I teach Web development) or because students have become more practiced at keeping their distractions to themselves (I rarely if ever hear a phone go off in class anymore).
Still, I do wonder whether sometimes I’m merely capitulating to Silicon Valley and its monopolization of the attention economy. We aren’t after all, the first generation to worry about a so-called “tyranny of bells a whistles” (to a quote worry-wart from an earlier era) but what one thing that may make our own era different is that we are more willing to capitulate to this tyranny. Read Katherine Hayles (Hyper and Deep Attention: The Generational Divide in Cognitive Modes) or Kathy Davidson (Now You See It) and one see’s evidence of this. Both suggest that hyper attention may be more adaptive to 21st century modes of production and that, therefore, we should make a place for it (and the technologies that foster it) in the modern university. It’s a compelling argument but ultimately I resist it — we need to revisit the virtues of deep attention (regardless of what age of production we live in) and maybe we’re jeopardizing that project when we let computers into our classrooms without at least asking the class whether an alternative arrangement might be better.
In 2011 Adam Gopnik wrote a New Yorker article titled “The Information – How the Internet Got Inside Us” in which he divides the pundits of the internet into three categories: the Never-Betters, the Better-Nevers, and the Ever-Wasers. I have always read Shirky through this framework where he’s pegged as “the Wired version of Whig history: ever better, onward and upward, progress unstopped.” But this medium piece suggests that a new, less sanguine Shirky is emerging who sees the internet in a little less rosy terms. Is it a serious evolution? Is he really migrating from the camp of the never-betters to the better-nevers? Or were these sentiments always there — just eclipsed by Shirky’s more familiar Whiggish demeanor?
I think if the subject matter requires a laptop, and Wi-Fi (or other Internet hookups) are disabled, then its fine. Otherwise there should be no need for one in high school, college, or in any classroom situation.