Category Archives: Uncategorized

Students and their devices

viewmaster

“The practical effects of my decision to allow technology use in class grew worse over time,” writes Clay Shirky in explaining why he’s decided to ban laptops, smartphones, and tablets from the classes he teaches at NYU. “The level of distraction in my classes seemed to grow, even though it was the same professor and largely the same set of topics, taught to a group of students selected using roughly the same criteria every year. The change seemed to correlate more with the rising ubiquity and utility of the devices themselves, rather than any change in me, the students, or the rest of the classroom encounter.”

When students put away their devices, Shirky continues, “it’s as if someone has let fresh air into the room. The conversation brightens, [and] there is a sense of relief from many of the students. Multi-tasking is cognitively exhausting — when we do it by choice, being asked to stop can come as a welcome change.”

It’s been more than ten years now since Cornell’s Helene Hembrooke and Geri Gay published their famous “The Laptop and the Lecture” study, which documented how laptop use reduces students’ retention of material presented in class.* Since then, the evidence of the cognitive toll that distractions, interruptions, and multitasking inflict on memory and learning has only grown. I surveyed a lot of the evidence in my 2010 book The Shallows, and Shirky details several of the more recent studies. The evidence fits with what educational psychologists have long known: when a person’s cognitive load — the amount of information streaming into working memory — rises beyond a certain, quite low threshold, learning suffers. There’s nothing counterintuitive about this. We’ve all experienced cognitive overload and its debilitating effects.

Earlier this year, Dan Rockmore, a computer scientist at Dartmouth, wrote of his decision to ban laptops and other personal computing devices from his classes:

I banned laptops in the classroom after it became common practice to carry them to school. When I created my “electronic etiquette policy” (as I call it in my syllabus), I was acting on a gut feeling based on personal experience. I’d always figured that, for the kinds of computer-science and math classes that I generally teach, which can have a significant theoretical component, any advantage that might be gained by having a machine at the ready, or available for the primary goal of taking notes, was negligible at best. We still haven’t made it easy to type notation-laden sentences, so the potential benefits were low. Meanwhile, the temptation for distraction was high. I know that I have a hard time staying on task when the option to check out at any momentary lull is available; I assumed that this must be true for my students, as well.

As Rockmore followed the research on classroom technology use, he found that the empirical evidence backed up his instincts.

No one would call Shirky or Rockmore a Luddite or a nostalgist or a technophobe. They are thoughtful, analytical scholars and teachers who have great enthusiasm and respect for computers and the internet. So their critiques of classroom computer use are especially important. Shirky, in particular, has always had a strong inclination to leave decisions about computer and phone use up to his students. He wouldn’t have changed his mind without good reason.

Still, even as the evidence grows, there are many teachers who, for a variety of reasons, continue to oppose any restrictions on classroom computer use — and who sometimes criticize colleagues that do ban gadgets as blinkered or backward-looking. At this point, some of the pro-gadget arguments are starting to sound strained. Alexander Reid, an English professor at the University of Buffalo, draws a fairly silly parallel between computers and books:

Can we imagine a liberal arts degree where one of the goals is to graduate students who can work collaboratively with information/media technologies and networks? Of course we can. It’s called English. It’s just that the information/media technologies and networks take the form of books and other print media. Is a book a distraction? Of course. Ever try to talk to someone who is reading a book? What would you think of a student sitting in a classroom reading a magazine, doodling in a notebook or doing a crossword puzzle? However, we insist that students bring their books to class and strongly encourage them to write.

Others worry that putting limits on gadget use, even if justified pedagogically, should be rejected as paternalistic. Rebecca Schuman, who teaches at Pierre Laclede Honors College, makes this case:

My colleagues and I joke sometimes that we teach “13th-graders,” but really, if I confiscate laptops at the door, am I not creating a 13th-grade classroom? Despite their bottle-rocket butt pranks and their 10-foot beer bongs, college students are old enough to vote and go to war. They should be old enough to decide for themselves whether they want to pay attention in class — and to face the consequences if they do not.

A related point, also made by Schuman, is that teachers, not computers, are ultimately to blame if students get distracted in class:

You want students to close their machines and pay attention? Put them in a smaller seminar where their presence actually registers and matters, and be engaging enough — or, in my case, ask enough questions cold — that students aren’t tempted to stick their faces in their machines in the first place.

The problem with blaming the teacher, or the student, or the class format — the problem with treating the technology as a neutral object — is that it ignores the way software and social media are painstakingly designed to exploit the mind’s natural inclination toward distractedness. Shirky makes this point well, and I’ll quote him here at some length:

Laptops, tablets and phones — the devices on which the struggle between focus and distraction is played out daily — are making the problem progressively worse. Any designer of software as a service has an incentive to be as ingratiating as they can be, in order to compete with other such services. “Look what a good job I’m doing! Look how much value I’m delivering!”

This problem is especially acute with social media, because . . . social information is immediately and emotionally engaging. Both the form and the content of a Facebook update are almost irresistibly distracting, especially compared with the hard slog of coursework. (“Your former lover tagged a photo you are in” vs. “The Crimean War was the first conflict significantly affected by use of the telegraph.” Spot the difference?)

Worse, the designers of operating systems have every incentive to be arms dealers to the social media firms. Beeps and pings and pop-ups and icons, contemporary interfaces provide an extraordinary array of attention-getting devices, emphasis on “getting.” Humans are incapable of ignoring surprising new information in our visual field, an effect that is strongest when the visual cue is slightly above and beside the area we’re focusing on. (Does that sound like the upper-right corner of a screen near you?)

The form and content of a Facebook update may be almost irresistible, but when combined with a visual alert in your immediate peripheral vision, it is—really, actually, biologically—impossible to resist. Our visual and emotional systems are faster and more powerful than our intellect; we are given to automatic responses when either system receives stimulus, much less both. Asking a student to stay focused while she has alerts on is like asking a chess player to concentrate while rapping their knuckles with a ruler at unpredictable intervals.

A teacher has an obligation not only to teach but to create, or at least try to create, a classroom atmosphere that is conducive to the work of learning. Ignoring technology’s influence on that atmosphere doesn’t do students any favors. Here’s some of what Anne Curzan, a University of Michigan English professor, tells her students when she explains why she doesn’t want them to use computers in class:

Now I know that one could argue that it is your choice about whether you want to use this hour and 20 minutes to engage actively with the material at hand, or whether you would like to multitask. You’re not bothering anyone (one could argue) as you quietly do your email or check Facebook. Here’s the problem with that theory: From what we can tell, you are actually damaging the learning environment for others, even if you’re being quiet about it. A study published in 2013 found that not only did the multitasking student in a classroom do worse on a postclass test on the material, so did the peers who could see the computer. In other words, the off-task laptop use distracted not just the laptop user but also the group of students behind the laptop user. (And I get it, believe me. I was once in a lecture where the woman in front of me was shoe shopping, and I found myself thinking at one point, “No, not the pink ones!” I don’t remember all that much else about the lecture.)

Our attention is governed not just by our will but by our environment. That’s how we’re built.

I suspect the debate over classroom computer use has become a perennial one, and that it will blossom anew every September. That’s good, as it’s an issue that deserves ongoing debate. But there is a point on which perhaps everyone can agree, and from that point of agreement might emerge constructive action. It’s a point about design, and Shirky gets at it in his article:

The fact that hardware and software is being professionally designed to distract was the first thing that made me willing to require rather than merely suggest that students not use devices in class. There are some counter-moves in the industry right now — software that takes over your screen to hide distractions, software that prevents you from logging into certain sites or using the internet at all, phones with Do Not Disturb options — but at the moment these are rear-guard actions. The industry has committed itself to an arms race for my students’ attention, and if it’s me against Facebook and Apple, I lose.

Computers and software can be designed in many different ways, and the design decisions will always reflect the interests of the designers (or their employers). Beyond the laptops-or-no-laptops-debate lies a broader and more important discussion about how computer technology has come to be designed — and why.

*This post, and the other posts cited within it, concerns the use of personal computing devices in classes in which those devices have not been formally incorporated as teaching aids. There are, of course, plenty of classes in which computers are built into the teaching plan. It’s perhaps noteworthy, though, to point out that, in the “Laptop and Lecture” study, students who used their laptops to look at sites relevant to the class actually did even worse on tests of retention than did students who used their computers to look at irrelevant sites.

Image: “Viewmaster” by Geof Wilson.

6 Comments

Filed under Uncategorized

Speak, algorithm

192hmaxdb7ij5jpg

Lost in yesterday’s coverage of the Apple Watch was a small software feature that, when demonstrated on the stage of the Flint Center, earned brief but vigorous applause from the audience. It was the watch’s ability to scan incoming messages and suggest possible responses. The Verge’s live-blogging crew were wowed:

autothink

The example Apple presented was pretty rudimentary. The incoming message included the question “Are you going with Love Shack or Wild Thing?” To which the watch suggested three possible answers: Love Shack, Wild Thing, Not Sure. Big whoop. In terms of natural language processing, that’s like Watson with a lobotomy.

But it was just a taste of a much more sophisticated “predictive text” capability, called QuickType, that Apple has built into the latest version of its smartphone operating system. “iOS 8 predicts what you’ll say next,” explains the company. “No matter whom you’re saying it to.”

Now you can write entire sentences with a few taps. Because as you type, you’ll see choices of words or phrases you’d probably type next, based on your past conversations and writing style. iOS 8 takes into account the casual style you might use in messages and the more formal language you probably use in Mail. It also adjusts based on the person you’re communicating with, because your choice of words is likely more laid back with your spouse than with your boss.

Now, this may all turn out to be a clumsy parlor trick. If the system isn’t adept at mimicking a user’s writing style and matching it to the intended recipient — if it doesn’t nail both text and context — the predictive-text feature will rarely be used, except for purposes of making “stupid robot” jokes. But if the feature actually turns out to be “good enough” — or if our conversational expectations devolve to a point where the automated messages feel acceptable — then it will mark a breakthrough in the automation of communication and even thought. We’ll begin allowing our computers to speak for us.

Is that a development to be welcomed? It seems more than a little weird that Apple’s developers would get excited about an algorithm that will converse with your spouse on your behalf, channeling the “laid back” tone you deploy for conjugal chitchat. The programmers seem to assume that romantic partners are desperate to trade intimacy for efficiency. I suppose the next step is to get Frederick Winslow Taylor to stand beside the marriage bed with a stopwatch and a clipboard. “Three caresses would have been sufficient, ma’am.”

In The Glass Cage, I argue that we’ve embraced a wrong-headed and ultimately destructive approach to automating human activities, and in Apple’s let-the-software-do-the-talking feature we see a particularly disquieting manifestation of the reigning design ethic. Technical qualities are given precedence over human qualities, and human qualities come to be seen as dispensable.

When we allow ourselves to be guided by predictive algorithms, in acting, speaking, or thinking, we inevitably become more predictable ourselves, as Rochester Institute of Technology philosopher Evan Selinger pointed out in discussing the Apple system:

Predicting you is predicting a predictable you. Which is itself subtracting from your autonomy. And it’s encouraging you to be predictable, to be a facsimile of yourself. So it’s a prediction and a nudge at the same moment.

It’s a slippery slope, and it becomes more slippery with each nudge. Predicted responses begin to replace responses, simply because it’s a little more efficient to simulate a response —a thought, a sentence, a gesture — than to undertake the small amount of work necessary to have a response. And then that small amount of work begins to seem like a lot of work — like correcting your own typos rather than allowing the spellchecker to do it. And then, as original responses become rarer, the predictions become predictions based on earlier predictions. Where does the algorithm end and the self begin?

And if we assume that the people we’re exchanging messages with are also using the predictive-text program to formulate their responses . . . well, then things get really strange. Everything becomes a parlor trick.

Image: Thomas Edison’s talking doll.

6 Comments

Filed under Uncategorized

Apple’s small big thing

watch2

Over at the Time site, I have a short commentary on the Apple Watch. It begins:

Many of us already feel as if we’re handcuffed to our computers. With its new smart watch, unveiled today in California, Apple is hoping to turn that figure of speech into a literal truth.

Apple has a lot riding on the diminutive gadget. It’s the first major piece of hardware the company has rolled out since the iPad made its debut four years ago. It’s the first new product to be designed under the purview of fledgling CEO Tim Cook. And, when it goes on sale early next year, it will be Apple’s first entry in a much-hyped product category — wearable computers — that has so far fallen short of expectations. Jocks and geeks seem eager to strap computers onto their bodies. The rest of us have yet to be convinced. …

Read on.

(Apple’s live stream of its event today was, by the way, a true comedy of errors. It seemed like the company was methodically going down a checklist of all the possible ways you can screw up a stream, from running audio feeds in different languages simultaneously to bouncing around in time in a way that would have made Billy Pilgrim dizzy.)

Image: Darren Birgenheier.

4 Comments

Filed under Uncategorized

Big Internet

lost

We talk about Big Oil and Big Pharma and Big Ag. Maybe it’s time we started talking about Big Internet.

That thought crossed my mind after reading a couple of recent posts. One was Scott Rosenberg’s piece about a renaissance in the ancient art of blogging. I hadn’t even realized that blogs were a thing again, but Rosenberg delivers the evidence. Jason Kottke, too, says that blogging is once again the geist in our zeit. Welcome back, world.

The other piece was Alan Jacobs’s goodbye to Twitter. Jacobs writes of a growing sense of disillusionment and disappointment with the ubiquitous microblogging platform:

As long as I’ve been on Twitter (I started in March 2007) people have been complaining about Twitter. But recently things have changed. The complaints have increased in frequency and intensity, and now are coming more often from especially thoughtful and constructive users of the platform. There is an air of defeat about these complaints now, an almost palpable giving-up. For many of the really smart people on Twitter, it’s over. Not in the sense that they’ll quit using it altogether; but some of what was best about Twitter — primarily the experience of discovery — is now pretty clearly a thing of the past.

“Big Twitter was great — for a while,” says Jacobs. “But now it’s over, and it’s time to move on.”

These trends, if they are actually trends, seem related. I sense that they both stem from a sense of exhaustion with what I’m calling Big Internet. By Big Internet, I mean the platform- and plantation-based internet, the one centered around giants like Google and Facebook and Twitter and Amazon and Apple. Maybe these companies were insurgents at one point, but now they’re fat and bland and obsessed with expanding or defending their empires. They’ve become the Henry VIIIs of the web. And it’s starting to feel a little gross to be in their presence.

So, yeah, I’m down with this retro movement. Bring back personal blogs. Bring back RSS. Bring back the fun. Screw Big Internet.

But, please, don’t bring back the term “blogosphere.”

Image: still from Lost.

4 Comments

Filed under Uncategorized

Playtators and their fans

Of sport and Men_web

“Man’s failure is yet more intense in the face of the triumph of ineffable things than in the face of heavy things.” —Roland Barthes, What Is Sport?

The videogamer has always been at once player and spectator, in the action and yet removed from it. Watcher and watched, entertainer and entertainee, warrior and couch potato, the videogamer was fated to become the broadcaster of his own amusements, and that makes Twitch and its success — Amazon is buying the game-streaming juggernaut for a billion dollars — something of an inevitability.

As Roland Barthes long ago noted, modern spectator sports usually involve an object that acts as a mediator of the competition: a puck or a ball of some sort. The mediator is the main focus of the violence, which helps keep the bloodshed within civilization’s tolerances and hence suitable for the metamedium of the screen. The videogame, which has as its very field of play a screen, adds further layers of mediation to the already unreal world of the spectator sport. What exactly are we watching when we watch Twitch? We’re watching a screen through a screen, virtual reality twice removed. It would seem to be media all the way down: sport as pure symbol, or, in Platonic terms, pure shadow.

It’s not blood, said Godard; it’s red.

Image: still from the 1961 film Of Sport & Men.

1 Comment

Filed under Uncategorized

Worlds of wordcraft

william-blake-night-thoughts

I enjoyed James Gleick’s review of Vikram Chandra’s Geek Sublime today, particularly the ending:

Poetry and logic live in different places, after all. Poetry has patience. It reaches into a dark vastness. But computer code has powers too. “It acts and interacts with itself, with the world,” Chandra says. And it changes us along the way. “We already filter experience through software — Facebook and Google offer us views of the world that we can manipulate, but which also, in turn, manipulate us. The embodied language of websites, apps and networks writes itself into us.”

Must one learn computer programming, then, to qualify as literate? Of course not. It doesn’t hurt to be aware of code, though. One of these days code will be aware of us.

If Gleick means conscious awareness, then I can’t say I share his confidence. There’s still a hell of a lot of undiscovered country between here and there. (If he means unconscious awareness, that’s a done deal.) Anyway, Chandra’s book sounds excellent.

Image: William Blake.

Comments Off

Filed under Uncategorized

Shattered

The unboxing ceremony has begun.

unboxed

The photo doesn’t do justice to the remarkable texture of the book jacket. Even if you’re not planning to buy The Glass Cage, you’re going to want to make a stop at your local bookstore just to touch the cover. Bring some band-aids.

6 Comments

Filed under Uncategorized