Category Archives: Uncategorized

Automation and the decay of talent

automation

The new wave of computer automation has provoked much concern and debate about job losses and the future of employment. Less discussed has been the way the computer is shaping the way people work and act, both on the job and in their personal lives. As the computer becomes a universal tool for getting things done, what happens to the diverse talents that people used to develop by engaging directly with the world in all its intricacy and complexity? In “The Great Forgetting,” an essay in the new issue of The Atlantic (the online version of the article bears the title “All Can Be Lost”), I look at some of the unexpected consequences of computer automation, particularly the way that software, as currently designed, tends to steal from us the opportunity to develop rich, distinctive, and hard-earned skills. Psychologists, human-factors experts, and other researchers are discovering that the price we pay for the ease and convenience of automation is a narrowing of human possibility.

Here’s an excerpt:

Psychologists have found that when we work with computers, we often fall victim to two cognitive ailments — complacency and bias — that can undercut our performance and lead to mistakes. Automation complacency occurs when a computer lulls us into a false sense of security. Confident that the machine will work flawlessly and handle any problem that crops up, we allow our attention to drift. We become disengaged from our work, and our awareness of what’s going on around us fades. Automation bias occurs when we place too much faith in the accuracy of the information coming through our monitors. Our trust in the software becomes so strong that we ignore or discount other information sources, including our own eyes and ears. When a computer provides incorrect or insufficient data, we remain oblivious to the error.

Examples of complacency and bias have been well documented in high-risk situations — on flight decks and battlefields, in factory control rooms — but recent studies suggest that the problems can bedevil anyone working with a computer. Many radiologists today use analytical software to highlight suspicious areas on mammograms. Usually, the highlights aid in the discovery of disease. But they can also have the opposite effect. Biased by the software’s suggestions, radiologists may give cursory attention to the areas of an image that haven’t been highlighted, sometimes overlooking an early-stage tumor. Most of us have experienced complacency when at a computer. In using e-mail or word-processing software, we become less proficient proofreaders when we know that a spell-checker is at work.

The way computers can weaken awareness and attentiveness points to a deeper problem. Automation turns us from actors into observers. That shift may make our lives easier, but it can also inhibit the development of expertise. Since the late 1970s, psychologists have been documenting a phenomenon called the “generation effect.” It was first observed in studies of vocabulary, which revealed that people remember words much better when they actively call them to mind — when they generate them — than when they simply read them. The effect, it has since become clear, influences learning in many different circumstances. When you engage actively in a task, you set off intricate mental processes that allow you to retain more knowledge. You learn more and remember more. When you repeat the same task over a long period, your brain constructs specialized neural circuits dedicated to the activity. It assembles a rich store of information and organizes that knowledge in a way that allows you to tap into it instantaneously.

Whether it’s Serena Williams on a tennis court or Magnus Carlsen at a chessboard, an expert can spot patterns, evaluate signals, and react to changing circumstances with speed and precision that can seem uncanny. What looks like instinct is hard-won skill, skill that requires exactly the kind of struggle that modern software seeks to alleviate.

This is one of the themes that I’ll be exploring in my next book, The Glass Cage: Automation and Us.

Photo: NASA.

The pleasures of merely circulating

whisk

Rob Horning, one of the most thoughtful writers on the online experience, considers how his writing and thinking have changed as he has shifted his time from blogging to tweeting:

Now, when I hit upon an article that starts me thinking, I excerpt a sentence of it on Twitter and start firing off aphoristic tweets. I don’t worry about ordering my thoughts into a sequential argument, or revising my first impressions much. I don’t try to build toward a conclusion; rather I try to draw conclusions that seem to require no build-up, no particular justification to be superficially plausible. And then, more often than not, I will monitor what sort of reaction these statements get to assess their accuracy, their resonance. At best, my process of deliberation and further reading on the subject  gets replaced by immediate Twitter conversations with other people. At worst, tweeting pre-empts my doing any further thinking, since I am satisfied with merely charting the response.

One of his recent tweets reads: “making things circulate seems far more important than letting things ‘settle’ within me.” Frisson and dolor, a Catherine wheel of vanity, servitude to vanishing ink: the Twitter intellectual is a strange new species.

The future’s so bright I gotta wear Glass

shiva

“It’s coming,” said Google Xer Mary Lou Jepsen last week. “I don’t think it’s stoppable.” She’s referring, of course, to Glass, Google’s much anticipated head-mountable. “I’ve thought for many years that a laptop is an extension of my mind,” she continued. “Why not have it closer to my mind?” Hmm. Next time I see Spock, I’m going to have to ask him if that’s logical. In the meantime, I will sleep with my Air under my pillow, just in case.

“You become addicted to the speed of it,” Jepsen confessed. Like all junkies, she craves more. Glass is just the “Model T” of wearables. In the churning bowels of the company’s secret lab, she let on, new and even zippier generations of mind-melding computers are already taking shape. “I’m now running a super-secret, stealth part of Google X that I can’t tell you anything about today. I’m really sorry. Maybe next year. Probably next year.” Jepsen said that she and her team are only sleeping three hours a night. That’s how important their work is.

Michael Sacasas sees Jepsen’s words as yet another manifestation of what he terms the Borg Complex — the quasi-religious belief that computer technology is an inexorable force carrying us to a better world. Only losers would be so foolish as to resist. Earlier this year, Eric Schmidt gave the starkest expression of this view. Also speaking of Glass, he said: “Our goal is to make the world better. We’ll take the criticism along the way, but criticisms are inevitably from people who are afraid of change or who have not figured out that there will be an adaptation of society to it.” Inevitably. Schmidt, in his benighted fashion, wants to imbue adaptation, a fundamentally amoral process, with a moral glow. To adapt is to improve, history and biology be damned.

There is no greater arrogance than the arrogance of those who assume their intentions justify their actions.

I believe in yesterday

8track

The following review of Retromania by Simon Reynolds originally appeared in The New Republic in 2011.

“Who wants yesterday’s papers?” sang Mick Jagger in 1967. “Who wants yesterday’s girl?” The answer, in the Swinging 60s, was obvious: “Nobody in the world.” That was then. Now we seem to want nothing more than to read yesterday’s papers and carry on with yesterday’s girl. Popular culture has become obsessed with the past — with recycling it, rehashing it, replaying it. Though we live in a fast-forward age, we can’t take our finger off the rewind button.

Nowhere is the past’s grip so tight as in the world of music, as the rock critic Simon Reynolds meticulously documents in Retromania. Over the last two decades, he argues, the “exploratory impulse” that once powered pop music forward has shifted its focus from Now to Then. Fans and musicians alike have turned into archeologists. The evidence is everywhere. There are the reunion tours and the reissues, the box sets and the tribute albums. There are the R&B museums, the rock halls of fame, the punk libraries. There are the collectors of vinyl and cassettes and — God help us — eight-tracks. There are the remixes, the mash-ups, the samples. There are the “curated” playlists. When pop shakes its moneymaker today, what rises is the dust of the archive.

Nostalgia is nothing new. It has been a refrain of art and literature at least since Homer set Odysseus on Calypso’s island and had him yearn to turn back time. And popular music has always had a strong revivalist streak, particularly in Reynolds’s native Britain. But retromania is not just about nostalgia. It goes deeper than the tie-dyed dreams of Baby Boomers or the gray-flecked mohawks of Gen X punks. Whereas nostalgia is rooted in a sense of the past as past, retromania stems from a sense of the past as present. Yesterday’s music, in all its forms, has become the atmosphere of contemporary culture. We live, Reynolds remarks, in “a simultaneity of pop time that abolishes history while nibbling away at the present’s own sense of itself as an era with a distinct identity and feel.”

One reason is the sheer quantity of pop music that has accumulated over the past half century. Whether it is rock, funk, country, or electronica, we have heard it all before. Even the edgiest musicians have little choice but to produce pastiche. Greatly amplifying the effect is the recent shift to producing and distributing songs as digital files. When kids had to fork out cash for records or CDs, they had to make hard choices about what they listened to and what they let pass by. Usually, they would choose the new over the old, which served to keep the past at bay. Now, thanks to freely traded MP3s and all-you-can-eat music services such as Spotify, there is no need to make choices. Pretty much any song ever recorded is just a click away. With the economic barrier removed, the old floods in, swamping the new.

Reynolds argues that the glut of tunes has not just changed what we listen to; it has also changed how we listen. The rapt fan who knew every hook, lyric, and lead by heart has been replaced by the fickle dabbler who cannot stop hitting Next. Reynolds presents himself as a case in point, and his experience will sound familiar to anyone with a hard drive packed with music files. He was initially “captivated” by the ability to use a computer to navigate an ocean of tunes. But in short order he found himself more interested in “the mechanism” than the music: “Soon I was listening to just the first fifteen seconds of every track; then, not listening at all.” The logical culmination, he writes, “would have been for me to remove the headphones and just look at the track display.”

Given a choice between more and less, we all choose more, even if it means a loss of sensory and emotional engagement. Though we don’t like to admit it, the digital music revolution has merely confirmed what we have always known: we cherish what is scarce, and what is abundant we view as disposable. Reynolds quotes another music writer, Karla Starr: “I find myself getting bored even in the middle of songs simply because I can.”

As all time is compressed into the present moment, our recycling becomes ever more compulsive. We begin to plunder not just bygone eras but also the immediate past. Over the course of the last decade, writes Reynolds, “the interval between something happening and its being revisited seemed to shrink insidiously.” Not only did we have 1960s revivals and 70s revivals and 80s revivals, but we even began to see revivals of musical fashions from the 90s, such as shoegaze and Britpop. It sometimes seems that the reason things go out of fashion so quickly these days is because we cannot wait for them to come back into fashion. Displaying enthusiasm for something new is socially risky, particularly in an ironical time. It is safer to wait for it to come around again, preferably bearing the “vintage” label.

For musicians themselves, the danger is that their art becomes disconnected from the present — “timeless” in a bad sense. The eras of greatest ferment and creativity in popular music, such as the mid-60s and the late 70s, were times of social discontent, when the young rejected the past and its stifling traditions. Providing the soundtrack for rebellion, rock musicians felt compelled to slay their fathers rather than pay tribute to them. Even if their lyrics were about getting laid or getting high — as they frequently were — their songs were filled with political force. Those not busy being born, as Dylan put it shortly after taking an axe to his folkie roots, are busy dying.

Now, youth culture is largely apolitical, and pop’s soundtrack is just a soundtrack. Those not busy being born are busy listening to their iPods. Whether it’s Fleet Foxes or Friendly Fires, Black Keys or Beach House, today’s bands are less likely to battle the past than to luxuriate in it. This is not to say they aren’t good bands. As Reynolds is careful to note, there is plenty of fine pop music being made today, in an ear-boggling array of styles. But drained of its subversive energies, none of it matters much. It just streams by.

Retromania is an important and often compelling work, but it is also a sprawling one. Its aesthetic is more Sandinista! than “Hey Ya!” But Reynolds is sharp, and he knows his stuff. Even when his narrative gets lost in the details, the details remain interesting. (I didn’t know, for instance, that the rave scene of the early 90s had its origins in the trad-jazz fad that preceded Beatlemania in England.) Reynolds might also be accused of being something of a retromaniac himself. After all, in worrying about the enervating influence of the past, he echoes the complaints of earlier cultural critics. “Our age is retrospective,” grumbled Emerson in 1836. “Why should we grope among the dry bones of the past, or put the living generation into masquerade out of its faded wardrobe?” Longing for a less nostalgic time is itself a form of nostalgia.

But Reynolds makes a convincing case that today’s retromania is different in degree and in kind from anything we’ve experienced before. And it is not just an affliction of the mainstream. It has also warped the perspective of the avant-garde, dulling culture’s cutting edge. It’s one thing for old folks to look backwards. It’s another thing — and a far more lamentable one — for young people to feed on the past. Somebody needs to figure out a new way to smash a guitar.

Photo from Eight Track Museum.

The Shallows in Silicon Valley

Silicon Valley Reads, one of the country’s premier community reading programs, has announced that its theme for 2014 will be “Books & Technology: Friends or Foes?,” and I’m thrilled to report that The Shallows is one of the two books that have been selected for the program. The other is Robin Sloan’s novel Mr. Penumbra’s 24 Hour Bookstore. Silicon Valley Reads includes dozens of free events at libraries, schools, and other venues throughout Santa Clara County. I’ll be attending as many of those events as possible, including the kick-off program on January 22. If you’re in the area, I hope to have a chance to meet you. I expect the topic will spur some thought-provoking  discussions among the Valley’s residents.

SVR will also have a kids’ program related to the general theme, focusing on three books: The Fantastic Flying Books of Mr. Morris Lessmore by William Joyce; Escape from Mr. Lemoncello’s Library by Chris Grabenstein; and Reading Makes You Feel Good by Todd Parr.

A full schedule of events will be posted soon at the SVR website.

Head Wake Up

The best thing about Google Glass, so far, are the instructions. I’m particularly fond of the line drawings that Google is using to explain how to use the device. Here’s how one performs the “Head Wake Up” gesture:

SNP_3064370_en_v0

I’m not convinced yet that I need Glass, but I would like to have a Head Wake Up command. Head Sleep would be good, too.

Thinking is knowing is thinking

mosaic

With lots of kids heading to school this week, an old question comes back to the fore: Can thinking be separated from knowing?

Many people, and not a few educators, believe that the answer is yes. Schools, they suggest, should focus on developing students’ “critical thinking skills” rather than on helping them beef up their memories with facts and other knowledge about the world. With the Internet, they point out, facts are always within easy reach. Why bother to make the effort to cram stuff into your own long-term memory when there’s such a capacious store of external, or “transactive,” memory to draw on? A kid can google the facts she needs, plug them into those well-honed “critical thinking skills,” and – voila! – brilliance ensues.

That sounds good, but it’s wrong. The idea that thinking and knowing can be separated is a fallacy, as the University of Virginia psychologist Daniel Willingham explains in his book Why Don’t Students Like School. This excerpt from Willingham’s book seems timely:

I defined thinking as combining information in new ways. The information can come from long-term memory — facts you’ve memorized — or from the environment. In today’s world, is there a reason to memorize anything? You can find any factual information you need in seconds via the Internet. Then too, things change so quickly that half of the information you commit to memory will be out of date in five years — or so the argument goes. Perhaps instead of learning facts, it’s better to practice critical thinking, to have students work at evaluating all that information available on the Internet, rather than trying to commit some small part of it to memory.

This argument is false. Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not simply because you need something to think about. The very processes that teachers care about most — critical thinking processes such as reasoning and problem solving — are intimately intertwined with factual knowledge that is in long-term memory (not just found in the environment).

It’s hard for many people to conceive of thinking processes as intertwined with knowledge. Most people believe that thinking processes are akin to those of a calculator. A calculator has available a set of procedures  (addition, multiplication, and so on) that can manipulate numbers, and those procedures can be applied to any set of numbers. The data (the numbers) and the operations that manipulate the data are separate. Thus, if you learn a new thinking operation (for example, how to critically analyze historical documents), it seems like that operation should be applicable to all historical documents, just as a fancier calculator that computes sines can do so for all numbers.

But the human mind does not work that way. When we learn to think critically about, say, the start of the Second World War, it does not mean that we can think critically about a chess game or about the current situation in the Middle East or even about the start of the American Revolutionary War. Critical thinking processes are tied to the background knowledge. The conclusion from this work in cognitive science is straightforward: we must ensure that students acquire background knowledge with practicing critical thinking skills.

Willingham goes on the explain that once a student has mastered a subject — once she’s become an expert — her mind will become fine-tuned to her field of expertise and she’ll be able to fluently combine transactive memory with biological memory. But that takes years of study and practice. During the K – 12 years, developing a solid store of knowledge is essential to learning how to think. There’s still no substitute for a well-furnished mind.