Tech in schools: less is more

onetoone

Although many educators and school administrators, including those working in the U.S. Department of Education, continue to push schools to invest heavily in computer technology, the evidence of any benefit from such investments remains elusive. The biggest beneficiaries of heavy spending on school technology are technology firms. Students, meanwhile, may actually be harmed by having too much tech in the classroom, particularly when spending on hardware and software leaves less money for hiring and training teachers and improving school facilities.

The latest evidence on the effect of computer use on learning, and some of the strongest to date, comes in a large, international study released today by the Organization for Economic Cooperation and Development. Called “Students, Computers and Learning: Making the Connection,” the study “shows that the reality in our schools lags considerably behind the promise of technology,” writes the OECD’s director of education and skills, Andreas Schleicher. Computers’ “impact on student performance is mixed at best.” He sums up the study’s findings this way: Continue reading

In the kingdom of the bored, the one-armed bandit is king

interface

It still feels a little shameful to admit to the fact, but what engages us more and more is not the content but the mechanism. Kenneth Goldsmith, in a Los Angeles Review of Books essay, writes of a recent day when he felt an urge to listen to some music by the American composer Morton Feldman:

I dug into my MP3 drive, found my Feldman folder and opened it up. Amongst the various folders in the directory was one labeled “The Complete Works of Morton Feldman.” I was surprised to see it there; I didn’t remember downloading it. Curious, I looked at its date — 2009 — and realized that I must’ve grabbed it during the heyday of MP3 sharity blogs. I opened it to find 79 albums as zipped files. I unzipped three of them, listened to part of one, and closed the folder. I haven’t opened it since.

The pleasure of listening to music was not as great as he anticipated. He found more pleasure in manipulating music files.

Our role as librarians and archivists has outpaced our role as cultural consumers. Engaging with media in a traditional sense is often the last thing we do. … In the digital ecosystem, the apparatuses surrounding the artifact are more engaging than the artifact itself. Management (acquisition, distribution, archiving, filing, redundancy) is the cultural artifact’s new content. … In an unanticipated twist to John Perry Barlow’s 1994 prediction that in the digital age we’d be able to enjoy wine without the bottles, we’ve now come to prefer the bottles to the wine.

It’s as though we find ourselves, suddenly, in a vast library, an infinite library, a library of Borgesian proportions, and we discover that what’s of most interest to us is not the books on the shelves but the intricacies of the Dewey Decimal System. Continue reading

I left my <3 in San Francisco

hill

In his revealing Q&A session in June, Mark Zuckerberg offered a peek into the future of interpersonal communication:

One day, I believe we’ll be able to send full, rich thoughts to each other directly using technology. You’ll just be able to think of something and your friends will immediately be able to experience it too if you’d like. This would be the ultimate communication technology.

Wow. That’s really going to require some incredible impulse control. Your inner filter is going to have to kick in not between thought and expression, as it does now, but before the formation of the thought itself. I mean, would you really want to share your raw thought-stream with another person, even a friend? Or maybe the technology will somehow allow you to send out a new thought to retrieve and erase a prior thought before it hits the other person’s brain? Zuck may want instantaneous thought-sharing, but I’m thinking there’s going to have to be some kind of time delay built into the system. Otherwise, the interbrain highway is going to resemble something out of a Mad Max movie.

Helpfully, William Davies puts Zuckerberg’s words into context: Continue reading

Smartness is a zero-sum game

blue

In her article “The Internet of Way Too Many Things,” Allison Arleff reviews some of the exciting new products on display at Target’s trendy Open House store in San Francisco. There’s Leeo, a night light “that ‘listens’ for your smoke detector to go off and then calls your smartphone to let you know your house might be on fire.” There’s Whistle, a $100 doggie dongle that “attaches to your pet’s collar and allows you to set a daily activity goal customized to your dog’s age, breed and weight.” And there’s Mimo, a web-enabled onesie that monitors your baby’s “body position” during the night. “When Mimo is connected to other devices in your home and discerns that your baby is stirring,” reports Arleff, “the lights turn on, coffee begins brewing and some Baby Mozart starts playing on the stereo.”

Welcome to Peter Thiel’s “innovation desert.” You’ll die of thirst, but at least the mirages are amusing.

There’s something else going on here, though, something deeper than the production of trinkets for neurotics. Each of these products is an example of a defining trend of our networked age:  the outsourcing of common sense to gadgetry. A foundational level of human perception and competence is being mechanized through apps and online services. The more mediated our lives become, the more we rely on media to make sense of the world for us. We can’t even trust ourselves to take Rover for a walk. Continue reading

Dawn of the automatic age

laboranonymous

It’s Labor Day. To mark the occasion, here’s a brief excerpt from The Glass Cage that describes the origins of automation after the Second World War:

The word automation entered the language only recently. It was first uttered in 1946, when engineers at the Ford Motor Company needed a new term to describe the latest machinery being installed on the company’s assembly lines. “Give us some more of that automatic business,” a Ford vice president reportedly said in a meeting. “Some more of that — that — ‘automation.’”

Ford’s plants were already famously mechanized, with sophisticated machines streamlining every job on the line. But factory hands still had to carry parts and subassemblies from one machine to the next. The workers still controlled the pace of production. The equipment installed in 1946 changed that. Machines took over the material-handling and conveyance functions, allowing the entire assembly process to proceed automatically. The alteration in work flow may not have seemed momentous to those on the factory floor. But it was. Control over a complex industrial process had shifted from worker to machine.

That the new Ford equipment arrived just after the end of the Second World War was no accident. It was during the war that modern automation technology took shape. When the Nazis began their bombing blitz against Great Britain in 1940, English and American scientists faced a challenge as daunting as it was pressing: How do you knock high-flying, fast-moving bombers out of the sky with heavy missiles fired from unwieldy antiaircraft guns on the ground? The mental calculations and physical adjustments required to aim a gun accurately — not at a plane’s current position but at its probable future position — were far too complicated for a soldier to perform with the speed necessary to get a shot off while a plane was still in range. The missile’s trajectory, the scientists saw, had to be computed by a calculating machine, using tracking data coming in from radar systems along with statistical projections of a plane’s course, and then the calculations had to be fed automatically into the gun’s aiming mechanism to guide the firing. The gun’s aim, moreover, had to be adjusted continually to account for the success or failure of previous shots.

As for the members of the gunnery crews, their work would have to change to accommodate the new generation of automated weapons. And change it did. Artillerymen soon found themselves sitting in front of screens in darkened trucks, selecting targets from radar displays. Their identities shifted along with their jobs. They were no longer seen “as soldiers,” writes one historian, but rather “as technicians reading and manipulating representations of the world.” Continue reading