When Roombas kill


Jenny Shank interviews me about The Glass Cage over at MediaShift. The conversation gets into some topics that haven’t been covered much elsewhere, including my suggestion that Roomba, the automated vacuum cleaner, provides an early and ever so slightly ominous example of robot morality (or lack thereof). “Roomba makes no distinction between a dust bunny and an insect,” I write in the book. “It gobbles both, indiscriminately. If a cricket crosses its path, the cricket gets sucked to its death. A lot of people, when vacuuming, will also run over the cricket. They place no value on a bug’s life, at least not when the bug is an intruder in their home. But other people will stop what they’re doing, pick up the cricket, carry it to the door, and set it loose. … When we set Roomba loose on a carpet, we cede to it the power to make moral choices on our behalf.”

Here’s the relevant bit from the interview:

Shank: “The Glass Cage” made explicit for me a number of problems with automation that I had been vaguely worried about. But one thing that I had never worried about until reading “The Glass Cage” was the morality of the Roomba. You write, “Roomba makes no distinction between a dust bunny and an insect.” Why is it so easy to overlook the fact, as I did, that when a Roomba vacuums indiscriminately, it’s following a moral code?

Carr: It’s easier not to think about it, frankly. The workings of automated machines often raise tricky moral questions. We tend to ignore those gray areas in order to enjoy the conveniences the machines provide without suffering any guilt. But I don’t think we’re going to be able to remain blind to the moral complexities raised by robots and other autonomous machines much longer. As soon as you allow robots, or software programs, to act freely in the world, they’re going to run up against ethically fraught situations and face hard choices that can’t be resolved through statistical models. That will be true of self-driving cars, self-flying drones, and battlefield robots, just as it’s already true, on a lesser scale, with automated vacuum cleaners and lawnmowers. We’re going to have to figure out how to give machines moral codes even if it’s not something we want to think about.


Image: Juliette Culver.

13 thoughts on “When Roombas kill

  1. Linux Guru

    Moral codes for Roombas – don’t you think Asimov’s Robots Rules of Order would apply – is that a moral code? Robots like Roomba are not moral agents they are surrogates for moral agents – the people who program them. Computers, no matter how complicated the tasks they perform may seem, are not making decisions they are implementing tasks that were programmed by humans. If Roomba kills crickets, it’s the fault of the designers not the robot itself. It’s the difference between murder and malfunction. One is a call to the cops the other to the warranty department

  2. Linux Guru

    Nicks posting has just inspired me again. I just realized how simple it would be to modify the basic design of the Roomba to make is a human killing machine. All that would be required would be to add two cheap video cameras for crude stereo vision, a cradle to hold a gun and a relay trigger to fire it connected to a standard serial or USB port connection. Attached to an embedded MP board running an open OS Linux or Free BSD, the only custom software needed would be some kind of image recognition program that scans the visual field, pattern matches it to the stored image file of the face of the indented victim, rotate the Roomba to orient the gun and then fire it. My guess this could be done for about a hundred dollars – not counting the cost of the weapon. I wonder what the Feds would do if I published a design in Popular Mechanics? History may note that the PC revolution started out with build-it-yourself kits. Could you image the kind of technological revolution from kits to make personal killer Roombas? LOL ;)

  3. dandre

    De-humanization can be scary. As with killing robots.
    But de-humanization can also be with real people. Such as in military where people just follow orders, perhaps disguised, perhaps prohibited to open their mouth.

  4. XSA

    Sturgeon’s Law

    90% of everything is crud.

    “There is a related principle actually observed in economics, the Pareto Principle or “80-20 rule”: 80% of the work is done by 20% of the group. This makes sense if you think about it: in a given group there will be, for whatever reason, variation in the capability of its constituent individuals, and by and large, variation tends to take the form of a bell-curve distribution: the vast majority are average or near-average, with occurrence correlating to rarity. So, if you take that curve (representing the number of individuals at each level of performance) and multiply by said level of performance, you get a plot showing how the total amount of work done is distributed among the various levels of performance, which will obviously be skewing towards the higher-performance end. The rule is an approximation and the exact ratio will vary with the situation, but the general principle is very widespread in situations involving normal and power law distributions. The principle is also used in Statistical Process Control, a mathematical approach to quality control, stating that generally, 80% of total defects are caused by 20% of known failure modes.”

    Higher-performance at lower costs. Humans are still needed to make a mess that needs cleaned up and we’re going to clean up.

  5. XSA

    An important corollary to SturgeonsLaw:

    … but ninety percent of everybody thinks they are part of the ten percent that’s not crap.

    People are far better at recognizing incompetence in others than themselves – see UnskilledAndUnawareOfIt. http://c2.com/cgi/wiki?SturgeonsLaw

    Gift ideas: “KurtVonnegut was involved in one great Internet moment. In 1997 an e-mail began circulating that purported to be a MIT commencement speech given by Mr. Vonnegut. The first line of the speech was “WearSunscreen?”. The chainmail spread like wildfire. However, it was a hoax. The ‘speech’ was really written by Chicago Tribune Newspaper Columnist MarySchmich?. In a sense, Mr. Vonnegut became involved in a real-life story just as bizarre as his writings.

    Meanwhile, the WearSunscreen? column became a popular graduation gift and even a record (read by director BazLuhrman?!!). Its popularity (and royalties) probably make up for Ms. Schmich’s destiny as the Internet’s first inadvertant ghost writer.

    So nobody but me thought that WearSunscreen? speech was just too stupid – even as a parody of itself – to have been Kurt’s? –PhlIp” http://c2.com/cgi/wiki?KurtVonnegut

  6. Linux Guru

    The basic tool for the manipulation of reality is the manipulation of words. If you can control the meaning of words, you can control the people who must use the words.

    –Philip K. Dick

  7. sort_of_knowledgeable

    That’s delegation and not specific to technology. I hire a house keeper and they vacuum up the cricket. I’m riding in a cab and the drive hits a cat because trying to stop or swerve to avoid the cat might cause an accident.

Comments are closed.