5 Machines That Rose Up Against Humans
If you were online at the wrong time last week, you heard a story about an Air Force simulation gone horribly awry. A drone was programmed to complete a mission, and when its operator tried to stop it, the drone targeted him. Then, it turned out this story wasn't true. No one had ever programmed such a simulation — someone had just speculated about how A.I. can go rogue, as people have for decades, and outlets incorrectly reported the imaginary situation as fact.
Many people walked away from that news cycle concluding that killer robots pose little threat after all. Not us. Us, we continue to mistrust every machine. We have a gun pointed at our toaster at all times (uneasily noting that the gun, too, is a machine). We know machines have a long history of evil, based on such incidents as the time that...
A chess robot is usually a simple piece of software. It comes up with chess moves, and the human competitor sees these using some device not unlike the one you’re using to read this article. A chess robot can also be a physical machine, which moves actual pieces on an actual board. We have a long history of such robots, dating back to before they were possible and were just a guy in a box pretending to be a robot.
One such chess robot appeared at the Moscow Open last July. Below is footage of it competing with several children simultaneously — which is no real accomplishment. A human chess player could manage the same thing, and there's no reason a robot should find alternating between three players any harder than playing just one. No, the real accomplishment is what else it does to one of the children:
via CNN
Yes, it reaches out and fractures the child's finger after he goes for a piece out-of-turn. Some accounts say the robot responded in anger by punishing the child. In reality, it targeted a piece and accidentally pressed on the errant finger, but that distinction provided little comfort for the boy. In the next screens, we see officials taking him away to receive euthanasia, as he is no longer of use to the nation as a competitor.
via CNN
Seriously, though: The kid came back the next day in a cast and finished the tournament, since a broken finger is no great handicap. You don't need all your fingers. Robots know this well. In fact, they’re counting on it.
The first recorded time a robot killed someone was 1979. As with the Moscow incident, the problem was we gave robots arms. Not guns — arms, metal limbs they can use to manipulate objects. One robot slammed its arm into a man's head, killing him.
Siaopan/Wiki Commons
As with all good robot stories, this one happened in Michigan. Robert Williams, 25, was working at a Ford plant that made auto parts. It was a robot's job to retrieve parts from a shelf, but for some reason, Williams was assigned to climb up and do the job himself. It was because the robot was too slow, his family would later claim in court. Even with Williams now perched on this shelf, the robot continued to operate, without his noticing. It moved its arm into a space now occupied by Williams’ head. He died immediately. Then the robot went on working, and no one realized what had happened till half an hour later, when they looked at the shelf and saw the corpse.
This case involved a certain amount of human error, and a jury took that into account when the family sued afterward. Still, surely the robot manufacturer could have included some sort of auditory cue to signal when the robot is moving, much like how most vehicles in a warehouse give off warning beeps. For this reason, the jury found the robot manufacturer (not Ford) liable for $10 million, later upped to $15 million. And we all learned an important lesson about robot safety. This lesson lasted two years, till we let the next robot kill another factory worker, by pushing him into a grinding machine.
Since those robots lack the intent to kill, these stories might not sound so different from any conventional industrial accident involving a machine. To which we say: Fine. The machines behind such accidents deserve to be shamed just as hard, whether anyone refers to them as a robot or not. For example, consider the story of Therac-25.
Therac-25, which had the perfect name for a robot, was a radiation therapy machine. Radiation treats cancer at the correct dosage and when targeted correctly, but excess radiation obviously hurts you. You might imagine then that a medical device should have safeguards in place to keep it from ever delivering too high a dose. If, say, a dose of 200 rads is on the high side, and a dose of 1,000 rads is lethal when applied to the whole body, perhaps a machine should never be allowed to give 15,000 or 25,000 in a single second. And yet, that's what Therac-25 did half a dozen times in separate 1980s cases in multiple hospitals.
Stefan Kögl
The earliest case, one of the more minor ones, left the patient immediately claiming to have been burned and soon after needing a mastectomy. Another patient felt the overdose as an electric shock. He got paralyzed from the waist down and was dead in six months. A third patient, in for minor skin cancer, died from radiation exposure, as did a fourth patient who was supposed to receive just 4 rads.
Sure, we’re all leery of giving robots missiles, but let's keep in mind the special danger posed by radiation weapons. Robots love radiation weapons. Machines are immune to radiation.
Along with all this talk of malfunctioning machines, we should say something about actual killer robots. We mean lethal autonomous weapons (LAWs), machines that are designed to kill and that do so using only their own programming.
Jollyroger/Wiki Commons
One milestone case of the military trying a vehicle with its own targeting system was the M247 Sergeant York in the late 1970s. "York" was the cool-robot-buddy name; its targeting system had the more fittingly sinister name of "DIVAD." This antiaircraft tank did not quite qualify as a LAW because a human operator still had to choose to fire on the target that York picked. This was fortunate. At one exhibition, York was supposed to target a series of drones. Instead, it turned its sights on the generals assembled in the stands.
When the operator did later choose to fire, York still hadn't picked the correct target. It had zeroed in on the exhaust fan of a latrine — and while propellers are smart targets during combat, this did not serve as a very reassuring demonstration of its powers. The military ultimately killed the DIVAD program, for reasons too complicated to fully lay out here, but as the Pentagon put it, "The Sergeant York was not operationally effective in adequately protecting friendly forces."
Protection is important, as is ventilation.
Kitchen appliances kill more people than any military weapon, even when they don't malfunction. Deep fryers alone kill hundreds of thousands of Americans every year, not by scalding them with boiling oil but by clogging their arteries. But these are slow deaths. If we’re talking about wiping people out in one blow, few food-preparation devices have higher body counts than one popcorn machine at an Indiana fair in 1963.
It was Halloween night. Thousands of people showed up to the state fairgrounds, watching ice skaters put on a medley called "Mardi Gras." Kind of an unfitting theme for an October celebration, but the really scary stuff was brewing over in the concessions area, in an enclosed area no one was monitoring, where a propane tank sprang a leak.
Then, with just three minutes left in the show, the heating element from an electric popcorn machine lit this gas. A fireball exploded up through the stands, sending flames 40 feet high. Body parts rained down. Some were immediately covered by chunks of falling concrete.
William H. Bass Photo Company
The explosion killed 74 people (some accounts say more) and injured hundreds. Do you know why, even though your microwave has a button labeled "popcorn," every bag of popcorn says not to use it? It's because that button cedes control to popcorn machinery. And popcorn machinery seeks to kill all humans.
Follow Ryan Menezes on Twitter for more stuff no one should see.