Policy

Forget Drones, Beware of Killer Robots

|

SGR-1 sentry robot
When I grow up, I want to be a terminator

In case you didn't have enough too worry about in terms of death raining down from the sky from remotely guided drones, now comes word that completely autonomous killer robots aren't just minions of Skynet, they're a likely presence in our near future. A new report released by Human Rights Watch cautions that "killer robots" (their words, not mine) could reach battlefields within the next few decades, and it strongly suggests that humans try to legislatively head-off that long-feared development.

In Losing Humanity, The Case Against Killer Robots, co-published by HRW and Harvard Law School's International Human Rights Clinic, the authors write:

With the rapid development and proliferation of robotic weapons, machines are starting to take the place of humans on the battlefield. Some military and robotics experts have predicted that "killer robots"—fully autonomous weapons that could select and engage targets without human intervention—could be developed within 20 to 30 years. At present, military officials generally say that humans will retain some level of supervision over decisions to use lethal force, but their statements often leave open the possibility that robots could one day have the ability to make such choices on their own power.

As that paragraph makes clear. the technology of "fully autonomous weapons" is highly speculative. To some extent, it seems to require that ever-elusive "artificial intelligence" that is either going to enslave us, or turn us all into scut-work-free philosopher kings, depending on your favorite science fiction writer. But it's obvious that military researchers are trying to move beyond the human-in-the-loop weapons that characterize remotely controlled drones in an effort to develop weapons that can select their own targets.

Military policy documents, especially from the United States, reflect clear plans to increase the autonomy of weapons systems. In its Unmanned Systems Integrated Roadmap FY2011-2036, the US Department of Defense wrote that it "envisions unmanned systems seamlessly operating with manned systems while gradually reducing the degree of human control and decision making required for the unmanned portion of the force structure."

As an example of a deployed system that implements some autonomy, the report cites the U.S. Navy's MK 15 Phalanx Close-In Weapons System which can sense and shoot-down incoming missilies and aircraft without human oversight. A land-based  Counter Rocket, Artillery, and Mortar System (C-RAM), also American, features similar hands-off abilities. Israel's Iron Dome, which has played a major role in headlines in recent days, is also largely automatic in its ability to identify and intercept missiles. South Korea has deployed a robotic sentry system along the demilitarized zone, though it requires human permission to fire weapons.

Of course, Phalanx and Iron Dome are a hell of a long way from terminators wandering the landscape hunting for John Connor, but if Losing Humanity can't point to a lot of Ahnold-ready technology, it cites plenty of official intent to see if Skynet is a do-able, short-term goal.

The report concludes, convincingly to me, that "[t]o comply with international humanitarian law, fully autonomous weapons would need human qualities that they inherently lack." Basically, we might give weapons systems the ability to hunt and kill people, but we don't know of any way to give them human moral judgment.

Fair enough, but implementing the next step is … trickier: "The development of autonomous technology should be halted before it reaches the point where humans fall completely out of the loop." Stopping development of weapons hasn't had such an impressive track record, so far. The best that's been managed is unleashing the likes of Stuxnet to slow down progress on forbidden weapons by nations that have yet to develop them. That doesn't prevent research, it just puts off proliferation a little longer into the future. Frankly, sticking a gun on a robot is going to be a lot harder to head-off than the construction of nuclear warheads.

Robot overlords may or may not be part of our future, but killer robots look increasingly likely.