Science & Technology

Something New to Worry About: Murderous, Autonomous Drones

|

Yep. We don't just need to worry over drones killing Pakistani folks, or spying 24/7, making the U.S. into a dystopia where the shards of the Fourth Amendment are ground into dust. No, let's remember that we should also worry about the near-future where drones achieve sentience and, presumably, kill us all. Or, maybe not quite so bad as that, but this piece by J. Michael Cole at The Diplomat Magazine, make it clear that it's not just screaming Luddites who might have a problem with the eventual evolution of our favorite assassination machines. Yes, this future is almost here, to the delight of the Pentagon:

The U.S. military (and presumably others) have been making steady progress developing drones that operate with little, if any, human oversight. For the time being, developers in the U.S. military insist that when it comes to lethal operations, the new generation of drones will remain under human supervision. Nevertheless, unmanned vehicles will no longer be the "dumb" drones in use today; instead, they will have the ability to "reason" and will be far more autonomous, with humans acting more as supervisors than controllers.

Scientists and military officers are already envisaging scenarios in which a manned combat platform is accompanied by a number of "sentient" drones conducting tasks ranging from radar jamming to target acquisition and damage assessment, with humans retaining the prerogative of launching bombs and missiles.

It's only a matter of time, however, before the defense industry starts arguing that autonomous drones should be given the "right" to use deadly force without human intervention. In fact, Ronald Arkin of Georgia Tech contends that such an evolution is inevitable. In his view, sentient drones could act more ethically and humanely, without their judgment being clouded by human emotion (though he concedes that unmanned systems will never be perfectly ethical). Arkin is not alone in thinking that "automated killing" has a future, if the guidelines established in the U.S. Air Force's Unmanned Aircraft Systems Flight Plan 2009-2047 are any indication.

In an age where printers and copy machines continue to jam, the idea that drones could start making life-and-death decisions should be cause for concern. Once that door is opened, the risk that we are on a slippery ethical slope with potentially devastating results seems all too real. One need not envision the nightmares scenario of an out-of-control Skynet from Terminator movie fame to see where things could go wrong.

Read the rest here.

Later in the piece, Cole mentions Gulf War One's "video game" look on television that was criticized as the time as a cold, sterile way to do the very serious task of killing. Drones, where the decision-makers don't even need to be in the same country, are the next several steps beyond that. But decision-making drones, who decide that a strike is necessary no matter what the cost-benefit or new information or, whoops. that was  hospital, well, that may be the Rubicon to try and not cross.

But of course these ethical and technological freak-outs are too late.  Back in January, the LA Times explored these fears and questions, describing the workings of the already-built X-47B . The 60 foot drone has no pilot (though it's flight path can be overwritten and it has a remote controler of sorts, if needed) and strongly resembles a UFO for maximum creep-factor. As the article notes, another, less panicky problem with autonomous drones is just who is accountable for any mistakes that could occur when there isn't even a human pilot. Is it the programer's fault? The military commander's? Maybe more to the point, since accountability for war is a much rarer thing, who will be to blame if there are domestic drone accidents, crimes, spying, etc. Notes the Times:

The drone is slated to first land on a carrier by 2013, relying on pinpoint GPS coordinates and advanced avionics. The carrier's computers digitally transmit the carrier's speed, cross-winds and other data to the drone as it approaches from miles away.

The X-47B will not only land itself, but will also know what kind of weapons it is carrying, when and where it needs to refuel with an aerial tanker, and whether there's a nearby threat, said Carl Johnson, Northrop's X-47B program manager. "It will do its own math and decide what it should do next."

In July, Wired's Spencer Ackerman wrote that the drone doesn't yet carry anything:

And since the X-47B isn't strapped with any weapons or cameras — yet — the Navy hasn't figured out exactly how the human being who will decide when to use its deadly force fits into the drone's autonomous operations. "We're in the crawl-walk-run stage of autonomous systems," Engdahl says. But underneath the bulbous, beak-like nose cone — what you might call the jowls of the drone, there's a second set of doors behind the landing gear. That's the payload bay, where the drone can carry two 2,000-pound bombs.

I guess it's time to mention the occasional really cool perk to this new technology, say, being able to get shots of beautiful Pakistan mountain ranges that are much more dangerous and difficult to film by helicopter.