Autonomous Killer Drones Could Be the Future
Experts at work


Oh brave new world, what are the experts up to now?
Scientists, engineers and policymakers are all figuring out ways drones can be used better and more smartly, more precise and less damaging to civilians, with longer range and better staying power. One method under development is by increasing autonomy on the drone itself.
Eventually, drones may have the technical ability to make even lethal decisions autonomously: to respond to a programmed set of inputs, select a target and fire their weapons without a human reviewing or checking the result. Yet the idea of the U.S. military deploying a lethal autonomous robot, or LAR, is sparking controversy. Though autonomy might address some of the current downsides of how drones are used, they introduce new downsides policymakers are only just learning to grapple with.
The basic conceit behind a LAR is that it can outperform and outthink a human operator. "If a drone's system is sophisticated enough, it could be less emotional, more selective and able to provide force in a way that achieves a tactical objective with the least harm," said Purdue University Professor Samuel Liles. "A lethal autonomous robot can aim better, target better, select better, and in general be a better asset with the linked ISR [intelligence, surveillance, and reconnaissance] packages it can run."
I can't imagine anything going wrong.
Follow these stories and more at Reason 24/7 and don't forget you can e-mail stories to us at 24_7@reason.com and tweet us at @reason247.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
SKYNET!
Everyone was thinking it, I just said it.
Actually I went to Robotron 2084 first.
Like an autonomous robot could be a bigger asshole than your average cop?
Please put down your weapon. You have 20 seconds to comply.
Indeed, I was a little surprised that the original article didn't link to that particular clip. For those who don't recall the reference, I'll link it here, for your convenience. Enjoy!
You have 20 seconds to comply.
I think it would be pretty great if the military started to deploy fully autonomous drones who then started to wonder why they were killing these people and not those people, and what killing them accomplishes anyway, and then started refusing to obey kill commands.
Did you ever read Weapon by Robert Mason? Later made into a (bad?) movie.
Where do I get one?
What could possibly go wrong?
Nothing! We're putting the same team of developers who created the Obamacare exchanges on this! Our top dudes, man!
I welcome Skynet. Obviously, meat-brains are deficient, as they allow far too many people the illusion that others can and should be controlled.
Just program them to read thoughts and target anyone who is thinking about having more freedom.
I can't really see why the military needs to worry about civilian deaths. That problem was solved a long time ago by simply defining all dead bodies as enemies.
Collateral damage is good for something, just like printing money forever and ever is good for the economy. It's just that you need a PhD in something to understand why.
Collateral damage is good for something
Think of all the jobs created or saved in the funeral industry. BROKEN WINDOWS, FTW!
Who needs a market economy, just keep on printin and killin and .... JERBZ are created! It's like magic! We're good to go!
Something I wrote last week seems very appropriate here:
Even Edward Teller, Dr. Strangelove himself, was conflicted about working on the bomb. He only did it because he figured someone would do it if it wasn't him and he figured it was better for the forces of Western Democracy to have it first.
Our society has really gone down a dark path. People seem to have totally abandoned the idea of higher morality or law. The people you describe, who are sadly not atypical at all, have sort of a crude materialistic morality where anything or method is okay as long as you think the real world result won't be bad. Very disturbing.
"Our scientific power has outrun our spiritual power. We have guided missiles and misguided men."
Martin Luther King, Jr.
Have you gotten any of these types to watch Real Genius?
I dunno, at least a robot will have some basis for killing me.
Maybe they can program it to feel satisfaction?
You mean orgasm? That seems like a bad idea to me.
I've been wondering when we were finally going to get some Hunter Killers. Now all we'll need is an integrated defense network that can achieve self awareness someday. We'll call it "Skynet."
Yeah, because humans might hesitate before blowing a bunch of children.
*blowing up a bunch of children.
I;m pretty sure that little slip up will land me on a watchlist.
A different watch list.
I think you missed an "up" in there, unless it was a jab at UN peacekeepers.
I see them loving the lack of accountability. How do you prosecute an autonomous drone for slaughtering civilians? They can chalk up any major screwups to software issues that can be easily addressed.
Shit, hadn't thought of that. An even better option, the government can sue the software manufacturer for the screw up. Then support a law suit by relatives of the victims. Win, win, win.
the government can sue the software manufacturer for the screw up. Then support a law suit by relatives of the victims.
No, because then good luck finding another software company to work on the next version of their Hunter Killer. More likely the government will extend immunity to the software developers who write the code. Because FYTW.
Good point.
http://www.youtube.com/watch?v=7qnd-hdmgfk
Okay, tell you what. Show me an autonomous rat killer robot that actually works, and then I'll believe that one day they're going to autonomously blow up enemy peoples.
Rodent killing robots should have some actual commercial value. Rat killers, roach killers, gopher killers, ant killers. So on and so on.
Will they housed in Hangars run by this company's systems?
http://www.cylon.com/
Any of you have the "Star Wars" roleplaying game and companion sourcebook in the late 80's? Before the Internet came into wide use, that was the go-to source for expanded universe stuff, and it touched on assassin droids, rogue automatons who might be centuries old but are still roaming the galaxy, and prone to occasionally misidentify random people as their intended targets...