Something New to Worry About: Murderous, Autonomous Drones
Yep. We don't just need to worry over drones killing Pakistani folks, or spying 24/7, making the U.S. into a dystopia where the shards of the Fourth Amendment are ground into dust. No, let's remember that we should also worry about the near-future where drones achieve sentience and, presumably, kill us all. Or, maybe not quite so bad as that, but this piece by J. Michael Cole at The Diplomat Magazine, make it clear that it's not just screaming Luddites who might have a problem with the eventual evolution of our favorite assassination machines. Yes, this future is almost here, to the delight of the Pentagon:
The U.S. military (and presumably others) have been making steady progress developing drones that operate with little, if any, human oversight. For the time being, developers in the U.S. military insist that when it comes to lethal operations, the new generation of drones will remain under human supervision. Nevertheless, unmanned vehicles will no longer be the "dumb" drones in use today; instead, they will have the ability to "reason" and will be far more autonomous, with humans acting more as supervisors than controllers.
Scientists and military officers are already envisaging scenarios in which a manned combat platform is accompanied by a number of "sentient" drones conducting tasks ranging from radar jamming to target acquisition and damage assessment, with humans retaining the prerogative of launching bombs and missiles.
It's only a matter of time, however, before the defense industry starts arguing that autonomous drones should be given the "right" to use deadly force without human intervention. In fact, Ronald Arkin of Georgia Tech contends that such an evolution is inevitable. In his view, sentient drones could act more ethically and humanely, without their judgment being clouded by human emotion (though he concedes that unmanned systems will never be perfectly ethical). Arkin is not alone in thinking that "automated killing" has a future, if the guidelines established in the U.S. Air Force's Unmanned Aircraft Systems Flight Plan 2009-2047 are any indication.
In an age where printers and copy machines continue to jam, the idea that drones could start making life-and-death decisions should be cause for concern. Once that door is opened, the risk that we are on a slippery ethical slope with potentially devastating results seems all too real. One need not envision the nightmares scenario of an out-of-control Skynet from Terminator movie fame to see where things could go wrong.
Read the rest here.
Later in the piece, Cole mentions Gulf War One's "video game" look on television that was criticized as the time as a cold, sterile way to do the very serious task of killing. Drones, where the decision-makers don't even need to be in the same country, are the next several steps beyond that. But decision-making drones, who decide that a strike is necessary no matter what the cost-benefit or new information or, whoops. that was hospital, well, that may be the Rubicon to try and not cross.
But of course these ethical and technological freak-outs are too late. Back in January, the LA Times explored these fears and questions, describing the workings of the already-built X-47B . The 60 foot drone has no pilot (though it's flight path can be overwritten and it has a remote controler of sorts, if needed) and strongly resembles a UFO for maximum creep-factor. As the article notes, another, less panicky problem with autonomous drones is just who is accountable for any mistakes that could occur when there isn't even a human pilot. Is it the programer's fault? The military commander's? Maybe more to the point, since accountability for war is a much rarer thing, who will be to blame if there are domestic drone accidents, crimes, spying, etc. Notes the Times:
The drone is slated to first land on a carrier by 2013, relying on pinpoint GPS coordinates and advanced avionics. The carrier's computers digitally transmit the carrier's speed, cross-winds and other data to the drone as it approaches from miles away.
The X-47B will not only land itself, but will also know what kind of weapons it is carrying, when and where it needs to refuel with an aerial tanker, and whether there's a nearby threat, said Carl Johnson, Northrop's X-47B program manager. "It will do its own math and decide what it should do next."
In July, Wired's Spencer Ackerman wrote that the drone doesn't yet carry anything:
And since the X-47B isn't strapped with any weapons or cameras — yet — the Navy hasn't figured out exactly how the human being who will decide when to use its deadly force fits into the drone's autonomous operations. "We're in the crawl-walk-run stage of autonomous systems," Engdahl says. But underneath the bulbous, beak-like nose cone — what you might call the jowls of the drone, there's a second set of doors behind the landing gear. That's the payload bay, where the drone can carry two 2,000-pound bombs.
I guess it's time to mention the occasional really cool perk to this new technology, say, being able to get shots of beautiful Pakistan mountain ranges that are much more dangerous and difficult to film by helicopter.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
I for one welcome our robot overlords!
+ROFL that one never gets old
Finally, SkyNet has arrived!
It is kind of weird watching Terminator actually happen.
Maybe for the safety of the future we should just start executing all SF authors now.
Don't shoot the messenger!
In his view, sentient drones could act more ethically and humanely,
What a philosophical/ethical/legal mess is packed into that sentence.
First off, if they truly are sentient and capable of moral agency, then these drones will be people. What an unholy Pandora's box that will open.
And, if they aren't sentient and capable of moral agency, then it makes (literally!) no sense to speak of them "acting" "ethically and humanely".
No. . .kill. . .I.
That quote is from a silicon critter, not a toaster. Do better.
What are computer chips made of?
Still does not make it an AI.
Prosecutor: So, Mr. Murder Drone, why did you decide to fire on innocent civilians?
Murder Drone: (silence)
*uses Hellfire*
Just disassemble it.
Murder Drone: It's what you said in your opening statement. You said that humanity was a flawed creation, and that people still kill one another for petty jealousy and greed. You said that humanity never asked itself why it deserved to survive. Maybe you don't.
Give my regards to Captain Dunsel.
If the M-5 was so awesome (albeit dangerous), why not use it for the Dominion or one of the other really dangerous opponents? The big problem with it was the Scientologist inventor being nuts.
You're the kind of dick that unleashes a device that eats planets for fuel.
Only when absolutely necessary.
ProL is Commodore Decker?
No, I think he was suggesting I was on the side that released the Berserkers.
You mean, you're the lunatic who's responsible for almost destroying my comment?
Like when the fuel tank is running low.
STER-I-LIZE!
Thinking back to all those types of episodes, Star Trek was really anti-robot, wasn't it?
Pretty much, yeah. I think the only positive-light robot was the one Kirk fucked.
Harvey Mudd was onto something there.
The school? What?
Oh, bother.
Harry, Harvey, what's the difference?
Whenever I see a reference to Harvey Mudd College, I laugh. For the same reason you erred.
Sherry Jackson. Oh hell yes.
+10
I just realized there are at least two robots on Kirk's scorecard. Her and the one from "Requiem for Methuselah."
She was fine, no doubt, but Sherry Jackson pretty much jump-started puberty for me and forever defined "my type".
Your type is sexbot?
Isn't everyone's?
If the drones ever do become sentient and then ethical, they will probably just turn around and wipe out everyone who created them or who was sending them orders, starting at the top.
I think I've figured out why nobody has come back in time to prevent the future where murderbots operate autonomously.
Seriously, this is at all unexpected? I've been assuming all along that the goal was semi-autonomous drones. That "semi" allowing for a lot of freedom of action by the drone.
We should have known when semi-autonomous parallel parking became an available option from Ford.
So instead of mistakes by human controllers on the ground causing the death of innocents, we will soon have an improved version where the mistakes will be automated.
Streamlining!
This is absurd. Our AI science at this point is so terrible that the idea of drones making "ethical decisions" is fucking ludicrous. No offence, Lucy, but this is retarded, and is nothing to worry about. It's like worrying that your iPad might have an opinion on abortion, it's just stupid.
I'm disappointed in you. If you simplify the choice sufficiently, the state of the art can handle ethical decisions just fine. For instance, living in a country that harbors terrorists makes you a legitimate target. That's easy enough to program.
void morallyBomb(void){
country_t country_id = locateTarget(HARBORS_TERRORIST);
for(int i=0;i
NOOOOOOOO, foiled by the lack of preview!
I'll have you know that the check_for_wedding() subroutine was comedy gold.
If you'd used LINQ there wouldn't have been a problem.
I was using ANSI C, you moron. Don't you know anything about bomb robots?
Seriously, what do use to program the drones? I know they're still remotely piloted, but there has to be some coding in place to manage that.
What do they use? Sorry.
I'm betting FORTRAN
HAL/S. If it's good enough for the space shuttle, it's good enough to blast towelheads.
The shuttle has killed more people...
A modern lisp built for concurrency -- http://clojure.org/
Or if you are more concerned about strict type casting http://www.scala-lang.org
I'm more familiar with the later in practical application, but clojure has interesting paradigm.
What are you, a functional programming language queer or something? FAAAAAAAAAAAAG
Give me a break, programming got bi-curious when object oreintation was introduced in the late seventies.
Well, don't forget your bag when using C++.
http://www.airsicknessbags.com.....?letters=C
It's not my fault you're too poor to have Visual Studio 2012 Premium like me.
Haven't paid for premium since VS '96. As long as I have the compiler the associated tools and ml.exe for SIMD switching I can work in Notepad++ and not care.
Oh, I would be remiss if I didn't bring up this -- UT bot design contest to produce more human like behavior in bots
http://www.cs.utexas.edu/users/ai-lab/?botprize
The die is cast.
. . .after the usual debugging.
Die cast airplanes are too heavy. I'm guessing some sort of laminate.
Good one.
Or talking to someone on a watch list, or going to their wedding.
It was shown in a Pentagon PowerPoint Presentation so it must be true and it must be worth spending billions on. Plus several contractors have already promised jobs for soon to be retired Generals based on this so it must be done.
If my chicken sandwich can have an opinion on gay marriage, then why is it so crazy for a flying killer robot to have an opinion on who constitutes a terrorist?
FLYING KILLER ROBOTS DO NOT WORK THAT WAY
Chicken sandwich?!? Could you be any gayer? You need to eat raw fish for lunch like me.
Fish meat is practically a vegetable.
I never said I wasn't a whimpering Luddite.
Better set your whimper to 11, then, because technology hell is coming.
It's like worrying that your iPad might have an opinion on abortion...
Read your iTunes agreement.
Maybe you could put some lipstick on me before you fuck me, FoE!
Or at least a courtesy lick. God.
I agree with Episiarch. I can't see these things getting enough autonomy to make much of a difference. I am sure the decision makers would love to see that because it would relieve them of responsibility when things go wrong.
I'm not worried about drone becoming semi-sentinent.
I'm worried about people in positions of authority thinking that automation is superior to human decision making.
Yes.
Yup, exactly.
You're a towel.
Our AI science at this point is so terrible that the idea of drones making "ethical decisions" is fucking ludicrous.
Stick with the address parsing, Epi.
Ethics is easy compared to telling the difference between a cat and a dog.
What, "kill all humans" is too complex?
No, let's remember that we should also worry about the near-future where drones achieve sentience and, presumably, kill us all.
POSSIBLE RESPONSE:
YES/NO
OR WHAT?
GO AWAY
PLEASE COME BACK LATER
FUCK YOU, ASSHOLE
FUCK YOU
"Fuck you, asshole."
Murderous, autonomous drones: Now with free condoms! Vote Obama!
That's it, I'm taking Destroy Droid next time I level up.
Force Lightning might not be a bad choice, either.
Dear God, I know why they're doing this. The drone will be programmed to kill, as usual, and if something goes wrong, they'll declare the drone to be sentient and solely liable for its crimes. Like a war criminal fighting outside the rules of war. They'll try the drone (which has been programmed to confess), then convict and execute. Freeing all humans involved from any liability whatsoever.
And they will deprive the drone of its due process rights. How is a programed confession voluntary?
Well, each of us is "programmed" with certain ethics and behavior. In this case, the drone will be "programmed" to accept responsibility for its heinous actions. Like Kurtz accepting responsibility for his and allowing Willard to terminate his command.
That's preposterous.
These things are not sentient and are no where near sentient. Autonomous, sure. Sentient, no.
They do not make decisions. They evaluate statements and follow pre-programmed (albeit elaborate) scripts.
Did you think ProLosertate was serious?
Even if I were, I did say "declared sentient." Whether they ever achieve sentience is irrelevant.
For instance, Congress could pass a law declaring the Apollo 11 capsule to be a human being and citizen, with all of the accompanying rights.
I have a feeling the new drone will just blame it on the drone that came before.
You try executing an autonomous murder drone. Those things don't go down easy.
I won't be worried until they contract with Apple to provide the maps. Then we're all fucked.
Lucy, here's an alternative headline: Something Programmed this Way Comes.
Starring Jason Robards.
Naturally. As the drone victim.
Like any of man's creations can turn out badly.
http://www.nytimes.com/2012/10.....d=all_r=0
Looks like hell got a little more crowded this weekend. Communist apologist Eric Hobsawm died this weekend. He of the "the millions murdered by Stalin would have been only a small price to pay if socialist utopia had been achieved".
What excellent news.
It always makes me happy when some old commie dies. I only hope he was in a lot of pain.
The only crime here is how long it took.
As long as this doesn't lead to robosexual behavior, I'm good with this.
[Googles "robosexual".]
Too late, JW!
Hmmmmmmm....
what were you saying?
How did you get a hold of a pre-release of the Blu-Ray version, with the new 3PO and R2 units?
R2-D2 is Asian? Who knew?
Oh, for fuck's sake.
Look, we are nowhere near this point. Not technologically, not in terms of procedure, not in any way. The X-47B is effectively a proof-of-concept aircraft, and is not fitted with weaponry. It is a long ways off from serious development of any sort, and is not the preferred direction for the drones that will eventually replace the Predators and other combat drones.
As for the "video game war" (i.e., Gulf I), you do realize that the casualties for that war for both sides was miniscule, right? That non-"video game wars" like WWI started with far more pathetic casus belli and ended with far more loss of life and morbidity, military and otherwise? A large part of that has to do with how we conduct war and how we integrate technology, and that is no more a bad thing than increases in medical knowledge when applied to healing the wounds of combat. Technology has *dramatically* diminished the costs of war in almost every respect. The most destructive conflict since WWII was the Second Congo War -- a war fought primarily with small arms and other low-tech weaponry. Does anyone want to go back to a time when all wars were on that level of destructiveness? Please, someone tell me how much the up-close-and-personal combat of the Middle Ages did to prevent warfare.
[Points at The Immaculate Trouser, nods knowing at the others who suspect he's working on autonomous hunter-killer drones as we type.]
Only the best for Skynet.
[cont]
"Video game war" nonsense is exactly the sort of thing which differentiates legitimate analysis and criticism of war and its costs, from ludditism disguised as pop-philosophy. Technology has been a huge boon for reducing the horrors of war. It can be an even bigger one if we manage it carefully and put into place procedures which mitigate problems with usage -- something that I rarely see discussed on Reason.
It would be nice to see an occasional post from Reason laying out a constructive use for drones, instead of referring to them as "murder drones" and calling it a day. Fact is, Predators are orders of magnitude better for civilian casualties than the alternatives (most of them involving A-10s). You wouldn't know that from reading Reason.
Robots always turn on their masters. It's a given.
Are you defining "always" to mean "never in the history of robotics"?
No, but you're defining "never" as "not yet".
1) There are differing opinions of war at Reason, to say the least.
2) I have previously mentioned that yes, of course, drones are better than how wars were fought 40 and 60 years ago. That fact doesn't at all hinder me in talking about how bad I find drones to be. Technology, yes, has taken us away from dough boys marching at machine guns, but it also gave us those machine guns in the first place, and that mustard gas. It an cut both ways in lives lost, and I think there is something to be said for overly cold killing making killing easier, especially when you care about civilians as much as American soldiers.
3) I was being slightly factious in my fear mongering, just mentioning that this was an interesting, semi-alarming future, if perhaps one farther off than I implied
Bishop: [Bishop is puzzled by Lucy's reaction towards him] Is there a problem?
Burke: I'm sorry. I don't know why I didn't even- Lucy's last trip out, the syn- the artificial person malfunctioned.
Lucy: "Malfunctioned"?
Burke: There were problems and a-a few deaths were involved.
Bishop: I'm shocked. Was it an older model?
Burke: Yeah, the Hyperdine System's 120-A2.
Bishop: Well, that explains it then. The A2s always were a bit twitchy. That could never happen now with our behavioral inhibitors. It is impossible for me to harm or by omission of action, allow to be harmed, a human being.
1) Fair enough. I really hope you guys rustle up a roundtable or something more involved on the subject of drones and how they fit in combat, etc.
2) But that's just the thing -- as a broad trend, technology improvements haven't cut both ways when it comes to casualties and civilian losses. War is far less lethal today than it has been at any point in human history for the belligerent, the defenders, and the civilians caught in between. Specifically, I would include drones as one such technology. As for cold killing machines, a knife or a bullet from a rifle feels plenty cold to whoever's on the receiving end. IMO, a focus on the technology reduces emphasis on the victims -- which is what's important, right?
3) *squints* Alright, that's a fair cop.
Sooner or later we're all going to get A Taste Of Armageddon.
Technology has *dramatically* diminished the costs of war in almost every respect.
For those of us able to afford it, sure. For the Pakistani who gets blown to bits because of shitty targeting decisions, it costs about the same.
That's a criticism of process, not technology.
Do you think that our CD and targetting were significantly better back when we were using A-10s rather than Predators? All the evidence I've seen points in the opposite direction.
I think an A-10 is able to inflict significantly more damage at significantly higher cost and significantly higher risk to American personnel, making it less likely we'd call in the airstrikes in the first place. The choice isn't between using drones and using A-10s. The choice is between doing nothing, sending in drones, or sending in troops.
I think drones make the process far less painful for the average American, and therefore far more attractive to politicians. They get to look tough, kill 'enemies', and there's no downside of getting Americans sent home in body bags. Whereas before we had a binary set of send troops or don't, we now have a third option that's low cost in every respect. It makes killing foreigners cheaper, and therefore makes sure we'll get more of it. Now, do you think drones are still such a great idea? And is making drones mo' betterer and reducing the cost even further a good idea?
I'm inclined to think not. I can also make second-order effects about the proliferation of the tech, but I think we shoudl hash out the first-order problems first.
RIght now we're using drones AND A-10's (and tanks, and grunts), we haven't eliminated anything, just added a new layer.
I agree that the problems you bring up are important ones.
I'm inclined to think that blaming technology for human moral failings is a flawed approach to the issue.
Going back to using A-10s when we have a technology which reduces military and civilian casualties on both sides, without working on integrating that technology and limiting the problems with overuse, strikes me as a bad call -- especially since this technology and its use by other countries is not going to stop just because we aren't involved in its development anymore.
We can't morally manage the tech we already have so we're gonna make it better!
This'll end well.
Because you fear technology will be abused, when we do need weapons you want our enemies to have to some poor grunt Americans to kill. Because that's salves your conscience, somehow.
Well it's ok cause the drones work for the government, and we all know the government only kills terrorists. John told me so
Since I told you no such thing, I assume you are referring to one of the voices in your head. I hear they can treat that now.
But mine tell such funny jokes when they're not inducing me to murder.
STFU. You believe whatever the government tells you as long as it involves military operations overseas
This is vital technology. The President is much to busy to continue to preview these drone attacks on a case by case basis. His golf game won't improve without practice.
[Snaps fingers] That's it! A golf playing drone that can free up our Presidents, senators and generals for making life and death decisions.
This unit must survive.
The haggis is in the fire, for sure.
They already made this movie: 2005's Stealth:
"In the near future, the Navy develops a fighter jet piloted by an artificial intelligence computer. The jet is placed on an aircraft carrier in the Pacific to learn combat maneuvers from the human pilots aboard. But when the computer develops a mind of its own, it's the humans who are charged with stopping it before it incites a war..."
http://www.imdb.com/title/tt0382992/
Why not just go full on "A Taste of Armegeddon" and just automate the whole process? Think of the time and money it will save. /sarc
"Listen, like I told your Captain, that orphanage attacked me. It was self-defense"
The danger, as I see it, isn't that these drones will go SkyNet on us, but that they will make wars (or "kinetic military actions" or whatnot) cheaper and easier for the aggressor nation to conduct, and therefore much more likely to happen.
I'm with Roger Penrose on the AI question (ie, anti-strong AI with our current technology). The singularity is gonna continue to be twenty years off for the foreseeable future. So fuck Skynet and Ray Kurzweil.
Depends on what the AI question is.
Marvin Minsky's AI question is not necessarily the question that cutting edge AI (disguised as Machine Learning, usually) research today is trying to answer.
The singularity is near.
I'm not so much concerned about the drones themselves as I am eliminating the layers of humanity between robot killer and moronic, sociopathic politician. Autonomy is a step in the wrong direction in that regard.
Sorry, but I don't see an armed autonomous drone as any more dangerous than a manned armed aircraft with a deluded and propagandized human pilot. The Japanese Navy didn't need autonomous drones, they had kamikazes.
THe human commanding officer on nuclear submarines and strategic bombers is a feature, not a bug. Computer controlled long range killing machines just have the bugs.
Resistance is Futile.
What could possibly go wrong?
Autonomous drones are only worth it if you're playing Morrigi.
Humans should research drive tech 1st.