RoboCop Reboot Tackles the Ethics of Mechanized War
The original RoboCop movie was more than a crazy, over-the-top violence romp. It had a not-exactly-subtle message that an oppressive, militarized law enforcement culture, fed by corporate cronyism, can be just as dangerous as the crime it was intended to fight.
The movie is 25 years old now, and we can track the militarization of our police forces since that time, sans cyborgs. There's a remake on the way, scheduled for 2014. The Los Angeles Times, through its Hero Complex blog, sat down with Brazilian director José Padilha to talk about how its message has been updated for our current situation:
"[W]e're in the future, and drones have been replaced by robots and are being used all over the world for foreign policy and war. Kind of like instead of sending soldiers to Iraq, you send robots to Tehran. This is how we open the movie, with American robots in Tehran, because Iran has been invaded. The idea is, now soldiers don't die in wars, so there's no political pressure at home to end wars. Because the reason why the Vietnam War ended is because soldiers were dying. When you take the soldiers away and you have robots, that opens a can of worms. The premise of the movie is everywhere in the world, robots are allowed, except in America, because Americans won't accept that a robot can pull the trigger, that the robot can decide to take or not to take a person's life in law enforcement. So this company is losing lots of money because it can't sell robots in North America, so the solution is, "Let's put a man in the machine and sell that." That was the premise of the movie that I said to them in the very first day, and because they wanted to do it, that's why I'm here."
Padilha, who was a documentary maker before moving into features, explained how the movie will explore the ethics of accountability:
"When you mechanize and when you create automatic law enforcement, accountability goes down the drain. I'll give you an example, and it has to do with drones. Say you look at a controversial war thing, like the atomic bomb in Hiroshima. At the end of the day, somebody ordered the bomb, in that case Eisenhower. So you can judge Eisenhower. You can say he was right, or he was wrong. You can have an opinion about it, and pass moral judgment on what happened. Now, when a robot pulls the trigger, because a robot doesn't have a conscience, who is accountable? If the robot makes a mistake and let's say shoots a kid by mistake, whose fault is it? It's clearly not the robot's fault because it doesn't make sense to attribute fault to an entity that doesn't have a conscience (and this is a very contemporary philosophical issue, by the way, that's being debated in the academic world). Once you have automatic robots making decisions on the spot that can decide whether to take or not take somebody's life, then accountability becomes very fluid. "
Even with drones having remote pilots, documents show that our government frequently has no idea who it is killing when it launches strikes in Pakistan.
Joel Kinnaman (the absolute best thing about The Killing this past season) will be playing the titular character this time around. Watch the trailer below:
Show Comments (161)