Bad Cop, No Robot
The case for legally constraining what police departments can do with robots.

Like millions of people, I watched the viral video of dancing Boston Dynamics robots that made its way around Twitter this week. But unlike many of those millions, I did not think, "Wow, the future is so cool." I thought, "We gotta keep these away from the cops."
I admit that some of my aversion is a gut reaction to the uncanny valley. The dog-shaped ones creep me out the most. A predator, often headless, unfazed by rain or heat, without need for food or water or rest—that's the stuff of science fiction nightmares. I know, objectively, these robots are an incredible technological achievement, yet I can't erase that instinctive unease.
Still, my worry about misuse of these and similar robots by law enforcement is not merely an emotional reaction. Nor do I think there's zero place for robots in policing. The problem I foresee is the introduction of robotics without a strong and specific legal framework dictating how they may and may not be used.
The risk here is escalation, and the history of SWAT teams provides an excellent case study. These units were introduced to American policing a little over half a century ago, designed for a limited set of very dangerous circumstances, chiefly hostage and barricade situations or violent rioting.
As time went on, however, police departments realized they could use SWAT teams in more routine contexts, too. Now, fewer than one in 10 SWAT raids involve those high-danger situations. "Today in America SWAT teams violently smash into private homes more than 100 times per day," writes Reason alum Radley Balko in Rise of the Warrior Cop. "The vast majority of these raids are to enforce laws against consensual crimes," he adds, particularly drug use. American SWAT teams have raided homes and businesses for alleged offenses including unlicensed barbering, copyright violation, and parodying a local politician on Twitter. Some police departments use SWAT teams to execute every search warrant.
There's a difference, however, between how SWAT use started and how robotics is being introduced to policing. When SWAT teams were created, state lawmakers passed legislation giving police new leeway in their work and deadlier tools with which to do it. By contrast, the residents of New York City learned their police department had obtained a Boston Dynamics dog when it was photographed in action at a crime scene. The acquisition does not appear to have been directed by any elected officials, though it's possible a law enforcement transparency measure passed by the city council this past summer will compel the NYPD to report on the robot's current use and devise policies for it going forward. So far, the NYPD has characterized the dog exactly as SWAT teams were described in their early days: a tool to keep officers safe in unusual emergencies, especially hostage and barricade crises.
Likewise, when the Massachusetts State Police borrowed a Boston Dynamics robot for "mobile remote observation," a records request by the state branch of the American Civil Liberties Union (ACLU) turned up no departmental use policy. "We just really don't know enough about how the state police are using this," said the organization's director of the Technology for Liberty Project, Kade Crockford. "And the technology that can be used in concert with a robotic system like this is almost limitless in terms of what kinds of surveillance and potentially even weaponization operations," Crockford continued. "We really need some law…to establish a floor of protection to ensure that these systems can't be misused or abused in the government's hands."
We do. Robots in policing are not inherently dystopian. It is a good thing that a robot can be sent to defuse a bomb instead of putting a human officer at risk. There are some appropriate uses here. But we need laws delineating those uses and, I believe, prohibiting machine use of force against human beings (just as we must legislate police use of drones).
The move from defensive (e.g., bomb disposal) to offensive (e.g., restraining or even killing a suspect) use will happen if it isn't prohibited. In fact, it has already happened: In Dallas in 2016, police jury-rigged a bomb disposal robot with explosives and used it to kill a sniper who had shot 12 officers, murdering five. Other departments have similarly used robots built to protect people to instead deliver nonlethal attacks.
Legislators—not unelected police department administrators—should be pre-emptively determining what kinds of robots police departments can acquire and how they may be used. Police robot acquisition, like any major new weapon or equipment procurement, should never catch the public by surprise. Law enforcement are supposed to be public servants, not masters.
In the bigger picture, there are three questions our lawmakers should be answering with legislation. First, do robots make police use of force more likely and/or more severe? This technology won't be a neutral influence on officers' decision making, just as SWAT teams haven't been neutral. If it is substantially easier (by virtue of being safer for officers) to use force with a robot than without, that will change police behavior. It may sometimes change it for the better, of course, for removing the question of officer safety could make escalation to violence less likely in some circumstances. It may also change the behavior of the person being policed, making them more or less fearful and therefore more or less likely to fight or flee.
Second, what happens when robots progress away from significant human control? Bomb disposal robots don't have artificial intelligence (A.I.) to make their own decisions. They are remote controlled by human operators. The Boston Dynamics machines are more sophisticated—they have "an element of autonomy"—but these too are mostly human-controlled. As A.I. progresses, however, we'll run into moral dilemmas like those posed by self-driving cars: What ethics do you give the robot? This will be a matter of sincere disagreement, and it should be subject to public debate, not secretive departmental decision making.
Finally, when we do reach that point with A.I., the case will be made that robots should be entrusted with use of force (perhaps even more than humans) because they cannot operate outside their programmed rules and wouldn't react out of fear for their own lives as a human officer might. A robot cop approaching Philando Castile's car, for example, would not have shot him as Jeronimo Yanez did unless programmed to do so.
The problem, as with autonomous weapons of war, is threefold: Programming can be bad, and the entire "robots would be police without the burden of human frailty" argument rests on an unjustified assumption of perfect ethics and execution. Complex ethical decisions require more than rules—human compassion and conscience can't be programmed, and subjecting human life to the decision of a robot is an affront to human dignity. Moreover, every problem we have with holding law enforcement accountable for misconduct today will be exacerbated ad infinitum if "the robot did it." You can't try or punish a robot, and it is inevitable that the manufacturers, programmers, and operators of police robots will be indemnified against liability in brutality cases if we permit the escalation I foresee to proceed unchecked.
We should take the lesson of SWAT teams to heart and build a strong fence of law around police robots before they get loose.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Are u free at home want to do some work at home and earn money then go to this site link its can provide u to work at home and earn money easy without any investment…I also do this work..
Here is More information.
I quit working at shoprite and now I make $65-85 per/h. How? I'm working online! My work didn't exactly RTE make me happy so I decided to take a chance on something new… after 4 years it was so hard to quit my day job but now I couldn't be happier. Here’s what I do…>> Visit Here
[ PART TIME JOB FOR USA ] Making money online more than 15$ just by doing simple work from home. I have received $18576 last month. Its an easy and simple job to du and its earnings are much better than regular office job and even a little child can do this and earns money. Everybodyd must try this job by just use the info
on this page.....work92/7 online
STAY HOME AND STARTING WORK AT HOME EASILY… MORE AND MORE EARNING DAILY BY JUST FOLLOW THESE STEPS, Iam a student and i work daily on this site VDO and earn money..HERE Visit Here
Making money online more than 15$ just by doing simple works from home. I have received $18376 last month. Its an easy and simple job to do and its earnings are much XME better than regular office job and even a little child can do this and earns money. Everybody must try this job by just use the info
on this page…. Visit Here
SleepyJoe likes him some ROBOCOP. Got to do something about the predators on the streets.
STAY HOME AND STARTING WORK AT HOME EASILY… I am making 70 to 60 dollar par hour at home on laptop ,, This is make happy But now i am Working 4 hour Dailly and make 400 dollar Easily .. This is enough for me to happy my family..how ?? i am making this so u can do it Easily…
==========►USA ONLINE JOBS
Of course he likes the robots...they can do all the beat downs while he sleeps in the basement.
"We really need some law…to establish a floor of protection to ensure that these systems can't be misused or abused in the government's hands."
We tried that with the Constitution. How well did that work out?
JOIN PART TIME JOBS
Google pays for every Person every hour online working from home job. I have received $23K in this month easily and I earns every weeks $5K to 8$K on the internet. And Every Person join this working easily by just just open this website and follow instructions
COPY This Website OPEN HERE..... Visit Here
It will be different. This time we will pass a law saying the police can not violate the constitution.
They will just be disabled and stolen, stripped for parts, or repurposed into jewelry store robbers.
Not worried in the slightest, global warming will kill every last human in 12 years (or is it 10 years now) and it would take 25 years just to upload all the criminal statues on the books.
That's not true! (now that Sleepy Joe is in charge)
yeah, we have 10 years left before Earth is an uninhabitable hell, I've heard. why worry about side issues?
Yep.
Unfortunately, the last thing any rational government would want is to make itself accountable for its own actions. That would impede the freedom of the state in exchange for the subjective and problematic freedoms of its citizens.
Defund now!
This is a great idea!
It's just that we need to mandate that the robot cops be controlled by popular vote of the social media crowd.
They can only be sent to houses of people the twits hate, or that facebook claims say nasty things about the elites.
Vox populi, vox dei.
The robutts are inevitable, even if we punt (bad idea) the ChiComs will do it like they're doing nuclear and everything else we should be doing.
Good article but the author failed to mention that it's Boston Dynamics that put a limit on the usage of their robots; they contractually prohibit the weaponization of the units they sell and lease. So it's not the people or legislature taking up the mantle of protecting us from AI and robots, right now it's the technology companies themselves that are serving as the gatekeepers.
And of course, after the cops weaponize the robots anyway, BD can just make them stop, right?
I rather doubt they can contractually prohibit the weaponization of units they actually sell.
BD sells to A with contract.
A sells to B without contact.
Contract invalid.
Honey we've got way bigger things to worry about than this.
Start writing about why a government of collectivist sociopaths run by a senile old man named Biden is a bad thing.
What? You don't think this is something a government of collectivist sociopaths run by a senile old man would do?
"Robots in policing are not inherently dystopian"
Sure thing....
Time to go back and reread Thoreau and think about it a bit. "That government is best which governs least".
"Real-ID will only be used for...." etc.
Happy New Year. 2020 sucked. Let's not allow 2021 to be even worse.
Way way too many of our problems with government stem from thinking of bandaid fixes for individual problems. The problem here is not police abuse of robots; the problem is police abuse, period. The fix is not new regulations about police robots, new watchdog (ha!) panels, etc; the fix is to hold police accountable, period.
Politicians love to divide and conquer. Concentrating on just police robots is one of those divisions.
Whenever I see that robot dog I just think it would be fun to just beat the shit out of it. That would be fun.
unless it's programmed to retaliate.
there is a video of a humanoid style robot where the human keeper knocks it down with a hockey stick, and the robot leaps back to its feet in half a second.
Then again, sth like 97% voted for the initiation of force, and odds are they are the ones that will be disproportionately murdered by these terminators--compared to libertarians. Anything that improves our spoiler vote clout can't be all bad, right?
nice thought, but libertarians are probably more likely to be raided by robo-K9 units, trying to disrupt anyone collecting firearms or gold or drugs.
Or you could just comply, citizen.
Then you wouldn't have to worry about Robo-Dog hunting you down, relentlessly.
Robotics and AI are both in their infancy in terms of development. There will likely be frequent abuses of human rights at the hands of machines operated by law enforcement. That is the unfortunate nature of the LE discipline. In less than a human lifetime, our machines will dominate virtually every human endeavor. It is inevitable. How the human race will deal with the competition remains to be seen.
Robots Are Very Important For The Future Specially In The Situations Where A Human Life Of An Officer Can Be Saved By Sending A Robot Upfront.
Has no one seen “Terminator “?
Weaponizing robots is a horrible idea.
Seems like robo dogs would be vulnerable to EMP. Someone needs to work out how to do that. Problem solved.
“We really need some law…to establish a floor of protection to ensure that these systems can’t be misused or abused in the government’s hands.”
On this year 2021 start earning cash from your home and getting paid(500$ to 700$ / hour )CMs by this job these are the best online jobs I've made $84, 8254 so far this year working online and I'm a full time student Join it today here.
================ Home Profit System