Robots

Massachusetts Police Test Out Robot Dogs. Is Dystopia on Its Way?

Don’t be afraid of the robopups, but make sure we leash law enforcement to keep officers from misusing them.

|

The Massachusetts State Police has temporarily added a robot dog to its bomb squad, leading, naturally, to concerns that Skynet and the dystopia of Black Mirror are upon us. We don't have to let it come to that, though.

Boston Dynamics, the inventors of the adorable/creepy dog robot model, which they named Spot, let the Massachusetts State Police borrow one of their pups from August to November. The state police signed a lease agreement, and have quietly used the robopup for "remote mobile observation" in two incidents, according to a state police spokesman. This is the first use of a Boston Dynamics robot dog by a law enforcement agency.

The Massachusetts chapter of the American Civil Liberties Union obtained a copy of the lease agreement and shared it with Boston's NPR affiliate WBUR. The agreement doesn't say much about how the bomb squad is permitted to use Spot, but it does forbid the bomb squad from taking and posting photos of Spot in use; it also forbids both the state agency and Boston Dynamics from advertising Spot's use by the police.

The vagueness of the agreement worried the Massachusetts ACLU because it doesn't put any restrictions on how police might use Spot, though the agreement notes the police are trying to test the robot's ability to navigate and inspect "potentially dangerous environments."

The stated intent is for Spot to serve as an upgrade to the current bomb robots used by law enforcement agencies across the country, not as a weapon. 

The lack of specifics in the agreement about how Spot may be used is creating concern at the Massachusetts ACLU, though. The Netflix series Black Mirror helped push such fears in a 2017 episode "Metalhead," which revolved around killer robot dogs, inspired by videos of Boston Dynamics' robots.

https://www.youtube.com/watch?v=GM3GM299orc

The Massachusetts ACLU wants to know just what the police are doing with these robots. In WBUR's reporting, Michael Perry, the vice president of business development at Boston Dynamics, says that the lease agreement requires that the robot not be used to "physically harm or intimidate people." But that clause does not appear in the version of the lease agreement that the Massachusetts ACLU helped to make public. While it's certainly not the norm, we do have one case of a bomb robot being used in Dallas in 2016 to kill a mass shooting suspect. Black Mirror may be stretching the extremes with its completely autonomous murderhound, but it's not outrageous to worry about future police plans.

What the Massachusetts ACLU wants is for police to be transparent about policies for Spot's use and for state lawmakers to enact legislation to control what law enforcement may do with these robots:

"We just really don't know enough about how the state police are using this," [Massachusetts ACLU Director of the Technology for Liberty Program Kade] Crockford said. "And the technology that can be used in concert with a robotic system like this is almost limitless in terms of what kinds of surveillance and potentially even weaponization operations may be allowed." …

"We really need some law and some regulation to establish a floor of protection to ensure that these systems can't be misused or abused in the government's hands," Crockford said. "And no, a terms of service agreement is just insufficient."

These concerns are similar to fears about police departments incorporating unmanned drones into their operations. Drones can be very useful in helping police and first responders scope out dangerous locations and rescue operations, but without transparent policies and strict requirements, such technologies can easily violate people's privacy and Fourth Amendment rights with unwarranted secret surveillance.

This doesn't mean police should refrain from using new technology like drones and robopups. Rather, it means they should have strict policies for their use and face consequences if—or when—they fail to follow them.