Privacy

Amazon Facial Recognition Confuses Members of Congress with Criminals

The ACLU stunt is intended to warn against using tech to identify suspects.

|

Facial Recognition
Monsit Jangariyawong / Dreamstime.com

An American Civil Liberties Union experiment with Amazon's facial recognition software showed the technology confused 28 lawmakers with the mugshots of criminals.

Since this is Congress we're talking about, we should be clear: The lawmakers aren't criminals. The pictures that Amazon's "Rekognition" software matched to them were of different people.

This stunt is intended to serve as a wake-up call for lawmakers and law enforcement leaders about the potential problems of using facial recognition software to identify suspects. Privacy and technology experts and activists have been warning for years that the technology remains far too flawed to be used to identify criminal subjects, and the tech struggles particularly with differentiating between the faces of minorities.

The experiment here bore out that concern. Only 20 percent of the members of Congress are racial minorities. But 39 percent of the members of Congress who were mistaken for criminals were racial minorities.

The ACLU warns of the potential consequences of widespread implementation of such flawed recognition software:

If law enforcement is using Amazon Rekognition, it's not hard to imagine a police officer getting a "match" indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.

An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it's easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.

Amazon is marketing Rekognition to police to be used as an identification tool, and the ACLU notes that a sheriff's department in Oregon is doing exactly what the ACLU did here, matching people's faces to mugshot databases.

A representative for Amazon told The New York Times that the ACLU used the tools differently than how they expect law enforcement will. The ACLU used the default mode of 80 percent confidence in the match; Amazon recommends that police use a 95 percent threshold. "It is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment," Amazon's rep said in a statement.

I can't imagine how anybody would find that reassuring. Not only do law enforcement officers have a lengthy history of stubbornly arresting and imprisoning people over cases of mistaken identity, Zuri Davis noted recently how one police chief was just flat-out arresting random innocent men in order to clear burglaries. Imagine being able to blame it on technology.

And keep in mind, in the midst of a federal government implementing a remarkably harsh crackdown on immigrants (both legal and illegal), the Department of Homeland Security is implementing a facial recognition system in airports that will scan everybody boarding international flights, including American citizens, all for the purpose of trying to catch illegal immigrants with expired visas. We already see cases of immigration officials detaining American citizens—some of them for months or even years—over mistaken identities.

But proponents note that facial recognition software has a place and does serve a purpose when it's accurate. Officials used facial recognition tools to identify the alleged shooter at the Capitol Gazette in Maryland. Given the man's historical obsession with that newspaper, though, it was only a matter of time before they figured out who he was.

The ACLU and dozens of other civil rights groups want Amazon to stop offering this surveillance tool to police, and they're calling on Congress to enact a moratorium on the use of facial recognition tools until the software is improved to the point that these mistakes are much less likely.