Amazon Facial Recognition Confuses Members of Congress with Criminals
The ACLU stunt is intended to warn against using tech to identify suspects.

An American Civil Liberties Union experiment with Amazon's facial recognition software showed the technology confused 28 lawmakers with the mugshots of criminals.
Since this is Congress we're talking about, we should be clear: The lawmakers aren't criminals. The pictures that Amazon's "Rekognition" software matched to them were of different people.
This stunt is intended to serve as a wake-up call for lawmakers and law enforcement leaders about the potential problems of using facial recognition software to identify suspects. Privacy and technology experts and activists have been warning for years that the technology remains far too flawed to be used to identify criminal subjects, and the tech struggles particularly with differentiating between the faces of minorities.
The experiment here bore out that concern. Only 20 percent of the members of Congress are racial minorities. But 39 percent of the members of Congress who were mistaken for criminals were racial minorities.
The ACLU warns of the potential consequences of widespread implementation of such flawed recognition software:
If law enforcement is using Amazon Rekognition, it's not hard to imagine a police officer getting a "match" indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins. Or an individual getting a knock on the door from law enforcement, and being questioned or having their home searched, based on a false identification.
An identification — whether accurate or not — could cost people their freedom or even their lives. People of color are already disproportionately harmed by police practices, and it's easy to see how Rekognition could exacerbate that. A recent incident in San Francisco provides a disturbing illustration of that risk. Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle.
Amazon is marketing Rekognition to police to be used as an identification tool, and the ACLU notes that a sheriff's department in Oregon is doing exactly what the ACLU did here, matching people's faces to mugshot databases.
A representative for Amazon told The New York Times that the ACLU used the tools differently than how they expect law enforcement will. The ACLU used the default mode of 80 percent confidence in the match; Amazon recommends that police use a 95 percent threshold. "It is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment," Amazon's rep said in a statement.
I can't imagine how anybody would find that reassuring. Not only do law enforcement officers have a lengthy history of stubbornly arresting and imprisoning people over cases of mistaken identity, Zuri Davis noted recently how one police chief was just flat-out arresting random innocent men in order to clear burglaries. Imagine being able to blame it on technology.
And keep in mind, in the midst of a federal government implementing a remarkably harsh crackdown on immigrants (both legal and illegal), the Department of Homeland Security is implementing a facial recognition system in airports that will scan everybody boarding international flights, including American citizens, all for the purpose of trying to catch illegal immigrants with expired visas. We already see cases of immigration officials detaining American citizens—some of them for months or even years—over mistaken identities.
But proponents note that facial recognition software has a place and does serve a purpose when it's accurate. Officials used facial recognition tools to identify the alleged shooter at the Capitol Gazette in Maryland. Given the man's historical obsession with that newspaper, though, it was only a matter of time before they figured out who he was.
The ACLU and dozens of other civil rights groups want Amazon to stop offering this surveillance tool to police, and they're calling on Congress to enact a moratorium on the use of facial recognition tools until the software is improved to the point that these mistakes are much less likely.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Confuses? I don't think so.
Now do autonomous vehicles...
Yeah, I dont think he knows what that means.
"There is no distinctly American criminal class - except Congress." Mark Twain
" The lawmakers aren't criminals. "
Not everyone shares your opinion.
"Since this is Congress we're talking about, we should be clear: The lawmakers aren't criminals."
Come on Shackford...how can I take the rest of this piece seriously when this is the second sentence of your article?
What other permutations of the "but members of Congress ARE criminals" joke are there?
"The people on the Wanted list complained that the facial recognition software confuses them with members of Congress."
Only humans are capable of drawing a zoom box around a portion of a digital image and unpixelating the zoomed in image.
I've tried that on certain videos of Japanese origin, and it doesn't work.
Currently, yeah.
Humans are still much better at recognizing faces from bad photos then computers are. We might eventually get to the point where computers can "zoom and enhance" a photo beyond the original resolution, but we're not there yet.
Sub pixel enhancement has probably existed for decades.
When you're talking about video, you have a stream of images to work your magic.
So where's the lie?
An American Civil Liberties Union experiment with Amazon's facial recognition software showed the technology confused 28 lawmakers with the mugshots of criminals.
Feature, not bug.
And so I was like "No joke I could include will top whatever people come up with in the comments so don't even bother with the joke."
I'm offended to see your crass attempt to placate us. No one here in the comments deserves anything good.
Also, I'm offended that your alt-text seems to imply that looking like the cover of Omikron is in anyway a bad thing.
The pictures that Amazon's "Rekognition" software matched to them were of different people.
Future primary challengers.
Since this is Congress we're talking about, we should be clear: The lawmakers aren't criminals.
Listen, Shackford. If you're going to just straight-up lie to us, why should we take the rest of the article seriously?
Yeah, cite your source!
Define "confuses".
Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment," Amazon's rep said in a statement.
As much as I distrust facial recognition technology, Amazon is right here. At least for now, I don't think that the Department of Public Works is automatically issuing warrants for everyone with a hat size of 7 1/2 based on facial recognition hits. Even your bullshit 'red light' camera has a human do the final review.
I understand what the ACLU is warning about, but I think they would do better to scrutinized how the tech is used more than how the tech actually works.
I think I'm legitimately okay with limiting the technology the government has access. I do not trust them with it, and I do not trust many people to have enough understanding of the technology to meaningfully stand against its abuses.
I think the government should be ham stringed.
I believe that's an impossible standard. It's the genie-out-of-the-bottle problem. I think the better approach is to make sure we know how, when and where those tools are being used. In some cases, that will probably rely on whistle blowers and leaked press reports because government will lie to us about their use of the technology anyway-- there's precedent for that.
Except we know from prior experience that cops are stupid and lazy as fuck. They will absolutely leave it on the default setting of 80%.
They could easily just turn it down further if they wanted to. This is a detail that is easy to gloss over too, or play off as user error.
All it probably does it put a distance between two faces. Some real between 0 and 1, and then have some threshold for which they call a match. Drop that number down to, say, 50% and you probably get just about every human.
That'll make for some shitty CSI episodes.
If it gives to many results they'll turn it down to make less work. Ideal is one result watch time so the brain can continue to remain off
Right, then we should know they're using it on the default setting and rake them over the coals for it. Because if we say "don't use it", they'll use it anyway and then just lie about it.
Where is Abby when you need her?
Well... not convicted anyway.
Well most are unindicted at least.
Amazon Facial Recognition Confuses Members of Congress with Criminals
So you're saying that it works.
Technically, it would be "other criminals"....
Problem was that Amazon had 'Predictive' enabled on the software.
Facts not in evidence.
The software has a hard time distinguishing between minorities - so Alexa is a bigot?
To Alexa, they all look alike.
She's not looking at their face.
That isn't the conclusion given the higher rate of crime by some minorities. The rate of mistakes could be exactly the same but with very different population pools (and their rates) to start with.
You'd expect the "false alarms" to be roughly proportional to how many *criminals who got caught* looked like you, not how many people in the general population looked like you.
Those numbers differ by race.
Facial recognition results are one bit of evidence. They're a screening tool. Similar to any other fact people might match on.
If you want to really have a hissy fit, realize that you can upload your own raw DNA data to ancestry sites to find relatives. Naturally, the police can do this too.
Getting within the range of 2nd and 3rd cousins drastically focuses the suspect pool.
"It is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment"
Screening.
Profiling
"It is worth noting that in real world scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment"
Screening.
So Amazon recommends using a 95% threshold but ships the product with a default threshold of 80%.
Or is it more likely that the Amazon rep is making up new standards on the fly in an attempt to deflect a PR (and civil liberties) disaster?
Bug or feature. You decide.
Is Face/Off the antidote?
https://www.youtube.com/watch?v=95VvTW1FvS8
I come for the alt text, I stay for the article, I get waylaid by the comments.
I wish other Reason writers did alt text. It's a real treat when Scotty "Love Shack" Shackford has an article up. Sometimes I wish Love Shack and 2chili would rebel and chuck the proverbial Reason tea into the proverbial harbor. They seem like the most likely candidates to at least request that KMW bring comrade Gillespie in for a spanking.
this is very important Jasa SEO
This can be used in the meantime until they can get everybody to implant a chip in their body.