Police Body Cameras May Get Facial Recogntion
One of America's largest body camera suppliers has expressed interest in the technology.

Axon, America's largest manufacturer of police body cameras, may add facial recognition to its body cameras, a move that has sparked concerns among civil libertarians.
The company, formerly known as Taser, is an industry leader in electroshock weapons and body cameras, with more than 18,000 law enforcement customers in more than 100 countries. Within the United States, 38 out of the 68 major law enforcement agencies have purchased Axon body cameras.
On Thursday, Axon announced that it has created an artificial intelligence ethics board dedicated to developing responsible AI technologies. The board will meet twice a year to "guide the development of Axon's AI-powered devices and services," with a focus on how the products will impact communities. While the announcement did not directly mention facial recognition, company founder Rick Smith tells The Washington Post that such technologies are "under active consideration."
Body cameras with facial recognition could identify the faces of individuals using biometric data—facial features, retina scans, and other identifiers—in real time. Every person the officer comes across could be flagged and identified. Artificial intelligence would make sense of the footage. It could track suspects, scan for wanted individuals, and identify people onscreen.
While acknowledging the technology's potential for "bias and misuse," Smith argues that the tech's benefits cannot be ignored. "I don't think it's an optimal solution, the world we're in today, that catching dangerous people should just be left up to random chance, or expecting police officers to remember who they're looking for," he says in the Post. "It would be both naive and counterproductive to say law enforcement shouldn't have these new technologies. They're going to, and I think they're going to need them. We can't have police in the 2020s policing with technologies from the 1990s."
Cameras with real-time facial recognition are already being used in China and the U.K.
Currently, photographs of 117 million Americans—nearly half the country's adult population—are stored in a facial recognition database that can be accessed by the FBI.
According to a report from the Government Accountability Office, the FBI's use of face recognition technology has scant oversight and the bureau does little to test for false positives and racial bias when looking for suspects. Yet facial recognition technologies often struggle with identifying members of some ethnic groups.
Forty-two different groups, including the American Civil Liberties Union, the National Association for the Advancement of Colored People, and the National Urban League, raised these concerns in an open letter to Axon's ethics board. In their statement, the groups called body cameras with real-time facial recognition "categorically unethical to deploy":
Axon has a responsibility to ensure that its present and future products, including AI-based products, don't drive unfair or unethical outcomes or amplify racial inequities in policing. Axon acknowledges this responsibility—the company states that it "fully recognize[s] the complexities and sensitivities around technology in law enforcement, and [is] committed to getting it right."
Certain products are categorically unethical to deploy. Chief among these is real-time face recognition analysis of live video captured by body-worn cameras. Axon must not offer or enable this feature. Real-time face recognition would chill the constitutional freedoms of speech and association, especially at political protests….Real-time face recognition could also prime officers to perceive individuals as more dangerous than they really are and to use more force than the situation requires. No policy or safeguard can mitigate these risks sufficiently well for real-time face recognition ever to be marketable.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Someone explain something about cops to me. It's not just that they're obese, it's that their heads seem even more obese than their bodies. Is it that their special diet of frosted pastries leads to this sort of physiological response, or are they actually even fatter than they appear when not in what must be remarkably slimming uniforms?
The camera adds 10 pounds, Tony. You of all people should know this.
That's funny.
Cops are the people who enforce those taxes and government programs you love. Everything government does is predicated on violence.
They are the poster children for a bloated bureaucracy.
How can one be anti-cop but also pro-government intervention in all other ways, one might ask?
It's pretty easy if you don't think about things very deeply.
I'm just pro-civilization. Only some kind of fucking moron would think people could get along in the modern world without a modern government.
So not being this I am free to critique aspects of our own civilization that lack behind the ideal, such as American criminal so-called justice and the walking heart attacks who enforce it.
Translation: Tony likes cops when they're forcing HIS agenda on everyone. He doesn't like cops when they're arresting people who support his party.
You want to do without law enforcement entirely? No, you like feeling secure in your person and your shit? So what are you talking about?
I accept the premise that cops are necessary, I just want them to be less fat and more accountable.
So, not monopolistic?
So... the reason for body cameras is to protect citizens from police misconduct which is a real thing.
The government potentially wants to use this technology to hurt citizens by tracking them and cataloging their faces without a required warrant based upon probable cause?
Steroid abuse.
Crap, meant for Tony.
They should try to get their hands on the ones that build muscle.
*sigh*
If a person doesn't break the law that person does not have to worry about facial recognition software.
This would work better coming from OBL.
Sure. Because there are no examples in history of a government ever misusing technology to crack down on its political opponents or a disfavored minority.
According to a report from the Government Accountability Office, the FBI's use of face recognition technology has scant oversight and the bureau does little to test for false positives and racial bias when looking for suspects.
Our nation's preeminent law enforcement agency does not require oversight.
They're already police. They will police themselves.
Because making sure the cameras aren't racist is totes as important as, uh, not arresting the wrong people.
Body cameras with facial recognition could identify the faces of individuals using biometric data?facial features, retina scans, and other identifiers?in real time.
Not bloody likely any time soon. Whatever facial recognition they're adding is going to be after-the-fact when they upload the video to a server, and then the server will perform facial recognition to the previously-recorded, uploaded video files. I have very strong doubts that those cameras will be able to process video and communicate vie wireless and cross reference a database of a hundred gajillion faces in real time.
You would have to have weissman scores only achieved in HBO comedies to successfully perform such a function.
Not to mention that, once again; Facial recognition software to be added to police body cameras, women and minorities hardest hit.
Researchers tested features of Microsoft and IBM's face-analysis services that are supposed to identify the gender of people in photos.
Uhhh.... uhhh... *looks around nervously*
Holy shh...
So, we can't tell the difference between a black person and a gorilla, so we'll just not allow it to classify anything as a "gorilla".
One of my coworkers was testing Photos on her phone. She had taken a picture of one of her friends in an awkward position at a party, and Google decided to label her friend "dog."
Basically, Photos is flawed, but hilarious.
If Google finds the doggy position awkward, There's something wrong with Google. James Damore was right.
Right, an "awkward position" at a "party" that Google labeled as "dog." Just say it: your coworker was at a gangbang and her friend was having a train ran on her in doggystyle.
It wasn't that so much. The awkward position made her face the camera directly. IOW, she's really ugly.
This reminds me of my five-year-old, who was looking over my shoulder as I was preparing a presentation on polio (I'm in medical school). When I inserted a picture of some children with polio into the slide, the first words out of his mouth were: "Why are those kids pretending to be camels?"
I still practically fall out of my seat every time I see that picture.
America is tearing itself apart over "race" so thoroughly that it's amazing we aren't sinking into complete irrelevance yet.
IN 2015, A black software developer embarrassed Google by tweeting that the company's Photos service had labeled photos of him with a black friend as "gorillas."
Just remember, Google fires all of it's uppity employees that raise thorny social issues or embarrass the company unnecessarily.
Yep, and of course don't let this color you judgment of self driving cars ability to weigh life and death autonomously.
Vehicle AI: Road obstacle detected.
Vehicle AI: Obstacle identified, Animal: Gorilla
Vehicle AI: Termination allowed.
-1 Harambe
+1 dick out?
Having my dick out is the major reason I dream for the future of self-driving cars.
Imagine what a boner-killer running over a pedestrian stumbling across the road will be.
It only takes one hand as it is.
I'm looking forward to the old in-car cocktail but without the expense of hiring a driver.
An indictment of the obstacle recognition AI for finding a gorilla on a road or of the navigation AI for finding a path with a gorilla on it for the obstacle recognition AI to be challenged by? Yes!
The disparity is the latest example in a growing collection of bloopers from AI systems that seem to have picked up societal biases around certain groups. Google's photo-organizing service still censors the search terms "gorilla" and "monkey" after an incident nearly three years ago in which algorithms tagged black people as gorillas, for example.
Whoopsie...
Turns out machines are racist as fuck too. Or is it GIGO? You decide!
I blame Fred Sanford.
+1 buffalo brownie
And retinal scans? What are they going to do, ask suspects to hold their eye's up to the camera lens? There is no way the cameras will have enough resolution to compare a retinal scan against a video image of a face taken from a distance.
Check out this guy who never saw Minority Report.
I saw Minority Report. You know it was fiction right?
In regards to #MeToo.
What Do We Do With These Men?
It's the NY Times. Worse, it's a bad article and I would feel bad for linking it if I weren't solidly certain the commentariat can handle anything. (We have Crusty. NY Times knows nothing.)
I'm also not sure a good article could be printed. We live in interesting times. It's bringing up a seriously good question, and perhaps we can start there. Worth a read.
The writer used to work at Jezebel. That's as far as I got.
But yeah, I dunno. Maybe the endgame is turn men into women. I don't see how they'll satisfied any other way.
Endgame.
I don't see how they'll satisfied any other way.
Satisfied? You clearly haven't known many women.
Will police get special goggles so they can see the world in Terminator Vision, with subjects identified by name and potential threat characteristics?
Those don't even need to be special googles. Just write "GOOD SHOOT" on the inside of each lens.
Or "DOG".
Or "DOG".
Less goggle, more helmet. Terminator Vision runs the risk of drawing their focus to actual targets, exculpating them, and improving their aim.
Just a matter of time. License plate readers are everywhere. On every cop car, and every street light. Every movement of your vehicle is in a database.
Between that and that tracking device otherwise known as a cell phone, you can't go anywhere without there being a record.
They can't yet pull you over without probable cause though. Until they can recognize who the driver is, I suppose.
Can the public have body cams with facial recognition that identifies the LEO in front of them as a habitual offender and alerts the user? I guess that would hinge on an accessible database.
"Can the public have body cams with facial recognition that identifies the LEO in front of them as a habitual offender and alerts the user?"
I thought that was generally what the badge meant.
Maybe they could combine those two things and have the camera automatically tase anyone it recognizes who has a warrant. We wouldn't want our Heroes in Blue to be using 1990s technology in the 2020s. What could go wrong? /sarc
Honestly, while that's a pretty fucked up suggestion it's also probably less terrible than what we have now.
True. Most people will survive a tasing, and it takes the decision to use force out of the hands of the 'roided up barely trained ape.
I guess some ethnicities really do all look alike.
The thing I find odd is that they say it's because of underrepresentation in the photo sample groups, but if that's true it seems like in the past three or four years they'd have fixed it...
You'd think there'd be enough mugshots for the database.
The thing I find odd is that they say it's because of underrepresentation in the photo sample groups, but if that's true it seems like in the past three or four years they'd have fixed it...
Without any first hand knowledge, this seems very much like the first answer the engineers gave that the lawyers and management would accept.
The funny thing is that a/the similar thing is happening or already kinda happened in photography decades ago (and arguably in other fields even further back) and that, despite all denials of reality, the issue is fundamental to the nature of light and photography. Excessively dark skin washes out nuance and depth in all but the most well adjusted or staged lighting. The same is true for excessively white skin except that it's biologically simple for most of us to avoid excessively light skin one way or another. Arguably, they still haven't solved the problem for in photography and it should be of note that nobody really cares about the photographic discrimination against albinos and gingers.
Wondering if the facial recognition is pre- or post- beating?
Axon announced that it has created an artificial intelligence ethics board dedicated to developing responsible AI technologies.
Dunphy got hisself a new job?
And no one saw this coming - even when they put cameras on top of every stoplight and a minor software tweak can track all our movements....
On the bright side, such tech might motivate the pigs to turn the bodycams on.
Facial recognition presumably only works when they are turned on, so no problems there. LOL