Police

Police Body Cameras May Get Facial Recogntion

One of America's largest body camera suppliers has expressed interest in the technology.

|

Ryan Johnson | Wikimedia Commons

Axon, America's largest manufacturer of police body cameras, may add facial recognition to its body cameras, a move that has sparked concerns among civil libertarians.

The company, formerly known as Taser, is an industry leader in electroshock weapons and body cameras, with more than 18,000 law enforcement customers in more than 100 countries. Within the United States, 38 out of the 68 major law enforcement agencies have purchased Axon body cameras.

On Thursday, Axon announced that it has created an artificial intelligence ethics board dedicated to developing responsible AI technologies. The board will meet twice a year to "guide the development of Axon's AI-powered devices and services," with a focus on how the products will impact communities. While the announcement did not directly mention facial recognition, company founder Rick Smith tells The Washington Post that such technologies are "under active consideration."

Body cameras with facial recognition could identify the faces of individuals using biometric data—facial features, retina scans, and other identifiers—in real time. Every person the officer comes across could be flagged and identified. Artificial intelligence would make sense of the footage. It could track suspects, scan for wanted individuals, and identify people onscreen.

While acknowledging the technology's potential for "bias and misuse," Smith argues that the tech's benefits cannot be ignored. "I don't think it's an optimal solution, the world we're in today, that catching dangerous people should just be left up to random chance, or expecting police officers to remember who they're looking for," he says in the Post. "It would be both naive and counterproductive to say law enforcement shouldn't have these new technologies. They're going to, and I think they're going to need them. We can't have police in the 2020s policing with technologies from the 1990s."

Cameras with real-time facial recognition are already being used in China and the U.K.

Currently, photographs of 117 million Americans—nearly half the country's adult population—are stored in a facial recognition database that can be accessed by the FBI.

According to a report from the Government Accountability Office, the FBI's use of face recognition technology has scant oversight and the bureau does little to test for false positives and racial bias when looking for suspects. Yet facial recognition technologies often struggle with identifying members of some ethnic groups.

Forty-two different groups, including the American Civil Liberties Union, the National Association for the Advancement of Colored People, and the National Urban League, raised these concerns in an open letter to Axon's ethics board. In their statement, the groups called body cameras with real-time facial recognition "categorically unethical to deploy":

Axon has a responsibility to ensure that its present and future products, including AI-based products, don't drive unfair or unethical outcomes or amplify racial inequities in policing. Axon acknowledges this responsibility—the company states that it "fully recognize[s] the complexities and sensitivities around technology in law enforcement, and [is] committed to getting it right."

Certain products are categorically unethical to deploy. Chief among these is real-time face recognition analysis of live video captured by body-worn cameras. Axon must not offer or enable this feature. Real-time face recognition would chill the constitutional freedoms of speech and association, especially at political protests….Real-time face recognition could also prime officers to perceive individuals as more dangerous than they really are and to use more force than the situation requires. No policy or safeguard can mitigate these risks sufficiently well for real-time face recognition ever to be marketable.

NEXT: USC Law Professor: Supporters of Campus Free Speech are 'Preying on Vulnerable Teenagers'

Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Report abuses.

  1. Someone explain something about cops to me. It’s not just that they’re obese, it’s that their heads seem even more obese than their bodies. Is it that their special diet of frosted pastries leads to this sort of physiological response, or are they actually even fatter than they appear when not in what must be remarkably slimming uniforms?

    1. The camera adds 10 pounds, Tony. You of all people should know this.

      1. That’s funny.

    2. Cops are the people who enforce those taxes and government programs you love. Everything government does is predicated on violence.

    3. They are the poster children for a bloated bureaucracy.

    4. How can one be anti-cop but also pro-government intervention in all other ways, one might ask?

      1. It’s pretty easy if you don’t think about things very deeply.

      2. I’m just pro-civilization. Only some kind of fucking moron would think people could get along in the modern world without a modern government.

        So not being this I am free to critique aspects of our own civilization that lack behind the ideal, such as American criminal so-called justice and the walking heart attacks who enforce it.

        1. Translation: Tony likes cops when they’re forcing HIS agenda on everyone. He doesn’t like cops when they’re arresting people who support his party.

          1. You want to do without law enforcement entirely? No, you like feeling secure in your person and your shit? So what are you talking about?

            I accept the premise that cops are necessary, I just want them to be less fat and more accountable.

            1. So, not monopolistic?

  2. So… the reason for body cameras is to protect citizens from police misconduct which is a real thing.

    The government potentially wants to use this technology to hurt citizens by tracking them and cataloging their faces without a required warrant based upon probable cause?

    1. Steroid abuse.

      1. Crap, meant for Tony.

      2. They should try to get their hands on the ones that build muscle.

  3. *sigh*

    If a person doesn’t break the law that person does not have to worry about facial recognition software.

    1. This would work better coming from OBL.

    2. Sure. Because there are no examples in history of a government ever misusing technology to crack down on its political opponents or a disfavored minority.

  4. According to a report from the Government Accountability Office, the FBI’s use of face recognition technology has scant oversight and the bureau does little to test for false positives and racial bias when looking for suspects.

    Our nation’s preeminent law enforcement agency does not require oversight.

    1. They’re already police. They will police themselves.

    2. does little to test for false positives and racial bias

      Because making sure the cameras aren’t racist is totes as important as, uh, not arresting the wrong people.

  5. Body cameras with facial recognition could identify the faces of individuals using biometric data?facial features, retina scans, and other identifiers?in real time.

    Not bloody likely any time soon. Whatever facial recognition they’re adding is going to be after-the-fact when they upload the video to a server, and then the server will perform facial recognition to the previously-recorded, uploaded video files. I have very strong doubts that those cameras will be able to process video and communicate vie wireless and cross reference a database of a hundred gajillion faces in real time.

    You would have to have weissman scores only achieved in HBO comedies to successfully perform such a function.

    1. Not to mention that, once again; Facial recognition software to be added to police body cameras, women and minorities hardest hit.

      1. Researchers tested features of Microsoft and IBM’s face-analysis services that are supposed to identify the gender of people in photos.

        Uhhh…. uhhh… *looks around nervously*

      2. Holy shh…

        IN 2015, A black software developer embarrassed Google by tweeting that the company’s Photos service had labeled photos of him with a black friend as “gorillas.” Google declared itself “appalled and genuinely sorry.” An engineer who became the public face of the clean-up operation said the label gorilla would no longer be applied to groups of images, and that Google was “working on longer-term fixes.”

        More than two years later, one of those fixes is erasing gorillas, and some other primates, from the service’s lexicon. The awkward workaround illustrates the difficulties Google and other tech companies face in advancing image-recognition technology, which the companies hope to use in self-driving cars, personal assistants, and other products.

        So, we can’t tell the difference between a black person and a gorilla, so we’ll just not allow it to classify anything as a “gorilla”.

        1. One of my coworkers was testing Photos on her phone. She had taken a picture of one of her friends in an awkward position at a party, and Google decided to label her friend “dog.”

          Basically, Photos is flawed, but hilarious.

          1. If Google finds the doggy position awkward, There’s something wrong with Google. James Damore was right.

          2. She had taken a picture of one of her friends in an awkward position at a party, and Google decided to label her friend “dog.”

            Right, an “awkward position” at a “party” that Google labeled as “dog.” Just say it: your coworker was at a gangbang and her friend was having a train ran on her in doggystyle.

            1. It wasn’t that so much. The awkward position made her face the camera directly. IOW, she’s really ugly.

          3. This reminds me of my five-year-old, who was looking over my shoulder as I was preparing a presentation on polio (I’m in medical school). When I inserted a picture of some children with polio into the slide, the first words out of his mouth were: “Why are those kids pretending to be camels?”

            I still practically fall out of my seat every time I see that picture.

        2. America is tearing itself apart over “race” so thoroughly that it’s amazing we aren’t sinking into complete irrelevance yet.

        3. IN 2015, A black software developer embarrassed Google by tweeting that the company’s Photos service had labeled photos of him with a black friend as “gorillas.”

          Just remember, Google fires all of it’s uppity employees that raise thorny social issues or embarrass the company unnecessarily.

        4. Yep, and of course don’t let this color you judgment of self driving cars ability to weigh life and death autonomously.

          Vehicle AI: Road obstacle detected.

          Vehicle AI: Obstacle identified, Animal: Gorilla

          Vehicle AI: Termination allowed.

            1. +1 dick out?

              1. Having my dick out is the major reason I dream for the future of self-driving cars.

                1. Imagine what a boner-killer running over a pedestrian stumbling across the road will be.

                2. It only takes one hand as it is.

                  I’m looking forward to the old in-car cocktail but without the expense of hiring a driver.

          1. An indictment of the obstacle recognition AI for finding a gorilla on a road or of the navigation AI for finding a path with a gorilla on it for the obstacle recognition AI to be challenged by? Yes!


      3. The disparity is the latest example in a growing collection of bloopers from AI systems that seem to have picked up societal biases around certain groups. Google’s photo-organizing service still censors the search terms “gorilla” and “monkey” after an incident nearly three years ago in which algorithms tagged black people as gorillas, for example.

        Whoopsie…

        Turns out machines are racist as fuck too. Or is it GIGO? You decide!

        1. I blame Fred Sanford.

          1. +1 buffalo brownie

    2. And retinal scans? What are they going to do, ask suspects to hold their eye’s up to the camera lens? There is no way the cameras will have enough resolution to compare a retinal scan against a video image of a face taken from a distance.

      1. Check out this guy who never saw Minority Report.

        1. I saw Minority Report. You know it was fiction right?

  6. In regards to #MeToo.

    What Do We Do With These Men?

    “At first I thought they didn’t want me to participate in campus activities,” one told me. “Then I thought they didn’t want me to graduate. Now they don’t want me to have a job or be part of society. Do they want me to commit suicide? Is that what they want me to do? What is the endgame?”

    It’s the NY Times. Worse, it’s a bad article and I would feel bad for linking it if I weren’t solidly certain the commentariat can handle anything. (We have Crusty. NY Times knows nothing.)

    I’m also not sure a good article could be printed. We live in interesting times. It’s bringing up a seriously good question, and perhaps we can start there. Worth a read.

    1. The writer used to work at Jezebel. That’s as far as I got.

      But yeah, I dunno. Maybe the endgame is turn men into women. I don’t see how they’ll satisfied any other way.

      1. I don’t see how they’ll satisfied any other way.

        Satisfied? You clearly haven’t known many women.

  7. Will police get special goggles so they can see the world in Terminator Vision, with subjects identified by name and potential threat characteristics?

    1. Those don’t even need to be special googles. Just write “GOOD SHOOT” on the inside of each lens.

      1. Or “DOG”.

      2. Or “DOG”.

    2. Less goggle, more helmet. Terminator Vision runs the risk of drawing their focus to actual targets, exculpating them, and improving their aim.

  8. Just a matter of time. License plate readers are everywhere. On every cop car, and every street light. Every movement of your vehicle is in a database.

    1. Between that and that tracking device otherwise known as a cell phone, you can’t go anywhere without there being a record.

    2. They can’t yet pull you over without probable cause though. Until they can recognize who the driver is, I suppose.

  9. Can the public have body cams with facial recognition that identifies the LEO in front of them as a habitual offender and alerts the user? I guess that would hinge on an accessible database.

    1. “Can the public have body cams with facial recognition that identifies the LEO in front of them as a habitual offender and alerts the user?”

      I thought that was generally what the badge meant.

  10. Axon, America’s largest manufacturer of police body cameras, may add facial recognition to its body cameras, a move that has sparked concerns among civil libertarians.

    The company, formerly known as Taser, is an industry leader in electroshock weapons and body cameras…

    Maybe they could combine those two things and have the camera automatically tase anyone it recognizes who has a warrant. We wouldn’t want our Heroes in Blue to be using 1990s technology in the 2020s. What could go wrong? /sarc

    1. Honestly, while that’s a pretty fucked up suggestion it’s also probably less terrible than what we have now.

      1. True. Most people will survive a tasing, and it takes the decision to use force out of the hands of the ‘roided up barely trained ape.

  11. Yet facial recognition technologies often struggle with identifying members of some ethnic groups.

    I guess some ethnicities really do all look alike.

    1. The thing I find odd is that they say it’s because of underrepresentation in the photo sample groups, but if that’s true it seems like in the past three or four years they’d have fixed it…

      1. You’d think there’d be enough mugshots for the database.

      2. The thing I find odd is that they say it’s because of underrepresentation in the photo sample groups, but if that’s true it seems like in the past three or four years they’d have fixed it…

        Without any first hand knowledge, this seems very much like the first answer the engineers gave that the lawyers and management would accept.

        The funny thing is that a/the similar thing is happening or already kinda happened in photography decades ago (and arguably in other fields even further back) and that, despite all denials of reality, the issue is fundamental to the nature of light and photography. Excessively dark skin washes out nuance and depth in all but the most well adjusted or staged lighting. The same is true for excessively white skin except that it’s biologically simple for most of us to avoid excessively light skin one way or another. Arguably, they still haven’t solved the problem for in photography and it should be of note that nobody really cares about the photographic discrimination against albinos and gingers.

  12. Wondering if the facial recognition is pre- or post- beating?

  13. Axon announced that it has created an artificial intelligence ethics board dedicated to developing responsible AI technologies.

    Dunphy got hisself a new job?

  14. And no one saw this coming – even when they put cameras on top of every stoplight and a minor software tweak can track all our movements….

  15. On the bright side, such tech might motivate the pigs to turn the bodycams on.

  16. Facial recognition presumably only works when they are turned on, so no problems there. LOL

Please to post comments

Comments are closed.