Reason.com - Free Minds and Free Markets
Reason logo Reason logo
  • Latest
  • Magazine
    • Current Issue
    • Archives
    • Subscribe
    • Crossword
  • Video
  • Podcasts
    • All Shows
    • The Reason Roundtable
    • The Reason Interview With Nick Gillespie
    • The Soho Forum Debates
    • Just Asking Questions
    • The Best of Reason Magazine
    • Why We Can't Have Nice Things
  • Volokh
  • Newsletters
  • Donate
    • Donate Online
    • Donate Crypto
    • Ways To Give To Reason Foundation
    • Torchbearer Society
    • Planned Giving
  • Subscribe
    • Reason Plus Subscription
    • Print Subscription
    • Gift Subscriptions
    • Subscriber Support

Login Form

Create new account
Forgot password

Facial Recognition

Facial Recognition and the Danger of Automated Authoritarianism

Hundreds of police departments are using facial recognition technology without oversight.

Ronald Bailey | 1.21.2020 3:10 PM

Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests
No place to hide | Design Cells/Science Photo Library/Newscom
Perpetual Line-up (Design Cells/Science Photo Library/Newscom)

Clearview AI, a tech startup, has created an app that enables law enforcement agencies to match photographs to its database of over 3 billion photos scraped from millions of public websites including Facebook, YouTube, Twitter, Instagram, and Venmo. For comparison, the FBI's photo database contains only 640 million images. According to The New York Times, some 600 law-enforcement departments, including federal, state, and local agencies, have already used Clearview AI's technology. "You take a picture of a person, upload it and get to see public photos of that person, along with links to where those photos appeared," explained the Times.

Moreover, Clearview AI's technology doesn't need straight-ahead mugshots to work effectively. "With Clearview, you can use photos that aren't perfect," Detective Sgt. Nick Ferrara of the Gainesville, Florida, police department told the Times. "A person can be wearing a hat or glasses, or it can be a profile shot or partial view of their face." No outside tests for the app's accuracy have been publicly reported, but the company claims in a 2019 FAQ to users that it:

…has the most accurate facial identification software in the world, with a 98.6% accuracy rate. This does not mean that you will get matches for 98.6% of your searches, but you will almost never get a false positive. You will either get a correct match or no results. We have a 30-60% hit rate, but we are adding hundreds of millions of new faces every month and expect to get to 80% by the end of 2019.

The current Clearview AI app works basically as an investigative tool helping police identify perpetrators or victims after a crime has occurred. The company is, however, developing facial recognition software that would make it possible for wearers of augmented-reality glasses to ID folks walking down a street in real-time. Of course, such a technology could easily be harnessed to networked surveillance cameras so that government agents could track where a citizen is and with whom that citizen is interacting.

"Facial recognition is the perfect tool for oppression," write Woodrow Hartzog, a professor of law and computer science at Northeastern University, and Evan Selinger, a philosopher at the Rochester Institute of Technology. It is, they persuasively argue in Medium, "the most uniquely dangerous surveillance mechanism ever invented." Real-time deployment of facial recognition technologies would essentially turn our faces into ID cards on permanent display to the police.

"I've come to the conclusion that because information constantly increases, there's never going to be privacy," Clearview AI investor David Scalzo told The New York Times. "Laws have to determine what's legal, but you can't ban technology. Sure, that might lead to a dystopian future or something, but you can't ban it."

In fact, several cities have already banned police use of facial recognition technologies. And members of Congress are also now waking up to how the widespread use of this technology could impair our civil liberties. "In November [2019], Sens. Mike Lee (R–Utah) and Chris Coons (D–Del.) introduced the Facial Recognition Technology Warrant Act," reports Reason's Scott Shackford. "The bill would require federal officials to seek a warrant in order to use facial recognition technology to track a specific person's public movements for more than 72 hours."

Permitting police to track a person using facial recognition for less than three days without a warrant seems constitutionally questionable to me. After all, in Carpenter v. United States (2018), the Supreme Court required that the police obtain a warrant in order to search a person's cell phone location data. "A cell phone—almost a 'feature of human anatomy'—tracks nearly exactly the movements of its owner," Chief Justice John Roberts wrote in his majority opinion. "When the Government tracks the location of a cell phone it achieves near perfect surveillance, as if it had attached an ankle monitor to the phone's user." Since faces are actual features of human anatomy (no ankle monitors needed), surely a warrant should be required for police to track citizens by their faces.

Harvard cybersecurity expert Bruce Schneier argued in a recent New York Times column that merely banning facial recognition is not enough to protect ourselves from government snooping. Ubiquitous mass surveillance is increasingly the norm. "In countries like China, a surveillance infrastructure is being built by the government for social control," he wrote. "In countries like the United States, it's being built by corporations in order to influence our buying behavior, and is incidentally used by the government."

The data exhaust constantly emitted by our cell phones, license plates, digital payments, and credit cards is an intimate and comprehensive record of our lives. Right now easy government access is hampered by the fact that the myriad private databases tracking us are disparate and unconnected. However, it is not hard to imagine how automated authoritarianism could arise quickly in the wake of a massive terrorist attack as frightened citizens set aside concern for civil liberties and give in to government demands for access to all of the data that has been privately collected on each of us.

Given the growing prevalence of both government and private surveillance, Schneier argued, we Americans "need to have a serious conversation about … how much we as a society want to be spied on by governments and corporations—and what sorts of influence we want them to have over our lives." To forestall a dystopian future, setting strict limits on government use of facial recognition technology is a good place to start.

Start your day with Reason. Get a daily brief of the most important stories and trends every weekday morning when you subscribe to Reason Roundup.

This field is for validation purposes and should be left unchanged.

NEXT: Still Seething Over 2016, Hillary Clinton Slams Sanders for Sexism Sans Substance

Ronald Bailey is science correspondent at Reason.

Facial RecognitionSurveillancePolicePolice AbuseCriminal Justice
Share on FacebookShare on XShare on RedditShare by emailPrint friendly versionCopy page URL
Media Contact & Reprint Requests

Hide Comments (36)

Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.

  1. TrickyVic (old school)   5 years ago

    ""Automated Authoritarianism""

    Feature, not a bug.

  2. Diane Reynolds (Paul.)   5 years ago

    Do what they do in Asia, just wear a medical mask everywhere you go.

    1. icannotread   5 years ago

      Journalist tried that. Still recognized him.

    2. some guy   5 years ago

      Wearing medical masks outside approved medical facilities is now illegal on penalty of suicide.

  3. Juice   5 years ago

    One reason I do my best to keep my pictures off the internet.

    1. Diane Reynolds (Paul.)   5 years ago

      If they ever get Penis Recognition, I'm in big trouble though.

      1. Jerryskids   5 years ago

        Penis Recognition cameras are just Facial Recognition cameras fitted with wide-angle lenses.

        Is what I would have worked into a comment on how big the trouble was going to be.

  4. Diane Reynolds (Paul.)   5 years ago

    3 billion photos scraped from millions of public websites including Facebook, YouTube, Twitter, Instagram, and Venmo.

    Person: Hey Paul, whycome you no use social media? I can't reach on on Facesnap, Instabook or LynchedIn?

    Me: *Sends them this article.*

    1. Juice   5 years ago

      I've recently been using meetup just to meet new people and try some new shit. It's kind of awkward because I don't use my real name or picture, so when I show up I have to tell them I'm the douche with the goofy name and picture. Also tough when I tell people I have no social media presence whatsoever. That's the first thing people want now, to add you on FB or Snapchat or whatever. Ugh.

      1. GroundTruth   5 years ago

        Gawd I love being a luddite! It may be lonely, but it does preserve a bit of privacy.

  5. Diane Reynolds (Paul.)   5 years ago

    This does not mean that you will get matches for 98.6% of your searches, but you will almost never get a false positive.

    Believe this I do not.

    1. Á àß äẞç ãþÇđ âÞ¢Đæ ǎB€Ðëf ảhf   5 years ago

      Not even close. Billions of photos of billions of people, or maybe only hundreds of millions of people if you get clever about height and skin color, all with different lighting, angle, hair styles, hats, scarves, and other clothes .... the only way to limit false positives is to have very few matches at all.

      How fast do they claim this fool thing is, anyway? Bunch of liars.

  6. Diane Reynolds (Paul.)   5 years ago

    In fact, several cities have already banned police use of facial recognition technologies.

    Right, which means they're just doing it illegally.

  7. Overt   5 years ago

    "After all, in Carpenter v. United States (2018), the Supreme Court required that the police obtain a warrant in order to search a person's cell phone location data...Since faces are actual features of human anatomy (no ankle monitors needed), surely a warrant should be required for police to track citizens by their faces."

    While I don't think facial recognition should be used to track a person's movements, Carpenter doesn't seem applicable. Carpenter was about whether the FBI can force a Wireless provider to give its privately used location data to be used in an investigation. In this case we are talking, I assume, about the government having cameras posted all over the place, and then their relationship with a third party to take those feeds of data and make more information about them.

    This isn't about the government taking data resulting from a private transaction, but about the government making "better" use of the data they already get, by automating the corroboration of that data with public records.

    If you believe this facial recognition without a warrant is unconstitutional, you should believe that having public cameras recording all our daily lives is unconstitutional.

    1. Diane Reynolds (Paul.)   5 years ago

      If you believe this facial recognition without a warrant is unconstitutional, you should believe that having public cameras recording all our daily lives is unconstitutional.

      It makes one wonder if we should make voluntarily putting all of your personal details and photos on the web unconstitutional.

  8. EscherEnigma   5 years ago

    "need to have a serious conversation about … how much we as a society want to be spied on by governments and corporations—and what sorts of influence we want them to have over our lives."

    Yeah, if you think the conversation needs to be whether and how much we "want to be spied on", you've already lost, and your opinion, however well reasoned, isn't going to matter.

    The only questions that matter are "what is government allowed to do with it" and "how do we watch the watchers?"

    Mass surveillance is coming, and it's going to be ubiquitous in all major cities, and eventually in most small towns and rural areas too. That can't be stopped.

    All we have control over is how that'll be used. And that will need radical transparency, such that when the government tries to over-step what is allowed (and it will) we see it and know. If we don't find out about how the government over-stepped until a decade later because of Chelsea Manning 2.0, then it'll be too-late.

  9. sarcasmic   5 years ago

    I actually don't mind being tracked by teh evul corepourrayshuns. The worst they can do is try to sell me something. Can't say the same about an organization of men who claim the monopoly on the initiation of violence.

    That of course is the opposite of what the general consensus appears to be. Watch entertainment and it's always teh corepourrayshuns who rule Dystopia. The message being that we need government to protect us from those slavers who dare to provide goods, services, and jobs to willing takers.

    1. EscherEnigma   5 years ago

      The message being that we need government to protect us from those slavers who dare to provide goods, services, and jobs to willing takers.

      ...?

      I don't think I've ever read a piece of Cyberpunk fiction that was set-up as Government v. Corporation. It's always Individual v. Corporation, or sometimes Individual v. Individual (with the corps as a back-drop). The message was that we needed to be more self-reliant and individualistic, even when falling in line and becoming a corporate drone would be easier.

      It's a very pro-individuality genre.

      Also, a key part of such dystopian fiction is that the government doesn't have a monopoly on violence anymore, that consequence-free violence is a power shared by corporations and government.

      1. EscherEnigma   5 years ago

        Unless you're thinking of Hunger Games? I never read those books/watched those movies, so I can't say how well those fall in-line with what I'm talking about.

    2. Jerryskids   5 years ago

      The problem is that the evil corporations tracking you aren't just trying to sell you something, they're also selling all that tracking information to the government. There's no need for the government to spy on you when you're voluntarily making your life an open book to the bank and the credit card companies and Paypal, the cellphone service and the ISP and the cable company, the grocery store and GM and Amazon, Facebook and Twitter and Google. It's not an either/or choice with the globalist state and the globalist corporations working together to create the new globalist society.

    3. mtrueman   5 years ago

      "Watch entertainment and it’s always teh corepourrayshuns who rule Dystopia. The message being that we need government to protect us from those slavers who dare to provide goods, services, and jobs to willing takers."

      You don't need to watch entertainment for that. I thought that during the Bush administration there was legislation enabling phone subscribers to add their names to a 'no call list' to protect themselves from corporate sales propositions. Evidently Americans do indeed look to government to protect their privacy.

    4. GroundTruth   5 years ago

      This!

  10. icannotread   5 years ago

    Clearview was smart and didn't just use social media. If anyone at anytime posted a photo of you online they could have you.

    Reason's a little late on this story because it may be DOA.

    Thanks to a cert denial from today in the Facebook V Patel case you can have Article III standing if you so much as sense there could be a privacy violation so according to the state of Illinois and their facial recognition law Clearview could wind up paying 5K for each violation. That could be each photo they scanned if it was used to further train the service. They could have to pay that to each person in Illinois. Picture owning that company and awakening to that realization.

    I was all ready to further the downfall of society by making my Abortion Watch website where I film women walking into abortion clinics, use the app to identify them, and list them on a website. (Never claimed they were getting an abortion. Just watching.) I'm 100% pro-choice too. I just like proving points as to why things are bad ideas. TrannyFinder and IsHeAHomo were other ideas. Guess I'm back to making my bread corrector where it detoasts toasted slices of bread. (Bread Mulligan patent pending)

  11. John Blutarsky   5 years ago

    LMAO aren't you the retard prick who put your fucking genome into a public database because you Fucking Love Science? And you're worried about facial recognition? HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

    God you're a fucking dildo Ron.

  12. Ben_   5 years ago

    Technology doesn't go backwards. If you don't trust your police, you have to vote to remove them from your community.

    If they are untrustworthy, prohibiting them from using a commonly available technology will do nothing. They will use it regardless.

  13. Echospinner   5 years ago

    I worry about the scammers more than the government.

    1. GroundTruth   5 years ago

      You're going to be lonely here!

      1. icannotread   5 years ago

        He's not wrong. The police are expected to abide by some form of rules. True, it may not be a locked door on your house, but it's a door instead of an open archway.

        1. GroundTruth   5 years ago

          Yes, the police are indeed expected to abide by some form of rules, but so are individuals (and their larger conglomerations), assuming the rules are clearly set out.

          But I can't tell the police to go to hell without it becoming an issue, whereas I can do that for Microsoft, Home Depot or Sears.

          1. GroundTruth   5 years ago

            Er, Sears.... change that to Amazon. Sears already has.

  14. mtrueman   5 years ago

    "To forestall a dystopian future, setting strict limits on government use of facial recognition technology is a good place to start."

    For libertarians and entrepreneurs a better place to start: How can I make money out of this? Unrecognizability masks. Spartacus masks. Rosa Luxembourg masks.

  15. NOYB2   5 years ago

    Simple:

    (1) prohibit government from using face recognition technology, or purchasing goods or services that use it, except by court warrant

    (2) make it legal again to conceal your face in public spaces, everywhere

    1. EscherEnigma   5 years ago

      You'd outlaw signs that require customers to remove ski masks before entering?

      Interesting.

    2. icannotread   5 years ago

      Preventing government use doesn't do a damn thing. You have to fully prevent the use of the technology.

      A police officer can't drink on the job, but when he's off the clock he can get smashed. If a police officer can't use facial recognition technology when he's on the job he can use it when he's off the clock.

  16. GroundTruth   5 years ago

    "I've come to the conclusion that because information constantly increases, there's never going to be privacy," Clearview AI investor David Scalzo told The New York Times. "Laws have to determine what's legal, but you can't ban technology. Sure, that might lead to a dystopian future or something, but you can't ban it."

    Sure Mr. Scalzo, you're right. The battle has already been lost on privacy, so why fight the inevitable?

    Paging Loa Tzu....

    1. icannotread   5 years ago

      Of course you can ban technology. Fully-auto rifles? High explosives? Cell phone jamming devices? That's all technology.

Please log in to post comments

Mute this user?

  • Mute User
  • Cancel

Ban this user?

  • Ban User
  • Cancel

Un-ban this user?

  • Un-ban User
  • Cancel

Nuke this user?

  • Nuke User
  • Cancel

Un-nuke this user?

  • Un-nuke User
  • Cancel

Flag this comment?

  • Flag Comment
  • Cancel

Un-flag this comment?

  • Un-flag Comment
  • Cancel

Latest

In Dangerous Times, Train for Self-Defense

J.D. Tuccille | 6.2.2025 7:00 AM

Welcoming Anti-Trump Liberals to the Free Trade Club

Katherine Mangu-Ward | From the July 2025 issue

Brickbat: Armed, Elderly, and Dangerous

Charles Oliver | 6.2.2025 4:00 AM

How Trump's Tariffs and Immigration Policies Could Make Housing Even More Expensive

M. Nolan Gray | From the July 2025 issue

Photo: Dire Wolf De-extinction

Ronald Bailey | From the July 2025 issue

Recommended

  • About
  • Browse Topics
  • Events
  • Staff
  • Jobs
  • Donate
  • Advertise
  • Subscribe
  • Contact
  • Media
  • Shop
  • Amazon
Reason Facebook@reason on XReason InstagramReason TikTokReason YoutubeApple PodcastsReason on FlipboardReason RSS

© 2024 Reason Foundation | Accessibility | Privacy Policy | Terms Of Use

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

r

Do you care about free minds and free markets? Sign up to get the biggest stories from Reason in your inbox every afternoon.

This field is for validation purposes and should be left unchanged.

This modal will close in 10

Reason Plus

Special Offer!

  • Full digital edition access
  • No ads
  • Commenting privileges

Just $25 per year

Join Today!