New York City Mayor Eric Adams wants to expand law enforcement's use of facial recognition software in an effort to combat his city's growing gun violence. Experts say the software can violate privacy and civil liberties with its secret surveillance and propensity for inaccuracy.
"We will…move forward on using the latest technology to identify problems, follow up on leads, and collect evidence. From facial recognition technology to new tools that can spot those carrying weapons, we will use every available method to keep our people safe," said Adams in a press release addressing his plan last month. He also calls for it in his Blueprint to End Gun Violence.
The New York Police Department (NYPD) uses facial recognition software to match photos with those in its mugshot database. The NYPD also uses images scraped from social media. It then uses these images to track someone with its 15,000 cameras placed throughout the city, numbers documented in an Amnesty International report warning about the invasiveness of the program.
Michael Sisitzky, senior policy counsel at the New York Civil Liberties Union (NYCLU) in tells Reason that New York's current surveillance operations can already track who goes where and when. He says spying would only be made worse with more facial recognition software, with the surveillance technology likely targeting minorities. He cites the government report from the National Institute of Standards and Technology, which said most facial recognition software misidentified Asians or African Americans 10 to 100 times more compared to white faces. Sisitzky says that facial recognition software may then misidentify minority suspects, leading police to target innocent people.
"The NYPD already disproportionately targets African Americans, and coupling that with facial recognition software would be bad," says Sisitzky. In Detroit, Robert Williams was arrested after facial recognition software misidentified him as a different man sought for shoplifting watches. The software that the Detroit police "caught" Williams with was provided by DataWorks Plus, the same company that provides facial recognition software to the NYPD.
Surveillance experts also said facial recognition software may be inaccurate or manipulated. Jameson Spivack, a policy associate at Georgetown Law's Center on Privacy & Technology, a legal research group studying surveillance, says that matching technology does not provide positive identification for a face. Instead, it assigns a probability of how much a face matches another. Spivack also said errors occur and that the software can mistake two different faces as matching or it could mistake the same face in two separate photos as not matching.
Spivack says that the current methods the NYPD uses are unscientific. He cites a Center on Privacy & Technology public records analysis that found the NYPD would sometimes photoshop images or upload celebrity photos to find matches. In one such instance, the cops thought a suspect on a security camera photo resembled actor Woody Harrelson. So when they could not find a match with the suspect's security footage photo, they uploaded one of Harrelson instead. When the software turned up matches for the actor, the police identified and arrested one of the matches that they thought resembled the suspect from the security cam footage.
"What's concerning is that the face recognition algorithm returned him as a possible match when the photo that they submitted was not him. It was Woody Harrelson," said Spivack.
Privacy and Technology Strategist Daniel Schwarz at the NYCLU said that his organization endorsed a new bill in New York State that would prohibit law enforcement from using biometric surveillance, including facial recognition software, to track people, citing civil liberty and privacy concerns. Sisitzky says local governments in San Francisco, Oakland, and Seattle banned facial recognition software, so there is reason to believe New York State can as well.
This post has been updated to correct the name of the organization Sisitzky and Schwarz belong to.