New Orleans Police Secretly Used Prohibited Facial Recognition Surveillance for Years
Although the AI-generated surveillance of the public has been paused, the program continues to send automatic alerts to the Louisiana State Police and federal authorities.
The New Orleans Police Department (NOPD) secretly received real-time, AI-generated alerts from 200 facial recognition cameras throughout the city for two years, despite a city ordinance barring generalized surveillance of the public.
"Police increasingly use facial recognition software to identify unknown culprits from still images, usually taken by surveillance cameras at or near the scene of a crime," an exposé by The Washington Post explains. However, "New Orleans police took this technology a step further," automatically alerting officers with real-time updates of names and locations of possible matches of wanted suspects from a private network of cameras through a mobile app.
"This is the facial recognition technology nightmare scenario that we have been worried about," Nathan Freed Wessler, a deputy director for the American Civil Liberties Union's Speech, Privacy, and Technology project, told the Post. "This is the government giving itself the power to track anyone—for that matter, everyone—as we go about our lives walking around in public." According to Wessler, New Orleans is the first known instance in which a major American city has used artificial intelligence to identify people through live footage for the purpose of making arrests.
The use of these automatic alerts may have violated a city ordinance meant to protect the public's privacy from a generalized surveillance tool and prevent wrongful arrests due to software errors.
Passed in 2022 in response to New Orleans' post-pandemic crime wave, the Surveillance Technology and Data Protection Ordinance removed a previous prohibition on surveillance technology in criminal investigations to increase public safety. Mayor LaToya Cantrell said at the time that the NOPD needed "every tool available at their disposal" to keep the city's "residents, businesses and visitors safe." However, the ordinance stopped short of allowing the NOPD to utilize a "face surveillance system"—defined as "any computer software or application that performs face surveillance"—while limiting data collection to "only the minimum amount of personal information needed to fulfill a narrow well-defined purpose."
While violent crime in New Orleans has declined since 2022, so have the crime rates in most major American cities that do not use real-time facial recognition surveillance systems.
Anne Kirkpatrick, superintendent of the NOPD since September 2023, paused the automatic alerts in April after learning about potential legal problems with using the system. Records obtained by the Post reveal that Kirkpatrick sent an email to Project NOLA, the nonprofit that provides the NOPD with facial recognition services, on April 8 stating "that the automated alerts must be turned off until she is 'sure that the use of the app meets all the requirements of the law and policies.'" The network of cameras remains in place.
While automatic pings of potential suspect matches to NOPD officers are paused, Kirkpatrick maintains that facial recognition technology is essential to law enforcement. On May 16, 10 inmates escaped from the New Orleans jail, prompting a manhunt (five inmates remain at large). Facial recognition is credited with the capture of two of the escaped inmates. Kirkpatrick told WVUE, the local Fox affiliate, that such a situation is "the exact reason facial recognition technology is so critical and well within our boundaries of the ordinance here." Bryan Lagarde, Project NOLA's executive director, confirmed that NOPD is not currently using real-time, AI-generated alerts but is still utilizing facial recognition technology and footage from 5,000 cameras across New Orleans to track and apprehend the escapees. Lagarde described to WVUE an instance in which officers narrowly missed an inmate by a matter of minutes, insinuating that automated alerts might be necessary to protect public safety, despite the cost to privacy.
Currently, Project NOLA staff continue to receive the AI-generated alerts provided by the camera network. The location of wanted suspects is then conveyed to police via phone calls, texts, and emails. According to Lagarde, the system is still sending alerts to the Louisiana State Police and federal authorities.
The government's increased use of AI tools raises new challenges that impact Americans' civil liberties and privacy. Many questions around accountability, due process, and audits concerning the information AI tools collect still remain as the technology is deployed in the name of public safety.
Show Comments (13)