The Volokh Conspiracy

Mostly law professors | Sometimes contrarian | Often libertarian | Always independent

Volokh Conspiracy

OK to Use Evidence Indirectly Obtained Using Facial Recognition Software

The results of facial recognition software might not be admissible evidence—but the police are allowed to use them to generate admissible evidence.

|The Volokh Conspiracy |


From People v. Reyes, decided last week by Justice Mark Dwyer (N.Y. trial ct.); seems quite right to me:

Defendant Luis Reyes is charged with Burglary in the Second Degree and related offenses. Defendant moves to preclude trial testimony stemming from the use of facial identification software to identify the perpetrator of the crime in question. Not in issue here is that the burglar could be seen in crime scene videos. The case detective was able to recognize defendant as that individual after he examined a police file containing photographs of defendant.

Defendant's complaint is that the detective retrieved that file because of an earlier "hit" on defendant obtained by analyzing the crime scene videos with facial recognition software. Defendant asks the court to "preclude the People's use of the results of any use of NYPD's facial recognition software." …

For current purposes the court assumes these facts. On September 29, 2019, defendant committed a burglary at 507 West 113th Street in Manhattan, entering a mail room there to steal packages. Defendant's actions were recorded by security cameras. The detective assigned to lead the investigation obtained the videos. He made stills from them and sent those stills to the NYPD Facial Identification Section ("FIS") for analysis.

Using facial recognition software, the FIS located a single "possible match": one of the burglar's pictures possibly matched defendant's mug shot. The FIS returned a report to the detective bearing a prominent statement that the "match" was only a lead. The report specified that the match did not create probable cause for an arrest, which could be produced only by further investigation.

The case detective therefore obtained defendant's police file. In it he found a copy of the same mug shot along with photos depicting defendant's distinctive forearm tattoos. After studying those photos, the detective again viewed the crime scene videos and recognized that the burglar indeed was defendant.

He next prepared a probable cause I-card bearing three photos of the burglar taken from the crime scene videos—photos which displayed, among other features, his tattoos. The I-card did not include any other picture or defendant's name. On October 14, 2019, officers recognized defendant from that I-card and arrested him. {[T]he three photos of the burglar that were the key component of the I-card were in no way products of the software analysis of the crime scene videos.} …

"[F]acial recognition" involves the use of software to analyze the front and perhaps side portions of the head of an unknown person, usually as depicted in a photo or a video still. The software measures the location and contours of facial features, including hair. It next compares the results with those for photos of known individuals—photos that are digitally maintained for these comparison purposes—to select any possible "matches."

The authorities can then investigate whether the individual or individuals in the selected photos could be the unknown person. The results can show, for example, that an applicant for a driver's license has had licenses under different names.

To the best of this judge's knowledge, a facial recognition "match" has never been admitted at a New York criminal trial as evidence that an unknown person in one photo is the known person in another. There is no agreement in a relevant community of technological experts that matches are sufficiently reliable to be used in court as identification evidence. Facial recognition analysis thus joins a growing number of scientific and near-scientific techniques that may be used as tools for identifying or eliminating suspects, but that do not produce results admissible at a trial….

Some uses of facial recognition software are controversial. For example, many people fear that employment of such software will erode First Amendment rights by permitting unfriendly officials to identify and take action against those who demonstrate against government policies. That concern is especially pronounced given the ubiquity of security cameras in many big-city areas.

It may well be that legislative action should be taken to curtail the use of facial recognition techniques for such purposes. And this court understands what a "slippery slope" argument is.

The court notes, however, that these and other common concerns about facial recognition techniques seem dramatically divorced from the use of those techniques to develop leads from photographs of people taken as they commit crimes. And such photos are often obtained from private entities, not governmental ones, who employ cameras precisely to secure their safety from criminals.

That is in fact what occurred in this burglary case. No reason appears for the judicial invention of a suppression doctrine in these circumstances. Nor is there any reason for discovery about facial recognition software that was used as a simple trigger for investigation and will presumably not be the basis for testimony at a trial ….