Policy

Fake IDs

No face time

|


Facial recognition technology is often billed as a tradeoff between privacy and security. A recent American Civil Liberties Union (ACLU) report suggests that it's closer to a no-win deal, resulting in less privacy and precious little added security.

The ACLU report focuses on Ybor City, Florida, where police began installing surveillance cameras with facial recognition technology last July. Faces caught on camera were compared by a computer to a database of 30,000 wanted criminals, a scheme that resulted in a loud outcry from privacy advocates. One dismayed resident told the local alternative paper The Weekly Planet that "citizens of [Ybor] are now subjected to a police lineup for the crime of walking down the street."

In mid-August, the police department stopped using the technology, saying that because of redistricting, too many new officers would have to be trained to use the system. But Jay Stanley and Barry Steinhardt, authors of the ACLU report, suggest a more likely cause: The technology was a complete failure. It not only resulted in no arrests but made many false matches. In several cases it misdiagnosed a potential suspect's sex, and it was easily fooled by less than perfect lighting.

"Right now," Stanley ex-plains, "discussion of a reliable face-recognition package is science fiction, which is not our line of work."