Welcome to the Panopticon
Not quite--at leaIt's just profiling now.
In 1785, the utilitarian philosopher Jeremy Bentham conceived the "Panopticon," a model prison in which all inmates would be observable at all times by unseen guards. Though never built, the Panopticon has lingered in the Western imagination and continues to be invoked as the worst-case scenario whenever methods of social control or surveillance are discussed. In the wake of last week's terrorist attacks and calls for heightened, technologically sophisticated surveillance measures, a question should be asked: Are we, in our search for safety, slipping toward a virtual Panopticon?
"We're in a new world where we have to rebalance freedom and security," House Democratic Leader Richard A. Gephardt (D-Mo.) declared after last week's terrorist attacks. "When you're at war, civil liberties are treated differently," concurred Senate Minority Leader Trent Lott (R-Miss.). Americans are debating whether they have to give up freedom for security, but modern technologies may help avoid that any such trade-off.
One technology is face-recognition software. There are a number of companies that have been developing face recognition systems that scan faces in a crowd, digitize them, and then match them to faces stored in a database. Earlier this year, Tampa, Florida installed the FaceIt face-recognition technology in its 36-camera closed-circuit TV (CCTV) monitoring system in the Ybor City entertainment district. Such a system has proven highly controversial and has engendered some rather inventive protests.
Face recognition software creates a matrix–a rectangular array of numbers–similar to a numerical version of a word-search puzzle. This matrix identifies 80 distinctive points, like the distance between the eyes, length of the lips, for each individual face. To achieve a match, 14 of those points must align with a database picture like a mug shot. Fingerprints require 11 points to match the whorls and ridges that make up each person's unique print. In recent federal government test sponsored by the Department of Defense Counterdrug Technology Development Program with ideal lighting, face recognition software can achieve a 99.3 percent accuracy rate. However, it is unlikely that such accuracy can be achieved on the streets where lighting and weather vary.
After the Tampa project was announced, the American Civil Liberties Union and House Majority Leader Dick Armey (R-TX) issued a joint statement condemning it as a threat to privacy. Others have likened it to subjecting citizens to a "virtual line up" or "facial frisking."
"There is something creepy about the technology," says civil libertarian Michael Godwin, a policy fellow at the Center for Democracy and Technology in Washington, D.C. "But I don't think that a federal court would necessarily find using it unconstitutional."
It's even more likely to pass muster if it is implemented with safeguards such as "no match-no memory. That is, if operators ensure that no images are kept by the system unless matched as a criminal. Strict rules on whose faces go into government databases would also have to be established. For example, Tampa's face-recognition database contains 30,000 images of people with outstanding felony warrants. And signs warn visitors that they are being monitored by "Smart CCTV" in Tampa.
Face-recognition software forces us to consider the difference between freedom and privacy. We find the technology creepy because we don't want to be monitored. A huge part of the attraction of cities is the anonymity that they offer-you can be whomever you wish to be, in any number of different contexts without your neighbors being any the wiser.
But is face-recognition technology the equivalent of "facial frisking?" In traditional frisking, a citizen is stopped by police, questioned, and patted down, often in public. This clearly interferes with his or her day-to-day activities and subjects them to public embarrassment. Facial-recognition systems, on the other hand, do none that.
Assuming that police adhere to rules about erasing images and that the systems do not produce a lot of false positives identifying law-abiding citizens as criminal suspects-both big assumptions-facial-recognition technology would not seem to limit the freedom of Americans to go unmolested about their business.
In the past we could rely on the limitations of government for a sense of privacy, but new technologies are erasing those limitations. If Americans are to avoid living in a Panopticon, we will have to be vigilant in imposing limits on monitoring technologies. But if we do, we will not only maintain our freedom, but enjoy the additional security that the new technologies can provide.
Show Comments (0)