Wrongful Arrest in Detroit Demonstrates Why Police Use of Facial Surveillance Technology Must Be Banned
The good news is that Boston has just barred law enforcement from using facial recognition technology.

Detroit police wrongfully arrested Robert Williams and jailed him for 30 hours, largely on the basis of a facial recognition match to a grainy video of a jewelry store robber. When the police got around to questioning Williams, 18 hours after his arrest, they showed him two photos of the robber and asked if they were pictures of him. At that point, The New York Times reports, Williams held one of the images beside his face and said, "You think all black men look alike?" The officers recognized immediately that there was no resemblance to the suspect in the video. All the same, Williams was not released until hours later and only after paying a $1,000 personal bond.
After Williams held up the photo, one of the officers made a crucial mistake: He said "the computer must have gotten it wrong." Police and prosecutors generally do not want to reveal that they have been using facial recognition technology to identify and arrest suspects. They especially don't want defendants and their lawyers to question the accuracy of the technology in court later. They know that many studies have shown serious problems with the technology's accuracy, particularly with respect to identifying individuals who belong to racial or ethnic minorities.
In Williams' case, the Detroit police sent an image from the jewelry shop's surveillance camera to the Michigan State Police. That agency, using DataWorks Plus facial recognition technology, ran the image through the state's driver's license database, which spit out Williams' photo as a suspect. Williams' photo was sent to the Detroit police as an "investigative lead report," which stated it was "not a positive identification" and "is not probable cause for arrest." Then, amazingly, the Detroit police showed a six-pack photo lineup containing Williams' driver's license photo to the shop's security consultant. That consultant wasn't there for the robbery; she'd only seen the same blurry surveillance video as the police. She nevertheless identified Williams as the culprit, and Williams was arrested.
Two weeks after his arrest, the prosecutor moved to dismiss the case—but "without prejudice," which means that Williams could later be charged with the crime.
Williams eventually got in contact with the Michigan branch of the American Civil Liberties Union (ACLU), which is now lodging a complaint on his behalf against the police department. The ACLU obtained a court order requiring the prosecutor's office to turn over Williams' complete file, including the warrant request and any body camera or dashboard camera footage of his arrest. As of this week, the prosecutors are still stonewalling and defying the court order by refusing to provide the required documents and/or claiming that they could not be located. The ACLU is asking for an absolute dismissal of the case, an apology, and the removal of Williams' information from Detroit's criminal databases.
After The New York Times' story about Williams appeared, the prosecutor's office issued a statement noting that "this case should not have been issued based on the [Detroit Police Department] investigation, and for that we apologize. Thankfully, it was dismissed on our office's own motion. This does not in any way make up for the hours that Mr. Williams spent in jail." While Williams' case has been expunged from the record, the prosecutor says that a legal technicality prevents it from being dismissed with prejudice.
The Williams case is unlikely to be the only false arrest based on faulty facial recognition technology. "The sheer scope of police face recognition use in this country means that others have almost certainly been—and will continue to be—misidentified, if not arrested and charged for crimes they didn't commit," writes Clare Garvie, a senior associate with Georgetown Law's Center on Privacy & Technology.
The good news is that more and more cities around the country are stepping in to stop the police use of facial recognition technology. Yesterday, Boston's city council adopted an ordinance making it unlawful for any Boston official to "obtain, retain, possess, access, or use any face surveillance system or information derived from a face surveillance system." It also bars Boston from entering into contracts with any third parties for such facial recognition services. And Boston police may not request facial surveillance information from outside agencies like the FBI, though they may use such info if such an agency provides it.
Earlier this month, tech giants Amazon and Microsoft declared temporary moratoria on selling their facial recognition services to law enforcement. That's not nearly enough.
Given the growing prevalence of surveillance, Harvard cybersecurity expert Bruce Schneier argued in a recent New York Times column that we "need to have a serious conversation about…how much we as a society want to be spied on by governments and corporations—and what sorts of influence we want them to have over our lives." If we want to prevent a dystopian future of automated authoritarianism, strict limits on the government's use of facial recognition technology is a good place to start.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Obviously the solution is mandated facemasks with state-issued license numbers clearly visible.
Liza Make money online from home extra cash more than $18k to $21k. Start getting paid every month Thousands Dollars online. I have received $26K in this month by just working online from home in my part time.BGf every person easily do this job by just open this link and follow details on this page to get started..... New Income Opportunities
Skynet is racist.
https://www.huffpost.com/entry/google-black-people-goril_n_7717008
Burn that bitch down
Wrongful Arrest in Detroit Demonstrates Why Police Must Be Banned
Yeah, the characterization is weak.
Wrongful Arrest in Detroit Demonstrates Why Police Use of Facial Surveillance Technology Must Be Banned
The system picked out someone as a *lead* and explicitly stated that the lead did not constitute *probable cause*.
At that point, the cops don't bother to look at the photo *that they later admit* looks nothing like the guy?
And how do they do a photo lineup with no one noticing that the photo lineup looks nothing like whats on video?
Stinks of bureaucracy. I'm guessing the photo lineup guy is absolved of responsibility for noticing obvious non-matches - he's just supposed to record the witness's assertion of a match.
Then the next set of cops don't see the video, and go looking for the guy base on the "witness identified" picture from the mug shot database.
This is all process failure.
The *real* problem with the technology is that unlike what Mommy told you, you're not really that unique of a snowflake. People look alike, and with a big enough database, they're going to find *someone* who looks close to even good pictures. Maybe *lots* of someones.
All blacks don't look alike, but for anyone, there's gonna be lots of somebodies who look pretty close, particularly to people who don't know either.
Incompetents like to blame tech, much like statistics, for their incompetence when using them.
I expect Democrats to start resolving these problems in January as part of broader reform of the criminal justice system, largely without help from or regard for the opinions of authoritarian Republicans.
Detroit’s one of those cities run entirely by Democrats for decades retard.
And black Democrats at that. Well over 2/3 of the D.P.D is black, they haven't had a white chief of police in 48 years and the current black county prosecutor has been in office since 2004.
So maybe not a result of systemic racism, then?
These problems were around back during the first half of Obama's first term when the Democrats held a majority in both the House and the Senate. If they didn't deal with it then, why should anyone think they will deal with it now?
" If they didn’t deal with it then, why should anyone think they will deal with it now?"
Because they have bills in the House (or just passed) that will at least make a solid effort to help the issue. That is dealing with it.
And Tim Scott had a Bill in the Senate that’s a non-starter with Democrats because he’s an Uncle Tom and Nancy Pelosi said it’s evidence that Republicans are trying to get away with murdering George Floyd. You’re a special kind of useful idiot aren’t you?
The Senate bill is a non-starter because it is very weak and will not lead to any change in police behavior. Why support a bill that will not achieve the goals you want? Also the way policies works is that once a "reform" is passed, even a weak reform, it could be another 20 years till the politics aligns to pass another. It is better to pass no bill now then pass a bad bill now.
What part of the bill, that you clearly read, would you change?
user error bad enough on personal level.
Face recognition technology might have helped in the case where the legacy technology (cop eyes and "brains") failed.
https://www.instagram.com/p/CBzQXbAJqD7/
Any technology will make mistakes, including a fingerprint in the 2005 spanish bombings linked to a portland lawyer who had never traveled there.
But we use fingerprints because they are pretty good. Like DNA, they might be wrong, and definitely don't say when the individual might have been at that location, or what the individual might have done at that location. So there is always the need to interpret data that at best localizes a person to a place, but not a time or activity.
Same with facial recognition. Certainly has the potential to improve on other ways that the judicial system can mistakenly identify people by using complementary data. Such stories of mistaken identity using merely names abound with a simple google search on "man with same name held in jail".
I suppose facial recognition technology is pretty flawed or someone would have used it to determine if Virginia Gov. Northam was the guy in blackface in the famous photo. If he wasn't, then he was the guy in the KKK regalia. Had he been a Republican, of course, I would be referring to him as former governor Northam.
OK, but are we also going to outlaw eye-witness testimony?
I will bet any of you any amount that a comparison of people vs. technology in facial recognition will show a significant greater error rate for people. If we are concerned with mistaken identity then surely we want to get rid of the most error-prone methods, right?
Hell, get rid of any method with any error or room for manipulation. Seems to be the premise here.
Geez without eyewitness testimony there would be a lot fewer people in prison. So many are there because of nothing more.
This. I wonder why "he-said she-said" is the gold fucking standard of evidence. People lie, make shit up all the time, especially under pressure.
Ideally all law would be handled by robots. They would read the law, read the case files, and execute the law, with zero emotional input. If this produces bad results, change the law.
Heh heh, I see what you did.
Execute the law.
So ol' Billy was right?
Mr. Bailey, like the New York Times article authors who preceded him, is incorrect. Despite numerous articles decrying inaccuracy and sex and race bias in machine face matching, there seem to be no comparable reports that compare its accuracy and biases with those that have been in use for a great many years. The main ones are witness reports, line ups, mug books, and police analyst comparison of surveillance video with public records. All of these have been evaluated to some degree and found unreliable, inaccurate, and biased. The reasonable question, mentioned in an earlier comment here, is how machine matching compares to people in these matters.
The problem with Mr. Williams' arrest in Detroit is not with the search method that flagged him as a lead, but the fact that the police appear to have accepted it uncritically, despite the warning that accompanied it. The police did little or no followup work to test the possibility, they and the prosecutor thought it justified an arrest warrant, and a judge issued it without probable cause. To compound their errors the police reportedly were rude in serving it, and they all persisted beyond reason as it became apparent that no evidence justified the arrest, but were sluggish and grudging about admitting their errors (although it is not clear they actually have done more than the bare minimum in that respect).
The correction is not to abandon machine face matching but to use it within its limits, as should be done with all identification methods but all too often is not. If I were the victim I would seek a modest cash settlement to be withheld from the direct compensation of the police officers, their supervisors, the prosecutors, and the judge - everyone involved except the state police analyst, who, unlike the others, seems to have done her part of it correctly.
The technology spits out candidates for a match; it's up to the officers to verify the match by effing looking at the pictures. Obviously, that didn't happen here. That's not a problem with the technology, it's a problem with the humans.
Ain't nothing in the union contract about looking at no stinking pictures.
If the computer say "this is him", then it is.
Same thing with busting down doors; if the computer says this is the address, in they go.
Well, then put it effing in there. Forcing police to do their jobs properly is the right policy. Taking away police tools because bad police officers use them incorrectly is generally the wrong approach.
Why don't they just use the same programs as the TV NCIS & FBI?
Those guys wrap up a case an hour with no lab schedule issues..
I make up to $90 an hour on-line from my home. My story is that I give up operating at walmart to paintings on-line and with a bit strive I with out problem supply in spherical $40h to $86h… someone turned into top to me by way of manner of sharing this hyperlink with me, so now i'm hoping i ought to help a person else accessible through sharing this hyperlink... strive it, you http://www.Topcitypay.com
Dear God; Google translate needs a LOT of work.
such a nice article you have done it’s really useful and giving us value. thanks for sharing valuable post for us. I pleasure to share sad love status hindi hope you all like it.