Predictive policing—a concept seemingly pulled straight from the 2002 popcorn flick Minority Report—has become increasingly hot with law enforcement agencies over the past decade. The field tempts budget-minded officeholders and cops alike with its science-y promise to forecast where crimes will occur in the future and who will commit them, targeting risk while minimizing wasted resources. But it also holds the potential to justify hassling people based on what a computer program and biases entered as data say they might someday do. That's the basis of a recent lawsuit charging that a Florida sheriff's department has used predictive policing to harass the innocent.
"Predictive policing is the use of analytical techniques to identify promising targets for police intervention with the goal of preventing crime, solving past crimes, and identifying potential offenders and victims," according to a 2013 RAND Corporation report. Even in those early days of the field, though, the report acknowledged that "[t]he very act of labeling areas and people as worthy of further law enforcement attention inherently raises concerns about civil liberties and privacy rights."
In 2012, Reason's Ron Bailey observed that the developing field had some promise. But he warned that "[t]he accuracy of predictive policing programs depends on the accuracy of the information they are fed" and that "we should always keep in mind that any new technology that helps the police to better protect citizens can also be used to better oppress them."
Fast-forward a few years, and we have those concerns fulfilled in spades.
"Pasco County Sheriff Chris Nocco took office in 2011 with a bold plan: to create a cutting-edge intelligence program that could stop crime before it happened," the Tampa Bay Times reported last September. "What he actually built was a system to continuously monitor and harass Pasco County residents."
How does the Pasco County program live up to everybody's worst fears of acting on predictions of what people haven't done but might do?
"First the Sheriff's Office generates lists of people it considers likely to break the law, based on arrest histories, unspecified intelligence and arbitrary decisions by police analysts," the newspaper's report noted. "Then it sends deputies to find and interrogate anyone whose name appears, often without probable cause, a search warrant or evidence of a specific crime."
"Make their lives miserable until they move or sue," is how a former deputy described the department's tactics to reporters.
Now a group of county residents, represented by the Institute for Justice, is suing the county over the department's conduct. They describe years of harassing visits by cops who dropped by because the system identified them as potential offenders. When the residents lost patience with the continued police presence in their lives, officers deployed the red tape of the endless regulatory state against them to encourage compliance or to simply cause pain.
"Code enforcement is a common tactic to compel cooperation," reports the Institute for Justice. "One deputy said they would 'literally go out there and take a tape measure and measure the grass if somebody didn't want to cooperate with us.' In Robert's case, deputies cited him for tall grass, but failed to notify him of the citation. Then, when he failed to appear for a hearing that he was never told was happening, they arrested him for failure to appear."
The citations didn't have to stick to work as intended. The slow torture of tickets, arrests, and disrupted lives drove some to pick up and move out of Pasco County to avoid harassment—by that particular department, at least.
The Pasco County lawsuit isn't the first filed over predictive policing, though it appears to be the only one to-date that addresses how the approach is used in practice. Given the field's relative youth, other lawsuits have sought transparency on just what sort of information and calculations are used to make predictions. Many civil liberties advocates are concerned that shiny tech talk is being used to add a scientific gloss to what police already want to do.
"[W]hile big data companies claim that their technologies can help remove bias from police decision-making, algorithms relying on historical data risk reproducing those very biases," worries Tim Lau of the Brennan Center for Justice, which sued the New York City Police Department for information about its predictive policing program.
"[I]n numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies," argues a 2019 NYU Law Review article by researchers with the university's AI Now Institute. "If predictive policing systems are informed by such data, they cannot escape the legacies of the unlawful or biased policing practices that they are built on."
Those policing systems can affect a lot of lives. Over five years, the Pasco County predictive program targeted almost 1,000 people, according to the Tampa Bay Times. Roughly 10 percent of those targeted were younger than 18, and some had only one or two arrests to bring them to the attention of the authorities. Once on the radar, those in the system receive repeated visits at all hours of the day and are at the receiving end of tickets or handcuffs at the slightest excuse. Given the spider's web of laws and regulations in which Americans now live, police have no difficulty finding a reason to fine or arrest those who their algorithms say are worthy of special attention.
Has the torment of some county residents at least made others safer? That doesn't seem to be the case.
"Pasco's drop in property crimes was similar to the decline in the seven-largest nearby police jurisdictions," the newspaper's September report noted. "Over the same time period, violent crime increased only in Pasco."
Whatever the field's potential for channeling law enforcement effort to where it's most needed, and away from places it's unnecessary, predictive policing has become just another excuse for the authorities to hammer those who rub them the wrong way.