As Bail Reforms Gain Steam, Civil Rights Groups Warn About Overreliance on Risk Assessment Tools
Data and algorithms can help end biases, but they can also help perpetuate them.

More than 100 civil rights organizations have endorsed a letter warning that using risk assessment tools and algorithms as a replacement for money bail could perpetuate rather than reduce inequities in pretrial systems.
The letter, titled "A Shared Statement of Civil Rights Concerns," outlines six principles they want to see applied to any tool used to calculate the whether a defendant will be freed prior to trial. Their stated goal is to make sure that pretrial detention is the last resort, used only when absolutely necessary for public safety, and used only after a rigorous, adversarial process where the defendant can challenge his or her detention.
The groups that have signed the letter include the American Civil Liberties Union, the National Association for the Advancement of Colored People (NAACP), the Drug Policy Alliance, and the National Council of Churches.
In a press call Monday, the NAACP Legal Defense Fund's Monique Dixon argued: "Pretrial detention reform that addresses the injustice of people being jailed because of their poverty is urgently needed, but substituting risk assessment instruments for money bail is not the answer."
This comes amid a growing national push to eliminate the use of money bail—a push that all these groups also support. On any given day, America has around half a million people in jail who have not yet been convicted of a crime. Many of these inmates are behind bars not because they're clear-cut flight risks or threats to their communities but because they cannot afford to pay bail.
To change this system, courts are looking for other tools to assess whether a defendant is likely to skip town or to commit other crimes while freed. Pretrial assessments are intended to objectively calculate the risk factors connected to any specific defendant.
One of the chief factors considered is a defendant's their past criminal record. But that can be tainted by a history of unequal enforcement, particularly in poorer communities and minority neighborhoods. As the report warns, "Automated predictions based on such data—although they may seem objective or neutral—threaten to further intensify unwarranted discrepancies in the justice system and to provide a misleading and undeserved imprimatur of impartiality for an institution that desperately needs fundamental change."
In short, these groups want to avoid replacing one thoughtlessly applied system—bail schedules—with an opaque automated system that can end up in the exact same place: with people needlessly stuck in jail even though they have not yet been convicted.
Rejecting risk assessment tools entirely is probably not a realistic goal, given that there's still a huge fight ahead to reduce the dependence on money bail. So the letter puts forth six principles that the authors think should shape how assessment tools are implemented. They want the data to be implemented in a way that reduces the racial disparities within the justice system; they don't want assessment tools ever to recommend preventative detention, and call instead a release hearing with procedural safeguards; they want pretrial detention or supervision to be imposed only after an adversarial hearing where a prosecutor must make a case that there's an identifiable risk if the defendant is released; they want pretrial assessment tools to be transparently operated, independently validated, and open to challenge; they want pretrial assessments to calculate and communicate the likelihood of success (that is, that the defendant will return to court and not commit further crimes) rather than failure; and they want pretrial assessment tools to be developed with community input and subjected to regular review and oversight.
That seems like a lot. New Jersey has almost completely abandoned money bail and leans heavily on the use of a pretrial risk assessment algorithmic tool developed by the Laura and John Arnold Foundation to score risk factors. While it does put a score on the risk factors and recommends release (often with monitoring) or detention, the way the state has implemented the assessment is relevant to these principles. The assessment score never determines on its own whether a defendant is released. The prosecutor has to request a hearing if he or she wants to detain a defendant prior to trial—the judge cannot simply decide to detain the defendant on his or her own. Then the prosecutor has to make a case for holding the defendant. The defendant is represented by an attorney through this. New Jersey operates on the assumption of release and requires the prosecutor to prove that there's no way to make certain a defendant will show up for court or commit crimes while released in order to keep him or her in jail. The information that gets pulled into the assessment score is simple to understand and transparent and based entirely on the defendant's own background of behavior.
Not all assessment tools are that transparent or so focused on a defendants' own behavior and record. Others pull in demographic and employment data that the defendant cannot control and that are influenced by policing choices.
Note that New Jersey's bail reforms did not happen in a vacuum. They were part of a significant overhaul of New Jersey's criminal justice system intended to discourage police from arresting people for low-level crimes in the first place and to speed up trials significantly. New Jersey's incarceration rate was already dropping before the bail reforms were implemented, but it has continued to do so under the new system.
In response to these concerns, the Arnold Foundation released a statement saying it agrees with the goals of limiting pretrial detention and reducing racial bias. It adds that it doesn't intend their pretrial assessment tool to supplant or replace smart decision-making on the part of judges. It's supposed to aid the process. "We believe—and early research shows—that this type of data-informed approach can help reduce pretrial detention, address racial disparities and increase public safety," the foundation says.
Read the full letter of concern here.
This blog post has been updated to correct Monique Dixon's organizational association.
Editor's Note: As of February 29, 2024, commenting privileges on reason.com posts are limited to Reason Plus subscribers. Past commenters are grandfathered in for a temporary period. Subscribe here to preserve your ability to comment. Your Reason Plus subscription also gives you an ad-free version of reason.com, along with full access to the digital edition and archives of Reason magazine. We request that comments be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of reason.com or Reason Foundation. We reserve the right to delete any comment and ban commenters for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
How are any of these people planning on paying for this revenue loss?
What loss? The interest on bail goes to the bondsmen, not the government.
Prison ain't free.
No, and neither is jail, so the government saves money by not holding people before trial. Where is the "revenue loss"?
The challenge is to accurately collectivize people while still preserving individual rights.
My thoughts and prayers are with them.
Algorithms are racist because they are too objective.
That's one of the current challenges of AI research: Finding a reliable way to keep artificial intelligences and algorithms from engaging in WrongThink.
What makes it especially hard is that the programmers aren't allowed to engage in WrongThink whie doing it.
REality is supposed to have a progressive bias. Any data that does not have a progressive bias must be untrue. Didn't you know that Brett?
It is a one, or it is a zero. How can that be biased?
Because it affects more black people than we want it to.
That's how it's rayciss.
Racist.
It's awfully insensitive to refer to a digit as a "zero".
AI algorithms aren't the problem - humans gathering the data that the AI uses is the problem.
A vaguely-related example: More black men are arrested for marijuana-related crimes, but whites and blacks use marijuana at a similar rate. Why? The likely cause is black neighborhoods being patrolled more aggressively, leading to more arrests. If you put that data into a system that predicts where crime will occur, it will lead to under-patrolling white neighborhoods and over-patrolling black neighborhoods.
Prejudiced data will lead to prejudiced AI, with a false veneer of "it's ones and zeros, so can't be racist".
No, the likely cause is Blacks getting high in public, whereas white pot smokers do so inside their homes.
They are arrested more for burning marijuana in public, perfectly reasonable given demographics.
If you have other data, you need to share it.
(Note Politifact erroneously scores Gillibrand's claim "true".)
Yep. I live in a middle-class, low-crime neighborhood, about 70% white. A police patrol car passes my house about every twenty minutes, around the clock. If someone made a habit of standing in their driveway here smoking weed, they would certainly get busted. Nobody does that, even though I'm sure there are plenty of pot smokers in the neighborhood.
Computer "predictive" modeling is already being used in Santa Cruz. The problem is... once the pros have something to hack, they will do that and case areas based on where the computer doesn't tell cops to go. If it's really sophisticated, they can target an area to create stats away from high value targets.
In response to these concerns, the Arnold Foundation released a statement saying it agrees with the goals of limiting pretrial detention and reducing racial bias.
What about public safety and justice? That apparently has no play in things at all. If one developed the perfect system whereby only those who are not a threat to commit crimes are released and that system disproportionately locked up minorities, would such a system be invalid? I wouldn't think so. But I think the people who are so concerned about "reducing racial bias" would. Unless there is something in the rules that says "hold brown people longer", then any concern about racial bias is just being angry over reality. Either the criteria is effective at protecting the public or it isn't. Race has nothing to do with that question. So, racial bias is not even a valid concern for facially neutral criteria.
Well, since everyone has a natural right to self defense, and the US Constitution prohibits governments from infringing on that right, how can there be a problem?
risk assessment tools and algorithms as a replacement for money bail
"We'd like to release you, but the algorithm says there's a 0.83 probability that you may be a potential terrorist."
You know not everyone who is arrested is innocent. Criminals really do exist.
"We'd like to give you a kidney transplant right now, but the algorithm says there's a 0.83 probability that you will die if we don't wait another six weeks."
So what's your point?
How does NJ's non-bail system compare in terms of getting people to show up to trials and preventing people from committing crimes while waiting for trial?
If bail has little effect on people's behavior and they'll mostly either show up or not based on their own characters and the severity the crime, then there's not much point in using bail.
If you get significantly better results with bail than without it, it would be foolish to abandon it.
I think the use of bail is a practical question not a matter of principle.
(Assuming for the sake of argument that the state is pursuing real crimes that are worth imprisoning people for. I'm sure the readers of this site can come up with plenty of examples where either the prosecutors shouldn't have prosecuted or the law should have been changed, but that's not the point of the bail/no bail question)
Well, just to be sure, make skipping out a crime with a really stiff penalty. Like when they catch up with you again, you are automatically denied pre-trial release, and the sentence for skipping out is fifty years in prison?
Two things that need to be done. First, as you say skipping out on a court date should be treated as a very serious crime and punished accordingly. Second, being on bail at the time should be a massive sentence enhancer. If you are allowed out on bail and commit a crime, you should get several more years in prison than you normally would as a result. I am fine with letting people out before their trials, but you have to hammer anyone who commits a crime or skips out to make such a system work.
The problem there is that you're thinking like a normal person. If people are capable of making the logical connection between their behaviors and the consequences, they are unlikely to commit violent or serious crimes in the first place. Deterrence doesn't work on the kind of criminals that really need to held before trial.
That a great solution to bail skipping, John. The 8th Amendment demands that all arrested persons get non-excessive bail. If someone skips bail, then give them a sentence enhancement, charge them with a crime for skipping bail (already a crime), but you still must give them bail that is not excessive.
Bail is racist, algorithms are racist, no bail is racist.
Everything is racist because there are too many black people getting caught for the crimes they commit. If not enough white people are arrested in black neighborhoods THAT'S racist and unequal treatment and racism and...and...yeah.
Racism.
It's not about making a more reliable system that doesn't negatively impact poor arrestees. It's not about fairness for everyone in the system. Not at all.
It's just another fever dream that racism is the motivator for stopping young black men from beating, robbing, raping and killing other black people.
Protecting minorities from crime is racist too!
I've heard people say that in seriousness. "We'll handle it in-house", they say.
This is an area where I have a somewhat practical education and experience. I work as a constable. Bail was developed as a way to release people from pre-trial detainment and to ensure that they would show up for their hearing. (Post 1/3. Says my comment is too long, so posting by paragraph.)
The problem is that if you release all people prior to trial and at the same time remove all incentives for them to appear at the trial you will disproportionately create a problem where people don't show up for trial. Already about a third of people don't show up for hearings. (Disclosure: This is where I earn a living. When people don't show up for a hearing a bench warrant is issued. I serve these warrants, and where authorized, arrest and bring people in to the jail for detention.) You will always have some people, regardless of how high the bail is, that will not show up for a hearing. You will always have some people, regardless if there is no bail, who will show up for a hearing. The problem is, you want to reduce the amount of people ~not~ showing up for hearing but make bail available/affordable to the poor. Being the layman that I am, I would advocate for the tracking bracelets which we now have. It is a far cheaper alternative to pre-trial detention both for the government and for the accused. However, it is ~not~ an answer without problems. Many crimes have been committed by people wearing the ankle bracelet. Victims have been intimidated by attackers walking "free". The wheels of justice are not known for the speed at which they turn and it can and has been years before the accused has come to trial. (Continuations are granted to both the prosecution and to the defense.) Tinkering with the ankle bracelet, while illegal, has been done. (Post 2/3.)
BTW, New Jersey (also the city of Philadelphia) when they did away with the bail bond system, did enjoy an initial drop in pre-trial detainment. Predictably however, there was a huge surge in no shows for trial. And then there came an enormous surge in pre-trial detainment because when the fugitives were picked up for another offense it showed up that they were an absconder. They would then be held until trial. It was almost like criminals didn't commit just one crime. (Post 3/3.)
Reason's editors have really lost touch with the English language as of late:
(Feel free to delete this comment when corrections have been made.)
"...any tool used to calculate the whether a defendant..."
"...is a defendant's their past criminal record.."
One of the chief factors considered is a defendant's their past criminal record. But that can be tainted by a history of unequal enforcement, particularly in poorer communities and minority neighborhoods.
data?are influenced by policing choices.
Urban police officers spend virtually all of their time responding to calls for service. The level of enforcement in a neighborhood is determined almost entirely by the number of 911 calls from that neighborhood.
People who don't have money can't post bail; it's inequality, but I fail to see the "inequity".
According to "civil rights organizations", any disparity of outcomes between Blacks and non-Blacks is an "inequity", regardless of the cause.
Data and algorithms can help end biases, but they can also help perpetuate them.
###
"Data so racist"
The 8th Amendment prohibits excessive bail. Since that is incorprated right under the 14A, set first time bail for everyone at $5 for misdemeanors and $10 felonies. Even suspected murders are entitled to bail.
Whatever happened to "Wanted: Dead or Alive"?