Prop. 25 would use algorithms to advise judges on bail decisions. So why are some early boosters having second thoughts?
California is either about to right decades of inequality between rich and poor defendants by eliminating cash bail, or it’s about to turn over its justice system to robots.
The question of what to do about the system that decides whether people should be free while awaiting trial will be determined by Proposition 25. The stakes, as explained by each side, are either ending an unjust system or relinquishing judicial authority to a pretrial assessment tool run on an algorithm.
If passed, Prop. 25 would allow each of California’s 58 counties to choose its own algorithm to assess a person’s flight risk or likelihood of reoffending while awaiting trial. The algorithm makes a recommendation, but the decision falls to the judge.
Yet those algorithms meant to solve for human bias have come under scrutiny in recent years, with some early boosters pulling back support. Those new dissenters worry the computer programs currently available will be overly broad in interpreting risk and unnecessarily keep throngs of defendants, many of them poor and minorities, behind bars.
Cash bail as an industry dominated by commercial bail bondsmen only exists in the U.S. and the Philippines. Some states have begun to turn away from cash bail either relying on national algorithms or, like Virginia and Florida, created their own.
In 2018, former Gov. Jerry Brown signed a law to eliminate cash bail, replacing it with a new pretrial risk assessment similar to federal courts. But the years since SB 10 have been difficult for supporters of bail determination algorithms. First, a group of 27 academics from institutions like MIT and Harvard pulled their support, citing the danger of using inexact and overly broad definitions in predicting violence.
Their principle objection was the way the algorithms defined risk. “When tools conflate the likelihood of arrest for any reason with risk of violence, a large number of people will be labeled a threat to public safety without sufficient justification,” the group wrote.
Then this year, an even bigger setback for algorithm advocates: The Pretrial Justice Institute, long the standard-bearer for a risk-based algorithmic approach, announced it no longer supported using algorithms in determining someone’s eligibility for pretrial release.
“We were too focused on fighting the damaging status quo to really listen,” PRI wrote in a mea culpa in February. “We made a mistake.”
Supporters of Prop. 25 argue that inequities created or exacerbated by the algorithm can be worked out during the periodic reassessments of the program — Prop. 25, if passed, would get its own review by Jan. 1, 2024 — and that other such algorithms are in use in other states, with no grave consequences yet reported.
There are five popular algorithms in use today, in states from Kentucky to New Jersey, along with several California counties that have already eliminated cash bail.
- One of the most popular, the Public Safety Assessment, comes from Arnold Ventures* and is the statewide algorithm for Arizona, Kentucky, New Jersey and Utah. That assessment uses a nine-factor test to determine bail eligibility, accounting for factors like age, prior failures to appear and past convictions. San Francisco County began using the PSA in 2016. Sonoma County joined in 2020.
- Another option is the algorithm used in the federal court system, the Pretrial Risk Assessment, developed by the U.S. Office of Pretrial and Probation Services. Since its implementation in 2009, prosecutors and defense attorneys now focus on arguing whether a defendant is a flight risk or a danger to the community.
- Further down is an algorithm used by New York, the Correctional Offender Management Profiling for Alternative Sanctions developed by Northpointe, Inc. That program has been criticized for its lack of transparency.
The bail industry, which forced the issue onto the ballot, are hoping voters will simply reject Prop. 25.
Sheriffs and prosecutors opposed to Prop. 25 say putting people back on the street after an arrest will allow them to commit more crimes. Meanwhile, some civil liberties organizations worry an algorithm will deepen racial and socioeconomic inequities in the justice system — or will at least fail to correct them.
“Algorithms might work for recommending songs, movies and other consumer interests but are biased and flawed when it comes to justice, bank loans and other sensitive and personal matters,” the No on 25 campaign warns on its website.
Forecasting bail releases is tricky. The decision ultimately lies with each judge, so it’s only really possible to forecast what the algorithms will tell them, not how each individual judge will act.
With that caveat, a study released this week by the California Policy Lab at the University of California, Berkeley found that in Sonoma and San Francisco counties, the implementation of an algorithm to assist with pretrial release decisions would have led to more releases and less time spent in jail for people arrested in 2017-2018, the period of the study.
Ultimately, proponents of an algorithm for bail decisions say the computer assist will act as a kind of scorecard. The public will be able to see which judges adhere to the algorithm’s suggestions and which judges go their own way.
“People are missing the fact that judges are already using their discretion,” said John Bauters of the Yes on 25 campaign, “they’re just hiding behind a cash bail schedule to do it.”
*Arnold Ventures is a supporter of CalMatters.
This article is part of the California Divide, a collaboration among newsrooms examining income inequity and economic survival in California.