California Bill Would Mandate ‘Crime Prediction’ Algorithms Instead of Cash Bail


By Isaac Scher
September 8, 2020

If California’s current effort to abolish cash bail succeeds, it would establish a system that keeps people in jail based on the recommendations of “crime prediction” algorithms—technology that experts criticize for enabling discrimination.

Under Proposition 25, a ballot measure up for vote in the state this November, judges will be required to use an algorithm, known as a “risk-assessment tool,” that predicts whether a defendant will be re-arrested or skip their trial if released. With a high enough “risk” score, a defendant will be jailed, and they won’t have the chance to purchase their freedom, since the legislation abolishes cash bail..

Between ending bail and implementing big-data analytics, the legislation has won plaudits from many reformers, who say it’s a scientific solution to the racism and classism of money bail that redresses worries about re-arrests and missed hearings. But predictive algorithms reproduce systematic discrimination, scholars and advocates say.

Pretrial prediction has quietly gained purchase, sometimes literally, over the last decade, concluding a 100-year effort in the criminal legal system to foresee so-called criminality with statistics. One-in-three counties in America employ algorithms in the pretrial space, according to one tally, and many are privately owned. “Often there’s no announcement or resolution or press release” when they’re adopted, Hannah Sassaman, policy director of Movement Alliance Project, told . “These tools are implemented in the dark.”

California is an outlier in this respect, having made a high-profile push to ban cash bail and mandate prediction algorithms in one sweep via Senate Bill 10, first introduced in 2016. The legislation, also known as SB10, briefly became law in 2018, but the bail industry mounted a rapid campaign to stall it until a 2020 ballot initiative: Proposition 25.

Courts in at least 49 of 58 California counties have already been equipped with risk assessment algorithms. But since they aren’t yet required, they might lie dormant or find use in only narrow circumstances. Some jurisdictions disregard them completely, Pilar Weiss, executive director of the , told Motherboard. That will change under SB10. “History,” one assemblyman declared in 2018, “will show that using a risk-assessment system will be the way to go.”

SB10 would not standardize the algorithms, meaning that each county’s predictive tool may process slightly different data about a defendant, as they do now.

Two counties already use the Public Safety Assessment, or PSA, a tool that uses seven factors to determine whether to lock someone up. Those include age at the time of arrest, history of missed court hearings, and prior felony record.

Los Angeles County, home to the biggest jail population in the world, uses a modified version of the Correctional Offender Management Profiling for Alternative Sanctions, or COMPAS, tool. COMPAS includes a 137-part questionnaire to appraise a person’s “risk,” recording personal details like whether the defendant has only held low-wage jobs, and whether any relatives or friends have been arrested or incarcerated. A screener will ask the defendant whether they believe “a hungry person has the right to steal,” and if it’s “easy to get drugs in your neighborhood.”

COMPAS’s precise use of these and other factors is considered “a trade secret,” the Community Justice Exchange wrote in a 2019 report, because the algorithm is for-profit. The PSA, for its part, has no price tag. It was developed by former Enron trader John Arnold’s philanthropic effort, Arnold Ventures, an erstwhile non-profit foundation that is now a limited liability company.

Though predictive algorithms do vary in the particulars, their substance is universal: They centrally rely on a defendant’s prior criminal history. It is here that algorithms come into view as irreversibly flawed, says Bernard Harcourt, professor of law and political science at Columbia Law School.

Predictive algorithms are touted as an efficient, cost-effective method of preventing crime. But “you’re not predicting future behavior,” Harcourt told . “You’re predicting policing.”

In other words, algorithms track policing, and American police regularly target poor people, Black people, children of color, trans people, immigrants, and sex workers. It doesn’t matter if a defendant is wrongfully arrested, because algorithms do not account for arrest patterns.

As with other predictive algorithmic systems, risk assessment software embeds the discrimination that already exists within our social structures. When an algorithm appraises someone’s highest level of schooling or job history, for example, it is tracking social structures that serve as proxies for race, class, and gender identity—and often determine education and income. Predictive algorithms, which use the past to presage the future, simply reinforce historical forces of oppression, Harcourt explains.

This may prove especially dangerous in California, where judges set an average bail of $50,000, often pressuring defendants to plead guilty. When defendants are jailed because they can’t afford bail, they “plead quicker and with less of a fight,” Raphling, a former trial lawyer in California, said. “Judges, as a whole, rely on pretrial incarceration to pressure guilty pleas.”

Predictive algorithms typically account for prior convictions, too. That means they will use data from California’s bail system, where, in 2018, just 0.95% of criminal cases went to jury trial. If SB10 becomes law, an algorithm will be more likely to jail defendants who pleaded guilty under a bail system that SB10 seeks to destroy.

“The system will continue to be completely unfair and corrupt,” Raphling added.

Since the 1920s, when so-called criminal prediction first came into use, scholars and carceral workers found that a prior record was the best predictor of future behavior, relative to others. But even when it’s accurate, algorithmic prediction does not choose to lock people up. That’s a moral and political choice, according to Harcourt. “The likelihood of someone committing a second crime should maybe make us think that we should not punish them,” he said, “but instead invest in them.”


* This article was automatically syndicated and expanded from VICE: Motherboard.


Ad debug output

The ad is not displayed on the page

current post: California Bill Would Mandate 'Crime Prediction' Algorithms Instead of Cash Bail, ID: 170565

Ad: Adsense in article mobile (124507)
Ad Group: content ads (483)
Placement: After Content (after-content)



Visitor Conditions
type: mobile
operator: is


Find solutions in the manual

Be the first to comment

Leave a comment: