Search Box

Thursday, July 28, 2016

Crime Prediction Algorithms Are Used In Sentencing

Secret Algorithms That Predict Future Criminals Get a Thumbs Up from Wisconsin Supreme Court

Etahn Chiel | July 27, 2016



There’s software used across the country that predicts whether people are likely to commit a crime. It’s not quite Minority Report, but the same basic idea is behind it: The software assesses various data points about a person and then gives him or her a risk score; the higher the score, the more likely they are to commit a crime in the future. The scores are used by judges in a number of different jurisdictions for sentencing people convicted of crimes.

Last week, the Supreme Court of Wisconsin issued an opinion in his case: It rejected Loomis’s request to be sentenced again, and said the lower court which sentenced him didn’t violate his due process rights by using risk-assessment software because it didn’t rely on the risk score alone.


We also turned up significant racial disparities, just as Holder feared. In forecasting who would re-offend, the algorithm made mistakes with black and white defendants at roughly the same rate but in very different ways.
The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.
White defendants were mislabeled as low risk more often than black defendants.
Could this disparity be explained by defendants’ prior crimes or the type of crimes they were arrested for? No. We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants’ age and gender. Black defendants were still 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind. (Read our analysis.)

"U.S. Courts Are Using Algorithms Riddled With Racism to Hand Out Sentences." Source: https://mic.com/articles/144084/propublica-courts-use-a-racially-biased-formula-to-predict-future-criminals

<more at http://fusion.net/story/330672/algorithms-recidivism-loomis-wisconsin-court/; related articles and links: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing (Machine Bias. There’s software used across the country to predict future criminals. And it’s biased against blacks. May 23, 2016) and https://mic.com/articles/144084/propublica-courts-use-a-racially-biased-formula-to-predict-future-criminals (U.S. Courts Are Using Algorithms Riddled With Racism to Hand Out Sentences. May 28, 2016)>

No comments:

Post a Comment