Report: Software Used to Predict Future Crime Shows Racial Bias

We may earn a commission from links on this page.

Risk-assessment scores, which are becoming ever more common in courtrooms across the nation, are often used to make important decisions regarding a defendant’s freedom or the terms of the defendant’s freedom.

However, according to ProPublica, the scores—which are based on an algorithm whose creators claim can predict defendants’ likelihood of committing another crime—seem to show a clear racial bias against black defendants. They were shown to often, wrongly, rate black defendants as future criminals, while white defendants were mislabeled low-risk more often than their black counterparts.

According to the report, then-U.S. Attorney General Eric Holder expressed caution about the risk scores in 2014 because of such possible bias.

Advertisement

“Although these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice,” he said. “They may exacerbate unwarranted and unjust disparities that are already far too common in our criminal-justice system and in our society.”

Advertisement

At the time, he called on the U.S. Sentencing Commission to study the use of these scores, which, according to ProPublica, the commission did not. And so the nonprofit investigative-journalism organization took it upon itself to study risk scores obtained from more than 7,000 people arrested in Broward County, Fla., in 2013 and 2014 to see how many were charged with new crimes over the next two years. This benchmark is the same used by the creators of the algorithm.

Advertisement

Among its findings, ProPublica noted that only 23.5 percent of whites were labeled higher risk but did not commit any further crime, while 44.9 percent of African Americans were labeled higher risk and yet did not reoffend. On the other side of the coin, some 47.7 percent of white offenders were labeled lower risk but but did commit another crime, whereas only 28 percent of African Americans were labeled lower risk but went on to commit another offense.

Overall, ProPublica found the score highly unreliable when forecasting violent crime, with only 20 percent of the people predicted to go on to commit these type of crimes actually going on to do so.

Advertisement

When all crimes—including simple misdemeanors such as driving with an expired license—were taken into consideration, the algorithm’s accuracy increased to “somewhat more accurate than a coin flip,” ProPublica states, with 61 percent of those labeled as likely to reoffend doing so within two years.

When ProPublica controlled for defendants’ prior crimes or the types of crimes they were arrested for, as well as the age and gender of each defendant, the racial disparity still crept in, with black defendants 77 percent more likely to be tagged as being at a higher risk of committing a future violent crime, and 45 percent labeled more likely to commit a future crime of any kind.

Advertisement

Read the full report at ProPublica.