Skip to main contentSkip to navigationSkip to navigation
Compas is used to weigh up whether defendants awaiting trial or sentencing are at too much risk of reoffending to be released on bail.
Compas is used to weigh up whether defendants awaiting trial or sentencing are at too much risk of reoffending to be released on bail. Photograph: Matias Nieto/Cover/Getty Images
Compas is used to weigh up whether defendants awaiting trial or sentencing are at too much risk of reoffending to be released on bail. Photograph: Matias Nieto/Cover/Getty Images

Software 'no more accurate than untrained humans' at judging reoffending risk

This article is more than 6 years old

Program used to assess more than a million US defendants may not be accurate enough for potentially life-changing decisions, say experts

The credibility of a computer program used for bail and sentencing decisions has been called into question after it was found to be no more accurate at predicting the risk of reoffending than people with no criminal justice experience provided with only the defendant’s age, sex and criminal history.

The algorithm, called Compas (Correctional Offender Management Profiling for Alternative Sanctions), is used throughout the US to weigh up whether defendants awaiting trial or sentencing are at too much risk of reoffending to be released on bail.

Since being developed in 1998, the tool is reported to have been used to assess more than one million defendants. But a new paper has cast doubt on whether the software’s predictions are sufficiently accurate to justify its use in potentially life-changing decisions.

Hany Farid, a co-author of the paper and professor of computer science at Dartmouth College in New Hampshire, said: “The cost of being wrong is very high and at this point there’s a serious question over whether it should have any part in these decisions.”

The analysis comes as courts and police forces internationally are increasingly relying on computerised approaches to predict the likelihood of people reoffending and to identify potential crime hotspots where police resources should be concentrated. In the UK, East Midlands police force are trialling software called Valcri, aimed at generating plausible ideas about how, when and why a crime was committed as well as who did it, and Kent Police have been using predictive crime mapping software called PredPol since 2013.

The trend has raised concerns about whether such tools could introduce new forms of bias into the criminal justice system, as well as questions about the regulation of algorithms to ensure the decisions they reach are fair and transparent.

The latest analysis focuses on the more basic question of accuracy.

Farid, with colleague Julia Dressel, compared the ability of the software – which combines 137 measures for each individual – against that of untrained workers, contracted through Amazon’s Mechanical Turk online crowd-sourcing marketplace.

The academics used a database of more than 7,000 pretrial defendants from Broward County, Florida, which included individual demographic information, age, sex, criminal history and arrest record in the two year period following the Compas scoring.

The online workers were given short descriptions that included a defendant’s sex, age, and previous criminal history and asked whether they thought they would reoffend. Using far less information than Compas (seven variables versus 137), when the results were pooled the humans were accurate in 67% of cases, compared to the 65% accuracy of Compas.

In a second analysis, the paper found that Compas’s accuracy at predicting recidivism could also be matched using a simple calculation involving only an offender’s age and the number of prior convictions.

“When you boil down what the software is actually doing, it comes down to two things: your age and number of prior convictions,” said Farid. “If you are young and have a lot of prior convictions you are high risk.”

“As we peel the curtain away on these proprietary algorithms, the details of which are closely guarded, it doesn’t look that impressive,” he added. “It doesn’t mean we shouldn’t use it, but judges and courts and prosecutors should understand what is behind this.”

Seena Fazel, a professor of forensic psychiatry at the University of Oxford, agreed that the inner workings of such risk assessment tools ought to be made public so that they can be scrutinised.

However, he said that in practice, such algorithms were not used to provide a “yes or no” answer, but were useful in giving gradations of risk and highlighting areas of vulnerability – for instance, recommending that a person be assigned a drug support worker on release from prison.

“I don’t think you can say these algorithms have no value,” he said. “There’s lots of other evidence suggesting they are useful.”

The paper also highlights the potential for racial asymmetries in the outputs of such software that can be difficult to avoid – even if the software itself is unbiased.

The analysis showed that while the accuracy of the software was the same for black and white defendants, the so-called false positive rate (when someone who does not go on to offend is classified as high risk) was higher for black than for white defendants. This kind of asymmetry is mathematically inevitable in the case where two populations have a different underlying rate of reoffending – in the Florida data set the black defendants were more likely to reoffend – but such disparities nonetheless raise thorny questions about how the fairness of an algorithm should be defined.

Farid said the results also highlight the potential for software to magnify existing biases within the criminal justice system. For instance, if black suspects are more likely to be convicted when arrested for a crime, and if criminal history is a predictor of reoffending, then software could act to reinforce existing racial biases.

Racial inequalities in the criminal justice system in England and Wales were highlighted in a recent report written by the Labour MP David Lammy at the request of the prime minister.

People from ethnic minorities “still face bias, including overt discrimination, in parts of the justice system”, Lammy concluded.

The findings were published in the journal Science Advances.

Most viewed

Most viewed