Posted on May 24, 2016

Is Software Used by Police to Identify Suspects Racist?

Ryan O'Hare, Daily Mail, May 24, 2016

Being taken to the police station for processing is part and parcel of the legal process–if suspects commit the crime they prepare to pay the consequences.

For many suspects in states across the US, this involves undergoing a computerised risk assessment which works out their likelihood of re-offending.

The premise is that the data can be used by judges at trial to aid them in sentencing, using the scores to work out who is a bigger risk to the public.

But an investigation has raised serious questions about the methods used by to generate risk scores, claiming there may be racial elements at play in assigning scores.

The investigation by ProPublica found that software widely used by police forces across the US is skewed against African Americans, with predicted rates of recidivism far likely to be higher for black suspects compared to those of other backgrounds.

It turned up racial disparities in the formula used to work out risk, reporting that it was more likely to flag black defendants as future criminals and wrongly identifying white defendants as lower risk.
Reporters obtained scores for more than 7,000 people arrested in Florida in 2013 and 2014 in order to analyse the rate of re-offending over the following two years.

According to ProPublica, of those offenders forecast to commit violent crimes by the risk score, analysis revealed that just one in five did so.

The software’s hit rate improved when the remit was widened to include less serious crimes and misdemeanours, such as driving without a licence, highlighting 61 per cent of defendants who re-offended within two years.

When looking across the board, risk profiles for black defendants were seen to be distributed across low to high risk (scored from 1 to 10, with 1 being the lowest risk).

But for white defendants, the analysis shows scores appear to be skewed towards low risk scores.

In one example outlined in the report, risk scores are provided for two suspects, one black and one white, both on drug possession charge.

Of their prior offences, the white suspect had attempted burglary while the other had resisting arrest.

But the risk scores rated the black suspect as being a far higher risk for re-offending.

Despite the scores, the black suspect had no offences in the following two years, while the ‘low risk’ white offender racked up a further three drug possession charges.

The software used to generate the risk scores in the investigation–Correctional Offender Management Profiling for Alternative Sanctions (Compas)–works out scores based on 137 questions, with information supplied by the defendant directly or pulled from their criminal records.

Miami-based Northpointe Inc, which supplies the software used in Florida and many other states, has disputed the analysis from ProPublica.

A spokesperson for the firm told MailOnline: ‘Northpointe does not agree with or accept the ProPublica findings.

‘The analysis performed by ProPublica included misapplication of various statistical methods, the outcome of the analysis produced erroneous results and that the conclusions drawn from the results are misleading.’

The spokesperson added: ‘Northpointe employs a team of research scientist that continually engage in validation of the Northpointe risk and needs assessment tools to ensure the outcomes produced are scientifically valid.’

Data published by the US Department of Justice highlights almost 3 per cent of the black males were imprisoned in 2013, compared to just 0.5 per cent of white males.

Black males also make up the largest proportion of prisoner population, at 37 per cent, with 32 per cent white and 22 per cent Hispanic.

Opponents of risk profiling algorithms argue that they are in effect racial profiling, and fail to take into account factors including access to services and probation conditions for offenders.