Posted on July 8, 2009

Program Helps Identify Likely Violent Parolees

Faye Flam, Philadelphia Inquirer, July 8, 2009

As part of an attempt to fight crime, Philadelphia is now the subject of an experiment never tried in another city: A computer is forecasting who among the city’s 49,000 parolees is likeliest to rob, assault, or kill someone.

Since March, the city’s Adult Probation and Parole Department has been using the system to reshuffle the way it assigns cases. Each time someone new comes through intake, a clerk enters his or her name and the computer takes just seconds to fish through a database for relevant information and deliver a verdict of high, medium, or low risk.

“It’s a complete paradigm shift for the department,” said chief probation and parole officer Robert Malvestuto. “Science has made this available to us. We’d be foolish not to use it.”

{snip}

The controversy over the new system cuts to the heart of a long-standing debate: whether parole agencies should control dangerous people or help them reclaim their lives.

The computer isn’t merely crunching data–it is creating its own rules in what is known as “machine learning,” a fast-growing technology that enables computers to encroach into the human realms of judgment and decision-making.

{snip}

The computer doesn’t use a formula, nor does it develop one that anyone could write down. Instead, it learns by itself after being fed reams of “training data,” in this case on past parolees and their subsequent crimes. The system looks for patterns that connect such factors with subsequent crimes.

{snip}

To “train” the system, Berk [University of California statistician Richard Berk, an expert in machine “learning”] fed in data on 30,000 past cases; about 1 percent had committed homicide or attempted homicide within two years of beginning probation or parole.

The data included the number and types of past crimes, sex, race, income, and other factors.

To test its power, he fed in a different set of data on 30,000 other parolees. This time he didn’t tell the computer who would go on to kill.

Applying what it had previously learned, the system identified a group of several hundred who were considered especially dangerous. Of those, 45 in 100 did commit a homicide or attempted homicide within two years–much higher than the 1 in 100 among the general population of probationers and parolees.

The predictors that mattered most were age, age at first contact with adult courts, prior crimes involving guns, being male, and past violent crimes.

{snip}

Race mattered only a little–and so Philadelphia decided to leave it out of the equation. Berk said he thinks the model should work fine without it and the decision to ignore race minimizes concerns about racial profiling.

{snip}