Law | NPR

Flaws plague a tool meant to help low-risk federal prisoners win early release

The Justice Department created an algorithm to measure a person's risk of committing a new crime after leaving prison. But even after multiple tweaks, the tool is leading to racial disparities.

Listen

A prisoner looks out of his jail window as protesters gather outside the federal detention center in Miami on June 12, 2020, during a demonstration over the death of George Floyd.
A prisoner looks out of his jail window as protesters gather outside the federal detention center in Miami on June 12, 2020, during a demonstration over the death of George Floyd. Chandan Khanna | AFP via Getty Images

Thousands of people are leaving federal prison this month thanks to a law called the First Step Act, which allowed them to win early release by participating in programs aimed at easing their return to society.

But thousands of others may still remain behind bars because of fundamental flaws in the Justice Department's method for deciding who can take the early-release track. The biggest flaw: persistent racial disparities that put Black and brown people at a disadvantage.

In a report issued days before Christmas in 2021, the department said its algorithmic tool for assessing the risk that a person in prison would return to crime produced uneven results. The algorithm, known as Pattern, overpredicted the risk that many Black, Hispanic and Asian people would commit new crimes or violate rules after leaving prison. At the same time, it also underpredicted the risk for some inmates of color when it came to possible return to violent crime.

"From the beginning, civil rights groups cautioned Congress and the Justice Department that use of a risk assessment tool to make these determinations would lead to racial disparities," said Aamra Ahmad, senior policy counsel at the American Civil Liberties Union.

"The Justice Department found that only 7% of Black people in the sample were classified as minimum level risk compared to 21% of white people," she added. "This indicator alone should give the Department of Justice great pause in moving forward."

The rule of unintended consequences

An American flag flies outside the Department of Justice in Washington in March 2019.
An American flag flies outside the Department of Justice in Washington in March 2019. Andrew Harnik | AP

Risk assessment tools are common in many states. But critics said Pattern is the first time the federal justice system is using an algorithm with such high stakes.

Congress passed the First Step Act in 2018 with huge bipartisan majorities. It's designed to prepare people in prison for life afterward by offering credits toward early release for working or taking life skills and other classes while behind bars.

Lawmakers like Sens. Sheldon Whitehouse of Rhode Island and John Cornyn of Texas took inspiration from similar criminal justice reforms in states, which they said led to drops in both prison populations and crime. The senators pointed out that some 9 in 10 people in prison eventually return home, and they contended that preparing them for release made good sense for formerly incarcerated people and for public safety.

Only inmates who pose a low or minimal risk of returning to crime can qualify for the programs, with that risk level determined using the Pattern algorithm.

"The significance of this risk assessment tool is that it divides all federal prisoners essentially into two groups: people who can get credit for doing this programming and get out early, and people who can't," said Jim Felman, an attorney in Tampa, Fla., who has been following the First Step Act for years.

The implementation has been rocky. The Justice Department finished the first version of Pattern in a rush because of a tight deadline from Congress.

It then had to make tweaks after finding Pattern suffered from math and human errors.

About 14,000 men and women in federal prison still wound up in the wrong risk categories. There were big disparities for people of color.

"The legislation, I think, came from a good place," said Melissa Hamilton, a professor of law and criminal justice at the University of Surrey who studies risk assessments. "It's just the rule of unintended consequences is not really realizing the impediments it was going to have."

Risk assessment tool "sounds highly technical, but it's not"

"You use a term like 'risk assessment tool,' it has this patina of science, it sounds highly technical, but it's not," said Patricia Richman, who works on national policy issues for the Federal Public and Community Defenders. "A risk assessment tool is just a series of policy decisions."

Those policy decisions are made by determining what counts as a risk factor and by how much.

Criminal history can be a problem, for example, because law enforcement has a history of overpolicing some communities of color. Other factors such as education level and whether someone paid restitution to their victims can intersect with race and ethnicity, too.

In its December report, the Justice Department concluded that some of the disparities could be reduced, "but not without tradeoffs" such as less accurate risk predictions. The department also said using race as a factor in the algorithm could trigger other legal concerns.

Still, it is consulting with experts about making the algorithm fairer and another overhaul of Pattern is already underway.

Attorney General Merrick Garland has directed the department to look for ways to assess racial bias and make the tool more transparent, a spokeswoman said.

One option is to adjust the cutoff points between the risk categories, allowing more prisoners to earn credits for release, which would "maximize access to First Step Act relief while ensuring public safety," she said.

Ultimately, Garland will have to sign off on a new version. Then, Justice has to reevaluate the 14,000 people in prison who got lumped into the wrong category.

"This is just one example of the ways that harmful artificial intelligence systems are being rolled out in everything from the criminal legal system to employment decisions to who gets access to housing and social benefits," said Sasha Costanza-Chock, director of research and design for the Algorithmic Justice League, which studies the social implications of artificial intelligence.

Costanza-Chock said the burden is on the Justice Department to prove the Pattern tool doesn't have racist and sexist outcomes.

"Especially when systems are high risk and affect people's liberty, we need much clearer and stronger oversight," said Costanza-Chock.

The Metropolitan Detention Center prison in Los Angeles
The Metropolitan Detention Center prison in Los Angeles David McNew | Getty Images

Looking for resolution

Felman, the Florida lawyer working with the American Bar Association, worried that the tool will continue to put many prisoners of color at a disadvantage.

"We will start to see more prisoners get out early," he said. "My concern is that the color of their skin will not be reflective of fairness."

The ACLU's Ahmad said she's seen enough.

"There are no technical fixes to these problems that could make Pattern and similar tools safe and fair to use," Ahmad said. "We would urge the Justice Department to suspend the use of Pattern until it can adequately address these concerns."

Hamilton, who studies risk assessments, thinks the Pattern tool may be worth saving. Consider the alternative, she said: decisions made by people who have all kinds of biases.

"So that's the unfortunate thing is, it's better than gut instinct of the very flawed humans that we all are, and can we improve it more than marginally, and that's what we're all working on?" Hamilton said.

Copyright 2022 NPR. To see more, visit https://www.npr.org.

Transcript :

STEVE INSKEEP, HOST:

Thousands of people are leaving federal prison this month because of a law called the First Step Act. President Trump signed this bipartisan measure back in 2018, which is designed in part to reduce the federal prison population. Leaders of both parties agreed that too many Americans have been in prison. The Justice Department is using computers to determine who gets a shot at early release. But it turns out the algorithm appears to give biased results, treating people of different races differently.

NPR's Carrie Johnson joins us now.

Carrie, good morning.

CARRIE JOHNSON, BYLINE: Good morning, Steve.

INSKEEP: What are you finding here?

JOHNSON: Well, there are persistent racial disparities and other problems with how the First Step Act is working. Remember, this was supposed to create a way for people to leave prison early if they take life skills classes to help...

INSKEEP: Yeah.

JOHNSON: ...Prepare for their release. The key is that they have to be considered a low or a minimum risk of a return to crime to be eligible for those programs. And the law says the prison system should decide that central question based on a new algorithm called Pattern.

Here's how David Patton, the top federal public defender in New York, described the issue to Congress.

(SOUNDBITE OF ARCHIVED RECORDING)

DAVID PATTON: That score that people receive will directly impact how much time they spend in prison. It is vital.

JOHNSON: These kinds of risk tools are common in the criminal justice system in many states. But Pattern is the first time the federal government has been using an algorithm with such high stakes.

Jim Felman is a lawyer in Tampa, Fla. He's been following the First Step Act for years now.

JIM FELMAN: The significance of this risk assessment tool - it divides all federal prisoners, essentially, into two groups - people who can get credit for doing this programming and get out early and people who can't.

JOHNSON: The implementation has been rocky. The Justice Department finished the first version of Pattern in a rush because of a tight deadline from Congress. The DOJ said that tool suffered from math errors and human errors, so it made some tweaks. About 14,000 men and women in federal prison still wound up in the wrong risk categories. And there were big disparities for people of color.

Aamra Ahmad is senior policy counsel at the ACLU.

AAMRA AHMAD: From the beginning, civil rights groups cautioned Congress and the Justice Department that use of a risk assessment tool to make these determinations would lead to racial disparities.

JOHNSON: Authorities have corrected some of the sloppy mistakes, but those racial disparities persist. Ahmad says they're clear in the Justice Department's own data, released before Christmas.

AHMAD: The Justice Department found that only 7% of Black people in the sample were classified as minimum-level risk, compared to 21% of white people. This indicator alone should give the Department of Justice great pause in moving forward.

JOHNSON: Pattern overpredicted the risk that Black, Hispanic and Asian people in prison would commit new crimes or violate rules, but it underpredicted the risk for some inmates of color when it came to possible return to violent crime.

Patricia Richman works on national policy issues for the Federal Public and Community Defenders.

PATRICIA RICHMAN: When you use a term like risk assessment tool, it has this patina of science. It sounds highly technical. But it's not. A risk assessment tool is just a series of policy decisions.

JOHNSON: Policy decisions like what should count and how much - take criminal history. That can be a problem because law enforcement has a history of overpolicing some communities of color. Then there is education level and whether someone paid restitution to their victims. Those factors can intersect with race and ethnicity, too.

Melissa Hamilton is a professor of law and criminal justice at the University of Surrey. Hamilton studies risk assessments. She says she's glad to see the Justice Department has tweaked the algorithm, but she says there's still a ways to go to make it work.

MELISSA HAMILTON: The legislation, I think, came from a good place for most of the congresspersons who voted for it. It's just - the rule of unintended consequences is not really realizing the impediments it was going to have.

JOHNSON: The Justice Department says it's aware of the problems with Pattern, and it's working with experts to make the risk assessment tool more fair and more accurate. Another overhaul of the tool is underway. Then Justice has to reevaluate the 14,000 people in prison it says got lumped into the wrong category.

Sasha Costanza-Chock is director of research and design for the Algorithmic Justice League, which studies the social implications of artificial intelligence.

SASHA COSTANZA-CHOCK: This is just one example of the ways that harmful AI systems are being rolled out in everything from, you know, the criminal legal system to employment decisions to who gets access to housing and social benefits.

JOHNSON: Costanza-Chock says the burden is on the Justice Department to prove the Pattern tool doesn't have racist and sexist outcomes.

COSTANZA-CHOCK: Especially when the systems are high risk and affect people's liberty - we need much clearer and stronger oversight.

JOHNSON: Jim Felman is the Florida lawyer working with the American Bar Association to monitor the First Step Act. He worries the tool is already putting many prisoners of color at a disadvantage.

FELMAN: We will start to see more prisoners get out early. My concern is that the color of their skin will not be reflective of fairness.

JOHNSON: Aamra Ahmad from the ACLU says she's seen enough.

AHMAD: There are no technical fixes to these problems that could make Pattern and similar tools safe and fair to use. We would urge the Justice Department to suspend the use of Pattern until it can adequately address these concerns.

JOHNSON: Melissa Hamilton, who studies risk assessment, says Pattern may be worth saving. Consider the alternative, Hamilton says - decisions made by people who have all kinds of biases.

HAMILTON: So that's the unfortunate thing - is it's better than gut instinct of very flawed humans that we all are. And can we improve it more than marginally? And that's what we're working on.

INSKEEP: NPR's justice correspondent Carrie Johnson is still with us. And, Carrie, what is the Justice Department saying about what you found?

JOHNSON: DOJ didn't want to talk on tape about this, but they sent a written statement. Attorney General Merrick Garland has directed people in the department who try to address some racial bias in this tool and also to make the process more transparent. One option on the table, Steve, is to adjust the cut-off points between these risk categories. It's technical but important. It would allow more prisoners to take programs, earn credits for release and eventually get released early. Justice says it also wants to keep public safety in mind here, too. It's highly technical stuff. It could take a while to finish. And remember, there are still 14,000 men and women in prison who need to be reevaluated 'cause they got lumped into the wrong category in the first place. Even if DOJ moves ahead, it's not clear they could find a way to eliminate all the racial bias here. That's why some advocates want to see Justice and Congress just drop this algorithm altogether.

INSKEEP: NPR national justice correspondent Carrie Johnson - Carrie, thanks as always for your reporting.

JOHNSON: Thank you.

(SOUNDBITE OF MAKAYA MCCRAVEN'S “INNER FIGHT”) Transcript provided by NPR, Copyright NPR.

Share