Back to News & Commentary

Lifting the Veil on the Design of Predictive Tools in the Criminal Legal System

Justice scales.
The voices of impacted communities must be integrated into the design process of predictive tools used in the criminal legal system.
Justice scales.
Tobi Jegede,
Data Scientist,
老澳门开奖结果
Marissa Gerchick,
she/her/hers,
Data Scientist and Algorithmic Justice Specialist,
老澳门开奖结果
Amreeta Mathai,
Former Staff Attorney,
老澳门开奖结果鈥檚 Racial Justice Program
Aaron Horowitz,
Former Head of Analytics,
老澳门开奖结果
Share This Page
November 22, 2023

Recently, the National Institute of Justice (NIJ) 鈥 the research arm of the Department of Justice 鈥 put out a call for researchers to participate in what they called the 鈥溾 The challenge was designed to use information about people on parole in Georgia to 鈥渋mprove the ability to forecast recidivism using person-and place-based variables,鈥 encourage 鈥渘on-criminal justice forecasting researchers to compete against more 鈥榯raditional鈥 criminal justice researchers,鈥 and provide 鈥渃ritical information to community corrections departments.鈥 Challenge contestants were awarded a collective total of $723,000 for their submitted models.

While by the NIJ as a successful effort that 鈥渄emonstrate[d] the value of open data and open competition,鈥 in reality, the challenge was marked by serious and fundamental flaws. One of the winning papers encapsulated the best when they said, 鈥淲e are hesitant to accept any insights gained from submitted models and question the reliability of their performance. We would also discourage the use of any submitted models in live environments.鈥 Six of the other 25 winning papers also expressed their concerns about the use of models created for the challenge in real-world environments.

So, what contributed to the challenge鈥檚 failures?

We argue critiquing the challenge that a failure to engage impacted communities (those whose data was used for the challenge) as well as public defenders and other advocates for impacted communities contributed in part to some of the failures of this project. The standard going forward for developing predictive tools should draw on recent resources from the to inform decision-making around whether to develop predictive tools. These efforts should center around developing strong protections for the people whose data is used to build automated systems and the people who may ultimately be evaluated by those systems if they are deployed.

So, why does this matter?

The NIJ has a lot of power, given its position within the Department of Justice, to shape the way that local community corrections departments think about recidivism. We submitted a Freedom of Information Act request to the DOJ to try to better understand how the results of the challenge have been or will be used but have not yet received a response to our request. While it is not fully clear yet how the results of the challenge will be used by the DOJ, the NIJ has already signaled that these types of tools are important to it by spending close to $1 million creating and executing the challenge. Furthermore, the DOJ, through the Bureau of Prisons, already uses a risk assessment tool, , to make critical decisions about incarcerated populations. The use of this tool has been roundly criticized by several civil rights organizations.

Beyond influencing decisions about imprisonment and government surveillance, the data produced by law enforcement agencies and the predictions generated from risk assessment tools are often used in making decisions that can have a on people鈥檚 lives 鈥 including loss of parental rights, homelessness, prolonged job insecurity, immigration consequences (including deportation), and inability to access credit. The voices of those impacted by these tools should be embedded in the design and implementation of these tools, as they are the individuals who will have to suffer the consequences of poorly designed systems. By involving impacted communities in the development of predictive tools, the design of these types of systems may look dramatically different, or these tools may be determined to not be useful at all.

For more information about the NIJ鈥檚 Recidivism Forecasting Challenge and its shortcomings, check out our paper below. Our paper was presented at the Association for Computing Machinery鈥檚 on Equity and Access in Algorithms, Mechanisms, and Optimization at the end of October, where it won an Honorable Mention for the New Horizons Award.

Learn More 老澳门开奖结果 the Issues on This Page