Urban Wire The Problem Is Not the Criminal Justice Risk Assessment Tool: It’s Systemic Racism
Kelly Roberts Freeman, Mari McGilton
Display Date

Judge holds gavel

The justice system process routinely involves people making predictions. Judges, for instance, consider future risk to public safety to make important legal decisions, such as pretrial release or supervision requirements. Criminal justice risk assessment was developed to make these predictions of misconduct more systematic and uniform. But many experts say risk assessment tools are developed with inaccurate crime data that reflect systemic biases. This is especially problematic for high-stakes applications, such as when risk information is used to make pretrial release or detention decisions.

Despite the criticisms, research shows risk assessment tools can better determine someone’s risk than subjective judgement alone. And risk assessment tools can be examined for racial biases, which may help explain why they are receiving so much attention. It is important to recognize that these tools do not operate apart from the justice system or the systemic issues that plague it. Rather, they should be understood within this context and with an eye toward promoting racial justice.

The problem with the data sources and the development  

Risk assessment tools are only as good as the data used and the decisions made by people who develop and use them. Development requires using administrative data containing detailed information on people involved in the justice system. Because the information reported in the data largely relies on legal interpretation and decisionmaking (such as criminal history records) and because people of color are disproportionately affected by the justice system, racial biases are likely built into the data and the risk assessment tool itself. Most notably, these tools often use arrest data to understand both past offending and future criminal behavior, but these measures also reflect behaviors of law enforcement, prosecutors, and courts. So risk assessment tools that use official data would more accurately be described as predicting future justice involvement.

The problem with risk assessment use and interpretation

Scores and associated risk levels (low, medium, high) calculated from these tools are used to make risk-based or risk-informed decisions about programming, treatment, detention, or sentencing conditions. Because the information going into risk scores can reflect racist judgments, polices, and practices (PDF) that disproportionately affect Black communities, agencies should consider how this information is embedded in their risk tool and within their system. For instance, criminal history information is relevant to decisionmaking, but it can be recorded and interpreted unsystematically and subjectively. It may be possible to mitigate biases used to score risk assessment tools by including only specific case outcomes and offense types. A history of convictions, for instance, may be less affected by racial biases than a history of arrests.

Additionally, risk assessment tools do not produce perfect predictions about whether a specific person will engage in a specific behavior, such as reoffending. Rather, they are likelihoods based on the group characteristics and recidivism outcomes of people in the data used to develop the tool. And because these tools categorize risk scores, or likelihoods, into discrete risk categories (PDF), even a one-point difference in score can mean the difference between a medium- or high-risk label. This can then lead to different case and treatment outcomes, such as greater likelihood of detention or increased community-supervision level. Risk scores among people of color have been shown to be higher than for white defendants, which reflects structural disadvantages in arrest, housing, and employment. Moreover, risk is not a direct measure of wrongdoing or harm.

The criminal justice system can be the solution

Risk assessment tools offer structured, objective, and accountable (PDF) alternatives to subjective decisionmaking in the justice system. But to address systemic biases, developers should exercise transparency regarding each tool’s data sources, decision points, models, and limitations. Policymakers and legal officials must understand the strengths and weaknesses of the data used to create the risk odds and categories. They could also scrutinize the information used to score each item and know its limitations and potential biases. Knowing what each risk assessment tool does and does not tell them, and how to use risk information, will lend itself to more just decisionmaking.

More broadly, researchers and practitioners should recognize that racial biases exist throughout the justice system, and there is a need to identify, limit, and correct injustices (PDF) to address this problem in using justice data. With this goal in mind, the criminal justice field could consider adopting standards for assessing bias and differential effects across populations. Criminal justice agencies could also welcome opportunities to challenge the kinds of information used in legal decisionmaking, such as the role of prior arrests or justice involvement. Rethinking how the criminal justice field uses risk assessment tools could help break down racist practices that permeate the system and further racial equity and justice.

Body

Tune in and subscribe today.

The Urban Institute podcast, Evidence in Action, inspires changemakers to lead with evidence and act with equity. Cohosted by Urban President Sarah Rosen Wartell and Executive Vice President Kimberlyn Leary, every episode features in-depth discussions with experts and leaders on topics ranging from how to advance equity, to designing innovative solutions that achieve community impact, to what it means to practice evidence-based leadership.

LISTEN AND SUBSCRIBE TODAY

Research Areas Crime, justice, and safety
Tags Racial and ethnic disparities Crime and justice analytics Racial and ethnic disparities in criminal justice Incarcerated adults
Policy Centers Justice Policy Center