An Algorithm Cannot Protect Constitutional Rights

Empty prison cell with door open
Empty prison cell with door open

In an effort to bring more standardized decision-making to our criminal justice system, many courts have turned to "risk assessment tools" to determine whether and under what conditions an accused should be released before trial. But, as a new, in-depth report by ProPublica shows (and what many, including former Attorney General Eric Holder, have long-suspected), risk assessment tools may perpetuate racial bias in our justice system and, perhaps even more disconcertingly, cloak such discrimination in a veil of scientific certainty. The latest news on risk assessments is another example of the vital role that the right to counsel plays in any criminal justice reform: the best way, perhaps the only way, to ensure judges make fully informed decisions concerning pretrial release is to be certain that any person facing loss of liberty has a lawyer.

Risk assessments have been touted as an effective tool to predict whether a defendant will pose a risk to public safety or fail to appear before trial. They consist of a series of inquiries about a defendant that are then plugged in to a proprietary (and therefore, secret) algorithm, which produces a risk "score" for the accused. These tools have become ubiquitous in our national conversation about criminal justice reform and many courts across the country now rely on them to make decisions on pretrial freedom of the accused, as well as sentencing decisions for those found guilty. Without an objective instrument to determine if a person is a safety or flight risk, proponents argue, too many individuals are unnecessarily jailed - a great waste of individual lives and taxpayer dollars.

But, just how accurate are risk assessments?

In its 2015 report, Don't I Need a Lawyer?: Pretrial Justice and the Right to Counsel at First Judicial Bail Hearing, The Constitution Project National Right to Counsel Committee--comprising judges, prosecutors, defenders, and other criminal justice stakeholders--addressed the use of risk assessment tools at the pretrial stage. The committee emphasized that without defense counsel to provide context missing from a risk assessment score, that number alone is not meaningful. The committee also warned that relying simply on a risk assessment tool could lead to disparate outcomes.

ProPublica is the latest entity to confirm the problems with some risk assessment tools. After examining the risk scores for over 7,000 defendants in Broward County, Florida, ProPublica researchers checked to see how many were charged with new crimes over the next two years. Their findings were alarming.

First, ProPublica found that the tool was highly unreliable in predicting future crime: only 20% of people predicted to commit violent crimes actually went on to do so. When misdemeanors and traffic violations were included, the algorithm's accuracy went up to just over 60%--in the words of the report, "somewhat more accurate than a coin flip." But the researchers' most startling conclusion was that the formula resulted in blacks being labeled higher risk at twice the rate of whites, even though those black defendants did not actually re-offend. The reverse was true for whites: they were more likely to be labeled low risk, but then more frequently went on to commit new crimes.

Paradoxically, risk assessment tools were created in an effort to remove much of the subjectivity--including racial bias--that permeates our criminal justice system. But, as ProPublica's findings illustrate, the business of predicting "future dangerousness" is a fraught one. Questions included in risk assessments often resemble those that could easily serve as surrogates for race, ethnicity, or economic status. The accused is often asked whether he or she has a friend or relative incarcerated, or whether the person is employed or has housing. Living in the wrong part of town--or living without stable housing--can create a higher risk score and consequently more onerous and expensive conditions to obtain pretrial freedom.

Our criminal justice system is operated by fallible individuals who bring their experiences, along with overt and implicit biases, to their decision-making. However, ours is an adversarial system in which defense counsel's zealous representation serves as a safeguard. Unfortunately--as the TCP report explains--in the vast majority of courtrooms nationwide, no attorney is present to represent the accused when a judicial officer makes the (often life-altering) decision regarding pretrial freedom.

The right to a lawyer is the right through which all other constitutional guarantees are protected. Without one, judges are more likely to order a financial condition on release before trial, often resulting in an indigent defendant's pretrial incarceration. Defendants jailed from the point of arrest also experience substantial prejudice in their ability to conduct an immediate investigation, prepare for trial and build a defense. Collateral consequences also flow from pretrial incarceration: the accused may lose a job, his or her home, and the ability to support loved ones. As a result, our committee has called for states to provide counsel at bail hearings for every defendant unable to afford one on his or her own.

An algorithm cannot protect constitutional rights. The recent findings on risk assessments, which are so often trotted out as a panacea to systemic bias, provide yet another reminder that no criminal justice reform is effective without states' adherence to the single constitutional imperative in all criminal cases: that the accused be represented by effective counsel regardless of his or her ability to pay for one.

This column was co-authored with Sarah Turberville, Director of Justice Programs, and Madhu Grewal, Senior Counsel, at The Constitution Project.