This embed will serve content from {{ domain }}. See our privacy statement

When someone is charged with a crime and arrested, a judge must decide whether to detain them pending trial. If they are forced to remain in jail, it will be increasingly difficult to avoid taking a plea deal – and facing the subsequent consequences – even if the charges against them are wrongful. People who are jailed before trial also receive far-harsher prison sentences if convicted. If allowed to fight their charges from home, they will likely get a much fairer outcome. This makes the decision determining pretrial freedom one of the most consequential in someone’s entire case – and in New Jersey, the criminal legal system relies on algorithms to make that decision.

About the Public Safety Assessment

In 2017, after abolishing cash bail, New Jersey adopted a pretrial risk assessment algorithm called the Public Safety Assessment, or PSA to inform whether to detain someone pending trial.

"People who are jailed before trial also receive far-harsher prison sentences if convicted. If allowed to fight their charges from home, they will likely get a much fairer outcome."

The PSA measures two aspects of a defendant’s “risk” level on a scale of one to six – how likely they are to fail to appear at future court dates and to what extent they might pose a risk of being rearrested – which informs the judge in their decision whether the defendant should be released. The PSA also provides a “flag” if it calculates an elevated risk that the person might be rearrested for a crime of violence. New Jersey’s current system presumes that people have a right to remain free while they fight their charges and can only be detained if a judge – informed by the PSA – deems that they cannot be released safely.

The Questions Raised by Pretrial Risk Assessment Algorithms

These kinds of pretrial risk assessment algorithms have come under heavy scrutiny in recent years, and for good reason, because they can perpetuate racial bias in how they evaluate risk. People of color, who are more likely to be arrested than white people and sentenced more harshly for the same kinds of crimes, will likely experience higher risk scores – a result of decades of systemic racism and targeted over-policing of Black and brown communities, context that the assessment itself ignores.

This speaks directly to some of the Automated Injustice Project’s most pressing questions: can an algorithm that looks at just nine factors about a defendant honor the due process owed to every person under the constitution? And further, should someone’s life be reduced to a number on a page – a non-comprehensive assessment of reality – when their liberty is on the line?

Our interpretation of these essential questions is informed by what we’ve seen in New Jersey, which first implemented the PSA in 2017 as part of its effort to eliminate cash bail in nearly all criminal cases. What has been the impact of this massive shift away from cash bail? And how has the PSA affected that landscape?

New Jersey’s Experience with Pretrial Risk Assessment 

Since the removal of cash bail and the adoption of the PSA in 2017, New Jersey has drastically reduced its jail population at no cost to public safety. Defendants have continued to appear in court at similar rates with no increase in crime, and now they spend less time in jail. This points to a major success of New Jersey’s abolition of cash bail – incarcerating people before they have been convicted is cruel and unnecessary.

"Proponents of the pretrial risk assessment algorithm had hoped it would make decisions free from bias, unlike human judges, but that just isn’t true."

But the PSA has not addressed the systemic racial bias of who is being detained pending trial. Before bail reform and the adoption of the PSA, Black defendants represented 54% of New Jersey’s jail population. Now, five years later, that figure remains unchanged despite the overall jail population being drastically reduced. Proponents of the pretrial risk assessment algorithm had hoped it would make decisions free from bias, unlike human judges, but that just isn’t true.

New Jersey’s experience using the PSA underlines the imperative for comprehensive regulation and oversight of government algorithms and automated decision systems. The government should have the obligation to constantly assess the algorithms it relies on to better understand their impact on marginalized communities – and these systems shouldn’t be shielded from public scrutiny and accountability. Some studies have already analyzed the performance of the PSA but understanding its impact, and the effect of systems like it, is a crucial undertaking that needs support.

Only when accompanied by a robust infrastructure of oversight regulating their use should we even begin to consider incorporating algorithms into decisions that affect things as important and life-altering as a person’s civil rights and civil liberties.

We have the right to know that the government is treating us fairly and with dignity. New technologies like algorithms and automated decision systems are no exception.

Related Content

News & Commentary
Aug 02, 2022
A collage of a scale of justice balancing binary codes, an image of a black man being scanned by facial recognition technology, a stethescope, a stack of books and an apple

Why the ACLU-NJ's Automated Injustice Project is Investigating Government Algorithms and their Effect on all New Jerseyans

Through our Automated Injustice Project, we’ve begun investigating a few instances when the New Jersey government has deployed algorithms and automated decision systems that impact our lives and wellbeing. Learn more about our investigation.
News & Commentary
Aug 02, 2022
A collage of a scale of justice balancing binary codes, an image of a black man being scanned by facial recognition technology, a stethescope, a stack of books and an apple

Porque el Proyecto de Injusticia Automatizada de ACLU-NJ está Investigando Algoritmos del Gobierno y sus Efectos a Todos los Habitantes de Nueva Jersey

El Proyecto de Injusticia Automatizada busca exponer los algoritmos del gobierno, explora cómo las personas de Nueva Jersey son impactadas por este, aboga por supervisión comprensiva de su uso, y, en ocasiones, cuestiona si debieran ser usado en absoluto.
News & Commentary
Aug 18, 2022
An illustration of a group of people standing against a city backdrop, with boxes framed around their faces, as if they're being targeted by surveillance.

We’ve Called for a Total Ban on the Use of Facial Recognition by Law Enforcement – Here’s Why

Facial recognition technology, like many kinds of automated decision systems used by the government, can worsen racial inequity, limit our civil rights and liberties, and deprive people of fundamental fairness. We're investigating.
News & Commentary
Aug 18, 2022
An illustration of a group of people standing against a city backdrop, with boxes framed around their faces, as if they're being targeted by surveillance.

Hemos Pedido una Prohibición Total del Uso del Reconocimiento Facial por Parte de los Cuerpos Policiales. He Aquí Por Qué.

La tecnología de reconocimiento facial, como muchos tipos de inteligencia artificial o sistemas de decisión automatizados utilizados por el gobierno, puede empeorar la injusticia racial, limitar nuestros derechos y libertades civiles, y privar a las personas de la justicia fundamental.