The ACLU of New Jersey started the Automated Injustice Project with a central question in mind: how has the New Jersey government’s use of artificial intelligence, algorithms or automated decision systems affected the rights and well-being of people living in the Garden State?

Every day, governments make thousands of decisions that impact our lives. Some of these decisions affect us on a large-scale: How much state funding will your local school district receive this year? Many more affect us in much smaller, but no less powerful, ways: Who should the police investigate as a suspect in a crime?

Each of these decisions, big and small, are often determined by a government algorithm. Many forms of algorithms and artificial intelligence have been shown to perpetuate and worsen racial inequity, deprive people of the ability to contest unfair outcomes, and fundamentally change how people interact with the government. So how are algorithms used to make decisions like these, and why?

Through our Automated Injustice Project, we’ve begun investigating a few instances when the New Jersey government has deployed algorithms and automated decision systems: 

  • Facial recognition: What happens when New Jersey law enforcement relies on artificial intelligence to attempt to identify possible suspects in a photograph? 
  • Medicaid budgeting: How can a formula and automated assessment determine the healthcare New Jerseyans are eligible to receive? 
  • Pretrial risk assessments: Why does a judge refer to a computer-generated report to decide whether New Jerseyans present a danger to their communities? 
  • Domestic violence risk assessments: Should an automated system decide why and how survivors of domestic violence receive public benefits?

Whenever the New Jersey Government uses an algorithm to make a decision, we want to know about it. Algorithms are notoriously opaque, and even the “what,” “how,” and “why” of their use is often shrouded in mystery. The Automated Injustice Project seeks to expose government algorithms, explore how New Jerseyans are impacted by them, advocate for comprehensive oversight of their use, and, at times, question whether they should be used at all.

Each of the systems we’re investigating encompass different technologies and represent the wide variety of algorithms used by the New Jersey government. Some involve complex digital systems often considered to be artificial intelligence, while others require humans to work in tandem in order to arrive at a determination. But all these systems have one thing in common: They replace human judgment with automated decision-making.

For some of the systems we’re investigating, it’s already clear what dangers they pose to the New Jersey community, and, as a result, we’ve called for bans on their use. For others, their secrecy makes it that much harder to figure out just how far their harm goes – but our effort to bring transparency and accountability to government algorithms will endure for as long as New Jerseyans’ civil rights and civil liberties are at stake.

Over the next four weeks, join us as we release a multimedia series explaining our findings so far on how government algorithms affect all New Jerseyans. Subscribe now.

To share how government algorithms or automated systems may have impacted your life, tell us your story.

Animation Playlist

mytubethumb play
Privacy statement. This embed will serve content from
A secretive algorithm, health care, and your rights.

Algorithms ration care by limiting the quantity of services that insurance providers are willing to fund for each patient. But when the formula gets it wrong and misrepresents someone’s needs, the results can be catastrophic, denying people vital services they need to live safely and independently in their communities. Read the blog.

More in this series

Further Reading

Automating Inequality

By Virginia Eubanks. Learn more. Read an excerpt.

A graphic that reads: Further Reading

Report: "Family Surveillance by Algorithm"

By Anjana Samant, Aaron Horowitz, Kath Xu, and Sophie Beiers, ACLU. Read more.

A graphic that reads: Further Reading

Report: "Poverty Lawgorithms"

By Michele Gilman, Data & Society. Read more.

A graphic that reads: Further Reading