By completing this form, I agree to receive occasional emails per the terms of the ACLU’s privacy statement.
Through our Automated Injustice Project, we’ve begun investigating a few instances when the New Jersey government has deployed algorithms and automated decision systems that impact our lives and wellbeing. Learn more about our investigation.
The ACLU of New Jersey started the Automated Injustice Project with a central question in mind: how has the New Jersey government’s use of artificial intelligence, algorithms or automated decision systems affected the rights and well-being of people living in the Garden State?
Every day, governments make thousands of decisions that impact our lives. Some of these decisions affect us on a large-scale: How much state funding will your local school district receive this year? Many more affect us in much smaller, but no less powerful, ways: Who should the police investigate as a suspect in a crime?
Each of these decisions, big and small, are often determined by a government algorithm. Many forms of algorithms and artificial intelligence have been shown to perpetuate and worsen racial inequity, deprive people of the ability to contest unfair outcomes, and fundamentally change how people interact with the government. So how are algorithms used to make decisions like these, and why?
Through our Automated Injustice Project, we’ve begun investigating a few instances when the New Jersey government has deployed algorithms and automated decision systems:
Whenever the New Jersey Government uses an algorithm to make a decision, we want to know about it. Algorithms are notoriously opaque, and even the “what,” “how,” and “why” of their use is often shrouded in mystery. The Automated Injustice Project seeks to expose government algorithms, explore how New Jerseyans are impacted by them, advocate for comprehensive oversight of their use, and, at times, question whether they should be used at all.
Each of the systems we’re investigating encompass different technologies and represent the wide variety of algorithms used by the New Jersey government. Some involve complex digital systems often considered to be artificial intelligence, while others require humans to work in tandem in order to arrive at a determination. But all these systems have one thing in common: They replace human judgment with automated decision-making.
For some of the systems we’re investigating, it’s already clear what dangers they pose to the New Jersey community, and, as a result, we’ve called for bans on their use. For others, their secrecy makes it that much harder to figure out just how far their harm goes – but our effort to bring transparency and accountability to government algorithms will endure for as long as New Jerseyans’ civil rights and civil liberties are at stake.
Over the next four weeks, join us as we release a multimedia series explaining our findings so far on how government algorithms affect all New Jerseyans. Subscribe now.
To share how government algorithms or automated systems may have impacted your life, tell us your story.
By completing this form, I agree to receive occasional emails per the terms of the ACLU’s privacy statement.
The ACLU of New Jersey started the Automated Injustice Project with a central question in mind: how has the New Jersey government’s use of artificial intelligence, algorithms or automated decision systems affected the rights and well-being of people living in the Garden State?
Every day, governments make thousands of decisions that impact our lives. Some of these decisions affect us on a large-scale: How much state funding will your local school district receive this year? Many more affect us in much smaller, but no less powerful, ways: Who should the police investigate as a suspect in a crime?
Each of these decisions, big and small, are often determined by a government algorithm. Many forms of algorithms and artificial intelligence have been shown to perpetuate and worsen racial inequity, deprive people of the ability to contest unfair outcomes, and fundamentally change how people interact with the government. So how are algorithms used to make decisions like these, and why?
Through our Automated Injustice Project, we’ve begun investigating a few instances when the New Jersey government has deployed algorithms and automated decision systems:
Whenever the New Jersey Government uses an algorithm to make a decision, we want to know about it. Algorithms are notoriously opaque, and even the “what,” “how,” and “why” of their use is often shrouded in mystery. The Automated Injustice Project seeks to expose government algorithms, explore how New Jerseyans are impacted by them, advocate for comprehensive oversight of their use, and, at times, question whether they should be used at all.
Each of the systems we’re investigating encompass different technologies and represent the wide variety of algorithms used by the New Jersey government. Some involve complex digital systems often considered to be artificial intelligence, while others require humans to work in tandem in order to arrive at a determination. But all these systems have one thing in common: They replace human judgment with automated decision-making.
For some of the systems we’re investigating, it’s already clear what dangers they pose to the New Jersey community, and, as a result, we’ve called for bans on their use. For others, their secrecy makes it that much harder to figure out just how far their harm goes – but our effort to bring transparency and accountability to government algorithms will endure for as long as New Jerseyans’ civil rights and civil liberties are at stake.
Over the next four weeks, join us as we release a multimedia series explaining our findings so far on how government algorithms affect all New Jerseyans. Subscribe now.
To share how government algorithms or automated systems may have impacted your life, tell us your story.
By completing this form, I agree to receive occasional emails per the terms of the ACLU’s privacy statement.
By Virginia Eubanks. Learn more. Read an excerpt.
By Anjana Samant, Aaron Horowitz, Kath Xu, and Sophie Beiers, ACLU. Read more.
By Michele Gilman, Data & Society. Read more.