Shreya Sampath and Marisa Syed are with Encode Justice, a student group partnering with ACLU-NJ to explore the impact of school surveillance on New Jersey students.   

Once while taking an exam, one of us asked our teacher how she could see everything we did on our computer screens. She said that the school’s IT department had access to everything we did on our computer – not just during the test, but even when we’re not in school. That did not sit well with us. The ominous feeling that we were being watched has lingered, a feeling that resonates with students around New Jersey and across the country. It made us realize that as public school students, we are subject to mass surveillance enabled by unvetted advanced technologies every day. 

As two of the student leaders of the New Jersey chapter of Encode Justice, a global youth-led coalition advocating for human-centered artificial intelligence, our mission is to explore the vacuum of knowledge surrounding just how far this kind of surveillance goes. In partnership with the ACLU of New Jersey, we have been researching the alarming amount of surveillance that students are subjected to by schools and police. Our research has revealed that schools in New Jersey:  

  • Spend tens of thousands of dollars on tools that scan everything a student writes, communicates, or searches to flag “concerning” content using artificial intelligence. In some school districts, that information could be shared directly with the police. 
  • Spend millions of dollars on surveillance cameras with facial recognition and other AI-enabled capabilities, with no proof that these expensive tools make us safer. 
  • ​​​Spend thousands of dollars on tools to monitor everything students do on social media and might punish students for what they post, even when those posts have nothing to do with school. 

It is clear from our research that school monitoring software does not serve the interests of all students. Instead, the practice of student digital monitoring desensitizes youth to the surveillance state, disproportionately impacting the lives of lower-income, minority, and LGBTQ+ students.  

As members of Gen Z ​​​​in the age of social media companies, facial recognition cameras, and monitoring tools, make us – and many others our age – desensitized to the idea of mass surveillance. Though school surveillance has existed in some form for decades, the explosion of AI-powered surveillance followed an increase in the distribution of school-sponsored laptops and the move to remote learning. Administrators began monitoring the actions and speech of students using algorithms and technology. The “alleged” purpose of this technology is to protect students from unsuitable content, assure that students do not misuse resources, and flag activity that denotes issues related to mental health, substance use, and harassment that need immediate attention.  

But studies have shown that six in ten students report they do not feel comfortable expressing their true thoughts and feelings online when their activity is being monitored – a clear display of the harmful chilling effect of school surveillance that deters students from exercising their rights and freedoms. In a school context, this means students are scared to research contentious issues for assignments, express their opinions, and communicate their feelings using school-sponsored computers.  

And for students who are unaware of the extent of the surveillance they are being subjected to, they might share sensitive information on their computers without knowing who may see it. Students commonly explore sensitive topics like mental health or sexuality online, but these surveillance technologies put that kind of personal information at risk without their consent. In a country turning more hostile to the LGBTQ+ community every day, “outing" students and exposing them to discrimination

Low-income and minority students are particularly vulnerable to this heightened surveillance.  At one of our high schools, low-income students make up 60% of the population. For many, a school-sponsored computer is their only avenue to participate in the many parts of our lives that happen online. In certain cases, when an algorithm flags that online activity, it might contact the police directly before notifying parents or school administrators, subjecting those students and their families to unnecessary policing. These systems subject low-income students to over-policing or push them away from online pursuits for fear of being criminalized. 

The companies that make surveillance tools often market their technologies as a response to the student mental health crisis, encouraging schools to use them in substitution of mental health infrastructure. But by relying on algorithms and content moderators detached from student experiences, these technologies dehumanize students. We found that the North Brunswick school district spent over $58,000 for just one year of a software tool that monitors student communications with AI. That money could instead go to real mental health resources. 

Ultimately, students care about privacy, too. We care that our data is collected and commodified by anonymous third parties. We care that many of our peers, censor themselves in learning environments whose purpose is to encourage exploration and free thought. We care that the implications of surveillance are wide-reaching and more severe for lower-income minority communities. As AI-powered surveillance continues infiltrating new environments, we must regulate its uses. Unchecked AI leads to perpetuating – and worsening –already existing discrimination. Together, we must encourage school districts to make evidence-based decisions that center the needs of students. Heightened surveillance is not the solution.