An iMessage thread of students discussing their favorite music artists. Selfies sent on a student’s personal phone while plugged into their computer. A discussion board post about the novel “To Kill A Mockingbird.”
These are all examples of content flagged by surveillance software across the nation, all of which resulted in disciplinary action after being searched, sent as a message or simply typed on school-issued devices. SMSD is no different in their use of surveillance software.
The growth of surveillance software in schools sets the stage for privacy violations, ultimately posing more harm than benefit to the mental and emotional well-being of students.
The term digital surveillance describes a variety of AI-based software tools used to monitor device activity remotely. Sold by private, third-party companies to school districts under the guise of preventing school shootings and suicide, this technology monitors students’ internet searches, messages and all computer data 24/7, which is then uploaded and stored by these companies.
Though advertised as the be-all-end-all solution to preventing the next Parkland, digital surveillance isn’t the solution to threats of violence and self-harm — it’s the problem. Instead of invading their students’ privacy, schools should instead focus more attention and funding towards mental health support.
And despite claims that these safety measures will make students feel more at ease by monitoring any potential safety threats, the knowledge that their every keystroke is being recorded, sent to and stored by private big tech companies is not comforting. An unintended product of this new culture of internet policing created by digital monitoring is students feeling uncomfortable reaching out for mental health help — a fear of being reported to and disciplined by administration, according to the Center for Democracy & Technology.
A high-schooler suffering a depressive episode of spiraling suicidal thoughts should not have to doubt whether or not they can search for resources online without being reported to their school’s administration.
Plus, digital surveillance doesn’t affect all students equally.
LGBTQ+ students are especially targeted by these programs. Gaggle, a surveillance technology product, openly admits to flagging terms like “gay” and “lesbian,” supposedly to prevent cyberbullying. However, blacklisting these terms altogether removes a layer of support from vulnerable queer youth seeking support and answers about their identity. Besides, The Roosevelt High School Southerner accuses school districts of using these programs to “out” queer students to their parents and teachers, setting an especially dangerous precedent for just how much schools can spy on students and what they can do with this information.
Similar to LGBTQ+ students, students of color suffer disproportionately from surveillance software because it over-reports certain words or phrases.
AI used in school device monitoring isn’t capable of detecting ethnic vernaculars or any non-standard English speech patterns, a prime example being African American Vernacular English. In fact, leading AI models are 2.2 times more likely to flag content written in AAVE, according to digital privacy researcher Nir Kshetri. This can be especially problematic for a population already disproportionately disciplined in school — more than three times as much as their white peers according to the New York Times.
Since schools spend so much money on digital monitoring programs — averaging upwards of $10,000 annually — they should be effective at their purpose. However, this isn’t the case. With the prevalence of “false flags,” or AI-misinterpreted language taken as a threat, these softwares are unreliable at best and dangerous at worst.
Investing in digital policing programs isn’t the answer to preventing self-harm and shooting threats — a potential school shooter being linked to guns from their school computer is a highly unlikely scenario, and mental health is already severely affected by these softwares. Instead, the $30,000 SMSD spends annually on digital security monitoring should be invested into mental health resources and counselors.
Safety is the most important consideration in this argument and spending tens of thousands of dollars on minority-policing, dysfunctional, glorified spyware software is not the solution.
The 2024-25 editorial board consists of Addie Moore, Avery Anderson, Larkin Brundige, Connor Vogel, Ada Lillie Worthington, Emmerson Winfrey, Sophia Brockmeier, Libby Marsh, Kai McPhail and Francesca Lorusso. The Harbinger is a student run publication. Published editorials express the views of the Harbinger staff. Signed columns published in the Harbinger express the writer’s personal opinion. The content and opinions of the Harbinger do not represent the student body, faculty, administration or Shawnee Mission School District. The Harbinger will not share any unpublished content, but quotes material may be confirmed with the sources. The Harbinger encourages letters to the editors, but reserves the right to reject them for reasons including but not limited to lack of space, multiple letters of the same topic and personal attacks contained in the letter. The Harbinger will not edit content thought letters may be edited for clarity, length or mechanics. Letters should be sent to Room 400 or emailed to smeharbinger@gmail.com. »
Leave a Reply