How One Company Surveils Everything Kids Do and Say In School

Caroline Haskins

For the 1,300 students of Santa Fe High School, participating in school life means producing a digital trail — homework assignments, essays, emails, pictures, creative writing, songs they’ve written, and chats with friends and classmates.

All of it is monitored by student surveillance service Gaggle, which promises to keep Santa Fe High School kids free from harm.

Santa Fe High, located in Santa Fe, Texas, is one of more than 1,400 schools that have taken Gaggle up on its promise to “stop tragedies with real-time content analysis.” It’s understandable why Santa Fe’s leaders might want such a service. In 2018, a shooter killed eight students and two teachers at the school. Its student body is now part of the 4.8 million US students that the for-profit “safety management” service monitors.

A college student whose middle school used Gaggle told BuzzFeed News that the tool taught them that they would always be watched. “I feel like now I’m very desensitized to the threat of my information being looked at by people,” they said.

Using a combination of in-house artificial intelligence and human content moderators paid about $10 an hour, Gaggle polices schools for suspicious or harmful content and images, which it says can help prevent gun violence and student suicides. It plugs into two of the biggest software suites around, Google’s G Suite and Microsoft 365, and tracks everything, including notifications that may float in from Twitter, Facebook, and Instagram accounts linked to a school email address.

Gaggle touts itself as a tantalizingly simple solution to a diverse set of horrors. It claims to have saved hundreds of lives from suicide during the 2018–19 school year. The company, which is based in Bloomington, Illinois, also markets itself as a tool that can detect threats of violence.

“… Maybe it teaches students some lessons, but not the ones we want them to learn.”

But hundreds of pages of newly revealed Gaggle documentation and content moderation policies, as well as invoices and student incident reports from 17 school districts around the country obtained via public records requests, show that Gaggle is subjecting young lives to relentless inspection, and charging the schools that use it upward of $60,000. And it’s not at all clear whether Gaggle is as effective in saving lives as it claims, or that its brand of relentless surveillance is without long-term consequences for the students it promises to protect.

Parents and caregivers have always monitored their kids. Sarah Igo, a professor of history at Vanderbilt University who studies surveillance and privacy, told BuzzFeed News, modern schools are often a de facto “training house for personhood.” But student surveillance services like Gaggle raise questions about how much monitoring is too much, and what rights minors have to control the ways that they’re watched by adults.

“It just seems like maybe it teaches students some lessons, but not the ones we want them to learn,” Igo said.

“Questionable content”

Gaggle is one of the biggest players in today’s new economy of student surveillance, in which student work and behavior are scrutinized for indicators of violence or a mental health crisis, and profanity and sexuality are policed.

“[There’s] a very consistent and long-standing belief that children have fewer rights to their own communications, and to their own inner thoughts and to their own practices,” Igo told BuzzFeed News.

Gaggle uses an in-house, AI-powered filtering system to monitor everything that a student produces on their school account associated with Google’s G Suite or Microsoft’s 365 suite of tools. It scans student emails, documents, chats, and calendars, and compares what students write against a “blocked word list,” which contains profanity as well as references to self-harm, violence, bullying, or drugs. A Gaggle spokesperson told BuzzFeed News the list is regularly updated and “based on the language commonly used by children and adolescents for almost a decade.”

The service also runs images uploaded by students through an “Anti-Pornography Scanner” (also proprietary and powered by AI). Gaggle, citing the sensitivity of its proprietary information, declined to tell BuzzFeed News how these tools were trained, answer questions about the original training sets, or say whether Gaggle’s AI tools learn based on what students put into G Suite and Microsoft 365.

Among the many banned words and phrases on Gaggle’s list are “suicide,” “kill myself,” “want to die,” “hurt me,” “drunk,” and “heroin.” Gaggle also commonly catches profanity — in 17 US school districts, about 80% of posts flagged by Gaggle within a particular school year were flagged for such words, according to documents obtained by BuzzFeed News.

According to Gaggle’s company wiki, student emails are scanned in real time, and those that violate the rules are blocked from their intended recipients. Meanwhile, documents and files are moderated “after the fact.”

When Gaggle’s software flags student content for any reason at all, it passes the material to one of the Level 1 safety representatives, who number 125. If the first reviewer thinks there’s an imminent risk to student safety, Gaggle said, it is forwarded it to one of the group of 25 core “trained safety professionals.” These employees are responsible for flagging content to school administrators.

Gaggle wrote on its company wiki in 2017 that content moderators individually reviewed “over a million blocked student communications each month.”

Gaggle analyzes student data and funnels its conclusions to a “Safety Management Dashboard.” School administrators with access can see rule violations alongside numeric IDs of the students who committed them. The dashboard includes a “Top Concerns” graph, which displays a scoreboard of students who have violated Gaggle’s rules over a particular time period. Administrators can expand the graph into a list that ranks the students within their district based on how often they violate Gaggle’s rules.

Gaggle ranks rule-breaking incidents according to three tiers of severity. A “violation” may include any use of language that breaches Gaggle rules, including false positives, like sending song lyrics or quoting from a book. “Questionable Content” includes material that’s concerning but not an imminent threat to student safety. This category could include cyberbullying or sending “professional pornographic images or files,” according to Gaggle. School officials are alerted when these occur.

A “Possible Student Situation” is the most severe type of content violation. According to Gaggle, it represents an “immediate threat” to the safety of students, including “student-produced pornography, violence, suicide, rape, or harmful family situations.” If a Gaggle safety representative determines that a piece of pornography is student-produced, Gaggle automatically send the file to the National Center for Missing and Exploited Children, which maintains a database of child pornography.

Gaggle operates by a “three strike rule,” meaning that mild rule violations could be flagged to school administrators if a student does it repeatedly. For instance, if a student said “fuck” three times, school officials would be alerted. According to Gaggle, students who commit three strikes have their account privileges limited until a school official gives those privileges back. It’s unclear if a student would lose email privileges in these situations, since it can be necessary for communicating with teachers and completing assignments.

Analytics designed to make track of massive troves of student data might seem like useful tools for school administrators. However, critics worry that schools are teaching students to accept sweeping forms of surveillance.

“My sense about this particular suite of products and services is that it’s a solution in search of a problem,” Roberts said, “which is to say that the only way that the logic of it works is if we first accept that our children ought to be captured within a digital system, basically, from the time they’re sentient until further notice.”

“If a student opts out of Gaggle, then they would not be able to use the school-provided technology and would have to use their personal email addresses for their school work — and that personal email would not be scanned by Gaggle,” a Gaggle spokesperson told BuzzFeed News.  

With suicide one of the leading causes of death for people under 18 and school shootings an ever-present concern, Gaggle’s student surveillance has a clear appeal to school administrators. When a robust team of student counselors is financially impossible, and when school administrators believe something is better than nothing, Gaggle becomes a feasible choice.

However, experts doubt whether Gaggle, or any kind of surveillance, is an adequate or appropriate response to adolescent suicide and school shootings.

 “We need to think about the consequences of using these kinds of surveillance technologies on students.”

 “If it’s done right, I don’t actually think there’s anything inherently wrong with [Gaggle],” Fasulo said. “It just has to be done well, and it can’t be used as the end-all, be-all — which I think people are often looking for.”

Published by Intentional Faith

Devoted to a Faith that Thinks

%d bloggers like this: