Some Missouri schools use AI, digital tools to prevent suicides. What does that mean for student privacy?
Advocates for the digital monitoring tools say they help save lives. Experts warn that the programs have a higher tendency to flag content from certain groups of students, and that more is needed to address youth mental health concerns.

New Franklin School District uses a free program called Bark to monitor student activity for signs of mental health crises, bullying or harassment. (Meg Cunningham/The Beacon)
As a school counselor in Neosho, Missouri, Tracy Clements was skeptical that an AI tool could help her students’ mental health, or make any dent in the school’s suicide prevention efforts.
She entered the Neosho School District in the fall of 2012. Over the years, she started to notice a trend.
“I was an elementary building-level counselor for probably six years, and we just kept having all these suicides.” Clements said. “I was like, ‘Why is nobody doing anything about this?’ And nobody seemed to know what to do.”
She became the director of counseling for the district, and started developing a communitywide mental health task force to try to chip away at the problem.
After a few years, she was approached with a new tool: a beta version of an AI software that would monitor student activity on their school computers. It could flag signs a student could be in crisis.
That software and others like it were booming as students were confined to their homes during the COVID pandemic, with the objective of keeping students safe while they weren’t physically in school.
“I was like, you know, ‘I don’t have time to play with your toys,’” she said. “‘Have fun with it, but I have real work to do.’”
Within weeks, Clements said, she got an alert that a student had searched how much of a medicine it took to kill someone. She tried calling the student’s parents and couldn’t reach them. So she drove to the student’s home, where she saw pills and a glass of water on the kitchen table.
“We got all the services in place, and then I just cried the whole way home,” she said.
The close call changed her mind about how effective the program could be. And at the time, Neosho needed help.
In 2020, 31% of students in Newton County — home to Neosho — reported to the state Department of Mental Health that they were always or often very sad in the past 30 days, compared to 25% of other surveyed Missouri students. Newton County students also reported slightly higher feelings of hopelessness, appetite and sleep changes, which can all be linked to mental health struggles.
That year, the district adopted GoGuardian Beacon, a version of student activity monitoring software that focuses specifically on mental health and student safety. It sends different types of alerts to school administrators based on how urgent situations seem. In the case of after-hours alerts, districts often use relationships with local law enforcement to check on students and ensure their safety.
GoGuardian Beacon claims that since 2020 their software has protected nearly 20,000 students nationwide from physical harm, and counties that use their software have 26% lower youth suicide rates.
In Neosho, there were an average of two student suicides a year before the program was adopted. Now, the company credits their efforts with student suicides dropping to zero for the past four years.
There are no studies showing exactly how effective GoGuardian’s program and others like it — such as Bark, Gaggle or Securly — are at keeping students safe. While GoGuardian focuses on mental health flags, other programs flag for things like hate speech, gender expression, bullying or harassment.
Despite indications that such software can help stem suicides, it also raises questions. Privacy experts warn of the potential downsides of the software. And student mental health advocates say schools must find ways outside of these programs to ensure students experiencing crises have other tools to address their needs.
Does suicide prevention outweigh privacy concerns?
Clements said she was most surprised by the program’s ability to catch students who may be at risk, and found it was flagging students much earlier than other metrics the district was using.
GoGuardian Beacon “was catching things sooner, before their behavior started to change on the outside, because kids are more honest with Google,” Clements said. “Those high-flyer kids who have good social skills and can mask when they are struggling — those kids we were able to catch further upstream and get services in place.”
When Neosho was rolling out the program, the school district held a sparsely attended informational meeting for parents to learn more about the software. They had little public feedback, but consenting to the program became part of the district’s technology release forms – by allowing your student to use this school-supplied laptop, parents were signing off to have their activity monitored.
The AI software flags certain search terms, but GoGuardian also has trained the staff screening the alerts behind the scenes before they are sent to school officials. The school can also give feedback about how accurate the alert was, to help the system continue learning.
“I think we have kids that are able to stay at school and successfully complete school because they’re getting the services they need,” said Tim Lewis, Neosho School District’s chief of police. “I’ve been a part of it for the last four or five years, and it has grown, and it is getting better and better every day,”
The New Franklin School District in Howard County, Missouri, uses a free version of a program called Bark to scan student activity in a similar way. The district has about 400 students, but has generated hundreds of thousands of alerts since 2022.
“It scans their emails, documents, slides, presentations, sheets, all of those kinds of things,” said Jackie Starke, a former teacher who is now a consulting technology director for New Franklin.
“Sometimes our kids are dealing with issues that we don’t realize,” Starke said. “It has helped us identify some students who may have considered self-harm and get them routed.”
From 2020 to 2022, Missouri was ranked 28th nationwide for its youth suicide rate.
In 2024, 21.5% of Missouri students surveyed about their mental health said they attempted to harm themselves in a deliberate but not suicidal way. The numbers have slightly improved since 2021, when 22% of high school students said they had seriously considered suicide, compared to 11.7% of students in 2024.
Still, about 56% of students surveyed in 2024 said they were sometimes, often or always “very sad.” And the numbers are consistently higher for female students compared to their male counterparts.
How are suicide prevention digital monitoring tools applied across student groups?
New Franklin and Neosho both have access to every alert that has been generated across their districts since they adopted these programs. Schools say the programs help keep students safe, but privacy experts have concerns about how they’re being used and what could be done with that data.
“Ultimately, the tool is available to the school, and there are many ways to customize your use of these tools,” said Kristin Woelfel, policy counsel for the Center for Democracy and Technology’s civic technology team. “All it takes is for somebody to decide, ‘We want to find all students who fit into this category, or we want to flag this particular content.’ That in itself is a little scary.”
The Center for Democracy and Technology (CDT) runs an annual survey of students, parents and teachers whose schools use student surveillance programs. In their 2024 survey, 88% of teachers said their school uses some form of technology to track students’ online activity.
It’s a booming industry. An ACLU research report found that lobbyists by 2023 had secured more than $300 million in federal funding to improve school safety through these software products. In 2021, K-12 schools and colleges across the U.S. spent about $3.1 billion on these security software types, up from $2.7 billion in 2017.
But the way those programs are applied, Woelfel said, changes across demographics.
For example, CDT found that students with an individualized education program or a 504 plan to accommodate students with disabilities were more likely to get in trouble for how they respond to school staff when their online activity was flagged. Of that group, 33% of students got in trouble for how they responded, versus 27% of students without an IEP or 504 plan.
They also found that LGBTQ+ students were more likely to experience adverse effects. CDT found that 55% of LGBTQ+ students said they got in trouble or know someone who got in trouble as a result of their activity that was flagged, versus 41% of non-LGBTQ+ students.
Another 18% of LGBTQ+ students said they were outed by the software, or know of someone who was.
That concern has been addressed by some companies. In early 2023, Gaggle announced it would no longer flag students who use words like “gay” and “lesbian” in school assignments and chat messages.
The ACLU report also found that algorithms have been shown to be worse at recognizing and categorizing Black dialects, and tools used to screen online comments for hate speech and cyberbullying disproportionately flag posts from Black students.
The CDT survey found that Black and Hispanic students were more likely than white students to report content being filtered or blocked: 42% of Black students and 33% of Hispanic students said content associated with students of color was more likely to be blocked or filtered, compared to 28% of white students.
“There are particular students who find themselves a little bit more vulnerable when these technologies are being used,” Woelfel said.
When surveyed, 42% of parents said they’d like to opt their child out of AI use for student safety purposes, and 81% of parents said they thought it was important they were given the chance to give input on how the software was used.
How school devices are used for different demographics of students is another element Woelfel worries about. If a student has access to a personal computer at home, that could provide them with more privacy. But if they rely on their school computer, their risk could be higher, she said.
“Should privacy be reserved to people who can pay for it?” Woelfel said. “I think that’s a valid question.”
The districts told The Beacon that they hadn’t talked with the state about these software products and how they should be used. There’s no estimate of how many Missouri school districts are using these digital tools for student safety purposes.
When contacted, the Missouri Department of Elementary and Secondary Education said that schools that accept federal e-rate money for internet connectivity and equipment are required to follow the Children’s Internet Protection Act, which calls for safety policies and the use of technology to filter obscene and harmful content.
Beyond technology, how can schools keep students safe in the digital age?
In New Franklin, administrators use the data and their knowledge of the student to assess how much of a threat an alert may be, Starke said.
“I can see if there is a pattern, or maybe somebody who hits an alert a little more often,” she said. “There are some kids, if they hit an alert, they go up the priority list … because there are potential issues that could have been serious.”
“We can’t go without it,” she added. “There are just too many things that students have access to compared to before they had phones and computers in their hands all of the time.”
Starke and the district have discussed upgrading to the paid version of Bark, which would allow parents to see alerts. But she questions if total access to the alerts would be helpful for every student.
“I would want to investigate that more,” Starke said. “That can be a tricky situation, because sometimes parents are the biggest part of the issue.”
Clements, the former Neosho counselor, doesn’t give GoGuaridan Beacon full credit for helping catch students in crises. She pointed to all of the previous communitywide work that taught students coping skills and what resources they had access to, if they needed them.
And mental health experts urge districts to ensure students are getting the support they need outside of digital monitoring.
But Missouri’s lack of mental health resources is pervasive, both in rural and urban areas, said Keith Herman, a curators’ professor in educational psychology at the University of Missouri.
He and his colleagues developed an early identification system for schools to assist students who may be at risk of experiencing mental health crises. The system included a student survey and a path for support for students that doesn’t rely on technology, which can’t pick up on all of the subtleties of mental health.
“They lack some of the more nuanced focus on identifying kids early with emerging risks, and then getting them the support and services in school or the community to prevent some of the things that threat detection software systems are detecting much farther downstream,” Herman said.
But especially in rural schools, monitoring software can help pick up some of the slack when districts may be understaffed or have less access to nearby resources.
“In those communities, often there’s not enough mental health providers to refer kids to when they do have needs that are outside the scope of the school,” Herman said.
And connecting students with resources, or knowing how to follow up to make sure they’re safe, is often the hardest part.
“It’s not enough to give tools like detection systems,” Herman said. He said schools need robust training on how to handle the systems, how to tailor them to their needs, and how to properly respond to students. And they need the tools to help students out, even if their mental health doesn’t seem to be in an urgent crisis.
But many schools don’t have sufficient resources to seek out that training for their teachers, Herman said, which can be a barrier to using these tools as safely and effectively as possible.
Ultimately, he said, a school and communitywide lens, not focused on individual students, is the best way to try and address student mental health.
“Getting adults in a school or community to think about … ‘What can we do differently in our environment to have fewer kids having these types of concerns?’” Herman said. “That type of thinking centers mental health as not something to be ashamed of … but more of something that we have a collective responsibility for thinking through.”
This article first appeared on Beacon: Missouri and is republished here under a Creative Commons Attribution-NoDerivatives 4.0 International License.