Is School Surveillance Going Too Far? Privacy Leaders Urge a Slow Down
To surveil or not to surveil? That is the question U.S. schools are grappling with as they face mounting pressure to better protect students against the gamut of growing safety concerns, from school shootings to bullying and self-harm.
Increasingly, many schools and states are deciding that the answer is to surveil—through social media monitoring, facial recognition cameras, threat assessment and other emerging technologies. But that decision does not come without its share of dissenters. One of the highest-profile examples involves New York’s Lockport City School District, which is using public funds from a Smart Schools bond to help pay for a reported $3.8 million security system that uses facial recognition technology to identify individuals who don’t belong on campus.
The Lockport case has drawn the attention of national media, ire of many parents and criticism from the New York Civil Liberties Union, among other privacy groups. (The state legislature is also considering a measure that would delay the technology’s implementation.) And yet Lockport is one of many examples of districts that have turned to new technology systems to address safety concerns.
“We’re seeing this rush to solutions, without the proper thoughtfulness, assessment or consideration of whether or not this is truly [effective] and how to implement this properly,” says Linnette Attai, the privacy project director at the Consortium for School Networking (CoSN), an association of K-12 technology leaders.
A growing chorus of education and privacy leaders are speaking out about the role of surveillance technology and whether it belongs in America’s schools or raises more issues than it solves.
Last week, the Future of Privacy Forum (FPF), a nonprofit think tank based in Washington, D.C., published an animated video that illustrates the possible harm that surveillance technology can cause to children and the steps schools should take before making any decisions, such as identifying specific goals for the technology and establishing who will have access to the data and for how long.
One of the clearest takeaways across these resources: Schools need to slow down.
“There is absolutely a growing market for this [technology],” says Sara Collins, a policy counsel for FPF’s Education Privacy Project. But schools typically don’t know how to communicate or justify it to parents, she adds, “and they don’t necessarily know why they’re doing it.”
The four-minute video FPF created, which is intended to be an introduction to school safety technology, cautions schools, districts and states against turning places of learning into “prison-like environment where students feel like Big Brother is always watching,” as the narrator puts it.
That last point may be a reference to new technology that lets schools monitor students in and outside the classroom, tracking their social media posts and flagging key phrases that may indicate a willingness to harm themselves or others.
That particular type of surveillance is not just looming on the horizon, says Collins. It’s already here.
“This is probably the most pervasive student [safety] technology now,” she says, naming Gaggle, Bark, GoGuardian and Securly as companies that provide these services. According to Collins, each of these companies claims between 1,000 and 5,000 schools as customers, “so a significant portion of schools right now.”
When it comes to monitoring students’ internet searches and social media activity, troubles abound. For one, it can create a lot of false positives. As the video explains, schools can wind up receiving countless alerts for innocuous posts that say “You’re the bomb” or “I’d kill for an ice cream cone right now” because the phrases in them register as potentially dangerous, but still someone has to sift through them and mark them as safe.
Another issue, says Amelia Vance, FPF’s director of education privacy, is determining who has access to data generated from these technologies and how much time should pass before that data can be deleted. For instance, if schools want to give law enforcement officials access to their video surveillance, they should only grant that access in a crisis. In many cases, schools and states haven’t even contemplated those questions, Vance says.
The two-page brief from the CDT and the Brennan Center notes that both federal and state policymakers are pushing for greater technology use in schools for safety purposes, and the authors highlight several considerations for education institutions before adopting safety technologies. Those considerations include issues related to cost and governance, but also advise schools to pay attention to how accurate or invasive a technology is, and whether it discriminates against certain populations.
“States, districts, and schools are under immense pressure to adopt new security measures, but rushing this process could hurt students without measurably contributing to school safety,” the brief says. “Many technological initiatives that have been proposed or implemented are unproven and come with significant risks to students’ privacy, free expression, and safety. Before ramping up data collection and digital surveillance in schools, decision-makers should consider the real effects and unintended consequences of these measures on students and families, and take steps to mitigate them.”
Among CoSN members, which include district IT directors, chief information officers and other school technology leaders, these issues have been top of mind as well, says Keith Krueger, CoSN’s CEO.
In February, CoSN put out a members-only brief addressing this topic, specifically as it pertains to facial recognition. Attai, the group’s privacy lead, led that effort. In the brief, she lays out the possible benefits and risks of adopting facial recognition technology, including its privacy implications and compliance with state and federal laws.
“I think every education association is figuring out how to make schools safer right now,” Krueger says. “Some of these new solutions are about technology, and suddenly our [CoSN] audience is involved in those decisions about what technologies to deploy and in a way that is responsible.”
He adds: “Sometimes there’s a rush to do things simply because we can.”