Privacy professionals are watching concerns compound during the COVID-19 outbreak as the effects of the virus bring new problems and magnify prior issues. Student privacy is one area in which this collision of problems has revealed itself the most.
The decision by schools across the world to move to online learning has only exaggerated preexisting and fresh privacy issues about the technology facilitating virtual learning experiences. With their adoption of technologies, many schools and teachers have shown a preference for ease of use rather than considering a product or platform that will best preserve student privacy.
"What’s been left out of the conversation is teachers at the K-12 levels and college professors randomly adopting non-education software without privacy vetting or (educational technology)," Future of Privacy Forum Director of Youth and Education Privacy Amelia Vance said. "They’re adopting whatever they can to provide online learning and continue working with their students, but there are significant privacy issues there."
Among the privacy problems that come with online learning tech are the collection and potential use of students' personal information, as well as employing products or platforms that are not designed for children. Such issues can bring violations of the U.S. Family Educational Rights and Privacy Act, Children's Online Privacy Protection Act and state-specific children's privacy laws, along with potential breaches of non-child directed laws, like the California Consumer Privacy Act and the Illinois Biometic Information Privacy Act.
Concerns may indeed stem from the mad rush to get online. The pandemic has made on-site learning an impossibility, forcing schools to shift classrooms online within an expedited timeline to keep schedules and students on track. Such a quick turnaround has pushed schools that are unfamiliar with edtech or online learning to choose less privacy-friendly technologies and platforms.
"There needs to be some kind of hindsight to dictate the professional development or processes that need to be set up before launching things," Vance said. "In the absence of being told what to do and how to do it, people are going to do what they know how to do and follow random advice."
There are privacy consultancy options for schools lacking the resources to employ an internal employee to properly vet technologies. In addition to states offering vetting services through regional education centers, Vance said schools have opted into consortium contracts to oversee proper vendor privacy practices. Vance added that teachers alone can refer to federal guidance or advocacy perspectives on proper student privacy, including those from FERPA|Sherpa and Common Sense Media.
In addition to the adoption of privacy-invasive edtech products, schools and teachers have been solicited by non-educational companies and platforms that lack children's privacy policies and may not comply with children's privacy laws. Vance and UCLA School of Law PULSE Fellow in Artificial Intelligence, Law & Policy Elana Zeide said they've seen platforms for webinars, social media, gaming and other non-educational activities advertise themselves as the "complete" resolution to online learning needs.
However, not only do those platforms potentially lack children's privacy standards, but they may also collect data on students as they would a normal adult consumer, according to Zeide.
"Students or children may use these products, but then their data is being sucked into the general stream of data and data uses that happen with commercial, for-profit platforms," Zeide said. "That means a profile could be created about a student where information is being exchanged over the course of what is supposed to be an educational experience and then used for targeted or behavioral advertising."
Teleconference platform Zoom, which has received considerable scrutiny from the start of the pandemic for general privacy concerns, is undoubtedly one non-educational platform schools and teachers have turned to. Zoom has been an easy target for frustrations based on its mass adoption, but Consumer Reports Privacy Researcher Bill Fitzgerald is focused beyond Zoom.
"Every video conferencing company marketing services right now should be sending Zoom an enormous thank-you card and a box of chocolates," Fitzgerald said. "Zoom is getting all the attention that should be otherwise spread evenly to all these companies."
Fitzgerald pointed to Google Meet as another hazardous platform promoting itself to schools. In his own research, Fitzgerald found Google Meet has flaws related to scheduling meetings and meeting access that are comparable to those for which Zoom has come under fire.
Beyond teleconference platforms, Zeide has seen and heard instances of teachers attempting to hold lessons via social media platforms or YouTube.
"If that’s a one-way lecture then sure, but any student interaction there is personally identifiable student information," Zeide said. "Teachers communicating or holding educational experiences via social media is not appropriate. Sounds engaging, up to date and easy to adopt, but in no way, shape or form are those platforms designed for use by children or students."
Fitzgerald and Zeide were quick to point out that beyond the issues with non-educational platforms, many of the children's privacy concerns with online learning are not new. Privacy pros have previously flooded edtech companies with compliance questions and queries on data sharing and storage practices.
"Privacy has been getting shortchanged in the education space for years before COVID-19," Fitzgerald said. "What’s happening now with everyone having first-hand experiences with software at a level they never have is that areas that have been ignored are getting the attention quickly."
According to Vance, the effects of privacy issues, both old and new, from this online shift "won't be known for a while." She added that companies seeking to mitigate potential privacy violations and protect kids properly should refer to best practices, like general transparency, clear and concise privacy policies, identifying the risks of a product or platform and more.
Fitzgerald sees a way forward with schools and teachers "having clear communication about what is and isn’t working," while edtech and consumer companies "show some humility." On the other hand, Zeide isn't so sure the damage hasn't already be done.
"I’m afraid schools adopting out of expedience will continue to use these technologies and platforms in less-than-ideal ways or have existing problems," Zeide said. "It would be a kind of path to dependency, which wouldn’t allow schools to go back to revisit concerns."
Photo by Annie Spratt on Unsplash
If you want to comment on this post, you need to login.