Fall has arrived in the northern hemisphere, summer has (allegedly) come to an end, and school is back in session. The school year brings anxiety for children and parents alike. We worry about assignments, friendships, grades, and, particularly in the U.S., we worry about safety.
Technology can be an attractive option to school districts that are struggling to react to changing threat models, an increasingly tech-savvy student base, and busy parents seeking convenient access to information. Students use technology in their classrooms, to complete assignments at home, and to access general information, like a syllabus, as well as highly personalized information, such as test scores.
Education technology certainly has its benefits.
Students are spared the panic of the lost syllabus and the strained backs and shoulders that many of us suffered lugging our bulky textbooks (in the snow, uphill, both ways). But ed-tech procurement and the administration of school websites are tasks often designated to school staffers who are under-resourced and lack expertise in the privacy and security implications of ed tech. A recent study by EdTech Strategies revealed “widespread lack of attention to issues of online security and privacy” by education agency websites, including use of insecure connections and tracking and surveillance software in contravention of the schools’ own privacy notices.
In August, The New York Times highlighted school website privacy as part of its "Learning section," noting the difficulty schools have navigating the murky data-handling practices of third-party providers: “knowingly or otherwise, many school sites are hosting software from third-party companies whose primary business is buying and selling data. … Those third parties may invite still other trackers onto the site, without the school’s knowledge or control.”
Though many schools may be unwittingly too lax with controls on data security, the same schools may be drawn to over-consume technology for physical security. The recently enacted Marjory Stoneman Douglas High School Public Safety Act requires the Florida Department of Education to coordinate with the Department of Law Enforcement to create a centralized data repository and analytics integrating vast amounts of student data. The repository will include student social-media data, as well as data from the Department of Children and Families, the Department of Law Enforcement, the Department of Juvenile Justice, and other local law enforcement. The repository will be live by December 2018, and though the act requires compliance with data privacy laws, it lacks guidance on how to achieve such compliance.
Similarly, many schools are deploying body cameras and facial-recognition software under the stated intent of improving physical safety, but whether or not these tools would actually improve safety is dubious, and the privacy implications are largely unaddressed.
A recent pilot of the facial-recognition software SAFR was limited to parents (expressly excluding students) who opted in to the pilot. As Wired’s Issie Lapowsky noted, “[i]f all schools were to use SAFR the way it's being used in [the pilot]—to allow parents who have explicitly opted into the system to enter campus—it seems less likely to do much harm. The question is whether it will do any good.”
Lapowsky’s conclusion underscores a key concern about the use of surveillance technologies: Vast amounts of data are collected, but the potential benefits of such data collection are often outweighed by the risks. As Clare Garvie of Georgetown Law School’s Center on Privacy & Technology warned, "My primary concerns have been, and remain, that the tech will be used in a widespread manner with very little oversight and essentially no rules."
While acknowledging the potential benefits of limited data sharing to support school safety, Elizabeth Laird of the Center for Democracy & Technology, cautioned that such practices “should not come at the expense of the students it is intended to protect.”
Bill Gates and Steve Jobs famously restricted their own children’s access to, and use of, the technology their own companies developed, and low-tech school options have become increasingly popular in Silicon Valley.
Joe Clement and Matt Miles’s 2017 book, "Screen Schooled: Two Veteran Teachers Expose How Technology Overuse Is Making Our Kids Dumber," offers a deep dive into the costs and benefits of using technology in education, a compelling case for less technology in the classroom, and more face-to-face instruction. Clement and Miles dismiss the argument that schools must teach technology for children to thrive in the increasingly tech-fueled world, noting that “[s]tudents need no help from schools developing their tablet, smartphone, or Twitter skills. They are doing this on their own. What they need help with is critical thinking, problem solving, and community building.”
Rob Glaser, who founded the company that produces the SAFR tool, concedes that tools like facial-recognition technology are not buttoned-up solutions for schools, and welcomes debate on the topic. “This is becoming something we, as a society, have to talk about," he writes. "That means the people who care about these issues need to get involved, not just as hand-wringers but as people trying to provide solutions. If the only people who are providing facial recognition are people who don’t give a [expletive] about privacy, that’s bad.”
Though Glaser was speaking specifically about facial-recognition technology, as privacy pros we should consider Glaser’s call for discussion more broadly. We may not all be experts on ed tech and student-data privacy laws, but we are all here because we do care about privacy, and we should consider what our roles can or should be in regard to technology in schools.
If you want to comment on this post, you need to login.