TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Privacy in the Classroom: What You Need To Know About Educational Software Related reading: COPPA in the Classroom


Remember that “permanent record” the children of yesteryear feared? It might finally be a reality. As educational software platforms become more ubiquitous in learning, educational privacy issues are gaining prominence. The topic will likely take its place as a top-level priority this year as parents, educators and administrators take greater notice of the potential issues coming down the road.

Technology tools have stormed into the learning sector. The Software and Information Industry Association (SIIA), the principal trade association for the software and digital content industry, estimates the value of the market for educational software and applications at $7.9 billion. Unfortunately, some of that tremendous volume is due to the proliferation of unfettered use of technology tools in schools. In some cases, privacy issues aren’t addressed by either the software providers or the school districts that use their programs. Across many platforms, students and parents aren’t asked to give consent for the creation, storage or sharing of the multitudes of records that now exist.

Another factor contributing to the burgeoning educational software market is the increased personal use of applications by parents and children. Research released in January shows that educational applications are the second most popular category in the Apple App Store, comprising just over 10 percent of all app downloads. This indicates a tremendous interest in learning applications across a wide, technically savvy—and growing—demographic.

All of this growth in learning technology is giving rise to a number of different types of tracking and privacy concerns. At the base level, a mixture of for-profit and non-profit suppliers in the industry means there are few standards and varying privacy controls across the multitude of software platforms. This lack of standards and regulations results in data sharing guidelines that are murky at best. In addition to concerns about how, when and why student information may be shared, the manner in which all this data could be used is also a cause for anxiety, and for good reason.

Classroom management software, such as ClassDojo and others, provides for behavioral and disciplinary tracking of students. While much of this functionality is a positive for teachers overseeing a classroom with limited resources, there are potentially negative impacts educators must consider. There is a worry that these types of software may label a student as a “problem child.” That tag could stick with the student in the system long after any behavioral issues have been corrected, possibly affecting how future educators approach, evaluate and work with them. This concern is already circulating. Recent friction concerning ClassDojo’s data storage practices, for example, resulted in that platform choosing to amend their data retention schedule down to only one year.

Predictive algorithms—widely used in these types of platforms—have a number of similar benefits, from helping to steer kids toward more aptitude-focused studies, to giving educators the information they need to work on specific weak points. However, there are also significant downsides, exacerbated by the vast amount of data that is harvested, analyzed and stored, and that is central to many of these technology tools. Students may be pigeonholed and typecast long after specific data points are gathered and evaluated. The availability of large data sets may also lead to children being put onto one track or another earlier and earlier in their development, with little or no room left for changing course as they grow and mature.

Aside from potential impacts on students’ learning opportunities, ongoing marketing efforts are increasingly invading the educational space, in part ushered into the environment through these same software providers. Chegg’s business model is one example. The firm started out renting college textbooks, later branching into digital-focused ventures such as online tutoring and a site that matches students and colleges. But after recently outsourcing its textbook rental arm, the company still aims to use it as an entry point to push students through a widening sales tunnel. During an earnings call in 2014, Dan Rosensweig, Chegg CEO, said the outsourcing partnership would enable the company to continue gathering student data and payment information, then using that as a launch pad to market the firm’s other products to them.

And the amount of data flowing through these systems is staggering. Adaptive learning engines, Knewton being a popular example, often scan digital textbooks for information on how long each user reviews a page and how long they spend on various other tasks within the system. This approach provides a wealth of intelligence on students. The CEO of Knewton, Jose Ferreira, estimated in a recent video that the company has “five orders of magnitude more data about you than Google has.” He boasts that other companies haven’t amassed even close to the amount of information his organization has. Other players in the educational space, from tutorial platforms to differentiated learning systems, may be storing similar troves of highly sensitive—and clearly highly valuable—student data.

Privacy worries don’t end when students leave campus. Most school buses today sport video cameras, recording kids’ activities from the time they take their seat until they get off at their stop. Questions abound. Do the kids get a disclosure notice? Do they even know they’re being recorded? Who retains that footage? What happens to it? Some of these video clips have ended up on YouTube, so obviously there are districts out there that haven’t done a good job keeping a lid on the issue. Given the still-evolving state of educational privacy, it’s likely administrators haven’t finished grappling with the dilemma of where public disclosure fits into the scheme of things.

Educational privacy today isn’t exactly the Wild West, but meaningful regulations are lacking. The Federal Education Rights and Privacy Act (FERPA) does not apply to private companies, which dominate the learning technology marketplace. Privacy protections that are assumed to exist often don’t, but parents, students and even educators may not be aware of the facts. Amendments made to FERPA in 2012 tightened down on some aspects of data privacy but further enabled sharing between school systems and third parties assisting with select activities, such as conducting evaluations of education programs. The framework around sharing student data needs to be brought up-to-date.

Emerging regulations could either help or hinder progress in terms of privacy. The potential effectiveness of self-regulation as the educational software and application trade industry continues to grow—and companies invariably seek new opportunities to generate revenue—isn’t a given. Those federal regulations currently in place do not address the privacy concerns that continue to exist in FERPA. The recently introduced Student Digital Privacy and Parental Rights Act is a step in the right direction. It may curb some forms of targeted advertising based on student data, though the scope of what’s prohibited is narrow. Information couldn’t be collected and sold in the context of selling services or applications to the children using the software, but that doesn’t mean students’ data would be fully protected.

Criticism around this proposed legislation has focused on loopholes that could potentially enable software providers and others in education technology to continue many of their current practices. The proposed regulations provide some outside boundaries for providers and school districts to work within, but language in the bill—such as that stating that the legislation is not to be construed as prohibiting marketing to parents, for example—leaves many privacy questions unanswered. Even one of the Student Digital Privacy and Parental Rights Act’s primary sponsors, Rep. Jared Polis (D-CO), indicated the bill was only a “first step.” In drafting meaningful legislation, states may be better suited to move more quickly, as many already have demonstrated with their data breach regulations. State-level student privacy laws are beginning to proliferate, with the current count at 119 bills in 38 states. There’s still a long way to go, but it’s a promising start.

Big benefits exist in the educational software space, which already are proving to be extremely positive for students, parents, educators, administrators and school systems in general. However, burdens and concerns also abound. As the sector continues to evolve, many thorny issues are likely to be revealed.

photo credit: Back to School via photopin (license)


If you want to comment on this post, you need to login.

  • comment Team Qbreaker • Mar 4, 2016
    Thanks for the information! 
  • comment ravi t • Apr 24, 2020
    Thank you! For the post.