As we enter the season of holiday shopping, many of the most popular children’s toys on the market are designed to connect to the internet. Toys have come a long way from the familiar teddy bears and dolls of past generations. Although toys that integrate technology are not new—we’re all familiar with pull-string dolls that speak, remote controlled racecars, and other classics—modern toys are increasingly equipped with an array of sensors and other sophisticated features. Toys like Mattel’s Hello Barbie and CogniToys’ Dino rely on sophisticated cloud-based services or artificial intelligence to provide interactivity.

Connected toys are creating new opportunities for interactive play and education, but also creating new challenges.

Toys that can become a child’s closest friend, collect intimate information, and provide advice are raising questions about how to ensure families can make appropriate choices about how data is collected and used. Future of Privacy Forum and Family Online Safety Institute have examined these issues in Kids & The Connected Home: Privacy in the Age of Connected Dolls, Talking Dinosaurs, and Battling Robots, a white paper released today at FOSI’s Annual Conference.

Today’s world of tech-enabled toys is presenting unique challenges. First, advances in computer processing are enabling “smart toys” to interact in ever more sophisticated ways, even going so far as to simulate intelligence. Smart toys may include the use of microphones for speech recognition, cameras for detection of patterns and visual cues, accelerometers, proximity sensors, gyroscopes, compasses, or radio transmitters. In addition, many of today’s toys are designed to connect to the internet, and therefore to remote servers that collect data and power the toy’s intelligence. 

The distinction between connected toys and other toys is important: Whether or not a toy is connected to the internet will be a key factor in determining whether or not the Children’s Online Privacy Protection Rule will apply.

More fundamentally, the privacy issues are different for connected toys. Smart toys that do not connect to the internet may raise important questions about child development and learning, but because data is not sent remotely, their privacy and security implications are more limited. Connected toys, in contrast, can connect to internet-based platforms or to other devices to enable data collection, processing, or sharing through a computer server. This key feature is what primarily generates concerns today over privacy and data security and makes them subject to COPPA.

Although COPPA provides a baseline of strong legal protections, FPF and FOSI have identified many ways that companies can go beyond their legal requirements and really build privacy and security into the design of their toys. For example, many connected toys today are purchased in brick-and-mortar retail stores, where COPPA—a law designed for websites—does not require privacy notices. But why should a parent have to find out later, after purchasing a toy, bringing it home, and setting it up, that it is going to request their permission to collect and use personal information from their children? Instead, manufacturers of connected toys would be smart to begin placing helpful notices on the packaging of toys—some sort of cue will help parents decide (before purchasing) whether they are comfortable with the toy or whether they would like to do more research.

For example, a toy’s packaging could say: “Parents, this toy will require that you create a personal online account in order to access all features,” or, “Parents, this toy will require your permission to use your child’s information to bring the toy to life!” This would be in line with other well-recognized disclosures on packaging, such as “Batteries sold separately."

Companies should also invest in developing creative and intuitive ways to alert children and parents when data is being collected or transmitted—including glyphs, and other visual, audio, and haptic cues. The Hello Barbie Dreamhouse, for example, has a visible Wi-Fi indicator at the top, and responds to a spoken “wake phrase” by lighting up and making a beeping sound. Forms of notice may depend on the unique features of the toy. In other words, if a connected toy is primarily voice-enabled, it should have an answer to privacy-related questions. If a connected toy is primarily controlled through an app, notice should be prominent and readable within the app. Steps like this will help ensure that the physical toy itself is intuitive, and will not surprise parents, or the children’s friends’ parents, with unexpected data collection.

Equally important to remember is that good privacy cannot exist without strong security measures, and concerns regarding cybersecurity threats in the general internet-of-things sector apply even more strongly to connected toys. Companies can take a variety of important steps to safeguard their data, including (but by no means limited to):

  • Identifying vulnerabilities by conducting independent security audits;
  • Facilitating ongoing identification of vulnerabilities by communicating with the security community and being involved in a public bug bounty program;
  • Implementing strong encryption standards (HTTPS / TLS) so that the toy will not send personal information over insecure channels, store personal information in an insecure format on the toy itself; or communicate with any unauthorized devices or servers;
  • Avoiding passwords that cannot be changed by users, and default passwords that are the same for all toys;
  • Ensuring that access to any remote server is secure by using physical and administrative restrictions, as well as technical restrictions—such as preventing unauthenticated firmware updates—or implementing IP address whitelisting.

Connected devices have suffered security breaches in a range of contexts in recent years. Home security cameras have been compromised, allowing unauthorized parties to view video from the inside of unsuspecting consumers’ homes. Reports have surfaced of unauthorized people accessing baby monitors and speaking, sometimes offensively, to children. If similar vulnerabilities are exposed in toys, the security breach would potentially allow bad actors to access video or audio of children, or use the toy to communicate with children.

Thus, in the context of connected toys, the sensitivity of the audience makes concerns over security even more critical.

The future for connected toys is promising, but much more can be done to build trust with parents and children. Industry leaders that follow privacy and security best practices will be well suited to succeed in the growing market for connected toys.

Top photo courtesy of the Future of Privacy Forum