TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | The ‘big shift’ around children’s privacy Related reading: The ‘growing ecosystem’ of age verification

rss_feed

Momentum around children's privacy is reaching a fever pitch in the U.S. and, with a lack of concrete congressional action, states are stepping in — leading to what could be sweeping change and a potential patchwork of regulation seeking to protect children online.

While Congress has explored updates to the Children's Online Privacy Protection Act and separate legislative proposals implementing targeted privacy protections to teens and children, efforts have stalled. Meanwhile, digital harms facing children continue to be detailed during legislative hearings. In his second State of the Union address in early February President Joe Biden again made it clear protections for children online are a priority.

"Everyone is hearing the message that we need to do something about kids online, but it's like a game of telephone," said Future of Privacy Forum Policy Counsel, Youth and Education Privacy Bailey Sanchez. "Some people are hearing we need to ban TikTok, some people are hearing we need a comprehensive consumer privacy law, some people are hearing there's harmful content we need to restrict online and then some are hearing maybe we need to do COPPA 2.0 that would raise the age of COPPA. We're all hearing the same message and putting our own spin on what that means."

Several states, including Minnesota and Nevada, are proposing legislation modeled after the California-Age Appropriate Design Code Act, which enters into force 1 July 2024. States including Maryland, New Mexico and Oregon proposed similar models, but those seem to have stalled for this year.

Other states are focusing on parental supervision to children's online access. Utah passed a law requiring social media companies to provide a parent or guardian "access to the content and interactions of an account held by a Utah resident under the age of 18," while Arkansas passed legislation banning minors under 18 from social media platforms without parental consent.

The California Age-Appropriate Design Code Act is facing a lawsuit from technology trade association NetChoice, which alleges the legislation "presses companies to serve as roving censors of speech on the internet," while advocates have voiced concerns, including around surveillance, with the Arkansas and Utah legislation.

'The unsolved question'

The age-appropriate design codes and social media laws take two different approaches to protecting children online, Sanchez said, but age verification is a common thread.

"All of a sudden, age assurance, age verification, age estimation is such a big topic … There is some uncertainty as to what that means, but the general consensus is it's going to require something more than just an age gate. It's going to require more than what we are used to,” she said. "How do you figure out how old someone is online? I think that has been the unsolved question since the internet has been around."

The Arkansas law requires social media companies to verify users' ages through a third-party vendor before allowing access to a platform, while the California law, and proposals modeled after it, calls for age estimation proportionate to risk and potential harms to children, defined as 18 and under.

Maynard Nexsen Shareholder Starr Drum, CIPP/E, CIPM, FIP, said there is no "one-size-fits-all" approach that will work for every organization, and identifying whether children are using a platform or website, through age-assurance or age-verification measures, requires a delicate balance. Often, the technologies used to determine a user's age can require the collection of additional personal information, she said, like biometrics through a facial scan.

Sanchez said she's concerned a focus on age verification may take away from "really focusing on the privacy protections that we are doing."

"How have we advanced the bar for privacy? Some privacy provisions would benefit both kids and adults, if states had a comprehensive bill that applies to both. But the focus right now is on kids and in some states the focus is on how do we just put up barriers to kids being able to access services rather than addressing what the underlying issues are with the service," she said.

"I think the age assurance question, knowing who your audience is, that's just one tool in the toolbox. It just gets to one piece of the puzzle. It doesn't necessarily speak to the underlying questions of what are your data practices, what are your default settings. So, I just worry that we will get too far down the road of this age verification question and not necessarily the privacy question states are seeking to answer."

'A sweeping change'

The movement toward age-appropriate design codes involves standards that are "a big shift" from established regulations under COPPA, Sanchez said. The California code defines covered businesses as those providing "an online service, product, or feature likely to be accessed by children" and includes requirements for privacy-by-default settings and data protection impact assessments.

Not only does the age of coverage extend from 13 under COPPA to 18 under the California code, but "likely to be accessed" provisions could cover most online services, Sanchez said. And companies not previously regulated by COPPA could face requirements under an age-appropriate design code.

"It's kind of like a sweeping change for the industry for multiple reasons," she said.

Drum said "every single website platform, with very limited exceptions," should anticipate children will access their sites and design with that in mind.

"What child isn't online these days? What child isn't trying to circumvent age controls so they can access a platform their parent doesn't want them to? Or what parents don't want kids to be babysat by online activities that occupy children's time and make parenting somewhat easier," she said. "It's really a conundrum and there's not a clear answer."

While the California Age-Appropriate Design Code Act creates a task force to bring forth guidance and allows for rulemaking by the attorney general, Minnesota and Nevada have not included such provisions in their proposals.

"What you see is what you get," Sanchez said. "You're not going to get further guidance or information. What's in the bill is what you will be working with for compliance."

Organizations can turn to draft guidance from the U.K. Information Commissioner's Office, which implemented the Children's Code in 2021 — the model for California's code. The draft guidance includes information on how to assess whether children are likely to access a service, with case study examples.

'Substantial uncertainty'

While age-appropriate design code proposals generally follow the same model, there could be differences between states – like requirements around data protection impact assessments and cure periods. Combined with the social media access laws, like those in Arkansas and Utah, Drum said "we're already starting to see" a patchwork of regulation around children's privacy "and it's going to impact how businesses treat consumers in different places."

"It's definitely possible," Sanchez said. "We have not just this age-appropriate model, but now we have the Utah model that has already passed and a copycat passed within a matter of days with the Arkansas bill. So I think we will see a patchwork. The age-appropriate design code is very different from Utah's social media restriction law and those will be, in fact, hard to reconcile."

Hogan Lovells Associate Sophie Baum, CIPP/US, said a lot will depend on what happens at the federal level, and the activity at the state level could certainly prompt action there.

"There was a lot of movement in terms of children's privacy in Congress last summer and I think we expect to see that again. So it will depend on whether that sets a floor or a ceiling," she said. "The goal from all sides – from private industry, from advocacy organizations, from state and federal government – is to protect kids. I don't think there's any disagreement about that. The challenge is how do we do it in a way that is manageable, that actually has real impact. At what point do compliance standards get in the way of an opportunity for real action to protect kids or to safeguard their data."

While some companies are waiting to see how the litigation around the California code plays out, Baum said those who are paying attention to all the activity in the children's privacy space "are already a step ahead."

"Track legislation. Keep an eye out, in particular, for differences between these bills that might have a meaningful or material change in how you would develop a compliance program," she said. "We expect to see more of these bills coming down the pike, or passing this year. Whether the Utah-style bills will proliferate may be more of a question given the impact it has on children's autonomy and freedom and access to information, but I think we expect to see more of these age-appropriate design code bills coming, and I think we expect to see more action at the federal level."

While there is "substantial uncertainty in this area," Sanchez said the "big-picture questions" are not going away.

"I don't think the question of what additional protections teens need online is going away, I don't think the age-assurance question is going away. I think there's just certain things that once you ring the bell you can't unring them," she said. "Even President Biden called for additional protections for kids and teens online. He didn't say pass an age-appropriate design code, he didn't say pass Utah's social media restriction bill. But he did say we need to do more for kids and teens online, and I think people are hearing that and taking different interpretations."

Enhanced privacy protections, for all ages, is something consumers "in general want," Common Sense Media Policy Counsel Irene Ly said, as they have a better understanding of what data is collected from them, how much it is shared, and how it is used. Companies have a choice to be forward-thinking about ways to protect children, and consumers, online.

"You can be thinking proactively about that now, or you can be waiting until there is legislation passed and then you will be required or forced to," she said. "We hope some companies will be realizing these are just some baseline protections that would be helpful for everyone, something we could be implementing now."

Children's Privacy and Safety

Children’s Privacy and Safety intends to help the practitioner and advocate on their journey to protect children’s data privacy and safety. The privacy laws and safety protections impacting children vary from country to country and across industries and data types, making it hard to apply a singular global approach. This comprehensive treatise is intended to provide reliable and substantive background on children’s privacy, data protection, and safety issues.

View Here


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

2 Comments

If you want to comment on this post, you need to login.

  • comment Maximilian Regner-Bleyleben • Apr 27, 2023
    Yet again, a privacy professional who should know better conflates - in writing! - the privacy invasiveness of facial recognition with the privacy-preserving age estimation method of facial scan.  
    
    Facial age estimation is not facial recognition, and writers on this topic ought to be doing more to rebut that oft-repeated and incorrect assumption.
    
    When done correctly, facial age estimation does not require any identity information at all and never needs to know who the person is.  It simply analyses the pixels of the image & compares them to those of anonymous known-age subjects.  This takes seconds, and the facial image is then deleted.
    
    All other commonly used methods of age assurance - scanning an ID card, using a payment card, providing a tax ID number - are vastly more privacy-invasive.
    
    We (privacy professionals) have a duty to help policymakers understand the material difference between facial recognition (privacy-invasive), facial authentication (what you do on your iPhone every day) and facial age estimation (privacy-protective).
  • comment Peter Gallinari • May 1, 2023
    First off, you have to educate the parents. Code alone will solve nothing. Accountability and someone held responsible for children's action need to be enforced. You walk around today and an 8yr old has full access to a cell phone, with the adults permission and no limitations to what the child is viewing and inputting. No code fixes that. You have a much bigger cultural problem of adult entitlement of how they will raise their child. Education and awareness to adults has to happen. I love technology, but we always miss the human element and social society of that technology. I am all for things like facial recognition, but, will that fly in todays world? Great topic, and lets keep plugging at it.