CNIL, ICO announce actions on children’s privacy; US regulators and policymakers keep their sights on children’s protections; Social media platforms roll out more safety features for teens

CNIL, ICO announce actions on children’s privacy 

France’s data protection authority, the Commission Nationale de l’Informatique et des Libertés (CNIL), announced an £800,000 fine against Discord for failing to comply with the General Data Protection Regulation (GDPR). CNIL took issue with Discord’s data security measures, its data retention policy, and for failing to complete a data protection impact assessment (DPIA). The CNIL disagreed with Discord’s view that it wasn’t required to complete a DPIA. Discord had argued that it processes very little data, all of which is from users over 15. The CNIL made clear that it considers anyone under 18 to be a ‘minor’ deserving of special protections under Recital 38 of the GDPR. By way of context, in 2020, the Children’s Advertising Review Unit (CARU) had issued a report confirming that it did not believe Discord is directed to children under 13, as defined under COPPA.

In the UK, the Information Commissioner’s Office (ICO) announced that TikTok could face a £27 million fine after an investigation determined that the company may have breached UK data protection law by processing u13’s data without appropriate parental consent and failing to present material information clearly. The ICO issued a notice of intent setting out its view, which TikTok has the chance to respond to. 

Meanwhile, a new report published by Ofcom (UK’s communications regulator which is preparing to regulate online safety under the Government’s proposed Online Safety Act) found that one third of children use adult accounts to inappropriately access social media, highlighting that “This means they could be placed at greater risk of encountering age-inappropriate or harmful content online.” 

US regulators and policymakers keep their sights on children’s protections

Although protecting children online is a bipartisan issue, there has been no legislative progress in Congress on child safety bills. Bills, including the first US federal data privacy bill, theAmerican Data Privacy and Protection Act (ADPPA), the Children and Teens’ Online Privacy Protection Act (“CTOPPA”) and the Kids Online Safety Act (KOSA) have not advanced. Congress is still considering adding KOSA to federal omnibus legislation (though many LGBTQ and human rights organizations have opposed it). 

In the meantime, the Family Online Safety Institute (FOSI) suggested that Congress capitalize on the interest in kids’ online safety and pass the Children and Media Research Advancement Act (CAMRA). CAMRA calls for research on the effects of digital media on the emotional and physical health as well as development of infants, children, and teens. And as Senators are coming under pressure, they’re also urging the FTC to update rules issued under the Children’s Online Privacy Protection Act (COPPA).

At the state level, New York introduced its own age-appropriate design code. The New York Child Data Privacy and Protection Act, which is similar to the California Age-Appropriate Design Code Act, would ban targeted advertising (and certain data collection) and require data controllers to assess the impact of their products on children. 

In Colorado, the Attorney General issued draft rules for the Colorado Privacy Act (ColoPA) that address how it will be implemented when it takes effect in July 2023. ColoPA requires opt-in consent for the processing of sensitive personal information (such as racial or ethnic origin, religious beliefs, and biometric data). ColoPA also requires consent for processing children’s data, applicable to those under the age of 13.

In the self-regulatory realm, the Children’s Advertising Review Unit (CARU) issued another decision against a mobile app publisher, holding that it violated COPPA and its self-regulatory guidelines (the Guidelines). While the case against Gameloft’s Disney Getaway Blast App addresses some run-of-the-mill COPPA issues, (failure to implement an effective age screen, unclear/inconsistent privacy policy and parental notice, blurring of advertising and editorial content), it also provides insight into what CARU considers to be manipulative design in the digital advertising context. It cited examples where players were presented with:

  • a disappointed character when the player ran out of moves. 
  • a green button labeled “Continue” and a red button labeled “Give up.” 
  • a second opportunity to pay for more moves with a screen that asks:  “Are you sure? If you give up now, you will lose your progress!”  

Gameloft agreed to make changes in line with CARU’s recommendations.

In October, the Federal Trade Commission (FTC) hosted a webinar, “Protecting Kids from Stealth Advertising”, which examined how advertising that is blurred with content can affect children. FTC Chair Lina Khan said that the agency is considering new rules to protect American children from “stealth” digital advertising that is “designed to exploit [their] insecurities for commercial gain”.

The newest FTC commissioner, Alvaro Bedoya, gave the keynote address at FOSI’s annual conference, offering insights into how the FTC views its role in protecting young audiences. Bedoya spoke about the need for updated privacy legislation to better protect kids online and said that an in-house FTC child psychologist would be useful for crafting rules impacting kids. He also touched on the “best interest of the child” standard under the UK and California age-appropriate design codes, and suggested that there should be a broad duty for companies to look into how their products affect their audiences in order to better protect them.

Social media platforms roll out more protections for teens

As social media platforms continue to grapple with how to keep young audiences safe, Meta announced updates to Facebook and Instagram it hopes will further protect teens from online harm. It is implementing measures to limit unwanted interactions, defaulting accounts for under 16s (or 18 in certain countries) to more private settings when they join, and releasing new tools to limit the spread of intimate images. 

TikTok is also working to make its community safer by raising the minimum age for access to its GO LIVE real-time interaction feature. The video-sharing platform already takes a graduated approach to features users can access based on their age. For example, users need to be over 16 to use direct messaging, send virtual gifts or access monetization features. Previously, users had to be aged 16 or over to host a livestream, however, now the minimum age has been raised to 18.

In other news:

  • A coalition of child advocates, including the Center for Digital Democracy and Fairplay, filed a petition asking the FTC to establish rules prohibiting social media platforms and app developers from using deceptive tactics against children. The petition urges the FTC to create rules to help clarify when practices reach a level of unlawful unfairness.
  • The Centre for Information Policy Leadership (“CIPL”) published a white paper on protecting children’s data privacy. The paper explores issues that organizations and data protection authorities face in the context of globally divergent legal standards and policy approaches. The paper also examines data privacy regulations in the context of empowering children to access and participate in online services and the concept of “best interest of the child”.

Warm wishes,
Katie Goldstein
Global Head of KidAware