California passes far-reaching children’s privacy legislation; Federal efforts on children’s privacy continue; UK’s Online Safety Bill faces challenges, as Instagram is fined €405m by Ireland; Roblox ages up, announces advertising tools; CARU announces its intention to monitor the metaverse
California passes far-reaching children’s privacy legislation
In August, California’s legislature passed the California Age-Appropriate Design Code Act (CAAADC), a new children’s privacy law that is the first statute in the US that will require apps and websites to install guardrails for users under 18. Now, Gov. Gavin Newsom has signed the bill into law. The CAAADC will limit the data that services are able to collect from u18s, restrict profiling of younger users for targeted advertising and mandate “age-appropriate” content policies.
This article provides a useful analysis of what the bill means for businesses. The CAAADC is modeled on the UK’s Age Appropriate Design Code (the Children’s Code), which in just one year has had a huge impact on children and teen privacy. California’s law may likely serve as a model for other states in the US to follow suit. New York has already introduced its own similar bill to protect children’s online data, and to guard against targeted advertising.
Federal efforts on children’s privacy continue
In the US, both chambers of Congress have weighed in on children’s privacy over the summer. As legislators continue to debate the American Data Privacy and Protection Act (ADPPA), the country’s first serious effort at legislation to establish nationwide standards for consumer privacy, Senator Cantwell (D-Wash.), chair of the Commerce Committee stated that she would not consider the bill, effectively blocking it for now. Although the bill has been supported by consumer advocates, it has also been criticized (by a group of attorneys general and the California Privacy Protection Agency) over fears that federal preemption of state privacy laws would roll back state-level privacy rights.
In light of the bottleneck, Senate Democrats pivoted to focus on expanding protections for children. In a surprising turn of events, an updated version of the Children and Teens’ Online Privacy Protection Act (“CTOPPA”) and the Kids Online Safety Act (KOSA) were voted out of the Commerce Committee and will next head to the Senate for a full vote. While some senators are already celebrating, it remains to be seen whether they can advance given the Senate’s limited legislative calendar left this year.
The Federal Trade Commission (FTC), meanwhile, is looking at how marketers reach kids via digital media, like video-sharing platforms, influencer content, games and virtual worlds. The FTC is seeking public comment (through 18 November) on how kids are affected by digital advertising and marketing messages that blur the line between ads and entertainment. In conjunction with the public consultation, the FTC is hosting a virtual event in October, featuring researchers, child development and legal experts, consumer advocates, and industry professionals to debate the topic.
Meanwhile, the FTC’s review of COPPA is in its third year, leading Commissioner Wilson to urge her colleagues to prioritize this work.
UK continues efforts to protect children despite challenges
Across the pond, the UK’s Online Safety Bill is facing a new obstacle under the leadership of new Prime Minister Liz Truss. According to the Financial Times, Truss is set to water down the draft legislation due to concerns over restriction on free speech and regulatory overreach. Big Tech companies that will likely fall within the law’s scope will be watching closely to see the fate of this bill, which they have fervently opposed.
The Information Commissioner’s Office (ICO) is currently investigating how over 50 different online services (including at least two large social media platforms) are complying with its Children’s Code. It is unclear whether the fact-finding will lead to any specific investigations.
The Irish Data Protection Commission (DPC) fined Instagram €405m for violating the General Data Protection Regulation (GDPR). This is one of the largest fines ever issued under the GDPR, which has been criticized for being weakly enforced. The DPC claims Instagram allowed teenage users (13-17) to operate business accounts, which included publicly-visible personal information including phone numbers and email addresses. Meta, Insta’s parent company, already shared its plans to appeal and claims the case is based on out-of-date settings. The DPC also announced a draft decision in its inquiry into TikTok, which examined the processing of u18’s personal data in relation to platform settings that were set by default to public rather than private, and to its approach to verifying the age of younger users. The DPC is consulting with other EU data protection authorities on its opinion.
Roblox ages up, announces advertising tools
At Roblox’s annual developer’s conference, CEO David Baszucki announced that the majority of its audience is now over the age of 13. At the same time, age ratings agency ESRB changed the platform’s E10+ rating to a T for teen rating. These moves reflect a concerted effort to ‘age up’ Roblox, and to persuade more brands and advertisers to invest in the platform. Roblox further announced that it will add age guidelines to individual ‘experiences’ in order to give more control to parents of younger players. All experiences will be categorized as either appropriate for all ages, for those 9+ or for those 13+.
Roblox also previewed an upcoming new ad platform that will allow advertisers to reach players in its games across interactive billboards, posters, and other surfaces (i.e., an ad atop a taxi). The new advertising features are set to fully launch next year.
CARU announces intentions to monitor the metaverse (and brings three new cases)
The Children’s Advertising Review Unit (CARU) issued a compliance warning to put brands on notice that CARU’s Guidelines apply to advertising in the metaverse and shared its intention to strictly enforce its guidelines in the space. CARU flagged several practices that companies need to be “particularly cautious” about, such as the blurring of advertising/entertainment content, influencer practices, manipulative tactics, and clear and conspicuous disclosures. In a follow-up blog, CARU provided context about how it believes companies can be age appropriate in the metaverse.
CARU also announced the results of three investigations. First, it found that an app featuring SpongeBob violated COPPA and also called it out for problematic ad practices that “unduly interfered with gameplay, encouraged excessive ad viewing by kids through deceptive door openers, and other manipulative techniques, which include requiring kids to download apps before allowing the child to exit the ad.”
Second, it found Firefly Games in violation of COPPA and also took issue with its ad disclosures and method for kids to exit ads. Although CARU doesn’t require that ads can be exited, where ads do have a method to do so, CARU requires that they are clear.
Finally, CARU released its second case based on a new guideline on negative gender and racial stereotypes. CARU found that a doll from Moose Toys, created to fix her beauty “fails”, violates the guidelines by perpetuating stereotypes that girls must look perfect to feel good about themselves. According to CARU, these advertising messages place undue pressure on girls to conform to unrealistic standards of beauty and perfection to see themselves as valued. In addition, CARU looked at the racial and cultural makeup of the dolls and determined that the characteristics and personalities attributed to each doll were likely to perpetuate racial and cultural stereotypes, rather than promote inclusion.
In other news:
- Social media platforms continue to announce measures to further protect teens. Snapchat rolled out a new family center that will allow parents to monitor who their teens are talking to without being privy to the conversation itself. The change will allow parents to report accounts they are concerned about. TikTok announced a new rating system to restrict mature content from reaching teenagers and removed posts promoting weight loss to under 18s.
- 5rights, a UK advocacy organization, accused edtech companies of leaving children’s data susceptible to exploitation. 5rights’ research revealed that Google and other third parties tracked children while they used Google’s educational products. The group shared the report with the ICO and Department of Education for further investigation.
- A study by Pixalate, found that more than two-thirds of the 1,000 most popular iPhone apps and 79% of Android apps that are likely to be used by children collect and share their personal information with the advertising industry, potentially in violation of privacy laws.
- In Australia, the Children and Media Australia, commissioned a campaign called Apps Can Track, which determined that six in 10 kids’ apps engage in problematic data collection.
Warm wishes,
Katie Goldstein
Global Head of KidAware