A New Era of Enforcement

The digital landscape for children is constantly evolving, and so is regulatory enforcement.

While the Children’s Online Privacy Protection Act (COPPA) has been a cornerstone of kids’ privacy in the U.S., recent actions from the Federal Trade Commission (FTC, with filings brought through the Department of Justice), UK’s Advertising Standards Authority (ASA), and the Michigan Attorney General signal a shift: brands, advertisers, and content owners—not just platforms—are now directly in scope.

If your actions lead to data collection from children, you’re responsible–regardless of intent or where the content is hosted.

The FTC’s Case with Disney: A Turning Point

Disney has agreed to pay $10 million to settle FTC allegations that it violated COPPA. The case centered on how Disney labeled its content on YouTube.

According to the complaint, Disney marked several of its YouTube channels as “Not Made for Kids” (NMFK), but then failed to properly mark individual videos that were child-directed as such. Many of the videos were singalongs, animated shorts, clips from Frozen, Toy Story, and other popular films that were clearly child-directed. Notably, in some cases, similar content was inconsistently labeled across different videos and channels. Because certain child-directed videos weren’t marked as “Made for Kids” (MFK):

  • YouTube treated the videos as general-audience
  • Persistent identifiers were collected
  • Targeted ads were served
  • Autoplay and comments remained active
  • Disney earned revenue from YouTube’s ad program

The complaint alleges that Disney’s mislabeling of child-directed videos as “Not Made for Kids” enabled YouTube to treat them as general-audience content, allowing the collection of children’s personal data through monetization.

The complaint is not about Disney’s marketing and advertising of its titles in other content, it’s specifically about how child-directed videos were mislabeled on YouTube, which enabled monetization through targeted advertising on those videos, in violation of COPPA.

This case builds on the FTC’s 2019 settlement with YouTube, where the agency fined the platform $170 million for COPPA violations. As part of that action, the FTC explicitly warned that it would pursue individual content owners—not just platforms—if mislabeling enabled data collection from children. The DOJ’s complaint, filed at the FTC’s direction also noted that in June 2020 YouTube itself reclassified more than 300 Disney videos from NMFK to MFK across multiple Disney channels, yet Disney did not revise its policy to ensure correct per-video designations. 

Under the settlement, Disney must now conduct video-level audience reviews for all YouTube uploads for at least 10 years—unless YouTube implements platform-wide age assurance.

🧠 Lesson: Content owners are now being held responsible for data collection that results from how their content is labeled—even on platforms they don’t control.

Context is a Signal that Matters—And So Does Who’s Watching

Two more cases reinforce that it’s not just the label that matters, it’s whether the content is child-directed.

In the UK, Domino’s Pizza breached the CAP Code by running an ad for their cookie products during a Minecraft YouTube video. Domino’s had followed YouTube’s ad rules for HFSS (high fat, salt, sugar) products: the campaign was age-restricted to 18+, excluded “Made for Kids” channels, and flagged appropriately.

But the ASA ruled that the ad placement was inappropriate because of the content itself — the video featured cartoon-style avatars, a high-pitched narrator, and was based on Minecraft — all creative elements that strongly appeals to children. Despite Domino’s use of YouTube’s age targeting tools, the ASA found the likely audience included under-16s, making the placement non-compliant.

🧠 Lesson: Platform tools don’t override legal obligations. If the content has strong child appeal or likely attracts a significant under-aged audience, protections apply—regardless of your targeting settings.

In the US, the Michigan Attorney General sued Roku for allegedly violating COPPA by collecting children’s personal information (including voice recordings and geolocation) without verifiable parental consent. The AG argued that Roku relied on co-viewing assumptions instead of assessing whether the programming was child-directed. You can read our blog about this case here.

The Broader Lesson

Across these cases, the pattern is clear: the legal standard follows the content.

Whether the issue is mislabeling content, misplacing ads, or misjudging co-viewing, the failure is the same: not applying the right protections to child-directed content.

Regulators are aligned: accountability sits with the advertiser, content owner, or brand—not just the platform. If your decisions lead to children’s data being collected, you’re on the hook.

How SuperAwesome Could Have Helped

Platforms don’t — and often can’t –guarantee that child-directed content is labeled, filtered, or handled correctly.

SuperAwesome helps brands bring their own compliance, embedding kids’ privacy into campaign design from day one.

  • YouTube: Our Awesome Ads for Social product helps brands engage kids without using YouTube’s data, which may include personal information if content is mislabeled. We rely on human-reviewed contextual signals, and not just platform metadata, to avoid risky placements and ensure compliance
  • Connected TV: AwesomeAds delivers contextual ads on youth-appropriate inventory without using personal information. And while CTV platforms ultimately control the final ad delivery, AwesomeAds ensures compliance by preventing personal data collection
  • In-Game & Mobile Apps: Immersive and interactive ad units allow for interaction without data collection which create fun experiences for GenAlpha, fund developers to build inclusively for youth and support engagement with brands

Creators: SuperAwesome helps brands work with creators who are the right fit for their audiences and ensures that content is age-appropriate and labeled correctly as Made for Kids (or not) in line with platform requirements. Through SuperAwesome’s SafeFam pledge, our creators commit to high standards for age-appropriate, responsible content.

Why This Matters

Responsibility for kids’ privacy now sits squarely with brands and content owners, not just platforms.

SuperAwesome helps meet that responsibility—without relying on platform tools, creator labels, or risky assumptions. From video to CTV to apps, we build compliance in from the start.

Because protecting kids online isn’t just a legal requirement—it’s part of being a trusted brand.