For over two decades, U.S. privacy compliance revolved around a single number: 13. The Children’s Online Privacy Protection Act’s (COPPA’s) bright-line cutoff created the illusion that everyone older could be treated like adults. While that may once have been the case, it is no longer true.

Even without a dedicated federal privacy law, and with little likelihood of one passing this year, regulators, platforms, and lawmakers already treat teenagers as a distinct audience requiring heightened protection. GDPR’s higher age thresholds set the direction, and U.S. enforcement has followed. State privacy statutes now impose teen-focused obligations, attorney generals are scrutinizing teen-heavy environments, the FTC is already enforcing teen protections under its existing authority, and self-regulatory bodies have adopted the same approach.

The practical takeaway is clear: companies reaching teens must already operate with elevated safeguards. The regulatory baseline has moved, and businesses that wait for federal clarity will be reacting from behind.

States Are Filling the Federal Void

In the absence of a federal teen-privacy framework, states have already created de facto national standards. Several states now extend explicit protections to teenagers under 16, and some extend them all the way to 18.

Laws in California, Connecticut, and Oregon require opt-in consent before a teen’s data can be sold or used for targeted advertising. Maryland goes further by banning targeted advertising to anyone under 18 and prohibiting the sale of minors’ data. These statutes vary in scope and definitions, leaving companies to navigate an increasingly fragmented landscape.

Because this patchwork is difficult to operationalize, many brands are adopting a highest-common-standard approach and avoiding targeted advertising to users under 18 altogether. Major platforms have already begun moving in this direction. For example, Google restricts ad targeting for users under 18, and Meta limits targeting to age and location.

States are also advancing laws that amplify operators’ age awareness. App-store accountability laws recently passed in Utah, Texas and now California’s Digital Age Assurance Act, require operating systems to generate an age signal that developers must honor unless they have clear evidence to the contrary. Louisiana has advanced similar measures. While ongoing legal challenges may affect implementation timelines, the likely implication is clear: companies will soon hold more concrete knowledge of user age, and their privacy obligations will adjust accordingly.

Enforcement is reinforcing these expectations. Regulators are no longer relying solely on self-declared age or a child-directed label. They are evaluating what a service actually delivers and whether minors are predictably in the audience. The California DOJ’s recent settlement with Sling TV required youth-oriented channels to be labeled and personalized ads to be disabled by default after the state alleged that targeted ads reached minors. Similar actions involving Roku in Michigan and Florida follow the same rationale. When content or ad placement signals indicate that minors, including teens, are present, platforms are expected to comply with relevant laws.

Together, these developments mean more companies will have actual knowledge that they are dealing with teens, and they will be required to treat that data accordingly.

Platform Design Obligations for Teens Are Now Enforceable Law

Age-Appropriate Design Codes (AADCs) helped establish a comprehensive blueprint for treating minors under 18 as a protected group online. The UK’s AADC, in effect since 2021, set the early benchmark by extending safeguards to all users under 18 and demonstrating how design obligations can exceed GDPR’s age threshold of 16.

The EU’s Digital Services Act (DSA) has since transformed those expectations into binding requirements for the largest platforms. The DSA prohibits profiling minors for advertising, mandates heightened protections for all users under 18, and requires systemic risk assessments, safer default settings, and transparency around recommender systems. In practice, it has made age-appropriate design a regulatory obligation rather than an aspirational model.

In the United States, AADC-style laws in Maryland and Vermont, along with California’s version (currently enjoined), reflect the same shift. These laws require services likely to be accessed by minors to assess risks, apply privacy-protective defaults, limit data collection to what is strictly necessary, and design features that mitigate reasonably foreseeable harms. Even where enforcement has been delayed by legal challenges, companies increasingly treat these requirements as baseline expectations for any environment that attracts teen users.

These standards now shape the experiences teens use most. In gaming, brands developing content on platforms such as Roblox or Fortnite, integrating social or interactive features, or experimenting with immersive ad formats are expected to meet design-forward safeguards. SuperAwesome already operationalizes these principles by reviewing each campaign for finite gameplay, clear disclosures, and the absence of design mechanics that promote extended engagement.

Major platforms are moving in the same direction. Instagram has introduced more restrictive defaults and additional protections for users under 18. Roblox has partnered with the Attorney General Alliance’s Partnership for Youth Online Safety to identify and implement design-based safeguards across youth experiences. ChatGPT has implemented comprehensive parental controls and published its Teen Safety Blueprint framework, committing to default teen-safe settings and age-appropriate AI design. Taken together, these developments confirm that age-appropriate design is not an emerging concept. For teens, it is the operational baseline.

The FTC Is Already Enforcing Teen Protections

Even without an age-specific federal teen privacy statute or an AADC in force, federal enforcement is already addressing teen risks under existing law. In the US, Section 5 of the FTC Act is the primary tool.

Recent settlements involving Epic Games and Genshin Impact mark a meaningful shift in how the Commission approaches teen harms. In both matters, the FTC used its unfairness authority to challenge design choices and monetization systems that created predictable risks for teen users. Together, these actions show the FTC applying Section 5 to teen-related harms at a scale not seen previously.

These cases establish a clear trajectory. The FTC has demonstrated it will use Section 5 to address teen-related risks tied to design, defaults, and monetization, and its recent actions signal that this enforcement approach will continue. The FTC has also stated several times under FTC Chair Andrew Ferguson’s leadership that protecting kids (meaning those under 18) is a top priority. Check out our blog about these conversations and more in our blog here

Self-Regulation Is Tightening Around Teen Marketing

Self-regulatory bodies are now examining categories with heavy teen exposure. The National Advertising Division (NAD) has begun scrutinizing brands that are popular with teen audiences, even when the campaigns are not explicitly directed at teens. Recent reviews of influencer content from Skims Body, NuOrganic, and Drunk Elephant focused on whether claims and disclosures met the standards expected for teen viewers. The signal is clear: if teens are likely to see the content, advertisers are responsible for making it appropriate.

BBB National Program’s newest program, the Institute for Responsible Influence, reinforces this. The program is dedicated to improving influencer marketing practices, reflecting the reality that influencer content is one of the primary ways teens discover and engage with brands. The initiative confirms that teen audiences are now central to how self-regulators evaluate advertising.

Consumer litigation is echoing the same concerns. Class actions against brands such as Shein and Celsius demonstrate that when marketing blurs disclosure lines or appears to take advantage of young consumers’ trust, the consequences can be swift and significant – both legally and reputationally.

The Takeaway: Treat Teens Responsibly – Because Regulators Already Require You To

Taken together, these developments show that accountability for teen-facing marketing now comes from every direction: federal and state regulators, self-regulatory bodies, and increasingly consumers. Teens are no longer a regulatory gray zone, and companies cannot rely on the absence of a federal statute to justify outdated practices.

The question for brands is no longer whether teen protections will be required, but whether their current practices can withstand scrutiny. Partnering with a youth privacy and safety expert such as SuperAwesome enables companies to implement defensible safeguards now, rather than reacting later to regulatory or reputational harm.