KidAware Bulletin – April 2021
To subscribe to the KidAware Bulletin, which provides useful insights on the kids’s digital privacy regulatory landscape, please register here
Disney, Viacom settle kids’ privacy lawsuits; US lawmakers focus on child safety & privacy; YouTube Kids in Congress’ crosshairs; Consumer groups urge investigation of Google Play apps; Facebook to launch Instagram for kids
Class-action settlements with Disney, Viacom and adtech firms could prompt industry-wide changes
A federal court in California granted final approval of three related class-action settlements against Disney, Viacom and Kiloo/Sybo. Originally filed in 2017, the lawsuits alleged that the companies violated the privacy of millions of child users by using software development kits (SDKs) in popular children’s mobile games that inappropriately tracked users and targeted them with ads without obtaining parental consent. The suits also named ten adtech companies as co-defendants including Twitter’s Mopub, Unity and Comscore.
In a novel legal strategy, the lawsuits were initiated by parents claiming “intrusion from seclusion” under state privacy law and citing the Children’s Online Privacy Protection Act (COPPA) as the standard for kids’ privacy. This is a way to sidestep the fact that COPPA does not allow a ‘private right of action’, e.g. civil lawsuits. We analyzed the importance of these cases for both advertisers and publishers in our blog when they were filed.
Although there were no financial penalties under the settlements (except for legal fees), the companies (who admit no wrongdoing) must remove or disable certain software in order to limit the personal data collection and to block the behavioral targeting of children with ads. This is essentially nothing more than being required to comply with COPPA. The adtech defendants are required to make changes to their SDKs to enable switching off personal data collection in relation to kids’ apps. Given that these requirements are a basic part of complying with COPPA, one would hope they implemented them years ago.
There are some useful lessons in the settlement for app developers. They highlight the importance of thoroughly vetting partners to ensure any SDKs are configured for compliance. It remains clear from these cases that signaling mechanisms like the so-called COPPA flag continue to be ineffective in ensuring that third parties act on information that an app or site is child-directed. As always, it’s better to use kidtech that suppresses personal information at source, protecting both the developer and their partners. Furthermore, operators should ensure privacy policies are precise and clearly explain what information SDK partners are collecting.
US lawmakers ramp up concern about kids’ safety and privacy
The pandemic has significantly increased kids’ screen-time. The New York Times put it best: “Coronavirus ended the screen-time debate. Screens won.” With their educational and social lives having moved almost entirely online under lockdown, kids’ use of technology has drawn increased bipartisan scrutiny.
At a House hearing last month they interrogated the CEOs of Facebook, Twitter, and Google. Lori Trahan (R-MA) asked them about “manipulative design features intended to keep [kids] hooked,” such as auto-play functionality, endless scrolling and filter effects on photos. “This committee is ready to legislate to protect our children from your ambition,” Trahan stated. “What we’re having a hard time reconciling is while you’re publicly calling for regulation—which comes off as incredibly completely decent and noble—you’re plotting your next frontier of growth which deviously targets our young children.”
The House Energy and Commerce Consumer Protection Subcommittee hosted a hearing called “Kids Online During COVID: Child Safety in an Increasingly Digital Age”, during which representatives raised concerns about so-called dark patterns, predatory practices and techniques that encourage prolonged engagement, comparing some of them to gambling or physical stalking.
Rep. Castor (D-FL) announced her intention to reintroduce the Kids Internet Design and Safety Act, which proposes to prohibit manipulative design. Members of both parties seemed to agree that existing COPPA protections are insufficient and Castor and Walberg (R-MI) announced their intention to reintroduce past proposals to update COPPA, the PRIVCY Act and the PROTECT KIDS Act, both of which would raise the age below which children would have specific legal protections to above 13.
This comes at a time when the Federal Trade Commission (FTC) is gaining new momentum under the Biden Administration. Acting FTC chair Rebecca Slaughter created a new rulemaking group within the FTC’s Office of General Counsel to streamline drafting new rules to prohibit unfair or deceptive practices and unfair methods of competition. The FTC is also expected to take more aggressive enforcement actions. Slaughter has been vocal that previous settlements have not gone far enough, even adding that executives should be held personally liable for violations.
While Congress has had a difficult time finding bipartisan agreement when it comes to federal privacy legislation, children’s online safety seems to be the bridge on which politicians can find common ground.
YouTube Kids under investigation
A House panel announced an investigation into YouTube Kids, the platform intended to serve as a safe space to view “enriching content for kids.”
In a letter to YouTube CEO Susan Wojcicki the Subcommittee on Economic and Consumer Policy said the tech giant is not going far enough to protect children from inappropriate marketing. It has requested information about the platform’s content quality and the impact that increased screen time has on children’s development. The letter accused the company of offering a “non-stop stream of low-quality commercial content and even went so far as to describe the platform as a “wasteland of vapid, consumerist content.”
While laudable as an improvement to the way privacy information is communicated to consumers, the move has also been criticised for potentially leading to regulatory confusion. Developers preparing for the increased disclosures face risks that their responses do not match their long-form privacy policy, and many may not realise that the Apple disclosures alone do not discharge their legal obligation to provide notice under data privacy laws including GDPR.
This investigation comes after the $170m settlement that YouTube reached in September 2019 with the FTC and New York State Attorney General over allegations it knowingly collected and used the personal data of kids without the consent of their parents as required under COPPA. As part of the settlement, YouTube was required to restrict data collection and remove targeted ads on content for minors. Recently, it made an effort to increase parental controls for teens and tweens and has stated that it “made significant investments in the YouTube Kids app to make it safer, and to serve more educational and enriching content for kids…”
According to the letter, however, ads continue to reach children on YouTube Kids through “smuggling in hidden marketing and advertising with product placements by children’s influencers”. The letter requests a range of information from the video giant that the Committee hopes will shed light into how the platform operates. This investigation may prompt changes that will further restrict advertising practices and content offered on the platform.
Consumer groups urge FTC to investigate whether Google promotes apps that breach COPPA
The Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) filed a complaint with the FTC to investigate whether apps in the Google Play Store labeled as “Teacher Approved” are unlawfully collecting personal data, without parental consent, to target ads to children.
These consumer groups had filed a similar complaint in 2018 and while they acknowledge that Google has made some changes, they say the company has yet to cure COPPA violations. Google defended its commitment to children and families, stating that it “will continue to make the protection of children on our platform a priority.”
The apps mentioned in the letter are labeled “Teacher Approved,” which means they have been rated on factors like “age appropriateness, quality of experience, enrichment, and delight.” Even though it may not have been Google’s intent, the nature of the “Teacher Approved” label may mislead consumers to believe that the apps have also been vetted for COPPA compliance and other safety measures.
Angela Campbell, chair of CCFC’s board, stated that the consumer groups’ goal is to have the FTC and Google reach a settlement whereby Google Play enforces its own requirements for developers, which ensures that children’s apps comply with COPPA.
Facebook’s plan to launch Instagram for kids comes under fire
Instagram, which is owned by Facebook, is planning to launch a version of the platform for children under 13. The news broke two days after Instagram published a blog post, committing to do more to protect its youngest users on the platform, which is restricted to kids over 13.
The post highlighted new safety features, which are intended to prevent adults from interacting with minors. They include the restriction of direct messages between teens and adults they don’t follow, and just-in-time prompts to urge teens to use caution in their messages with adults. Notably, the post did not allude to any plans to build a new platform for children.
Although Instagram has not announced a detailed plan about the new app’s development, it has shared that it will not carry ads and that the new version will provide parents with “transparency or control”.
Facebook recently shared that it is working to find a practical solution to the pervasive problem of children lying about their age. Pavni Diwanji, a former Google exec who led policy for YouTube Kids has been tapped by Facebook to lead its new Instagram initiative. The historic FTC fine against Google, of which Diwanji is undoubtedly acutely aware, may have influenced the decision not to offer advertising.
SuperAwesome’s CEO, Dylan Collins was quoted in Digiday stating that “the legal requirements for advertising [to kids under 13] is the precise opposite of Facebook’s business model, and so I think that that [would have been] an interesting cultural dynamic to manage if they [had planned] to sell advertising.” It remains to be seen whether Facebook will seek to generate revenue from the new app.
Unsurprisingly, consumer groups worldwide want Facebook to scrap its plans. Lawmakers have expressed serious concern about it in a letter to Zuckerberg, pleading that Facebook should consider child users’ welfare first. Lawmakers are skeptical of the company’s commitment to children’s well-being given its previous failure to protect the privacy of children using its services.
What else mattered?
- A 2019 class action lawsuit against Amazon over kids’ privacy can go ahead, according to a federal appeals court. The suit alleges that Amazon’s voice assistant, Alexa, breaches privacy laws in eight states by collecting and storing children’s voiceprints. The court ruled against Amazon’s position that kids are bound under Alexa’s terms to submit to arbitration rather than trial. The lawsuit came a few weeks after several consumer organisations filed a complaint with the FTC accusing Amazon’s Echo Dot Kids Edition smart speaker of breaching COPPA.
- Former children’s commissioner for England Anna Longfield has launched legal proceedings against TikTok on behalf of 3.5 million children. The suit alleges TikTok illegally collected personal data from kids without sufficient notice or parental consent under the General Data Protection Regulation (GDPR). The suit alleges “excessive” data collection practices, which include precise geolocation, biometric data, and behavioral information about its users. According to TikTok’s privacy policy, such data may be shared with advertising and marketing partners. Longfield says this will be a “powerful test case”, and a “wake up call” for other social media platforms.
- Germany passed a new law, the Jugendschutzgesetz (Youth Protection Act), that requires operators of digital experiences to adopt uniform age ratings, and to disclose (via symbols alongside the rating) “cost traps, open social features and gambling-like elements.” A new agency has been appointed to enforce the Act, with the power to fine companies.
- Thinktank DataEthics.eu published a report about data privacy risks faced by children online. The report has prompted criticism that the Danish Data Protection Agency (DPA) must go further to protect children. The DPA has shared its intentions to increase its efforts, including proposing a common standard for game producers with the European Data Protection Council.
- Brazil announced an inquiry aimed at banning loot boxes on the basis that the randomized monetization mechanics are a form of gambling, which is illegal in Brazil. Germany recently proposed a reform that may result in new standards for loot boxes as well. These proposals come on the heels of a report from the UK, which held that loot boxes are “structurally and psychologically akin to gambling”.
- As of November 2020, Ofcom has had the power to regulate UK-established video-sharing platforms (VSPs). Under the new rules, VSPs must have measures in place to protect under-18s from harmful content and all users from criminal content and incitement to hatred and violence, as well as meet standards for responsible advertising. Companies deemed VSPs must register with OfCom before 6 May 2021. Ofcom is currently consulting on draft guidance on the specific measures providers can take to comply.
- Google acknowledged in a meeting at the World Wide Web Consortium that Federated Learning of Cohorts (FLoCs) may not be compatible with European privacy laws such as GDPR and the ePrivacy Directive. FLoCs are Google’s proposed replacement for cookies, and enable targeting advertising at groupings of users with similar interests without collecting unique personal data. A Google engineer stated that it will not proceed with FLoC origin trials in Europe, at least for now. However, after the meeting, a product manager for Chrome tweeted that Google is “100% committed to the Privacy Sandbox in Europe.” Meanwhile, a slew of browsers including Safari, Edge and Brave, have decided they would not be supporting FLoCs.
- The Children’s Advertising Review Unit (CARU) hired FTC veteran Mamie Kresses as its new Vice President. Kresses spent over 30 years in the Bureau of Consumer Protection working on advertising law, online privacy and COPPA enforcement. Kresses co-led the FTC’s 2012 COPPA rule review.
- The 5Rights Foundation published a report on the state of age verification technologies called ‘But how do they know it is a child?’ The comprehensive report delves into the challenge of age verification with the goal of inviting children into a richer digital world rather than restricting their access. It sets out 11 standards it would like the UK government to incorporate in the new Online Safety Bill to ensure age verification is both effective and privacy-preserving.
Stay safe and healthy.
Kind Regards,
Max Bleyleben, Managing Director
& Chief Privacy Officer