KidAware Bulletin – December 2020
To subscribe to the KidAware Bulletin, which provides useful insights on the kids’s digital privacy regulatory landscape, please register here
UK presents final Online Harms intentions; Irish data protection regulator faces social media platforms; app stores get stricter with kids’ apps; new video content rules come into force in the EU; games industry gets to grips with the Children’s Code
UK presents final Online Harms paper, promises legislation in 2021
This week the government published its long-awaited final position on Online Harms, previewing new legislation in 2021 that will go further to protect consumers—and particularly children—from harmful content than any country. The publication follows growing criticism over the summer about delays in advancing the legislation and industry concerns about increasing overlap (and potential confusion) among UK government initiatives to mitigate online risks to kids..
The government’s comprehensive outline sets out its intentions to give Ofcom sweeping regulatory authority over the biggest digital platforms operating in the UK, alongside the Information Commissioner’s Age Appropriate Design Code and subsuming the recently implemented Audiovisual Media Services Regulation.
Read on for more analysis on our blog.
Irish regulator confronts social media giants
Ireland’s Data Protection Commissioner (DPC) has been busy this autumn. In October, it was reported that the DPC is investigating Instagram to determine whether “Facebook has a legal basis for processing children’s personal data and if it employs adequate protections and restrictions on Instagram for children.” The case responds to specific consumer complaints about kids’ phone numbers and email addresses being displayed publicly when users converted their personal accounts to business accounts. According to Instagram’s terms of service, users must be at least 13 to create an account.
The investigation comes alongside the DPC’s ongoing public consultation on the processing of children’s personal data—which led to the publication of new GDPR guidance last week (more analysis of this to come on our blog)—and their preliminary ruling in September that Facebook should stop data transfers to the US about EU users.
The DPC is the most influential data protection regulator in the EU because some of the largest platforms—including Google, Facebook, Twitter, LinkedIn, Apple—made Ireland their European headquarters and are hence subject to the DPC as their lead supervisory authority for GDPR. To handle this enormous responsibility the DPC has been trying to grow its team of data protection professionals to 180 by the end of 2020. With hundreds of live investigations underway, the regulator announced a fine against Twitter for failing to secure private tweets from Android users (disclosed in early 2019).
App Store platforms get stricter on kids’ apps
In October, Google removed three apps from its store for breaching its policies regarding kids’ data privacy. This followed publication of a report by the International Digital Accountability Council, who found that the three apps—Princess Salon, Number Coloring and Cats & Cosplay—were sharing user data with SDK partners that included the unique Android Advertising ID or AAID. Such IDs allow users to be tracked across apps and domains and enable third parties to link the user to other app information such as geolocation. It is for this reason that—under the US law, COPPA—persistent identifiers such as the AAID are considered personal information that may not be collected from kids without parental consent.
In the meantime, developers submitting apps to the Apple app store faced new requirements as Apple implemented enhanced privacy notices. For several months now Apple has been asking app makers to provide detailed information about the types of personal data they collect and what they use them for. This information is being distilled into what is intended to be a user-friendly privacy label consumers can use to assess an app’s privacy practices, akin to nutrition labels found on food products. The labels went live this week across all Apple platforms.
While laudable as an improvement to the way privacy information is communicated to consumers, the move has also been criticised for potentially leading to regulatory confusion. Developers preparing for the increased disclosures face risks that their responses do not match their long-form privacy policy, and many may not realise that the Apple disclosures alone do not discharge their legal obligation to provide notice under data privacy laws including GDPR.
In the meantime, Apple shows no signs of further delaying implementation of its controversial iOS14 changes, most notably the requirement that apps ask for opt-in consent from consumers for using the device’s advertising identifier, known as IDFA. This move, ostensibly to prevent user profiling, has led to opposition from unexpected quarters as advertiser groups in both France and the UK make the case to regulators that these changes are designed to punish Apple’s rival, most notably Facebook, and risk damaging the advertising ecosystem as a whole.
New video content rules come into force
In several EU countries, the revised Audio-Visual Media Services Directive or AVMSD, came into force this autumn, requiring video-sharing platforms to protect users, including kids, from harmful content. In the UK, the designated regulator, Ofcom, started enforcing the new rules from 1 November and recently issued guidance for video platforms to follow. Facebook and Instagram have reacted by disabling certain features in their messaging services last week (though it’s not clear why they felt those features required review under the new rules), and Google asked Gmail accountholders to opt into certain “smart features” such as sentence prediction, which requires email texts to be scanned.
Under the legislation the obligation to protect users from harmful content is very broad, requiring operators to know their audience, know their content and deploy means of preventing kids from seeing videos that are not appropriate to their age. It is likely to lead to significant increase in investment in age verification technologies and parental controls. A useful summary by Taylor Wessing can be found here.
Interestingly, in its final Online Harms paper (see above) the UK government this week confirmed its intention to replace the AVMS Regulation with its new Online Safety Bill expected in 2021
California’s kids’ privacy protections strengthened further
In November California voters passed Proposition 24 and in so doing, enacted another update to the country’s most stringent privacy law. The new law, CPRA, is set to have major impact on ad targeting: it “increases protections for children under the age of 16 with fines tripled up to $7,500 for violating the CCPA’s opt-in to sale right. Businesses must obtain opt-in consent to sell or share data from consumers under the age of 16.”
This goes beyond the requirement in the CCPA to obtain opt-in consent for sales of data. By including the sharing of data, the new law aims to capture the trading of targeting data that is prevalent in the ad ecosystem and one of the greatest sources of personal data ‘leakage’ leading to profiling of kids.
It follows Gov. Newsom’s decision in October to veto Assembly Bill 1138, which would have explicitly required social media platforms to obtain verifiable parental consent before collecting personal data from under-13s. Since this is already a requirement under COPPA, the governor claimed that such an overlapping state-level requirement would simply cause confusion.
Games industry begins work to comply with the Children’s Code
Now that the enforcement deadline for the ICO’s Age Appropriate Design Code has been set for 2 September 2021, games companies in particular are reviewing their approach to users under 18. The Code, which sets out 15 principles, will be used by the regulator to determine whether an operator is in compliance with the GDPR. A summary of its impact can be found in our earlier blog post, and additional context provided by Commissioner Elizabeth Denham in our podcast.
For games developers, the key challenge will be to work out how to establish who their audience is, and how to take into account what is in child users’ best interest. The Code effectively extends the scope of GDPR to:
- teenagers up to 18 years old;
- not only services aimed at kids, but those likely to be used by them;
- the passive collection of data by connected devices (such as voice assistants or toys);
- ‘inferred data’, such as that created by ad targeting platforms
Some of the specific challenges, helpfully outlined in a post by Taylor Wessing, include the Code’s provisions on:
- Tactics to extend engagement. Providers will have to consider issues such as the need for screen breaks and general user welfare not directly related to privacy. This means that some techniques that reward users for playing for longer may become unacceptable.
- Parental controls vs children’s rights. Parental controls may conflict with kids’ rights to privacy as enshrined in the UN Convention on the Rights of the Child. For example, if the platform monitors private chats between users and shares that information with parents, this may breach the reasonable expectation of privacy a teenager assumes. Operators will have to work out how to balance the parental controls obligation with notices to users that their information may be shared with parents.
- Best interest of teenagers. Similar to obligations on gambling operators, this requirement goes beyond not doing users harm. It effectively requires second-guessing a user’s choices even if such decisions are not obviously harmful.
Based on our interactions with kids’ app developers including games in recent months, it seems clear that many have not yet begun the process of properly testing their kids’ experiences against the principles of the Code. Their task should be made easier once the ICO publishes additional guidance on its Children’s Code hub, which is expected in the coming months.
What else mattered?
- A new law to strengthen the rights of young influencers has been passed by the French parliament. The law would extend the protections child actors have to digital influencers, ensuring safe working conditions and protecting their income. Among other things the law would require parents of kid influencers to register as soon as their revenues exceed a certain threshold, and to pay a portion of the income into a protected account (not dissimilar to a Coogan account under various state laws in the US) to be held on trust until the child turns 16.
- The Children’s Advertising Review Unit (CARU), has assessed popular messaging app Discord for compliance with kids’ privacy law COPPA and the CARU guidelines, and determined that Discord is a ‘general audience’ service (not primarily directed to children) and hence compliant with COPPA so long as it continues to screen and block under 13s and has processes in place for identifying any underage users.
- TikTok has strengthened parental controls by expanding the features of its Family Pairing functionality (enabled earlier this year) to enable parents more granular control over whether their kids can accept comments on their videos from strangers, or search for content, among other restrictions. Nonetheless, the ‘pairing’ function is optional for kids, who can switch it off (with a warning sent to parents).
- New Mexico AG appeals kids’ privacy case against Google. New Mexico Attorney General Hector Balderas is not giving up after the courts in September granted Google’s motion to dismiss the lawsuit he had filed against the company for breaching COPPA. The original suit claimed that Google was using G Suite for Education to “spy on New Mexico students’ online activities for its own commercial purposes, without notice to parents and without attempting to obtain parental consent.”
- The Council of Europe stepped into the debate on how to balance children’s rights with the need to protect them with the launch of its Handbook for policy makers on the rights of the child in the digital environment. The comprehensive set of recommendations for policymakers includes a call for states to implement new laws to require digital operators to take into account the best interests of anyone under 18; to enforce data privacy-by-design; to prohibit profiling kids for marketing purposes; to deploy age verification technologies in determining whether to seek parental consent; and to “ensure that easily accessible, meaningful, child-friendly and age-appropriate information about privacy tools, settings and remedies is made available to children.”
- Lawyers from Osborne Clarke debate whether the UK government’s proposed total ban on digital advertising of foods considered high in fat, sugar or salt (HFSS) is likely to be effective. The government launched its consultation in November with the stated objective of reducing childhood obesity. The initiative—which comes on top of the 2017 ban on HFSS advertising in media directed to children—has drawn sharp criticism from advertiser groups, led by the IAB UK.
- The Australian government has released an issues paper seeking public comment on a range of recommendations made by the Competition and Consumer Commission’s (ACCC) Digital Platforms Inquiry in 2019. The responses will inform potential changes to Australia’s data privacy laws. Among revisions being considered are a requirement to obtain parental consent for the use of personal data of children (similar to what is required under COPPA and GDPR-K).
- The FTC announced this week that it had used its so-called Section 6(b) powers to investigate the data collection practices of nine digital platforms, including Facebook, TikTok, YouTube, Snap and others. Under the orders, the companies will have to provide extensive information to the FTC on what personal data they collect, how they use it, and how their practices impact children and teens in particular. In an accompanying statement, 4 out of the 5 commissioners said, “Policymakers and the public are in the dark about what social media and video streaming services do to capture and sell users’ data and attention. It is alarming that we still know so little about companies that know so much about us. […] The questions push to uncover how children and families are targeted and categorized.”
- China’s Audio-Video and Digital Publishing Association rolled out a new age rating system last week, requiring digital games to adopt colour labels indicating age appropriateness: 8+ (green), 12+ (blue), and 16+ (yellow). This move extends China’s proactive policies to protect kids, which include a parental consent obligation for under-14s, a requirement to register with real names (to enable age verification), and automatic screen time controls.
Stay safe and healthy.
Kind Regards,
Max Bleyleben, Managing Director
& Chief Privacy Officer