ICYMI: FPF Webinar Discussed The Current State of Kids’ and Teens’ Privacy
The virtual conversation, moderated by FPF’s Chloe Altieri, included a discussion about the industry’s work on compliance with a variety of regulations across jurisdictions. The webinar began with presentations from the panelists, setting the stage with their current work on youth privacy issues. The panelists were Phyllis H. Marcus, Partner at Hunton Andrews Kurth LLP, Pascale Raulin-Serrier, Senior Advisor in Digital Education and Coordinator of the DEWG at the French CNIL, Michael Murray, Head of Regulatory Policy at the U.K. ICO, and Shanna Pearce, Managing Counsel, Family Experience at Epic Games.
Phyllis led the audience through the current U.S. legislative and regulatory landscape. The U.S. States have been incredibly active in children’s and teens’ privacy legislation, with 11 states having enacted bills and an additional 35 states have considered legislation for youth online safety. The trends emerging from the online safety legislation being considered show four primary categories of laws being considered by state regulators: Platform Accountability Laws, Age Verification Laws, Social Media Metering Laws, and the California Age-Appropriate Design Code (CA AADC). In recent actions, the Federal Trade Commission has stepped up enforcement of the Children’s Online Privacy Protection Act (COPPA). Finally, the U.S. Congress has taken an interest in enhancing youth online safety measures through the introduction of COPPA 2.0 by Sen. Markey and Sen. Cassidy and the Kids Online Safety Act (KOSA) by Sen. Blumenthal and Sen. Blackburn.
Pascale discussed the work of the French CNIL, both nationally and internationally, in making privacy and youth online safety an effective initiative. Key international initiatives include the International Resolution on Children’s Digital Rights adopted in 2021, which harmonized a regulatory vision and set of core principles among Data Protection Authorities Worldwide around youth online safety. Additionally, the CNIL is working to improve digital education to combat the issues of online safety. The CNIL published eight recommendations to enhance the protection of children online in 2021 with the aim of providing practical advice to a range of stakeholders. Such recommendations include strengthening youth awareness of risks online and privacy rights as well as encouraging youth to exercise their privacy rights in order to stay safe. Youth education efforts are being undertaken in conjunction with digital literacy efforts that empower parents and caretakers to have meaningful conversations about online safety with their children.
Michael discussed the United Kingdom’s Age Appropriate Design Code (U.K. Children’s Code), which is a statutory code of practice under the UK’s General Data Protection Regulation(GDPR). The Children’s Code is grounded in the principles established by the United Nations Convention on the Rights of the Child (UNCRC). Michael explained that the Code sets out 15 interlinked standards of age-appropriate design for online services that are likely to be accessed by children. The U.K. Information Commissioner’s Office (ICO) has undertaken a wide-ranging effort to supervise the code by exploring the possible ways the ICO could provide guidance for online services to meet the Code’s objectives. The ICO not only looks to submitted complaints but also to industry engagement efforts and engagement with other regulators or government offices to inform its guidance. According to recent industry surveys, this new method of supervising code implementation has been effective.
Shanna discussed the safety and privacy considerations that are necessary when building online experiences for kids and teens, specifically on gaming platforms. Shanna spoke about the balance and thoughtful product design required to create a positive experience for younger players and their guardians in compliance with global regulations. One product of this balancing act for Epic Games was the deployment of a suite of protections across its ecosystem of games, including age-appropriate default settings, parental controls, and Cabined Accounts, an Epic account designed to create a safe and inclusive space for younger players. Players with Cabined Accounts can still play Fortnite, Rocket League, or Fall Guys but won’t be able to access certain features, such as voice chat, until their parent or guardian provides verifiable parental consent (“VPC”).
How are policymakers and industry members working to resolve points of tension between privacy and safety? How is this tension and its resolution approached differently across the globe?
Michael Murray gave a three-fold answer on the difference between U.S. and U.K. child privacy & safety policy. The first key difference is that in the U.K. context, the ICO and OFCOM lead a dual effort where the ICO focuses on privacy, and the U.K.’s Office of Communications (Ofcom) focuses on safety and content regulation. In the U.S., there is no defined, systemic privacy and safety regulatory effort. The second is that the U.K. is a signatory to the UNCRC, which defines children as anyone under the age of 18, and that is reflected in U.K. law. In contrast, the U.S. is not a UNCRC signatory, and the current U.S. federal protections define children as individuals under the age of 13 years old. Finally, third, the U.S. operates largely under an actual knowledge standard that an online site or service is directed to children. Whereas the U.K. Code and, recently, the California AADC operate under a “likely to be accessed” by children standard.
Phyllis H. Marcus elaborated on some of the points Michael brought up, describing the knowledge standard and tensions between privacy and safety we see in the U.S. COPPA is a privacy rather than a safety regime, but it does have safety components as one of its statutory underpinnings. It is important to know that this current regime is being somewhat upended by the patchwork of state laws being passed. Additionally, this regime may change if COPPA 2.0 and KOSA make their way through Congress and especially if they go into effect, where, combined, they will regulate privacy and safety regimes in tandem.
Pascale provided insight into the approach by France and the European Union (EU), where safety and privacy are not opposing or differentiated regulatory efforts. Safety is an element of privacy regulation and is considered within Data Protection legislation.
Age assurance is a large part of the policy conversation on youth privacy and safety online, especially as privacy protections for teens are expanding. What are your thoughts on the current issues around age assurance?
According to Michael, the area of age assurance and age verification is rapidly evolving, but there is “no silver bullet” for establishing or verifying user age on digital platforms and services. The U.K. Code takes a risk-based and proportional response to age assurance. The lower the risk of processing a child’s data, the less intensive age assurance or verification mechanisms need to be. For example, self-declaration might be a suitable mechanism for a lower-risk service. However, where risk is higher, more assurance and verification are needed to protect against processing children’s data, which is a GDPR violation. For higher-risk services, age estimation software with a buffer, a form of cabined accounts, or age verification through mobile contracts and digital IDs could be employed. These methods are all still immature, and there is no clear one-size-fits-all solution.
Pascale expressed that there are no clear solutions for age assurance and verification at this time. The CNIL, along with working groups such as DEWG, are still experimenting with age assurance methods. The CNIL is working towards developing a technical approach among different stakeholders in both the government and private sectors to harmonize methods across the chain of actors concerned. However, it is clear that the solution developed will not rely on biometric data collection for age verification efforts, though scans and estimations of face shape may be permitted. The end goal will be to find a strong mechanism for verification while balancing privacy concerns.
What are the important considerations when trying to strike a balance between data minimization, collecting information for age assurance, the level of accuracy that is appropriate, and the evolving landscape of age assurance technologies?
Shanna discussed the challenges the industry faces with respect to the state of age assurance technology in certain scenarios. Those challenges include imprecision with some age assurance technology and methods that cannot be used across all types of devices where users access online services, such as gaming consoles. These challenges are reduced when several methods are offered for verification of adults providing parental consent but may be significant where a single method is used as a gate for users to access services. While alternative age assurance technologies continue to develop, Shanna observed that the industry can find creative ways to improve the reliability of existing methods like self-reported age gates–such as providing child experiences that reduce the incentive to misstate age (Epic’s Cabined Accounts are one example) and using trusted data intermediaries to reduce friction and privacy risk to parents providing parent verification.
Michael echoed concerns about data minimization complicating age verification techniques. He added that when given a choice, a lot of parents prefer age estimation mechanisms as opposed to giving hard identifiers or personal information such as ID numbers or credit cards for age verification purposes.
These questions around age assurance are sometimes linked to discourse about parental consent. Can you speak to these two topics and share a bit about the emerging methods for each?
Phyllis provided more insight into age assurance and parental consent practices in the U.S., noting that, at least in the U.S., age assurance and parental consent “are really two different things.” When it comes to children’s use of online services under federal law, there is no requirement to verify the age of users. Rather, the U.S. requirement is to obtain consent from someone whom the company has reasonable assurance to be the parent of a child requesting access to a service. There are some parental consent mechanisms that have been whitelisted by the FTC and have been in place for decades, while others are still being reviewed by the FTC. The idea of age assurance, on the other hand, is relatively new in the U.S., and there are a number of actors considering the possibility of deploying age assurance methods in the states. Key considerations for exploring the use of age assurance technology in the U.S. include looking at less-intrusive, less risky verification methods and making data minimization a priority. Finally, Phyllis made clear that when using age estimation systems for age assurance, the over/under age estimations could be risky if not adequately tested. When estimating a user’s age with just a few years for margin of error, that would be the difference in compliance and non-compliance.
There is a lot of work being done globally by lawmakers, regulators, advocates, industry leaders, and researchers to answer these policy questions we have discussed today. How are recommendations created, and how is this guidance impactful for remedying noncompliance or figuring out solutions to protect youth privacy online?
Pascale noted that in addition to the CNIL’s eight recommendations previously mentioned, global IT experts are exploring age verification technologies to be able to create recommendations for compliance and enforcement. Work is also being done on digital education for parents as a way to increase awareness and understanding of child privacy and safety online. There is a balance between allowing parents to be involved and requiring online services to add protections. There are also important nuances around teen autonomy, developmental stages, and parents sharing too much of their child’s information. There is more to come on recommendations providing topics of discussion with parents as well as developing cooperation on a voluntary basis with the industry.
Michael agreed that digital education is a vitally important part of the solution, and research shows that parents want to have a say in the online services their kids use. Still, it cannot be the entire solution, and parents will not always be able to make informed decisions. Children’s design codes are placing an emphasis on the design of online services to avoid placing an overwhelming burden on the shoulders of parents. This emphasis works productively in tandem with developing resources for parents to have productive conversations with their children.
Recent youth privacy legislation has included a variety of standards for the level of knowledge of an online service’s audience’s age. These variations in legislation have led to companies needing to consider youth privacy issues, like age assurance, that previously did not. How is this impacting emerging technology and the practical implementation of new products?
Phyllis responded that the development of new standards for determining what services do or do not fall under the scope of regulatory scrutiny and age assurance requirements is one of the most hotly contested and highly discussed issues in the evolving U.S. landscape. Under COPPA, the requirements are defined clearly into buckets, which then clearly define the scope of the law. The current federal standard in the U.S. is actual knowledge that a service is directed to children. Phyllis cautions that it’s important to note that most new initiatives change this paradigm, and the jury is still out on what the new standard will ultimately entail with COPPA 2.0 and KOSA.
Shanna noted that while many services were developed and deployed prior to legislation going into effect, those services may be brought into scope later. Retrofitting an existing service to address things like parental consent and default settings may require significant design and technical effort, and the process is complicated further as the age of digital consent differs across regions. Shanna stated that engagement by regulatory bodies and issuing of guidance is invaluable to companies trying to comply with these evolving requirements in the tech space.
Pascale added that this is not the first time that the industry has faced technical difficulties like the ones we see today in age assurance and verification. According to Pascale, innovation is a key element to prioritize in each company’s approach because big innovations can guide smaller ones.
Phyllis observed that there is a lot on the horizon and that it will be easy for actors to fall behind if they are not intentionally keeping up with youth privacy. It is clear that developments in the U.K. have had an effect on U.S. policy at both the state and federal levels. These initiatives will continue to be momentum to keep an eye on.
Pascale opined that privacy by design is one of the best policy options. While digital education is important to aid in solving these issues, integrating privacy by design at the conception of tech innovation will help to distribute the pressure of protecting youth online.
Michael noted that age assurance is an obvious answer. Additionally, the resolution of First Amendment questions presented in the litigation of the California Age-Appropriate Design Code will be critical. The suit brings up fundamental issues around how to protect data without impacting U.S. constitutional rights that will be an important debate.
Shanna is interested in seeing how companies balance privacy with uses of emerging technologies that improve online safety. She also observed that a variety of laws are currently taking shape around the globe, and there’s an opportunity to improve consistency and clarity of forthcoming guidance so companies can comply effectively.
Each of the panelists shared helpful resources, which we have listed and linked below, along with a few of our own. You can also find the panelist’s presentation slides and additional resources here.
Phyllis H. Marcus’ Recommended Resource:
Pascale Raulin-Serrier’s Recommended Resources:
- Global Privacy Assembly Adoption of an International Resolution on Children’s Digital Rights (2021)
- Personal Data Protection Competency Framework For School Students
- Données & Design Workshops for co-constructing ethical interfaces
- Demonstration of a privacy-preserving age verification process
- CNIL Children & Teenagers Education Resources
Michael Murray’s Recommended Resources:
- UK ICO GDPR Resources
- Best Interests of the Child Self-Assessment
- Information Commissioner’s Opinion: Age Assurance for the Children’s Code
- ‘Likely to be accessed’ by children – FAQs, list of factors & case studies
- Designing data transparency for child
Shanna Pearce’s Recommended Resources:
Additional Future of Privacy Forum Resources of note:
- Infographic: Unpacking Age Assurance: Technologies and Tradeoffs
- Policy Brief: An Analysis of the California Age-Appropriate Design Code
- Policy Brief: Comparing the UK and California Age-Appropriate Design Codes
Coming up soon! You won’t want to miss FPF’s final session in our virtual Immersive Tech Panel Series on December 6 at 11 am ET. The December session will dive into designing immersive spaces with kids and teens in mind. You can register for this event here.
For more information or to learn how to become involved with FPF’s youth privacy analysis and initiatives, please contact Chloe at [email protected]. Subscribe here to receive monthly newsletters from the Youth and Education Team.