California Age-Appropriate Design Code Aims to Address Growing Concern About Children’s Online Privacy and Safety
Authors: Chloe Altieri, Kewa Jiang
Kewa Jiang, CIPP/US, is a 2021 graduate of USC Gould School of Law and a Student Contractor with FPF’s Youth and Education Privacy team.
On May 26, 2022, AB-2273, the California Age-Appropriate Design Code Act (ADCA) unanimously passed the California Assembly and moved to the Senate for consideration. California Assembly Members Buffy Wicks (D-Oakland) and Jordan Cunningham (R-Templeton) proposed AB-2273, earlier this year. The bill is modeled after the United Kingdom’s Age Appropriate Design Code (AADC) and aims to regulate the collection, processing, storage, and transfer of children’s data. The bill takes a substantially different approach to youth privacy than the leading federal framework, extending heightened protections to individuals under age 18, covering entities that provide services likely to be accessed by a minor (even if the provider lacks actual knowledge that children use the service), and requiring age-appropriate language for privacy disclosures. The bill is moving through the California legislature at a time when young people are increasingly engaging with their peers online and using digital services for entertainment, educational, and other purposes. Given the rise in young users, there is growing international concern for children’s online data privacy and the effects online content can have on children’s mental and physical health.
President Joe Biden emphasized the increasing need for children’s online protections in his State of the Union Address on March 1, 2022. He called upon Congress to provide stronger privacy protections for children, ban targeted advertising to children, and hold social media companies accountable. The Federal Trade Commission (FTC) also emphasized the need for increased protections for youth in a policy statement released on May 19, 2022. The statement signaled the agency’s renewed focus on enforcing COPPA for all covered entities, including edtech vendors. Internationally, the United Kingdom enacted the Age Appropriate Design Code (AADC), which came into force in September 2021. The UK AADC informed the creation of the ADCA and many of the questions or amendments surrounding compliance have arisen from the complications of transplanting the UK’s European-style regulatory approach into the US legal framework.
Proponents of the California ADCA are partly motivated by concerns regarding youth safety, privacy, and mental health; the bill would significantly change the regulation of many online services in the US. The bill is scheduled for a Senate committee hearing on June 28th, 2022. If the bill successfully passes the Senate and is signed into law by Governor Gavin Newsom, the law would come into effect on July 1, 2024. There remain numerous questions about the compliance, enforcement, and potential impact of the ADCA. The sections below provide an overview and analysis of the bill’s key provisions:
- Covered Entities
- Age Assurance and Knowledge Standard
- Data Minimization
- Default Privacy Settings
- Enforcement and Regulation
Covered Entities
The ADCA would apply to businesses that provide “an online service, product, or feature likely to be accessed by a child.” This casts a wider net than COPPA’s “directed to children” standard. Under COPPA, under-age-13 users’ personal data is not typically afforded higher protection unless the service has actual knowledge that the user is a child or if the service’s offerings are deemed as child-directed through factors such as direct marketing, graphics, or music that appeals to children. Without additional guidance, the ADCA standard may be difficult to interpret or implement for covered entities that target a general audience and traditionally do not host content directed to minors. This uncertainty would likely be exacerbated by ADCA’s applicability to teens; some organizations could struggle to distinguish between services likely to be accessed by 17-year-olds and services likely to be accessed only by those 18 and older.
Baroness Beeban Kidron, an architect of the UK AADC, argued during an April hearing on the California bill that the “likely to be accessed” standard is essential to the bill’s purpose because the standard incentives service providers design a wide range of online products and services with youth protection in mind, not just services that know they have young users or direct their services to kids and teens. The “likely to be accessed” standard could encompass online products and services that children regularly visit that might otherwise not be covered under COPPA, like sites for video conferencing, online games, and social media.
Age Assurance and Knowledge Standard
The ADCA defines children as consumers under the age of 18, a higher age than any enacted child privacy law in the US. The ADCA definition differs from that of COPPA and the California Consumer Privacy Act (CCPA) which define children as under 13. The CCPA also creates a requirement for covered entities to receive affirmative opt-in authorization to sell the data of 13- to 16-year-old users. The ADCA would require covered entities to use age-appropriate language for privacy disclosures and the increase in age aims to create age-appropriate regulations for all minors. However, aggregating all children under 18 in a single group may cause issues in implementation because the developmental needs and maturity of teenagers are vastly different from those of elementary school age children. The UK AADC adopts the age-appropriate application through using age ranges. The California ADCA could benefit from a similar approach, which could improve the readability of privacy notices for young people while also incentivizing services to collect less identifiable data regarding user age.
Additionally, the ADCA would require covered entities to “establish the age of consumers with a reasonable level of certainty” appropriate to the risks or to apply the highest protections to all consumers. Age assurance and verification have been ongoing concerns as companies struggle with practical implementation. Some online services already engage in some form of age identification or inference, but some advocates have critiqued COPPA’s “actual knowledge” standard, arguing that it incentivizes websites for a general audience to simply not ask users’ ages. Others have criticized age-verification requirements, arguing that such mandates compel services to collect additional information about individuals.
The ADCA’s approach diverges from COPPA – the ADCA would essentially establish a “constructive knowledge” standard, creating liability if a service knew or should have known that children are likely to access its products or services given the circumstances, such as user data, online context, and marketing, but would not require services to collect additional personal information merely to infer user age. The ADCA standard is consistent with the UK’s AADC and may be more consistent with CPRA’s knowledge standard. While CPRA adopts an actual knowledge standard, it also states that a “business that willfully disregards the consumer’s age shall be deemed to have had actual knowledge of the consumer’s age.”
Data Minimization
The ADCA would establish strong data minimization requirements, prohibiting the collection, sale, sharing, or retention of personal information that is not necessary to provide the product or service. Children’s data that is necessarily collected may only be used for the reason for which it was collected. The ADCA would permit the collection of data solely for age verification purposes, but would minimize the use of data by prohibiting that data to be used for any purpose other than verifying user age. Moreover, the ADCA states that “Age assurance shall be proportionate to the risks and data practice of a service, product, or feature.”
Default Privacy Settings
When a minor accesses digital services, the ADCA would require covered entities to configure “all default privacy settings offered by the online service, product, or feature to the settings that offer a high level of privacy protection offered by the business.” “Default” is defined by the ADCA to mean “a preselected option adopted by the business for the online service, product, or feature.” Examples of settings that would be disabled by default include features that may profile children’s behavior, browsing history, or assume similarity to other children to offer detrimental material.
By default, the ADCA would bar a covered company from “collect[ing], sell[ing], or shar[ing] any precise geolocation information of children . . . unless the collection of that precise geolocation information is necessary.” Covered companies would be required to provide “an obvious sign to the child for the duration of that collection that precise geolocation information is being collected.” When any precise geolocation information is collected, it should only be for the limited time that the collection of the information is necessary.
Enforcement and Regulation
The ADCA would task the California Privacy Protection Agency (CPPA) with establishing the California Children’s Data Protection Taskforce (the “Taskforce”) and publishing privacy information, policies, and standards. The CPPA is an agency established through the CPRA for the purpose of implementing and enforcing the law. Under the ADCA, the Taskforce would be responsible for adopting regulations by April 1, 2024 and providing compliance guidance. The Taskforce would be assembled by April 1, 2023 and consist of members appointed by the CPPA. Members would consist of “Californians with expertise in the areas of privacy, physical health, mental health, and well-being, technology, and children’s rights.” Companies would have 3 months to comply with regulations produced by the taskforce prior to enforcement by the CPPA.
The ADCA would also require covered entities to undertake a Data Protection Impact Assessment (DPIA) for any product, service, or feature likely to be accessed by a child. This new requirement would be a large shift for entities as they would also be required to report the assessments to the CPPA and for review every 24 months or before new features are offered to the public. DPIAs can serve to strengthen enforcement of privacy laws, but are resource-consuming for both covered entities complying and regulators. The CPRA also requires risk assessments, but it is not clear whether these assessments will be filed with the CPPA, unlike the ADCA.
Looking Ahead
In proposing the ADCA, California expands the growing conversation on children’s online protection. Domestically and internationally, legislatures, policymakers, and advocates are attempting to balance the need for youth data protection and mitigating the negative effects of online content with the need for young minds to learn and explore. Federally, the FTC has also renewed its focus on children’s privacy. These efforts may further motivate Congress to push for more comprehensive federal children’s privacy protections and extend heightened protections to teens. Lawmakers in California and around the world have prioritized legislation that would establish individual protections, limit the potential harmful consequences of large-scale data collection and processing, and curtail abuses of targeted advertising and automated decision making.
The ADCA is currently pending in the California Senate Judiciary Committee and will be discussed in a committee hearing on June 28, 2022. Although the bill passed unanimously through the House, it is not yet clear how it will be received by the Senate and what revisions may be adopted. If the ADCA is passed and signed by the governor, it will shift the children’s data privacy landscape in the United States – many online services are based in California and nearly all have California users. For now, it remains to be seen whether the ADCA will emerge from the California Senate and, if it does, the provisions that may be amended in the process.
Additional Resources:
- FPF’s Response to the FTC’s COPPA Statement (May 2022)
- FPF Brief on the Passage of CPRA (November 2020)