FPF Submits Comments to Inform New York Children’s Privacy Rulemaking Processes
At the end of the 2024 legislative session, New York State passed a pair of bills aimed at creating heightened protections for children and teens online. One, the New York Child Data Protection Act (NYCDPA), applies to a broad range of online services that are “primarily directed to children.” The NYCDPA creates novel substantive data minimization requirements, restricts the sale of data of children (defined as under 18), and requires businesses to respect “age flags” – new device signals intended to convey whether a user is a minor. The second law, the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, is more narrowly focused on social media platforms and restricts minors’ access to the presentation of content made by “addictive feeds.”
On September 30, the Future of Privacy Forum (FPF) filed comments with the New York Office of the Attorney General (OAG) to inform forthcoming rulemaking for the implementation of these two frameworks. While raising the protections for youth privacy and online safety has been a priority area for lawmakers over the last several years, New York’s two new laws both take unique approaches. FPF’s comments seek to ensure that New York’s protections for youth online can protect and empower minors while supporting interoperability with existing state and federal privacy frameworks.
New York Child Data Protection Act
The NYCDPA creates new restrictions on the collection and procession of the personal data of teen users who are outside the scope of the federal Children’s Online Privacy Protection Act (COPPA). Under the law, a covered business must obtain “informed consent” to use teen data or the processing must be strictly necessary to meet one of nine enumerated “permissible purposes” such as conducting internal business operations or preventing cybersecurity threats. The requirement that, in the absence of informed consent, processing must be strictly necessary for minors 13+ could be stricter than COPPA standards, especially with respect to many digital advertising practices. The law also restricts the sale of minors’ personal data, including allowing a third party to sell the data.
Obtaining “informed consent” under the NYCDPA requires satisfying a number of conditions, some of which diverge from comparable privacy regimes. Consent must be made separately from any other transaction or part of a transaction; be made in the absence of any ‘dark patterns;’ clearly and conspicuously state that the processing for which consent is requested is not strictly necessary and that the minor may decline without preventing continued use of the service; and clearly present an option to refuse to provide consent as the most prominent option.
The NYCDPA is also unique in providing for the use of device signals to transmit legally binding information about a user’s age and their informed consent choices. Such technologies are not commonplace in the market and raise a number of both technical and policy questions and challenges.
With these unique provisions in mind, FPF’s comments recommend that the OAG:
- 1. Consider existing sources of law, including the COPPA Rule’s internal operations exception, state privacy laws, and the GDPR to provide guidance on the scope of “permissible processing” activities;
- 2. Where appropriate, align core privacy concepts with the developing state comprehensive privacy landscape including the definition of “personal information” and opportunities for data sharing for research;
- 3. Consult with the New York State Education Department to ensure alignment with New York’s existing student privacy laws and implementing regulations to avoid disruption to both schools and students access to and use of educational products and services;
- 4. Mitigate privacy, technical, and practical implementation concerns with “age flags” by further consulting with stakeholders and establishing baseline criteria for qualifying signals. FPF offers technical and policy considerations the OAG should consider in furthering this emerging technology; and
- 5. Explicitly distinguish informed consent device signals from “age flags”, given that providing consent at scale raises a separate set of challenges and may undermine the integrity of the NYCDPA’s opt-in consent framework.
Read our Child Data Protection Act comments here.
New York SAFE for Kids Act
The SAFE for Kids Act restricts social media platforms from offering “addictive feeds” unless the service has conducted “commercially reasonable” age verification to determine that a user is over 17 years of age, or the service has obtained verifiable parental consent (VPC). The legislative intent makes clear that ordering content in a chronological list would not be considered an “addictive feed.” A social media platform will also need to obtain VPC to provide notifications concerning an addictive feed to minors between the hours of midnight and 6 am.
“Addictive feeds” are broadly defined as a service in which user-generated content is recommended, selected, or prioritized in whole or in part on user data. There are six carve-outs to the definition of “addictive feed” such as displaying or prioritizing content that was specifically and unambiguously requested by the user or displaying content in response to a specific search inquiry from a user.
Notably, the SAFE for Kids Act focuses on parent consent for teens to receive “addictive feeds.” In contrast, teens are empowered by the Child Data Protection Act to provide informed consent for a broad range of activities. The divergence in policy approaches between these two laws regarding who can provide consent for a teen using a service may lead to challenges in understanding individual rights and protections.
Given the critical role of age verification and parental consent within the SAFE for Kids Act, FPF’s comments to the OAG focus on highlighting considerations, risks, and benefits of various methods for conducting age assurance and parental consent. In particular we note that:
- 1. There are three primary categories of age assurance in the United States: age declaration, age estimation, and age verification. Each method has its own challenges and risks that should be carefully balanced across the state interest in protecting minors online, the state of current technologies, and end-user realities when developing age verification standards.
- 2. When exploring appropriate methods for providing verifiable parental consent, the OAG should consider the known problems, concerns, and friction points that already exist with the existing verifiable parental consent framework under COPPA.
- 3. Strong data minimization, use limitations, and data retention standards could enhance data protection and user trust in age assurance and VPC requirements.
Read our SAFE For Kids Act comments here.