Contextualizing the Kids Online Safety and Privacy Act: A Deep Dive into the Federal Kids Bill
Co-authored by Nick Alereza, FPF Policy Intern and student Boston University School of Law. With contributions from Jordan Francis.
On July 30, 2024, the U.S. Senate passed the Kids Online Safety and Privacy Act (KOSPA) by a vote of 91-3. KOSPA is a legislative package that includes two bills that gained significant traction in the Senate in recent years—the Kids Online Safety Act (KOSA), which was first introduced in 2022, and the Children and Teens Online Privacy Protection Act (“COPPA 2.0”), which was first introduced in 2019. KOSPA contains new provisions and a variety of provisions that would amend, and in some cases augment, the United States’ well-established existing federal children’s privacy law, the Children’s Online Privacy Protection Act (COPPA).
KOSPA’s passage in the Senate marks the most substantial advancement in federal privacy legislation in decades. In just the last two years, the children and teens’ privacy and online safety landscape has seen a flurry of activity. The federal executive branch has been active through efforts such as significant FTC enforcement actions and a report released just two weeks ago from the Biden-Harris Administration’s interagency Task Force on Kids Online Health and Safety. Most notably, many states have passed laws providing heightened protections for kids and teens online, some of which have been the subject of litigation.
Amongst all this activity, the Kids Online Safety and Privacy Act takes a new approach that is unlike much of what we have seen before. Like other proposals, the bill would create heightened protections for teens, and new protections for design and safety. However, KOSPA also contains a novel knowledge standard, limited preemption, and a novel “duty of care,” along with requiring particular design safeguards and prohibiting targeted advertising to children and teens.
1. A novel knowledge standard
Similarly to COPPA, the Kids Online Safety and Privacy Act (KOSPA) would establish a two-part threshold for when companies are required to comply with various data protection obligations, such as access, deletion, and parental consent, for when a service is “directed to children” or when services have “actual knowledge” that an individual is a child. However, KOSPA would modify the standard in a novel way: its protections for minors would apply when a business has “actual knowledge or knowledge fairly implied on the basis of objective circumstances.”
This language is based on the FTC’s trade regulation rules, which use the “knowledge fairly implied” standard to determine if a company knew it violated a trade rule. While the FTC is experienced in using this standard, it is new when applied to children’s privacy and online safety. Currently, there is little guidance or comparable laws to help understand how “knowledge fairly implied on the basis of objective circumstances” applies specifically to the narrow question of whether a user on a website is a minor. This standard is arguably closer to constructive knowledge and may even be broader than the “willful disregard” standard used in state comprehensive laws.
COPPA’s knowledge standard, or the question of what obligation a business has to figure out who on their website is a child, has long been debated. On one hand, critics of the existing standard argue that it is too narrow and that needing actual knowledge incentivizes companies to avoid evidence that might suggest children are on their websites. On the other hand, proponents of keeping the existing standard argue that broadening the threshold would require companies to engage in too much data collection, creating an unintended result of age-gating even general audience, age-appropriate websites. In recent years, most state comprehensive laws have taken the approach of using actual knowledge or willfully disregards,” which attempts to strike a balance between the two sides of this debate.
2. Narrow preemption of state laws
Preemption, or the question of which state privacy laws will be superseded by a federal standard, is one of the biggest sticking points in federal privacy debates. Under KOSPA, preemption is narrow and would explicitly supersede only state laws that directly conflict with the Act. Additionally, the Act includes a savings clause explicitly allowing states to enact laws and regulations that provide “greater protection” to minors than those under KOSPA.
While any federal law is likely to have some uncertainty when it comes to preemption of state laws, this language bodes well for states who have enacted heightened privacy and online safety protections for children and teenagers in recent years, such as Maryland, Connecticut, and New York. Some of the thinking with a federal privacy law is that it would afford one national standard for privacy rather than a “patchwork” state-by-state approach. However, with KOSA and COPPA 2.0, these would be additional protections layered on top of existing state compliance obligations.
3. A novel “duty of care” to prevent and mitigate harms to children and teens
One of the most discussed new provisions in KOSPA (arising from KOSA) is its duty of care. The proposal would require covered platforms to exercise “reasonable care” in the “creation and implementation of any design feature to prevent and mitigate [harms] to minors.” Specifically, KOSPA identifies six categories of harm, including explicitly stated mental health disorders, violence and online bullying, and deceptive marketing practices. (See Table 1)
Online services owing a duty of care to minors is a novel aspect of child-focused privacy laws a trend that has popped up in recent years – seen in the currently-enjoined California Age-Appropriate Design Code, Maryland Age-Appropriate Design Code, and recent amendments to Colorado and Connecticut’s comprehensive consumer privacy laws. Design codes require an affirmative duty to act in the best interests of children, whereas KOSA, Connecticut, and Colorado require a duty to avoid harm.
Overall, KOSPA/KOSA’s approach to a duty of care is both broader in scope, and at the same time more specific in its enumeration of specific harms, compared to existing state approaches. As comprehensive consumer privacy laws, Connecticut and Colorado are focused on how processing personal data may be used to facilitate harms whereas KOSA applies broadly to preventing and mitigating harms. Connecticut and Colorado also require an assessment of any service, product, or feature, while KOSA is focused only on “design features.” Lastly, Connecticut and Colorado’s list of harms is shorter and more narrowly focused on more traditional privacy harms, while KOSA enumerates specific concrete harms related to modern kids’ and teens’ well-being, such as anxiety, bullying, and abuse.
None of the state laws with duties of care are yet in force, so it remains to be seen how these provisions will be implemented by companies or enforced by regulators. However, the alignment of KOSA with the specificity and narrower scope of Colorado and Connecticut, could mitigate risks of legal challenges over restrictions on content, like those seen in the California AADC litigation.
KOSA’s duty of care | Connecticut & Colorado’s duty of care |
A covered platform shall exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate the following harms to minors: | Controllers shall use reasonable care to avoid any heightened risk of harm to minors caused by such online service, product, or feature. |
(1) Consistent with evidence-informed medical information, the following mental health disorders: anxiety, depression, eating disorders, substance use disorders, and suicidal behaviors. (3) Physical violence, online bullying, and harassment of the minor. (4) Sexual exploitation and abuse of minors. (5) Promotion and marketing of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol. | Heightened risk of harm to minors means processing minors personal data in a manner that presents any reasonably foreseeable risk of: (A) any unfair or deceptive treatment of, or any unlawful disparate impact on, minors (B) any financial, physical or reputational injury to minors, or (C) any physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of minors if such intrusion would be offensive to a reasonable person (D) unauthorized disclosure of the personal data of minors as a result of a security breach [note: this fourth harm is in CO, but not CT] |
4. Changes to Verifiable Parental Consent (VPC)
KOSPA would expand the existing requirements for verifiable parental consent (VPC), requiring companies to collect it at an earlier stage than might often be obtained under COPPA. Interestingly, both provisions of KOSPA (the COPPA 2.0 and KOSA parts of the bill) address VPC separately. KOSA would require a covered platform to obtain verifiable parental consent (VPC) before a known child’s initial use of the service. While a covered platform may consolidate this process with its process to obtain VPC for COPPA, KOSA’s VPC requirement seems to still apply even if a covered platform’s personal information practices do not necessitate VPC under COPPA.
KOSA may also differ in its approach to children who already use a covered platform. Because KOSA requires VPC prior to a known child’s “initial use”, it is unclear whether a covered platform must obtain VPC from a child whose initial use happened before the bill’s effective date or when the platform knew they were a child. Comparable state social media laws include provisions that prevent a minor from holding an account they could not create: Florida’s HB 3 would require a social media service to terminate all accounts that likely belong to minors younger than 16, and Tennessee’s Social Media Act would require age-verification of an unverified account holder when they attempt to access their account.
5. Other Privacy and Safety Safeguards
KOSPA includes a number of requirements for companies to establish safeguards aimed at addressing “the frequency, time spent, or activity of minors” on platforms, including the ability to opt out of personalized recommendation systems. The proposal would also establish a flat ban on personalized advertising to kids and teens under the age of 17.
Design Safeguards for Time Spent and Recommendations
KOSPA requires covered platforms to “provide readily-accessible and easy-to-use safeguards” to any user or visitor that the platform knows is a minor. These safeguards must be on the most protective setting by default. KOSA requires a covered platform to make parental tools available, although a minor can change their own account settings without VPC.
Two of KOSPA’s safeguards have key differences compared to state social media laws with similar provisions. KOSA requires a covered platform to limit by default “design features that encourage or increase the frequency, time spent, or activity of minors.” State social media laws which regulate design features tend to do so narrowly such as Utah’s SB 196, which would prohibit the use of infinite scroll, autoplay, and push notifications for minors, or New York’s SAFE for Kids Act, which would require VPC to enable overnight notifications for minors. Once again, KOSA’s scope more closely resembles state privacy laws: Colorado and Connecticut both have a broader prohibition against the use of any “system design feature to significantly increase, sustain, or extend a minor’s use of the online service, product, or feature” without a child’s VPC or a minor’s consent. But unlike all of these laws, KOSPA would allow minors, including children, to change any of these settings without VPC.
The second notable safeguard is a requirement for a covered platform to include controls to adjust or opt-out of any personalized recommendation systems, which are suggestion or ranking algorithms that incorporate a user’s personal information as defined in COPPA. This category appears to be narrower than New York’s SAFE for Kids Act, which would limit feeds which rank or suggest content based on any information associated with a user or user’s device.
Prohibition on Targeted Advertising
Finally, the COPPA 2.0 portion of the bill creates a flat prohibition on targeted advertising to children and teens 16 and under. While comparable state laws have moved in the direction of creating additional restrictions on advertising to minors, the federal approach goes the furthest by creating a ban rather than allowing for opt-in consent. Notably, the bill takes the approach of creating and defining the term “individual-specific advertising.” The combination of the targeted advertising ban and the broader, constructive knowledge standard used is likely to have significant impacts for the adtech ecosystem.
Reporting Mechanism
KOSPA requires a covered platform to incorporate a reporting mechanism, through which minors, parents, or schools can report harms to minors. The platform must have an electronic point of contact specific to these matters, and the platform must substantively respond to a report within at most 10 or 21 days, depending on the size of the platform and the imminence of harm to the minor. KOSPA’s attention to detail regarding reporting mechanisms stands out when compared to the Maryland AADC’s single requirement that a service’s reporting tools be “prominent, accessible, and responsive.”
Looking ahead
While KOSPA passed the Senate by an overwhelming vote of 91-3, its future in the House of Representatives is uncertain. The House started its August recess just days before the Senate vote, and the earliest KOSPA could be taken up in the House is September 9, which will be just under two months until the November election. Whether that helps or hurts the bill’s chances is subject to speculation. No matter Congress’s next move, states are poised to keep forging ahead on youth privacy and online safety.