New Age-Appropriate Design Code Framework Takes Hold in Maryland
On April 6, the Maryland legislature passed HB 603/SB 571, the “Maryland Age-Appropriate Design Code Act” (Maryland AADC), which is currently awaiting action from Governor Moore. While FPF has already written about Maryland’s potentially “paradigm-shifting” state comprehensive privacy law, the Maryland AADC may similarly pioneer a new model for other states. The Maryland AADC seeks to create heightened protections for youth aged 17 and under and will apply to businesses that provide online services, products, or features reasonably likely to be accessed by children. Businesses that may not typically be in scope of other statutorily-created child privacy protections may find themselves with new obligations under this framework.
See our comparison chart for a full side-by-side comparison between the Maryland Age-Appropriate Design Code and California Age-Appropriate Design Code.
Who does the Maryland AADC apply to?
If enacted, the Maryland AADC will apply to businesses that provide online services, products, or features reasonably likely to be accessed by children. The applicability threshold is fairly similar to the California Consumer Privacy Act, as integrated into the California Age-Appropriate Design Code Act (California AADC). The Maryland AADC specifically captures businesses that conduct business in Maryland and meet one of three thresholds: 1) have annual gross revenue of at least $25,000,000, 2) buy, receive, sell, or share the personal data of 50,000 or more consumers, households, or devices, (down from 100,000 consumers or households in CA) or 3) derive at least 50% of annual revenue from the sale of personal data.
What about that other age-appropriate design code?
The Maryland AADC is the second “design code” bill to pass a U.S. state legislature, following the California AADC of 2022. However, FPF’s analysis finds that the Maryland AADC differs from its California predecessor in numerous critical ways. While the California AADC was slated to take effect on July 1, 2024, it was enjoined in September 2023 by the United States District Court for the Northern District of California (CA District Court) in Netchoice v. Bonta. The CA District Court held that plaintiffs were likely to succeed on their claim that several non-severable provisions of the California AADC violate the First Amendment.
While litigation on the California AADC is ongoing, proponents of a design code-style framework have claimed they can fix it in light of the questions of Constitutionality raised in the District Court’s preliminary injunction. The “AADC 2.0” framework emerged during the 2024 state legislative session in several states, including Vermont, Minnesota, New Mexico, and Maryland. Maryland is the first state to pass an AADC 2.0 bill, and the Maryland AADC will, therefore, likely be the subject of considerable analysis and debate over whether the First Amendment vulnerabilities that plagued the California AADC have been removed.
Fundamental changes from California AADC
- No express age estimation mandate
One of the most significant changes to the Maryland AADC is that there is no express obligation for businesses to determine the age of individuals using a service. Under the California AADC, businesses would have been required to estimate the age of young users with “a reasonable level of certainty appropriate to the risks” arising from data management practices or, alternatively, provide strict privacy protections by default to all individuals regardless of age. Under present methods, accurately estimating the age of users with a high level of accuracy typically necessitates collecting additional personal information, such as government identifiers or facial scans. In granting a preliminary injunction of the California AADC, the CA District Court appeared greatly troubled by this age estimation requirements, noting that it was “likely to exacerbate” rather than alleviate any harm of insufficient data protections for children by requiring both children and adults to share additional personal information.
In 2023-2024, several other youth privacy laws with requirements to collect age information have similarly been enjoined by U.S. courts, often on First Amendment grounds. Given this consistent trend, it is unsurprising that the Maryland AADC would not include this requirement. Instead, the Maryland AADC solely relies on a “likely to be accessed by children” audience standard. Rather than collecting age information, a service will need to assess, using a variety of indicators, whether or not the service is likely to be used by children. Some factors appear to be modeled after the federal Children’s Online Privacy Protection Act’s (COPPA) similar “directed to children” standard, such as empirical evidence on audience composition or whether the online product features advertisements marketed to children. However, as a reminder, the Maryland AADC applies to children and teens up to 18. While businesses might have great familiarity with assessing whether advertisements appeal to children under 13 in complying with COPPA, doing this assessment for a 16 or 17-year-old might be less familiar and potentially more complicated.
Notably, the CA District Court in Bonta also observed that the age estimation provision of the California AADC was the “linchpin” of the law because knowing the age of users is critical for applying “age-appropriate” protections. For example, the Maryland AADC requires that privacy information and community standards be provided in a language suited to the age of children likely to access the service. Therefore, it remains an open question whether Maryland’s removal of this express requirement also erases any implicit obligations for collecting age information to serve the “age-appropriate” protections mandated by the bill.
- Defining and upholding the “best interests of children”
While the Maryland AADC is an evolution of the California AADC, the California AADC is itself derived from the UK Age-Appropriate Design Code (UK AADC). A core component of the UK AADC is that businesses should consider the “best interests of the child” when designing and developing services online. The “best interests of the child” is a recognized concept adopted from the UN Convention on the Rights of the Child, of which the United States is the only country not to have ratified. In the United States, the “best interests of the child” typically is not an established legal standard outside of the family law context.
While the California AADC imported the “best interests of the child” language from the UK AADC, it did not include a definition. Under the California AADC, businesses would have been permitted to avoid certain obligations if they were able to demonstrate that their alternative course of action was consistent with the undefined “best interests of children.”
In contrast, the Maryland AADC establishes a quasi-‘duty of care’ that affirmatively obligates online services to act in the best interests of children. It goes on to scope “best interests of the child” by uses of a child’s data or design of an online product that will not result in 1) reasonably foreseeable and material physical or financial harm to children, 2) reasonably foreseeable and severe psychological or emotional harm to children, 3) a highly offensive intrusion on the reasonable privacy expectations of children, or 4) discrimination against children based upon race, color, religion, national origin, disability, gender identity, sex, or sexual orientation. As further explained below, this switch from using “best interests of the child” as a means to avoid obligations to instead creating affirmative obligations arguably makes the Maryland AADC less flexible in ways that could, for instance, disrupt or prevent children’s access to beneficial services.
- Changes to data protection impact assessment (DPIA) obligations
The Maryland AADC, like the California AADC before it, requires businesses to conduct DPIAs to consider how online products will impact children. However, the Maryland AADC incorporates small but potentially impactful changes from its Californian predecessor. The District Court in Bonta took issue with the California AADC’s DPIA requirement for two reasons: 1) it did not address the harms it aimed to cure because the DPIAs addressed the risks arising from data management practices rather than the design of a service and 2) businesses were required to develop a plan to mitigate risks, there was no requirement actually to mitigate the risks. In light of this, the Maryland AADC requires that DPIAs include a description of steps the company has taken and will take to comply with the duty to act in the best interests of children.
The Maryland AADC also makes small changes to what harms or risks must be assessed. The California AADC required assessing whether the service could expose children to “harmful” or “potentially harmful” content, which in particular raised the ire of the news industry. Though the District Court did not reach this issue, the Maryland AADC removed any mentions of content, presumably to proactively address any concerns about First Amendment free speech issues. The Maryland AADC is also absent a requirement to assess harms related to targeted advertising. During the legislative session, drafters removed any mentions of targeted advertising from the bill. The exclusion of “targeted advertising” may be less a response to Bonta and more likely because the Maryland Online Data Privacy Act, which also creates heightened protections for children and teens, explicitly addresses targeted advertising.
- Stricter processing restrictions
One area where the Maryland AADC arguably goes further than the California AADC is in placing more expansive default limitations on how businesses may process children’s data, which is defined to include everything from collecting, using, storing, and deleting personal information. The Maryland AADC would ban businesses from processing personal data that is not reasonably necessary to provide an online product with which the child is “actively and knowingly engaged.” While “actively and knowingly” is not defined, a strict reading would suggest that the bill forbids businesses from retaining any information about a child user beyond a single-use session, including basic details like account information and log-in credentials. This restriction would functionally deprive children of the ability to use many online products, services, and features. Even if future regulations or judicial holdings advance a more flexible interpretation of this restriction, it could significantly impact the ability of services to perform analytics, collect attribution data, or even receive health records from a parent or doctor.
Under California AADC, there was an exemption from this prohibition if the business could demonstrate a compelling reason that the processing was in children’s best interests. However, the Maryland AADC has no similar exemption. Instead, Maryland will prohibit any processing inconsistent with children’s best interests under a separate provision, so reconciling the processing restrictions under this law may prove challenging.
- No mention of enforcing published terms
Unlike the California AADC and other state laws, the Maryland AADC does not require businesses to enforce their terms of service or other policies implemented under the law. By comparison, the California AADC would have required that businesses both publish and enforce “terms, policies, and community standards established by the business,” essentially giving the California Attorney General power to second guess core First Amendment-protected functions such as content moderation. While different in scope, Florida’s social media law recently heard in the Supreme Court similarly contained a requirement to enforce community standards that a District Court determined conflicted with a service’s First Amendment right to exercise editorial discretion. The absence of such a provision in the Maryland AADC may be explained by criticism of these other laws that pointed out that creating liability for services that fail to enforce published community guidelines may unintentionally incentivize platforms to lower community standards, leading to more harmful online spaces overall.
Conclusion
After the California AADC passed, some thought a flurry of similar legislation could be passed in other states. While a handful of states considered copycat legislation over the last two legislative sessions, none have ultimately been enacted, potentially due to the ongoing legal questions about that model’s constitutionality. Now that Maryland is pioneering this new “AADC 2.0” framework, stakeholders should be on high alert for new legal challenges and the potential for other states to consider and iterate upon this approach.
If enacted, the Maryland AADC will go into effect on October 1, 2024 – coincidentally the same day the Connecticut Data Privacy Act’s recently passed heightened youth protections go into effect.