Paradigm Shift in the Palmetto State: A New Approach to Online Protection-by-Design
South Carolina Governor McMaster signed HB 3431, an Age-Appropriate Design Code (AADC) -style law, on February 5, adding to the growing list of new, bipartisan state frameworks fortifying online protections for minors. Although HB 3431 is dubbed an AADC, its divergence from past models and unique blend of requirements that draw upon a variety of other state laws may signal that youth privacy- and safety-by-design frameworks are undergoing a paradigm shift away from “AADCs” and into a new model for online protections entirely. South Carolina’s novel approach evolves the online design code schema through its variant approach to the combination of privacy and safety risks and the way covered services must address those risks, the kinds of safeguards online services should provide to users and minors, enforcement priorities, and navigating constitutional pitfalls.
For compliance teams, the need to unpack the law’s unique provisions is urgent since the law took effect upon approval by the Governor, meaning these requirements are now in effect. Further complicating entities’ compliance considerations, NetChoice filed a lawsuit on February 9 challenging the constitutionality of the Act on First Amendment and Commerce Clause grounds. NetChoice has requested a preliminary injunction to block enforcement of the law as litigation progresses. However, with an unclear litigation timeline, several newly effective legal obligations, and significant enforcement provisions carrying personal liability for employees, compliance teams may be stuck between two high-stakes options: (1) a risk of insufficient action and consequential liability if entities are slower to come into compliance while monitoring litigation outcomes; or, (2) a risk of sunken compliance costs that could have been invested in other important compliance and trust and safety operations if they invest more into compliance now and the law is later overturned.
This blog post covers a few key takeaways, including:
- Broad Scope & Thresholds: The Act applies to any legal entity (potentially including non-profits) providing online services “reasonably likely to be accessed by minors” that meets specific applicability thresholds, representing a broader blend of the scope of Maryland’s and Nebraska’s AADCs.
- Heightened “Duty of Care”: Unlike other states that only require risk mitigation, South Carolina mandates that entities exercise reasonable care to prevent heightened risks of harm to minors, which include compulsive use, severe psychological harm, identity theft, and discrimination, among others. This sets a notably higher compliance bar than other states’ duty of care obligations.
- Mandatory Tools & Universal Defaults: Despite only applying to services “reasonably likely to be accessed by minors,” services must provide all users of covered online services with expansive tools. For minors, these tools must be enabled by default, coupled with prescriptive parental monitoring requirements seemingly inspired by the federal Kids Online Safety Act.
- Third-Party Audits & Public Reporting: Apparently attempting to navigate the constitutional pitfalls plaguing California’s and Maryland’s AADCs, South Carolina replaces requirements for internal data protection assessments (DPIAs) with mandatory annual third-party audits. These audits must be submitted to the Attorney General who will post audit reports publicly on the state website and include detailed information, including descriptions of algorithms and how “covered design features” are used by the online service, potentially raising trade secret concerns.
- Significant Enforcement Provisions & Personal Liability for Employees: In a novel and extreme enforcement shift, the Attorney General is authorized to hold individual officers and employees personally liable for “willful and wanton” violations, in addition to seeking treble financial damages.
Additionally, see our comparison chart for a full side-by-side analysis of how South Carolina’s approach compares against other state law protections for minors online.
Scope
South Carolina’s Act applies to any legal entity that owns, operates, controls, or provides an online service reasonably likely to be accessed by minors. Whereas prior comparable state laws typically limited the scope to for-profit entities, South Carolina seemingly extends application to non-profit and other non-commercial entities. This approach mirrors the legal entity framing adopted in Vermont’s and Nebraska’s AADCs, though those laws include narrower applicability thresholds. With respect to applicability threshold criteria, South Carolina aligns with the model set out in Maryland’s AADC, applying to entities that meet any one of the following: (1) $25 million or more in gross annual revenue; (2) the buying, selling, receiving, or sharing of personal data of more than 50,000 individuals; or (3) deriving more than 50 percent of annual revenue from the sale or sharing of personal data.
An Evolving Approach to Design Protections & Enforcement
Duty of Care
Similar to Vermont’s AADC and state comprehensive privacy laws that incorporate heightened protections for minors, such as Connecticut and Colorado, South Carolina imposes a duty of care on covered online services. Significantly, South Carolina’s duty requires entities to exercise reasonable care to prevent heightened risks of harm to minors, including compulsive use, identity theft, discrimination, and severe psychological harm, among others. The obligation to “prevent” harms to minors diverges sharply from comparable duties of care which only require entities to “mitigate” risks–seemingly placing a higher bar on entities’ compliance efforts compared to other online protection frameworks. Moreover, South Carolina includes two disclaimers regarding the application of the duty of care, including: (1) clarifying that “harm” is limited to circumstances not precluded by Section 230; and, (2) clarifying that entities are not required to prevent minors from intentionally “searching for content related to the mitigation of the described harms.”
Mandatory Tools & Default Settings
South Carolina takes a Nebraska AADC-style approach to requiring comprehensive tools and protective default settings for minors–but with a twist. Notably, South Carolina requires covered services to provide extensive tools to all users of an online service, such as tools for disabling unnecessary design features and limiting the amount of time spent on a service or platform. For minors, the Act requires covered services to enable all tools by default, functionally achieving the same goals as high default settings requirements in other frameworks, like Vermont’s and Maryland’s AADCs. Additionally, South Carolina includes prescriptive requirements for the kinds of parental tools businesses must build and provide for parents to monitor and further limit minors’ use of online services–seemingly inspired by the parental tools obligations proposed by the federal Kid’s Online Safety Act. Importantly, businesses in scope of several minor online protection frameworks should pay close attention to South Carolina’s expansive mandatory tools and default settings requirements–and the range of users for which these tools must be available–when assessing compliance impacts.
Processing Restrictions
A common component of other minor online protection frameworks, South Carolina includes several normative processing restrictions limiting the way covered online services can collect and use minors’ data, including restrictions on profiling and geolocation data tracking and a prohibition on targeted advertising. Notably, similar to Nebraska’s AADC, South Carolina also broadly prohibits covered entities’ use of dark patterns on a service. This goes far beyond many other privacy laws that instead prohibit dark patterns only insofar as they are used in obtaining consent or collecting personal data. Although the law as a whole is subject to Attorney General enforcement, South Carolina’s Act singles out the dark patterns prohibition as a violation of the South Carolina Unfair Trade Practices Act, which includes a private right of action.
Third Party Audits
One of the key issues hampering states’ pursuit of AADC frameworks has been the inclusion of data protection impact assessments (DPIAs), which typically require covered online services to assess the service’s likelihood of harm to children. For example, California’s AADC has been subject to litigation because, among other things, it included a requirement for businesses to assess and limit the exposure of children to “potentially” harmful content. While the Ninth Circuit held that assessments that require a company to opine on content-based harms are constitutionally problematic, it did not hold that DPIAs are entirely unconstitutional.
Within this dynamic constitutional landscape, South Carolina shifts away from requiring covered entities to internally assess harms through DPIAs and instead requires covered entities to undergo annual third-party audits and publicly disclose the reports. Those audits must include detailed information on various aspects of the online service as it pertains to minors, including the purpose of the online service, for what purpose the online service uses minors’ personal and sensitive data, whether the service uses “covered design features” (e.g., infinite scroll, autoplay, notifications/alerts, appearance-altering filters, etc.), and a description of algorithms (an undefined term) used by the covered online service. This shift towards public disclosure of service assessment information may cause notable compliance difficulties and raise trade secret questions for covered online services, although it is unclear whether this unique ‘third-party audits’ approach addresses the underlying constitutional concerns highlighted in state AADC litigation.
Enforcement
South Carolina authorizes the Attorney General to enforce the Act, allowing for treble financial damages for violations. Most significantly, South Carolina also authorizes the Attorney General to hold officers and employees personally liable for willful and wanton violations–a novel and extreme enforcement mechanism not employed in comparable frameworks. However, personal liability for employees and officers is not entirely unheard of in the broader consumer protection and digital services enforcement context–and might be a trending enforcement mechanism. For example, in a more aggressive enforcement approach advanced by the FTC under Lina Khan, the agency pursued personal liability against senior executives at a public company for violations of the FTC Act. In another more recent consumer protection example, the Kentucky Attorney General filed a consumer protection lawsuit against Character.AI and its founders alleging the company knowingly harmed minors in the operation of its companion chatbot product, exposing minors to “sexual conduct, exploitation, and substance abuse.”
Conclusion
By adopting this novel approach, South Carolina adds to a growing state-level experiment that seeks to balance business obligations to address and disclose risks of harm in online services and afford greater protections for minors with constitutional constraints. South Carolina’s novel blend of different state-level models, unique take on service assessments, and unusual enforcement approach signals a broader fragmentation of online youth protection frameworks into three increasingly defined models: (1) data management-oriented heightened protections for minors embedded in state privacy laws; (2) age appropriate design codes that impose a fiduciary duty to act in children’s best interests, require age appropriate design, and mandate DPIAs to assess foreseeable harms; and, (3) a “protective design” model, like South Carolina, exemplified by South Carolina, that synthesizes elements observed in first two while uniquely integrating privacy and safety obligations. It remains to be seen how the emerging protections-by-design model may influence ongoing state legislative efforts, impact business compliance efforts, and measure-up against potential constitutional scrutiny.