Five Big Questions (and Zero Predictions) for the U.S. State Privacy Landscape in 2025
In the enduring absence of a comprehensive national framework governing the collection, use, and transfer of personal data, state-level activity on privacy legislation has been on a consistent upward trend since the enactment of the California Consumer Privacy Act in 2018. With all 50 U.S. states scheduled to be in session in 2025, stakeholders are anticipating yet another year of expansion and divergence across the state privacy landscape. It is still too early to predict which states will adopt or amend their privacy laws so instead this article explores the five big questions set to shape American privacy law in the coming year.
- Will a new consensus emerge on data minimization (and would it change anything)?
State privacy laws have traditionally incorporated the principle of data minimization by prohibiting data processing beyond what is reasonably necessary to accomplish the purposes that are disclosed to a user. Consumer advocates have long objected to this approach, arguing that it incentivizes companies to bury broad disclosures in dense privacy notices, resulting in little, if any, heightened protection. This year, in a potentially paradigm-shifting move, the Maryland Online Data Privacy Act became the first state comprehensive law to attempt to depart from the typical data minimization standard by placing new limits on the collection and use of data tied to the activities necessary to provide a specific product or service requested by a consumer. My colleague Jordan Francis has called the classic approach “procedural data minimization” and the Maryland approach “substantive data minimization.”
While Maryland is the only comprehensive state privacy law to adopt a substantive data minimization approach, proposals in Vermont and Maine that came close to enactment this year contained similar language. Heightened data minimization provisions are also elements of recent sectoral laws including the Washington State My Health My Data Act, the New York Child Data Protection Act, and the Virginia Child Data Privacy Amendment. Taken together, these frameworks portend a new trend toward substantive data minimization standards; however, their statutory requirements vary in subtle but consequential ways. Distinctions include whether new minimization standards (1) apply to the collection or the processing of data, (2) limit data processing to what is “reasonably” necessary, “strictly” necessary, or just plain “necessary” to provide a requested product or service, and (3) are subordinate to other bases for using personal data, such as a list of “permissible purposes” (e.g. protecting data security) or if consistent with consumer consent.
The emergence of substantive data minimization requirements in state privacy laws represents an attempt to depart from the much maligned “notice and consent” approach to consumer privacy law. However, the ultimate impact of these emerging standards is not yet clear, and is expected to be largely shaped by future trends in interpretation, implementation, and enforcement. Consider the following, yet unanswered, questions about these “necessity” data minimization standards:
- If personal data satisfies a “necessity” standard that focuses solely on collection, can it then be processed for an unnecessary secondary purpose following initial ingestion by a business?
- What data collection is “reasonably” necessary to offer a requested product or service but would not be “strictly” necessary to offer that same service? Practically, what is the difference between these two standards?
- If a company’s business model is based on the sale of personal information or the use of data for targeted advertising, are those processing activities necessary in order to offer a product or service?
- Does providing consent for data processing make that use a “requested” product or service regardless of the processing purpose?
- Will companies have leeway to define data collection and processing activities within the scope of their “products and services” that they offer? (e.g.: “Our service is a photo hosting platform that generates revenue from selling data.”)
- Will data brokers face renewed scrutiny?
Perhaps the biggest surprise of the 2024 cycle has been the lack of legislative activity directly focused on the information collection and sharing practices of data brokers. The third party collection and sale of sensitive data, including health and location information, has been the subject of several high profile media investigations and is increasingly cited as a potential threat to national security. However, this year no new states passed data broker specific privacy laws and very few such bills were even introduced.
The scarcity of state level attention is even more noticeable when considering national efforts. New restrictions and enforcement concerning the brokering of personal information was one of the few privacy topics on which federal policymakers were particularly active this year. For example, the Biden Administration’s Executive Order 13873, the Protecting Americans Data from Foreign Adversaries Act, and the Federal Trade Commission’s litigation against Kochava and settlements with Gravy Analytics and Mobilewalla.
However, privacy legislation constraining the activities of data brokers could be set for a comeback in 2025 and lawmakers have a number of options they could pursue. In November, a coalition of data brokers decisively lost a bid to strike down New Jersey’s Daniel’s Law, which empowers certain government employees to request the removal of personal information from public websites. Furthermore, data broker registry laws are now in effect – and increasingly being enforced – in California, Texas, Oregon, and Vermont. California is also attracting attention for its efforts to build a “one stop shop” accessible deletion mechanism intended to allow individuals to request the deletion of their personal information across the entire data broker ecosystem.
On the other hand, it is possible that lawmakers will instead choose to address concerns about the data broker industry through more comprehensive regulatory approaches. There are inherent challenges to singling out a particular industry or practice for regulation that often raise complicated line drawing issues. A possible template for such a broader approach may be the aforementioned Maryland Online Data Privacy Act, which contains a unique standalone restriction on the sale of sensitive personal data.
- Which laws will be subject to legal challenges?
Several recent state privacy laws have been met with constitutional challenges, often concerning their intersection with First Amendment protected activity and impact to interstate commerce. To date, the most common litigation (and industry success in seeking injunctions) has involved laws requiring social media companies to conduct age verification and limit to features/access to certain child users. At the same time, lawmakers have continued to iterate on these proposals in search of a framework that can reliably withstand constitutional scrutiny – industry has notably only secured a partial injunction of the Texas SCOPE Act.
Looking ahead to 2025, legislative experimentation and industry litigation concerning children’s online safety and privacy laws are likely to continue apace. However, the tenor of these challenges may evolve following the Supreme Court’s decision in NetChoice v. Moody. While that case involved state laws regulating the content moderation practices of social media companies, several Justices expressed disapproval of how the case was brought as a “facial challenge” prior to enforcement, which may shift litigation strategies to focus on “as applied challenges”.
Stakeholders should also pay close attention to privacy laws in California and Maryland. In California, industry groups have already raised concerns that recent California Privacy Protection Agency rulemaking activity – on both data brokers and automated decisionmaking technology opt-outs – exceeds the bounds of the Agency’s statutory authority and is in violation of the California Administrative Procedure Act. Separately, while the Maryland Age Appropriate Design Code was drafted to remove any direct requirements to moderate content, the law’s risk assessment requirements may still contain “proxies for content” that Ninth Circuit found to likely violate the First Amendment in California’s version of the law.
- How will lawmakers approach artificial intelligence?
Opportunities, risks, and hype surrounding advancements in artificial intelligence (AI) technologies have impacted every domain in tech policy, and data privacy is no exception. In fact, privacy rules may emerge as one of the more successful levers for governing AI. For example, existing technology-neutral privacy laws will already apply to AI systems to the extent that they collect, process and output personal information. In particular, transparency, security, risk assessment, and consumer choice requirements under existing laws are poised to have significant influence on the development and use of new AI tools.
It is also important to recognize that AI is not a single technology, but can encompass a range of systems, some of which have been with us for decades (such as facial recognition technology) and some of which are still emerging (such as general purpose ‘foundation’ models). Lawmakers therefore have an array of approaches from which they could address AI safety, transparency, and fairness. For example, they could comprehensively regulate a broad range of technologies and harms, which is the approach taken by the draft Texas Responsible AI Governance Act. They may also seek to regulate a particular AI technology or use case such as “‘deep fakes” in political advertisements. Finally, lawmakers could also bake new AI-specific requirements into comprehensive privacy laws, as Minnesota did this year by creating a new right to contest the result of significant profiling decisions.
President-elect Donald Trump’s promise to repeal the Biden Administration’s AI Executive Order and incoming FTC Chair Ferguson’s leaked agenda to “terminate all initiatives involving so called… AI ‘bias’” could also inspire state lawmakers to focus on the use of AI systems in a manner that results in unlawful discrimination. This was the focus of the Colorado AI Act, which was enacted this year and that may serve as a template for similar state level efforts. However, efforts to establish a harmonized state-level approach to regulating discriminatory outcomes in high-risk systems may be complicated: Colorado’s pathsetting law is likely to be further shaped by amendments and rulemaking prior to taking effect.
- How will the new administration and congress impact the state privacy landscape?
Next year, President-elect Trump will enjoy narrow but meaningful majorities in both chambers of congress. The Republican Party has historically supported the enactment of broadly preemptive privacy legislation, raising the possibility – however faint – that some or all of the emerging state privacy ‘patchwork’ could be superseded by new federal legislation. Business groups may see a window of opportunity to advocate for a broadly preemptive national privacy framework modeled on existing state laws like the Texas Data Privacy and Security Act. However, at present there is little to suggest that preemptive comprehensive privacy will be a top priority for Republican lawmakers during the next congress, though bipartisan movement on child-specific online safety legislation appears more likely.
The November election results will influence not only the legislative agenda in Washington D.C., but also legislative activity in the states. Democratic governors and attorneys general are already discussing legislative and legal strategies to attempt to minimize or block various priorities of the Trump agenda. Concerns about the incoming administration’s approach to issues like immigration, law enforcement, and health care may be a motivating factor for commercial privacy legislation in Democrat-controlled states. For example, in a potential sign of things to come, Democratic Senators in Michigan rapidly sought to establish new protections for “reproductive health data” during the State’s ‘lame duck’ session immediately following the November election.
Outside of a few notable examples, recent state privacy laws have typically been enacted on an overwhelmingly bipartisan basis. However, this pattern could shift next year should commercial privacy become increasingly intertwined with other, more polarized issues. Therefore, while 2025 is likely to be as active as ever for legislative activity, this dynamic could ultimately reduce the amount of bills that are enacted compared to prior years.
Do you have the answers to these questions or are you brave enough to make your own predictions? Email the author of this post at [email protected]