A Price to Pay: U.S. Lawmaker Efforts to Regulate Algorithmic and Data-Driven Pricing
“Algorithmic pricing,” “surveillance pricing,” “dynamic pricing”: in states across the U.S., lawmakers are introducing legislation to regulate a range of practices that use large amounts of data and algorithms to routinely inform decisions about the prices and products offered to consumers. These bills—targeting what this analysis collectively calls “data-driven pricing”—follow the Federal Trade Commission (FTC)’s 2024 announcement that it was conducting a 6(b) investigation to study how firms are engaging in so-called “surveillance pricing,” and the release of preliminary insights from this study in early 2025. With new FTC leadership signalling that continuing the study is not a priority, state lawmakers have stepped in to scrutinize certain pricing schemes involving algorithms and personal data.
The practice of vendors changing their prices based on data about consumers and market conditions is by no means a new phenomenon. In fact, “price discrimination”—the term in economics literature for charging different buyers different prices for largely the same product—has been documented for at least a century, and has likely played a role since the earliest forms of commerce.1 What is unique, however, about more recent forms of data-driven pricing is the granularity of data available, the ability to more easily target individual consumers at scale, and the speed at which prices can be changed. This ecosystem is enabled by the development of tools for collecting large amounts of data, algorithms that analyze this data, and digital and physical infrastructure for easily adjusting prices.
Key takeaways
- Data-driven pricing legislation generally focuses on three key elements: the use of algorithms to set prices, the individualization of prices based on personal data, and the context or sector in which the pricing occurs.
- Lawmakers are particularly concerned about the potential for data-driven pricing to cause harm to consumers or markets in housing, food establishments, and retail, echoing broader interest in the impact of AI in “high-risk” or “consequential” decisions.
- Legislation varies in the scope of pricing practices covered, depending on how key terms are defined. Prohibiting certain practices deemed inappropriate, while maintaining certain practices that consumers find beneficial like loyalty programs or personalized discounts, is a challenge lawmakers are attempting to address.
- Beyond legislation, regulators have signalled interest in investigating certain data-driven pricing practices. The Federal Trade Commission, Department of Justice, Department of Transportation, and state Attorneys General have all stated their intentions to enforce against particular instances of algorithmic pricing.
Trends in data-driven pricing legislation
As discussed in the FPF issue brief Data-Driven Pricing: Key Technologies, Business Practices, and Policy Implications, policymakers are generally concerned with a few particular aspects of data-driven pricing strategies: the potential for unfair discrimination, a lack of transparency around pricing practices, the processing and sharing of personal data, and possible anti-competitive behavior or other market distortions. While these policy issues may also be the domain of existing consumer protection, competition, and civil rights laws, lawmakers have made a concerted effort to proactively address them explicitly with new legislation. Crucially, these bills implicate three elements of data-driven pricing practices, raising a series of distinct but related questions for each:
- Algorithms: Was an algorithm used to set prices? Are consumers able to understand how the algorithm works? How was the algorithm trained, and how might training data implicate the model’s outputs? What impact does the algorithm have on different market segments and demographic groups, as well as markets overall?
- Personal data: Was personal data used to set prices, and are prices personalized to individuals? What kind of personal data is used? Is sensitive data or protected characteristics included? Are inferences made about individuals based on their personal data for the sake of market segmentation?
- Context: Is the pricing being implemented in a particular sector, or in regard to particular goods, that might be especially sensitive or consequential? For example, is data-driven pricing being used in the housing market, or in groceries and restaurants?
These elements generally correspond to the different terms used in legislation to refer to data-driven pricing practices. For example, a number of bills use terms such as “algorithmic pricing,” including New York S 3008, an enacted law requiring a disclosure when “personalized algorithmic pricing” is used to set prices,2 and California SB 384, which would prohibit the use of “price-setting algorithms” under certain market conditions. A number of other bills use terms like “surveillance pricing,” such as California AB 446, which would prohibit setting prices based on personal information obtained through “electronic surveillance technology,” and Colorado HB 25-1264, which would make it an unfair trade practice to use “surveillance data” to set individualized prices or worker’s wages. Finally, some bills seek to place limits on the use of “dynamic pricing” in certain circumstances, including Maine LD 1597 and New York A 3437, which would prohibit the practice in the context of groceries and other food establishments. Each of these framings, while distinct, often cover similar kinds of practices.
Given that certain purchases such as housing and food are necessary for survival, the use of data-driven pricing strategies in these contexts is of particular concern to lawmakers. Many states already have laws banning or restricting price gouging, which typically focus on products that are necessities, and specifically during emergencies or disasters. Data-driven pricing bills, on the other hand, are less prescriptive in regards to the amount sellers are allowed to change prices, but apply beyond just emergency situations. While many apply uniformly across the economy, some are focused on particular sectors, including:
- Housing: eg, California SB 52
- Food establishments: eg, Massachusetts S 2515 (applies to grocery stores), Hawaii HB 465 (applies to the sale of food qualifying for federal SNAP and WIC benefits programs)
- Retail: eg, Vermont H 371
- Tickets: eg, Illinois HB 3838
In addition to bills focused on data-driven pricing, legislation regulating artificial intelligence (AI) and automated decision making more generally often apply specifically to “high-risk AI” and AI used to make “consequential decisions,” including educational opportunities, employment, finance or lending, healthcare, housing, insurance, and other critical services. The use of a pricing algorithm in one of these contexts may therefore trigger the requirements of certain AI regulations. For example, the Colorado AI Act defines “consequential decision” to mean “a decision that has a material legal or similarly significant effect on the provision or denial to any consumer of, or the cost or terms of…” the aforementioned categories.
Because certain data-driven pricing strategies are widespread and appeal to many consumers, there is some concern—particularly among retailers and advertisers—that overly-broad restrictions could actually end up harming consumers and businesses alike. For example, widely popular and commonplace happy hours could, under certain definitions, be considered “dynamic pricing.” As such, data-driven pricing legislation often contains exemptions, which generally fall into a few categories:
- General discounts: Deals that are available to the general public, such as coupons, sales, or bona fide loyalty programs (eg, California SB 259).
- Cost-based price differentials: Pricing differences or changes due to legitimate disparities in input or production costs across areas (eg, Georgia SB 164).
- Insurers or financial institutions: Highly-regulated entities that may engage in data-driven pricing strategies in compliance with other existing laws (eg, Illinois SB 2255).
Key remaining questions
A number of policy and legal issues will be important to keep an eye on as policymakers continue to learn about the range of existing data-driven pricing strategies and consider potential regulatory approaches.
The importance of definitions
As policymakers attempt to articulate the contours of what they consider to be fair pricing strategies, the definitions they adopt play a major role in the scope of practices that are allowed. Crafting rules that prohibit certain undesirable practices without eliminating others that consumers and businesses rely on and enjoy is challenging, requiring policymakers to identify what specific acts or market conditions they’re trying to prevent. For example, Maine LD 1597, which is intended to stop the use of most dynamic pricing by food establishments, includes an incredibly broad definition of “dynamic pricing”:
“Dynamic pricing” means the practice of causing a price for a good or a product to fluctuate based upon demand, the weather, consumer data or other similar factors including an artificial intelligence-enabled pricing adjustment.
While the bill would exempt discounts, time-limited special prices such as happy hours, and goods that “traditionally [have] been priced based upon market conditions, such as seafood,” prohibiting price changes based on “demand” could undermine a fundamental principle of the market economy. Even with exceptions that carve out sales and other discounts—and not all bills contain such exemptions—legislation might still inadvertently capture other accepted practices such as specials aligned with seasonal changes, bulk purchase discounts, deals on goods nearing expiration, or promotions to clear inventory.
Lawmakers must also consider how any new definitions interact with definitions in existing law. For example, an early version of California AB 446, which would prohibit “surveillance pricing” based on personally identifiable information, included “deidentified or aggregated consumer information” within the definition of “personally identifiable information.” However, deidentified and aggregated information is not considered “personal information” as defined by the California Consumer Privacy Act (CCPA). In later versions, the bill authors aligned the definition in AB 446 with the text of the CCPA.
The role of AI
In line with policymakers’ increased focus on AI, and a shift towards industry use of algorithms in setting prices, a significant amount of data-driven pricing legislation applies explicitly to algorithmic pricing. Some bills, such as California SB 52 and California SB 384, are intended to address potential algorithmically-driven anticompetitive practices, while many others are geared towards protecting consumers from discriminatory practices. Though consumer protection may be the goal, some bills focus not on preventing specific impacts, but on eliminating the use of AI in pricing at all, at least in real time. For example, Minnesota HF 2452 / SF 3098 states:
A person is prohibited from using artificial intelligence to adjust, fix, or control product prices in real time based on market demands, competitor prices, inventory levels, customer behavior, or other factors a person may use to determine or set prices for a product.
This bill would prohibit all use of AI for price setting, even when based on typical product pricing data and applied equally to all consumers. Such a ban would have a significant impact on the practice of surge pricing, and any sector that is highly reactive to market fluctuations. On the other hand, other bills focus on the use of personal data—including sensitive data like biometrics—to set prices that are personalized to each consumer. For example, Colorado HB 25-1264 would prohibit the practice of “surveillance-based price discrimination,” defined as:
Using an automated decision system to inform individualized prices based on surveillance data regarding a consumer.
…
“Surveillance data” means data obtained through observation, inference, or surveillance of a consumer or worker that is related to personal characteristics, behaviors, or biometrics of the individual or a group, band, class, or tier in which the individual belongs.
These bills are concerned not necessarily with the use of AI in pricing per se, but how the use of AI in conjunction with personal data could have a detrimental effect on individual consumers.
The impact on consumers
While data-driven pricing legislation is generally intended to protect consumers, some approaches may unintentionally block practices that consumers enjoy and rely on. There is a large delta between common and beneficial price-adjusting practices like sales on one hand, and exploitative practices like price gouging on the other, and writing a law that draws the proper cut-off point between the two is difficult. For example, Illinois SB 2255 contains the following prohibition:
A person shall not use surveillance data as part of an automated decision system to inform the individualized price assessed to a consumer for goods or services.
The bill would exempt persons assessing price based on the cost of providing a good or service, insurers in compliance with state law, and credit-extending entities in compliance with the Fair Credit Reporting Act. However, it would not exempt bona fide loyalty programs, a popular consumer benefit that is excluded from other similar legislation (such as the enacted New York S 3008, which carves out deals provided under certain “subscription-based agreements”). While lawmakers likely intended just to prevent exploitative pricing schemes that disempower consumers, they may inadvertently restrict some favorable practices as well. As a result, if statutes aren’t clear, some businesses may forgo offering discounts for fear of noncompliance.
Legal challenges to legislation
When New York S 3008 went into effect on July 8, 2025, the National Retail Federation filed a lawsuit to block the law, alleging that it would violate the First Amendment by including the following requirement, amounting to compelled speech:
Any entity that sets the price of a specific good or service using personalized algorithmic pricing … shall include with such statement, display, image, offer or announcement, a clear and conspicuous disclosure that states: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA”.
The New York Office of the Attorney General, in response, said it would pause enforcement until 30 days after the judge in the case makes a decision on whether to grant a preliminary injunction. Other data-driven pricing bills would not face this challenge, as they don’t contain specific language requirements, instead focusing on prohibiting certain practices.
Beyond legislation
Regulators have also been scrutinizing certain data-driven pricing strategies, particularly for potentially anticompetitive conduct. While the FTC has seemingly deprioritized the 6(b) study of “surveillance pricing” it announced in July 2024—cancelling public comments after releasing preliminary insights from the report in January 2025—it could still take up actions regarding algorithmic pricing in the future under its competition authority. In fact, the FTC’s new leadership has not retracted a joint statement the Commission made in 2024 along with the Department of Justice (DOJ), European Commission, and UK Competition and Markets Authority, which affirmed “a commitment to protecting competition across the artificial intelligence (AI) ecosystem.” The FTC, along with 17 state attorneys general (AGs), also still has a pending lawsuit against Amazon, accusing the company of using algorithms to deter other sellers from offering lower prices.
Even if the FTC refrains from regulating data-driven pricing, other regulators may be interested in addressing the issue. In particular, in 2024 the DOJ, alongside eight state AGs, used its antitrust authority to sue the property management software company RealPage for allegedly using an algorithmic pricing model and nonpublic housing rental data to collude with other landlords. Anticompetitive uses of algorithmic pricing tools is also a DOJ priority under new leadership, with the agency filing a statement of interest regarding the “application of the antitrust laws to claims alleging algorithmic collusion and information exchange” in a March 2025 case, and the agency’s Antitrust Division head promising an increase in probes of algorithmic pricing. Additionally, in response to reports claiming that Delta Air Lines planned to institute algorithmic pricing for tickets—and a letter to the company from Senators Gallego (D-AZ), Blumenthal (D-CT), and Warner (D-VA)—the Department of Transportation Secretary signalled that the agency would investigate such practices.
Conclusion
Policymakers are turning their attention towards certain data-driven pricing strategies, concerned about the impact—on consumers and markets—of practices that use large amounts of data and algorithms to set and adjust prices. Focused on practices such as “algorithmic,” “surveillance,” and “dynamic” pricing, these bills generally address pricing that involves the use of personal data, the deployment of AI, and/or frequent changes, particularly in critical sectors like food and housing. As access to consumer data grows, and algorithms are implemented in more domains, industry may increasingly rely on data-driven pricing tools to set prices. As such, legislators and regulators will likely continue to scrutinize their potential harmful impacts.
- While some forms of price discrimination are illegal, many are not. The term “discrimination” as used in this context is distinct from how it’s used in the context of civil rights. ↩︎
- The New York Attorney General’s office said, as of July 14, 2025, that it would pause enforcement of the law while a federal judge decides on a motion for preliminary injunction, following a lawsuit brought by the National Retail Federation. ↩︎