Key Points: California Attorney General Rob Bonta announced a sweep concerning so-called “surveillance pricing” or “algorithmic pricing” The AG highlights potential CCPA privacy violations tied to the use of individualized pricing models based on a lack of transparency and failure to comply with the CCPA’s “purpose limitation” principle. Other regulators are likely to follow suit — now is the time to assess and mitigate potential compliance and enforcement risks.

On January 27, 2026, California Attorney General (AG) Rob Bonta announced an investigative sweep focused on businesses that use consumer data to individualize prices for their goods or services. Bonta framed the issue as follows:

Consumers have the right to understand how their personal information is being used, including whether companies are using their data to set the prices that Californians pay, whether that be for groceries, travel, or household goods. We need to know whether businesses are charging people different prices for the same good or service — and if they’re complying with the law.”

The California Department of Justice (DOJ) is issuing written inquiries to businesses with substantial online operations in the retail, grocery, and hotel industries that leverage individualized pricing. It is requesting certain information on this issue, including details about:

  • Companies’ use of consumer personal information to set prices.
  • Policies and public disclosures regarding personalized pricing.
  • Any pricing experiments undertaken by companies.
  • Measures companies are taking to comply with algorithmic pricing, competition, and civil rights laws.

This post summarizes the basis for the California DOJ’s investigatory sweep, how it intends to apply California Consumer Privacy Act (CCPA) requirements, and how businesses can prepare for and mitigate the risk of these inquiries and potential enforcement actions.

What is “Surveillance Pricing”?

According to a 2025 Federal Trade Commission (FTC) study, some companies track consumer behavior to set pricing for goods or services that are specific to the individual consumer. Details such as a person’s precise location, demographic information, or browser history can be used to target individual consumers with different prices for the same goods and services. The study found that consumer behaviors ranging from mouse movements on a webpage to the types of products left unpurchased in an online shopping cart can be tracked and used by retailers to tailor pricing. Per the FTC, the goal of this approach is to infer and offer the highest price that a customer is willing to pay for a good or service.

In recent months, several studies and reports by consumer watchdog groups have highlighted alleged price disparities resulting from companies using individualized pricing models. These reports have likely contributed to the California DOJ setting its sights on addressing individualized pricing and are also likely to spark a wave of additional scrutiny by other regulatory agencies in the coming year.

CCPA Requirements and Potential Compliance Issues

The DOJ announcement highlighted two main concerns: “practices like surveillance pricing may undermine consumer trust, unfairly raise prices, and when conducted without proper disclosure or beyond reasonable expectations, may violate California law.”

The CCPA, in the form of mandated privacy notices, gives consumers the right to know how businesses collect, share, and use their personal information. The AG believes that some businesses do not provide adequate transparency around individualized pricing:

Consumers have the right to understand how their personal information is being used, including whether companies are using their data to set the prices that Californians pay, whether that be for groceries, travel, or household goods.

Additionally, per the CCPA’s “purpose limitation” principle, in certain cases regulated businesses must limit their collection, use, retention, and sharing of personal information. The CCPA provides as follows:

A business’s collection, use, retention, and sharing of a consumer’s personal information shall be reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed in a manner that is incompatible with those purposes.

The CCPA’s final regulations list five factors for analyzing whether the purpose for collecting or processing a consumer’s personal information is consistent with their reasonable expectations, including:

  1. The relationship between the consumer(s) and the business.
  2. The type, nature, and amount of personal information that the business plans to collect or process.
  3. The source of the personal information and the business’s method for collecting or processing it.
  4. The specificity, explicitness, prominence, and clarity of disclosures to the consumer(s) about the purpose for collecting or processing their personal information.
  5. The degree to which the involvement of service providers, contractors, third parties, or other entities in the collecting or processing of personal information is apparent to the consumer(s).

The regulations identify other criteria for determining whether another stated purpose for using personal information is compatible with the context in which the personal information is collected or processed (and therefore consistent with a consumer’s reasonable expectations).

Note, while the DOJ announcement was silent on this point, the CCPA also provides to consumers the right to limit certain uses of sensitive personal information (SPI) (e.g., precise geolocation data, health information, biometric data, and other data types). If a business uses SPI beyond various purposes prescribed by the CCPA in order to “infer characteristics” of consumers, businesses must provide a “Notice of Right to Limit” and a “Limit the Use of My Sensitive Personal Information” link or an “Alternative Opt-out Link.” To the extent the DOJ discovers businesses using SPI for inferring characteristics tied to individualized pricing, the DOJ is likely to scrutinize these CCPA requirements as well.

Significantly, other state privacy laws (e.g., Colorado, Connecticut, Delaware, Iowa, Indiana, Kentucky, Maryland, Minnesota, Montana, New Hampshire, New Jersey, Oregon, Tennessee, Texas, Virginia) impose similar purpose limitations, and the FTC has weighed in on the topic of individualized pricing as a potential unfair or deceptive trade practice under the Section 5 of the FTC Act.

What Can Companies Do to Prepare?

The DOJ’s focus on transparency, purpose limitations, and consumers’ reasonable expectations (in light of the CCPA’s purpose limitations) suggests the following steps for mitigating compliance and enforcement risk:

  1. Individualized pricing data processing “inventory. Take steps to determine whether and how individualized pricing is implemented by the organization. This will require the legal team to coordinate with stakeholders on the product, marketing, data, and IT/web teams. Efforts here will include identifying the personal information collected and processed for individualized pricing (including potentially SPI), determining how it is used and disclosed in support of those efforts, and investigating the operation of any in-house or third-party tools used to set pricing.
  2. Reassess and update privacy disclosures. The CCPA generally requires businesses to disclose the categories of personal information they collect, and how they use, disclose, and process such information. Additionally, notice to consumers concerning specific data uses (including potentially individualized pricing) helps create reasonable consumer expectations for processing their data. As such, it’s important to confirm that a business’s privacy disclosures and representations (not only formal privacy policies, but also marketing materials and other customer-facing documents) accurately reflect its data processing activities with an appropriate level of detail and clarity, including how personal information may be used in this context.
  3. Analyze potential AI, ADMT, and profiling due diligence and legal obligations. Many organizations are establishing individualized pricing using personal information combined with artificial intelligence (algorithmic pricing), automated decision-making, and profiling. See e.g., Colorado Artificial Intelligence Act or New York Pricing Disclosure Law. To the extent these approaches involve “consequential decisions,” additional privacy and AI legal requirements may apply, including the need to conduct risk and data processing impact assessments, make prescribed disclosures, and provide certain data subject rights.
  4. Anticipate and prepare for potential regulatory scrutiny and enforcement. Regulators are clearly interested in individualized pricing, especially in an inflationary economic environment. As such, in addition to taking stock and addressing compliance, organizations should work backwards in anticipation of regulatory activity (and perhaps class action litigation as well). This includes analyzing individualized pricing under attorney-client privilege, reviewing internal documentation or communications concerning individualized pricing, developing a governance approach to enable vetting of individualized pricing models before implementation, and mapping specific requirements to the organization’s actual practices (with an eye toward establishing reasonable positions regarding compliance).

Overall, companies should assume any individualized pricing models they use will draw increased regulatory scrutiny for the foreseeable future. Given the intersection of multiple areas of regulatory concern (i.e., pricing, antitrust, artificial intelligence, privacy, civil rights, etc.), it is no surprise that regulators at the FTC and California DOJ have already weighed in. Other regulators will likely follow suit in order to garner favorable attention, including seeking fines and penalties, as well as injunctive relief to generate politically favorable headlines on an issue that impacts consumer pocketbooks. Now is the time to take stock and implement defensible practices to get ahead of these issues.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of David Navetta David Navetta

David advises clients on all aspects of technology and data law, including data privacy, information security, artificial intelligence (AI), financial reporting, data governance, technology-related transactions, and data monetization and use.

Photo of Karla Ballesteros Karla Ballesteros

Karla is an associate in the firm’s Privacy + Cyber practice. Her daily work includes counseling insureds on the initial incident response, potential ransom payment, restoration, data mining, and notification segments of the incident response practice. She also leads efforts to identifying and…

Karla is an associate in the firm’s Privacy + Cyber practice. Her daily work includes counseling insureds on the initial incident response, potential ransom payment, restoration, data mining, and notification segments of the incident response practice. She also leads efforts to identifying and remediating shortcomings in cybersecurity and privacy practices of firm clients.

Photo of Brianna Dally Brianna Dally

Brianna provides comprehensive advice to clients across various industries on privacy and cybersecurity issues. Her work ranges from implementing information security and incident response programs to addressing complex compliance questions. Brianna has experience advising on compliance with regulations such as the New York…

Brianna provides comprehensive advice to clients across various industries on privacy and cybersecurity issues. Her work ranges from implementing information security and incident response programs to addressing complex compliance questions. Brianna has experience advising on compliance with regulations such as the New York Department of Financial Services (NYDFS) Cybersecurity Regulation and other insurance data security laws modeled on the NAIC Insurance Data Security Model Law.

Photo of Daniel Waltz Daniel Waltz

Daniel is a member of the firm’s Regulatory Investigations, Strategy + Enforcement (RISE) Practice Group and State Attorneys General team. He counsels clients in connection with navigating complex government investigations, regulatory compliance, and transactions, involving state and federal government contracting obligations. Drawing on

Daniel is a member of the firm’s Regulatory Investigations, Strategy + Enforcement (RISE) Practice Group and State Attorneys General team. He counsels clients in connection with navigating complex government investigations, regulatory compliance, and transactions, involving state and federal government contracting obligations. Drawing on his broad experience as a former assistant attorney general for the state of Illinois, Daniel is a problem solver both inside and outside the courtroom.