Photo of Shelby Dolen

Shelby develops and implements comprehensive privacy programs that are tailored to the specific needs of each client, helping them to remain compliant as privacy laws continue to evolve at the state, federal, and international levels. She is well versed in all U.S. state privacy laws, laws governing social media and children’s data, AI laws and regulations, and international data privacy laws, including the GDPR.

Key point: If enacted, the bill will require GenAI systems to provide a conspicuous warning that GenAI outputs may be inaccurate.

On March 9, 2026, the New York legislature passed A 3411, which requires generative artificial intelligence (GenAI) systems to notify users that the system’s outputs may be inaccurate. The bill will next move to Governor Kathy Hochul for consideration. If it becomes law, the bill will go into effect 90 days from enactment. The bill is short (it contains only 30 lines of text) but has broad implications.

In the article below, we provide background on the bill, an overview of its requirements, and potential implications should it become law.

Key point: The California attorney general announced a $2.75 million fine against a company for CCPA violations for failing to honor requests to opt out of the sale or sharing of personal information across all devices and services associated with consumer accounts.

On February 11, 2026, the California attorney general (AG) announced a settlement with a multiplatform entertainment company, resolving alleged California Consumer Privacy Act (CCPA) violations based on gaps in the company’s opt-out procedures. This is the second public CCPA enforcement settlement arising from the California Department of Justice’s 2024 investigative sweep of streaming services. This also is the largest CCPA settlement amount to date, and is roughly five times the amount of the first enforcement action and more than $1 million more than the prior largest settlement by the AG. These actions reflect an escalating enforcement trajectory as the AG and the California Privacy Protection Agency develop a body of precedent that increasingly functions as operational compliance guidance for businesses. Notably, every CCPA enforcement action to date has involved, in some way, the right to opt out and demonstrates that the AG’s expectations for what constitutes compliant opt-out implementation are becoming both more granular and more demanding with each successive action.

Key point: The Connecticut Office of the Attorney General issued the third annual enforcement report under the Connecticut Data Privacy Act, focusing on the office’s privacy and security efforts, consumer complaints, data breaches, and enforcement priorities.

The Connecticut Office of the Attorney General (OAG) issued its 2025 enforcement report under the Connecticut Data Privacy Act (CTDPA) last week. This is the third report since the CTDPA went into effect in July 2023. The report provides an update on (1) privacy-related consumer complaints, (2) data breach notice review and enforcement, and (3) enforcement efforts and priorities. Importantly, the OAG emphasized that protecting “kids online remains a topmost priority” and that it would continue to pursue investigations and enforcement actions focused on companies that offer online services, products, or features to consumers under 18.

In the report, the OAG also outlined recent amendments to the CTDPA, which will take effect on July 1, 2026. For more information regarding these amendments, see the recording of our webinar on 2025 Key Updates on State Privacy and AI Laws.

This article summarizes the OAG’s report and the positions the OAG takes on various issues. While the report highlights the OAG’s strong pro-consumer stance and illustrates the OAG’s expansive view of the CTDPA and its provisions, in breaking down the report, this article takes no position on the substance of those positions.

Key point: The law, which went into effect at signing, contains significant design and development requirements, requires independent third-party audits, and can be enforced against officers and employees.

On February 5, 2026, South Carolina Governor Henry McMaster signed the South Carolina Age-Appropriate Design Code Act (H 3431). South Carolina now joins California, Maryland, Nebraska, and Vermont in enacting Age-Appropriate Design Code (AADC) laws although these laws vary widely in both scope and requirements.

South Carolina’s law has several unique requirements, including requiring covered online services to engage in independent third-party audits, which are to be publicly posted by the state attorney general. We review these requirements below.

Of further note, the law went into effect upon the governor’s signature and does not contain a right to cure. The law is generally enforceable by the state attorney general who can seek treble financial damages for violations. The law also specifically provides that officers and employees of covered online services can be held personally liable for willful and wanton violations. In addition, the law’s prohibition against dark patterns is enforceable under the South Carolina Unfair Trade Practices Act, which allows for a private right of action. In the below post, we provide an overview of the new law and provide more general context on its provisions.

Key point: Kentucky attorney general files a lawsuit against an artificial intelligence chatbot company, eight days after the Kentucky Consumer Data Protection Act went into effect.

On January 8, the Kentucky attorney general (AG) announced its first lawsuit for violations of the Kentucky Consumer Data Protection Act (KCDPA) against an artificial intelligence (AI) chatbot company. The complaint alleges that the defendant violated the KCDPA with unfair, false, misleading, or deceptive acts and practices, and through unfair collection and exploitation of children’s data. Among other claims, the complaint also states claims under the state’s consumer protection law and data breach law.

The complaint is the latest in a growing trend of states regulating AI chatbots, including companion chatbots. As we recently discussed, New York and California passed laws last year specifically regulating companion chatbots. Lawmakers in other states have already proposed numerous bills this year. This comes notwithstanding the recent executive order, which seeks to preempt “onerous” state AI laws. As we foreshadowed in our analysis of that order, the instant complaint also reinforces the difficulty in defining what constitutes a state AI law, as the complaint is brought under existing state laws that are not specifically written to cover AI.

In the article below, we provide a summary of the allegations in the complaint.

Key point: Businesses subject to the CCPA now must conduct risk assessments for certain types of processing activities and, starting in 2028, must certify to California regulators that they completed the assessments.

The California Consumer Privacy Act’s (CCPA) new regulations went into effect on January 1, 2026. Although the new regulations bring many changes for businesses subject to the CCPA, one of the biggest changes is a new requirement to conduct risk assessments for processing activities that present “significant risk to consumers’ privacy.” This can encompass many types of common data processing activities such as the use of third-party cookies and tracking technologies, processing of sensitive personal information (e.g., biometric data), and the use of AI for certain employment-related activities. Like the CCPA, the risk assessment requirement applies to consumer, employee, and commercial personal information.

Importantly, on April 1, 2028, businesses subject to the CCPA must file a certification with the California Privacy Protection Agency (CalPrivacy) attesting — under penalty of perjury — that they conducted the required risk assessments. The certification must be signed by a member of the business’s executive management team.

In the below article, we provide an overview of this new risk assessment requirement.

Key point: Set to take effect on January 1, 2026, court blocks the Texas App Store Accountability Act on constitutional grounds.

A Texas federal district court granted a preliminary injunction enjoining the Texas App Store Accountability Act today, stating that the law likely violates the First Amendment and is unconstitutionally vague. In October, an internet trade association sued the state of Texas over the act, and this month the case was consolidated with another case stating similar claims. The law was scheduled to take effect January 1, 2026, and imposed obligations on both app stores and developers providing mobile applications to Texas users. Texas will be unable to implement or enforce the act while the litigation is ongoing.

Key point: The California AG’s fifth CCPA-related enforcement action focuses on the CCPA’s right to opt out of sales/shares and on children’s privacy provisions and, with respect to the right to opt out, it should trigger businesses to reevaluate their procedures, especially as it relates to the treatment of account holders and mobile apps.

On October 30, 2025, the California attorney general (AG) announced a settlement with a streaming services provider[1] over violations of the California Consumer Privacy Act (CCPA). Pursuant to the proposed final judgment and permanent injunction, the company will pay a $530,000 fine and implement several injunctive relief requirements. According to the press release, the settlement arose from a 2024 investigative sweep of streaming services.

The complaint alleges two CCPA violations: (1) failure to provide easy-to-execute methods for consumers to opt out of the selling and sharing of their personal information; and (2) failure to provide sufficient privacy protections for children. Given that these are distinct issues, we will address them in two separate articles. This first article provides a brief background of the enforcement action, an analysis of the right to opt-out violations, and a summary of the injunctive relief requirements. The next article will analyze the children’s privacy violations.

Key point: California’s new Digital Age Assurance Act will likely create significant compliance challenges for many businesses.

On October 13, 2025, California Governor Gavin Newsom signed AB 1043 — the Digital Age Assurance Act — into law. In doing so, California joins Louisiana, Texas, and Utah, in passing laws this year requiring app developers to receive age bracket signals. While California’s law is more operational in nature, and in key respects narrower than the content-focused nature of the laws passed by Louisiana, Texas, and Utah, when AB 1043 goes into effect on January 1, 2027, the law will likely require companies to consider unique implementation strategies and may frustrate approaches to creating a uniform age-assurance compliance program. Further, the law will likely affect almost every app developer operating in California, including many that have never dealt with age verification requirements.

In the below article, we provide background and a summary of the law, discuss how it compares with other similar-in-kind laws, and outline some implications businesses will need to consider.