Photo of Shelby Dolen

Shelby develops and implements comprehensive privacy programs that are tailored to the specific needs of each client, helping them to remain compliant as privacy laws continue to evolve at the state, federal, and international levels. She is well versed in all U.S. state privacy laws, laws governing social media and children’s data, AI laws and regulations, and international data privacy laws, including the GDPR.

Key point: The amendment significantly expands the existing law’s applicability, redefines “covered design feature,” creates two new prohibitions, and adds a new deletion right.

On April 17, 2026, Nebraska Governor Jim Pillen signed LB 838 into law. The omnibus bill addresses several different topics but, as is relevant here, it amends Nebraska’s Age-Appropriate Online Design Code Act, Neb. Rev. Stats. § 87-1301, et seq, which only went into effect on January 1, 2026. We provided an overview of the existing law here. The amendment goes into effect three months after the legislature’s April 17 adjournment date, on July 17, 2026.

The article below provides a summary of the changes.

A federal court in Michigan significantly narrowed Michigan Attorney General (AG) Dana Nessel’s privacy and consumer protection case against Roku, Inc. (Roku) dismissing all non-Children’s Online Privacy Protection Act (COPPA) claims for lack of standing while allowing the state’s privacy claims under COPPA to proceed. The decision highlights COPPA’s utility as a vehicle for state AGs to bring enforcement actions in federal court, while also underscoring the jurisdictional limits on bringing companion state privacy and consumer protection claims in the same forum.

Key point: Tennessee’s new law prohibits parties that develop or deploy AI systems from advertising or representing to the public that the AI systems can act as a qualified mental health professional. 

On April 1, 2026, Tennessee Governor Bill Lee signed SB 1580 into law, and it will go into effect on July 1, 2026. The new law is short — less than one page — but has potentially significant consequences given that it includes a private right of action.

In the following post, we provide an overview of the new law.

Key point: If enacted, the bill will require GenAI systems to provide a conspicuous warning that GenAI outputs may be inaccurate.

On March 9, 2026, the New York legislature passed A 3411, which requires generative artificial intelligence (GenAI) systems to notify users that the system’s outputs may be inaccurate. The bill will next move to Governor Kathy Hochul for consideration. If it becomes law, the bill will go into effect 90 days from enactment. The bill is short (it contains only 30 lines of text) but has broad implications.

In the article below, we provide background on the bill, an overview of its requirements, and potential implications should it become law.

Key point: The California attorney general announced a $2.75 million fine against a company for CCPA violations for failing to honor requests to opt out of the sale or sharing of personal information across all devices and services associated with consumer accounts.

On February 11, 2026, the California attorney general (AG) announced a settlement with a multiplatform entertainment company, resolving alleged California Consumer Privacy Act (CCPA) violations based on gaps in the company’s opt-out procedures. This is the second public CCPA enforcement settlement arising from the California Department of Justice’s 2024 investigative sweep of streaming services. This also is the largest CCPA settlement amount to date, and is roughly five times the amount of the first enforcement action and more than $1 million more than the prior largest settlement by the AG. These actions reflect an escalating enforcement trajectory as the AG and the California Privacy Protection Agency develop a body of precedent that increasingly functions as operational compliance guidance for businesses. Notably, every CCPA enforcement action to date has involved, in some way, the right to opt out and demonstrates that the AG’s expectations for what constitutes compliant opt-out implementation are becoming both more granular and more demanding with each successive action.

Key point: The Connecticut Office of the Attorney General issued the third annual enforcement report under the Connecticut Data Privacy Act, focusing on the office’s privacy and security efforts, consumer complaints, data breaches, and enforcement priorities.

The Connecticut Office of the Attorney General (OAG) issued its 2025 enforcement report under the Connecticut Data Privacy Act (CTDPA) last week. This is the third report since the CTDPA went into effect in July 2023. The report provides an update on (1) privacy-related consumer complaints, (2) data breach notice review and enforcement, and (3) enforcement efforts and priorities. Importantly, the OAG emphasized that protecting “kids online remains a topmost priority” and that it would continue to pursue investigations and enforcement actions focused on companies that offer online services, products, or features to consumers under 18.

In the report, the OAG also outlined recent amendments to the CTDPA, which will take effect on July 1, 2026. For more information regarding these amendments, see the recording of our webinar on 2025 Key Updates on State Privacy and AI Laws.

This article summarizes the OAG’s report and the positions the OAG takes on various issues. While the report highlights the OAG’s strong pro-consumer stance and illustrates the OAG’s expansive view of the CTDPA and its provisions, in breaking down the report, this article takes no position on the substance of those positions.

Key point: The law, which went into effect at signing, contains significant design and development requirements, requires independent third-party audits, and can be enforced against officers and employees.

On February 5, 2026, South Carolina Governor Henry McMaster signed the South Carolina Age-Appropriate Design Code Act (H 3431). South Carolina now joins California, Maryland, Nebraska, and Vermont in enacting Age-Appropriate Design Code (AADC) laws although these laws vary widely in both scope and requirements.

South Carolina’s law has several unique requirements, including requiring covered online services to engage in independent third-party audits, which are to be publicly posted by the state attorney general. We review these requirements below.

Of further note, the law went into effect upon the governor’s signature and does not contain a right to cure. The law is generally enforceable by the state attorney general who can seek treble financial damages for violations. The law also specifically provides that officers and employees of covered online services can be held personally liable for willful and wanton violations. In addition, the law’s prohibition against dark patterns is enforceable under the South Carolina Unfair Trade Practices Act, which allows for a private right of action. In the below post, we provide an overview of the new law and provide more general context on its provisions.

Key point: Kentucky attorney general files a lawsuit against an artificial intelligence chatbot company, eight days after the Kentucky Consumer Data Protection Act went into effect.

On January 8, the Kentucky attorney general (AG) announced its first lawsuit for violations of the Kentucky Consumer Data Protection Act (KCDPA) against an artificial intelligence (AI) chatbot company. The complaint alleges that the defendant violated the KCDPA with unfair, false, misleading, or deceptive acts and practices, and through unfair collection and exploitation of children’s data. Among other claims, the complaint also states claims under the state’s consumer protection law and data breach law.

The complaint is the latest in a growing trend of states regulating AI chatbots, including companion chatbots. As we recently discussed, New York and California passed laws last year specifically regulating companion chatbots. Lawmakers in other states have already proposed numerous bills this year. This comes notwithstanding the recent executive order, which seeks to preempt “onerous” state AI laws. As we foreshadowed in our analysis of that order, the instant complaint also reinforces the difficulty in defining what constitutes a state AI law, as the complaint is brought under existing state laws that are not specifically written to cover AI.

In the article below, we provide a summary of the allegations in the complaint.

Key point: Businesses subject to the CCPA now must conduct risk assessments for certain types of processing activities and, starting in 2028, must certify to California regulators that they completed the assessments.

The California Consumer Privacy Act’s (CCPA) new regulations went into effect on January 1, 2026. Although the new regulations bring many changes for businesses subject to the CCPA, one of the biggest changes is a new requirement to conduct risk assessments for processing activities that present “significant risk to consumers’ privacy.” This can encompass many types of common data processing activities such as the use of third-party cookies and tracking technologies, processing of sensitive personal information (e.g., biometric data), and the use of AI for certain employment-related activities. Like the CCPA, the risk assessment requirement applies to consumer, employee, and commercial personal information.

Importantly, on April 1, 2028, businesses subject to the CCPA must file a certification with the California Privacy Protection Agency (CalPrivacy) attesting — under penalty of perjury — that they conducted the required risk assessments. The certification must be signed by a member of the business’s executive management team.

In the below article, we provide an overview of this new risk assessment requirement.