April 2026

Key point: Alabama became the 21st state to enact a broader consumer data privacy law, Kentucky and Virginia finalized amendments to their consumer data privacy laws, and Nebraska amended its Age-Appropriate Design Code Act.

Below is the 14th update on the status of proposed state privacy legislation in 2026. This post covers updates on proposed bills dealing with consumer data privacy, children’s privacy, biometric privacy, data brokers, and consumer health data privacy. As always, the contents provided below are time-sensitive and subject to change.

On January 1, the California Delete Act went into effect establishing new requirements for entities that qualify as “data brokers” under the law. Beginning August 1, covered businesses will need to process deletion requests through the California Privacy Protection Agency’s (CalPrivacy) centralized mechanism — with fines of $200 per day

Key point: Alabama becomes the 21st state to enact a broad consumer data privacy law with a law that is one of the more business-friendly laws passed to date.

According to Privacy Daily, on April 16, 2026, Alabama Governor Kay Ivey signed the Alabama Personal Data Protection Act (HB 351) into law, making Alabama the 21st state to pass a broad consumer data privacy law and the second state to do so this year. This is the second privacy law Alabama enacted this year. The state enacted an app store law in February.

With passage of Alabama’s law, approximately 46% of the U.S. population will now be covered by a broad consumer data privacy law.

The new business-friendly law is largely unremarkable. Companies that are complying with other state consumer data privacy laws will not need to do anything new to comply with Alabama’s law. However, the law does have a few nuances that we discuss in the article below — in particular, the law’s applicability standard and its definition of “sale.”

Key Points: An August 2025 federal court ruling has opened the door for plaintiffs to use alleged inaccuracies or misrepresentations in a company’s privacy policy and other privacy disclosures as the basis for a federal wiretapping claim under the Electronic Communications Privacy Act (“ECPA”).

Unlike state wiretapping claims like CIPA, class action plaintiffs can file ECPA claims nationwide and they can carry statutory damages of $100 per day of violation or $10,000, whichever is greater. Plaintiffs’ firms are increasingly leading with ECPA claims in demand letters and class action complaints.

Companies can take steps to help insulate themselves from litigation by assessing and modifying their privacy policy and other data processing disclosures.

Introduction

Any company with a privacy policy that operates a website using so-called tracking technologies such as pixels, cookies, software development kits, or third-party analytics tools (which is practically every company) should be aware of the real class action risk associated with the federal wiretapping law known as the Electronic Communications Privacy Act (ECPA or Wiretap Act) and its “crime-tort” exception.  We have data mined and analyzed thousands of privacy lawsuits using AI to track plaintiff lawyers’ allegations and patterns.

Key point: Legislatures in Nebraska (chatbot bill), Maryland (pricing), and Maine (health) passed AI bills last week.

Below is the 13th update on the status of proposed state AI legislation in 2026. These posts track state AI bills that can directly or indirectly affect private-sector AI developers and deployers. These posts do not track AI bills that focus on government use of AI; insurance; workgroups; education; legal settings; name, image, and likeness; deepfakes; CSAM and sexual material; and election interference. As always, the contents provided below are time-sensitive and subject to change.

Key point: Last week, Alabama’s legislature passed a consumer data privacy bill.

Below is the 13th update on the status of proposed state privacy legislation in 2026. This post covers updates on proposed bills dealing with consumer data privacy, children’s privacy, biometric privacy, data brokers, and consumer health data privacy. As always, the contents provided below are time-sensitive and subject to change.

A federal court in Michigan significantly narrowed Michigan Attorney General (AG) Dana Nessel’s privacy and consumer protection case against Roku, Inc. (Roku) dismissing all non-Children’s Online Privacy Protection Act (COPPA) claims for lack of standing while allowing the state’s privacy claims under COPPA to proceed. The decision highlights COPPA’s utility as a vehicle for state AGs to bring enforcement actions in federal court, while also underscoring the jurisdictional limits on bringing companion state privacy and consumer protection claims in the same forum.

Key point: Five takeaways from March 2026 decisions: (1) Courts diverge on “purpose” requirement in ECPA’s crime-tort exception; (2) Courts consider ECPA exception outside the health care industry; (3) Contradictory statements in privacy policies can defeat consent even when tracking tech use is disclosed; (4) Courts provide guidance on website design to establish consent; and (5) Three courts allow negligence claims to proceed but nix negligence per se claims.

Welcome to our monthly update on how courts across the U.S. have handled privacy litigation involving website tools such as cookies, pixels, session replay, and similar technologies. In this post, we cover decisions from March 2026.

Key point: Businesses operating generative artificial intelligence systems in Utah and Washington may be subject to new legal obligations, such as including provenance data in content created or altered using generative artificial intelligence.

In March, Utah’s Digital Content Provenance Standards Act (HB 276) was signed by Governor Spencer Cox, and Washington’s HB 1170 on regulation of AI-modified content was signed by Governor Bob Ferguson. Both laws impose certain obligations related to provenance data on covered providers that create, code, or otherwise produce a generative artificial intelligence (GenAI) system that has more than 1 million monthly users and is publicly accessible within the geographic boundaries of each state. Utah and Washington’s bills largely align with the California AI Transparency Act (CAITA) and AB 853, which obligate creators of GenAI systems, large online platforms, GenAI hosting platforms, and capture device manufacturers to fulfill certain provenance data requirements. The article below provides an overview of the California, Utah, and Washington laws and compares the obligations of covered providers under each law.

Key point: Last week, chatbot bills were signed into law in Oregon and Idaho, while a health care-related AI bill was signed into law in Tennessee.

Below is the 12th update on the status of proposed state AI legislation in 2026. These posts track state AI bills that can directly or indirectly affect private-sector AI developers and deployers. These posts do not track AI bills that focus on government use of AI; insurance; workgroups; education; legal settings; name, image, and likeness; deepfakes; CSAM and sexual material; and election interference. As always, the contents provided below are time-sensitive and subject to change.