Key point: If enacted, the bill will require GenAI systems to provide a conspicuous warning that GenAI outputs may be inaccurate.

On March 9, 2026, the New York legislature passed A 3411, which requires generative artificial intelligence (GenAI) systems to notify users that the system’s outputs may be inaccurate. The bill will next move to Governor Kathy Hochul for consideration. If it becomes law, the bill will go into effect 90 days from enactment. The bill is short (it contains only 30 lines of text) but has broad implications.

In the article below, we provide background on the bill, an overview of its requirements, and potential implications should it become law.

Background

The bill is sponsored by Senator Kristen Gonzalez in the Senate (S 934) and Representative Clyde Vanel in the House (A 3411). Gonzalez has become a prominent voice with New York technology bills. For example, for the past two years she spearheaded the New York Legislative Oversight of Automated Decision-making in Government (LOADinG) Act, among many other bills.

The concept of New York lawmakers requiring companies to provide warning or disclosure labels may seem familiar. Last year, the New York legislature passed the Algorithmic Pricing Disclosure Act, which requires any entity that “sets the price of a specific good or service using personalized algorithmic pricing, and that directly or indirectly, advertises, promotes, labels or publishes a statement, display, image, offer or announcement of personalized algorithmic pricing to a consumer in New York, using personal data specific to such consumer” to “include with such statement, display, image, offer or announcement, a clear and conspicuous disclosure that states: ‘This price was set by an algorithm using your personal data.’” A federal district court upheld that bill as constitutional, and the ruling is currently on appeal.

At least one trade association has similarly claimed that the current bill is unconstitutional.

Applicability

The bill applies to GenAI systems, defined as a “class of artificial intelligence models that are self-supervised and emulate the structure and characteristics of input data to generate derived synthetic content, including, but not limited to, images, videos, audio, text, and other digital content.”

As originally introduced, the bill defined a GenAI system more broadly as “any artificial intelligence system whose primary function is to generate content, which can take the form of code, text, images, and more.”

The bill defines AI to mean “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments, and that uses machine- and human-based inputs to perceive real and virtual environments, abstract such perceptions into models through analysis in an automated manner, and use model inference to formulate options for information or action.” The bill goes on to state that the definition “includes but is not limited to systems that use machine learning, large language model, natural language processing, and computer vision technologies, including generative AI.”

The same trade association mentioned above has criticized the definition for being overbroad and applying to too many types of tools that have been widely commercially available for decades. The bill appears to apply to consumer-facing and internal-facing uses.

Warning Requirement

The bill provides that “the owner, licensee or operator of a generative artificial intelligence system shall clearly and conspicuously display a notice on the system’s user interface that the outputs of the generative artificial intelligence system may be inaccurate.”

As originally introduced, the bill required warnings to state that outputs may be “inaccurate and/or inappropriate.” However, the later draft removed the “inappropriate” term. The original draft also provided that the warning had to “consistently apprise the user,” but that requirement also was removed.

Perhaps in recognition of the bill’s broad applicability, the bill does not provide any context for what constitutes a clear and conspicuous display. In contrast, last year’s Algorithmic Pricing Disclosure Act defined this phrase as “disclosure in the same medium as, and provided on, at, or near and contemporaneous with every advertisement, display, image, offer or announcement of a price for which notice is required, using lettering and wording that is easily visible and understandable to the average consumer.”

Enforcement

If an owner, licensee, or operator of a GenAI system fails to provide the notice, it is subject to a $1,000 fine for each violation. The bill provides that each user the covered entity fails to provide a notice to constitutes a separate violation. In the original bill, the civil penalty amount also was much higher at $25,000.

Effective Date

The bill would go into effect 90 days after it becomes law.