Bill Text: CA AB3211 | 2023-2024 | Regular Session | Amended
Bill Title: California Digital Content Provenance Standards.
Spectrum: Partisan Bill (Democrat 1-0)
Status: (Engrossed - Dead) 2024-08-31 - Ordered to inactive file at the request of Senator Gonzalez. [AB3211 Detail]
Download: California-2023-AB3211-Amended.html
Amended
IN
Senate
June 10, 2024 |
Amended
IN
Assembly
April 18, 2024 |
Amended
IN
Assembly
March 21, 2024 |
Introduced by Assembly Member Wicks |
February 16, 2024 |
LEGISLATIVE COUNSEL'S DIGEST
Digest Key
Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NOBill Text
The people of the State of California do enact as follows:
SECTION 1.
The Legislature finds and declares all of the following:SEC. 2.
Chapter 41 (commencing with Section 22949.90) is added to Division 8 of the Business and Professions Code, to read:CHAPTER 41. California Provenance, Authenticity, and Watermarking Standards
22949.90.
For purposes of this chapter, the following definitions apply:(c)“Authentic content” means images, videos, audio, or text created by human beings without any modifications or with only minor modifications that do not lead to significant changes to the perceived contents or meaning of the content. Minor modifications include, but are not limited to, changes to brightness or contrast of images, removal of background noise in audio, and
spelling or grammar corrections in text.
(d)
(i)“Inauthentic content” means synthetic content that is so similar to authentic content
that it could be mistaken as authentic.
(j)
(k)
(l)
(m)
(n)
(o)
22949.90.1.
(a) A generative AI provider shall do all of the following:(i)
(ii)
(iii)
(iv)
(2)Develop downloadable watermark decoders that allow a user to determine whether a piece of content was created with the provider’s system, and make those tools available to the public.
(b)A generative AI provider may continue to make available a generative AI system that
was made available before the date upon which this act takes effect and that does not have watermarking capabilities as described by paragraph (1) of subdivision (a), if either of the following conditions are met:
(1)The provider is able to retroactively create and make publicly available a decoder that accurately determines whether a given piece of content was produced by the provider’s system with at least 99 percent accuracy as measured by an independent auditor.
(2)The provider conducts and publishes research to definitively demonstrate that the system is not capable of producing inauthentic content.
(4)The requirements under this subdivision shall not apply to conversational AI systems that do not produce inauthentic content.
(a)For purposes of this section, the following definitions apply:
(1)“Authenticity watermark” means a watermark of authentic content that includes the name of the device manufacturer.
(2)“Camera and recording device manufacturer” means the makers of a device that can record photographic, audio, or video content, including, but not limited to, video and still photography cameras, mobile phones with built-in cameras or microphones, and voice recorders.
(3)“Provenance watermark” means a watermark of authentic content that includes details about the content, including, but not limited to, the time and date of production, the name of the user, details about the device, and a digital signature.
(b)(1)Beginning
22949.90.2.
(a) (1) Beginning January 1, 2026, newly manufactured(3)Authenticity watermarks shall be turned on by default, while provenance watermarks shall be turned off by default.
(4)Newly manufactured digital cameras and recording devices subject to the requirements of this subdivision shall clearly inform a user of the existence of the authenticity and provenance watermarks settings upon the user’s first use of the camera or the recording function on the recording device.
(A)
(B)A newly manufactured digital camera or recording device shall allow the user to adjust the watermarks settings.
(c)
(d)
22949.90.3.
(a) Beginning March 1, 2025, a large online platform shall use labels to(d)(1)A large online platform shall require a user that uploads or distributes content on its platform to disclose whether the content is synthetic content.