CHAPTER
25. Automated Decision Tools
22756.
As used in this chapter:(a) “Algorithmic discrimination” means the condition in which an automated decision tool contributes to unjustified differential treatment or impacts disfavoring people based on their actual or perceived race, color, ethnicity, sex, religion, age, national origin, limited English proficiency, disability, veteran status, genetic information, reproductive health, or any other classification protected by state law.
(b) “Artificial intelligence” means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives
how to generate outputs that can influence physical or virtual environments.
(c) “Automated decision tool” means a system or service that uses artificial intelligence and has been specifically developed to, or specifically modified to, make, or be a substantial factor in making, consequential decisions.
(d) “Consequential decision” means a decision or judgment that has a legal, material, or similarly significant effect on an individual’s life relating to access to government benefits or services, assignments of penalties by government, or the impact of, or the cost, terms, or availability of, any of the following:
(1) Employment with respect to all of the following:
(A) Pay
or promotion.
(B) Hiring or termination.
(C) Automated task allocation that limits, segregates, or classifies employees for the purpose of assigning or determining material terms or conditions of employment.
(2) Education and vocational training as it relates to all of the following:
(A) Assessment.
(B) Detecting student cheating or plagiarism.
(C) Accreditation.
(D) Certification.
(E) Admissions.
(F) Financial aid or scholarships.
(3) Housing or lodging, including rental or short-term housing or lodging.
(4) All of the following essential utilities:
(A) Electricity.
(B) Heat.
(C) Water.
(D) Internet or telecommunications access.
(E) Transportation.
(5) Family planning.
(6) Adoption services, reproductive services, or assessments related to child protective services.
(7) Health care or health insurance, including mental health care, dental, or vision.
(8) Financial services, including a financial service provided by a mortgage company, mortgage broker, or creditor.
(9) All of the following aspects of the criminal justice system:
(A) Risk assessments for pretrial hearings.
(B) Sentencing.
(C) Parole.
(10) Legal services.
(11) Private arbitration.
(12) Mediation.
(13) Voting.
(e) “Deployer” means a person, partnership, local government agency, developer, or corporation that uses an automated decision tool to make a consequential decision.
(f) “Developer” means a person, partnership, state or local government agency, or corporation that designs, codes, or produces an automated decision tool, or substantially modifies an artificial intelligence system or service for the intended purpose of making, or being a substantial factor in making, consequential decisions, whether for its own use or for use by a third party.
(g)“State government deployer” means a state government agency that uses an automated decision tool to make a consequential decision.
(h)
(g) “Impact assessment” means a documented risk-based evaluation of an automated decision tool that meets the criteria of Section 22756.1.
(i)
(h) “Sex” includes pregnancy, childbirth, and related conditions, gender identity, intersex status, and sexual orientation.
(j)
(i) “Significant update” means a new version, new release, or other update to an automated decision tool that materially changes its principal use, principal intended use, or outcome.
(j) “State government deployer” means a state government agency that uses an automated decision tool to make a consequential decision.
22756.1.
(a) (1) Subject to paragraph (2), a deployer shall perform an impact assessment on any automated decision tool before first using it and annually thereafter.(2) (A) With respect to an automated decision tool that a deployer first used before January 1, 2025, the deployer shall perform an impact assessment on that automated decision tool before January 1, 2026, and annually thereafter.
(B) This subdivision does
not require a deployer to perform an impact assessment on an automated decision tool before using it if all of the following are true:
(i) The deployer uses the automated decision tool only for the intended use determined by the developer of the automated decision tool.
(ii) The deployer does not make any significant updates to the automated decision tool.
(iii) The developer of the automated decision tool has performed any impact assessment on the automated decision tool required by subdivision (c).
(b) A deployer shall ensure that an impact assessment prepared pursuant to subdivision (a) includes all of the following:
(1) A statement of
the purpose of the automated decision tool and its intended benefits, uses, and deployment contexts.
(2) A description of the automated decision tool’s outputs and how they are used to make, or be a substantial factor in making, a consequential decision.
(3) A summary of the categories of information collected from natural persons and processed by the automated decision tool when it is used to make, or be a substantial factor in making, a consequential decision, including, but not limited to, all of the following:
(A) Each category of personal information identified by reference to the applicable subparagraph enumerated under paragraph (1) of subdivision (v) of Section 1798.140 of the Civil Code.
(B) Each category of sensitive personal information identified by reference to the applicable paragraph and subparagraph enumerated under subdivision (ae) of Section 1798.140 of the Civil Code.
(C) Each category of information related to a natural person’s receipt of sensitive services, as defined in Section 56.05 of the Civil Code, identified by reference to the specific category of sensitive service enumerated in the definition.
(4) A statement of the extent to which the deployer’s use of the automated decision tool is consistent with or varies from the statement required of the developer by Section 22756.3.
(5) An analysis of potential adverse impacts on the
basis of sex, race, color, ethnicity, religion, age, national origin, limited English proficiency, disability, veteran status, or genetic information from the deployer’s use of the automated decision tool.
(6) A description of the safeguards implemented, or that will be implemented, by the deployer to address any reasonably foreseeable risks of algorithmic discrimination arising from the use of the automated decision tool known to the deployer at the time of the impact assessment.
(7) A description of how the automated decision tool will be used by a natural person, or monitored when it is used, to make, or be a substantial factor in making, a consequential decision.
(8) A description of how the automated decision tool has
been or will be evaluated for validity or relevance.
(c) (1) Subject to paragraph (2), a developer, before making an automated decision tool that it designs, codes, or produces available to potential deployers, shall perform an impact assessment on the automated decision tool and annually thereafter.
(2) With respect to an automated decision tool that a developer first made available to potential deployers before January 1, 2025, the developer shall perform an impact assessment on the automated decision tool before January 1, 2026, and annually thereafter.
(d) A developer shall ensure that an impact assessment prepared pursuant to subdivision (c) includes all of the following:
(1) A statement of the purpose of the automated decision tool and its intended benefits, uses, and deployment contexts.
(2) A description of the automated decision tool’s outputs and how they are used to make, or be a substantial factor in making, a consequential decision.
(3) A summary of the categories of information collected from natural persons and processed by the automated decision tool when it is used to make, or be a substantial factor in making, a consequential decision, including, but not limited to, all of the following:
(A) Each category of personal information identified by reference to the applicable subparagraph enumerated under paragraph
(1) of subdivision (v) of Section 1798.140 of the Civil Code.
(B) Each category of sensitive personal information identified by reference to the applicable paragraph and subparagraph enumerated under subdivision (ae) of Section 1798.140 of the Civil Code.
(C) Each category of information related to a natural person’s receipt of sensitive services, as defined in Section 56.05 of the Civil Code, identified by reference to the specific category of sensitive service enumerated in the definition.
(4) An analysis of a potential adverse impact on the basis of sex, race, color, ethnicity, religion, age, national origin, limited English proficiency, disability, veteran status, or genetic information from the deployer’s use
of the automated decision tool.
(5) A description of the measures taken by the developer to mitigate the risk known to the developer of algorithmic discrimination arising from the use of the automated decision tool.
(6) A description of how the automated decision tool can be used by a natural person, or monitored when it is used, to make, or be a substantial factor in making, a consequential decision.
(e) A deployer or developer shall, in addition to the impact assessment required by subdivisions (a) and (b), shall perform, as soon as feasible,
an impact assessment with respect to any significant update.
(f) This section does not apply to a deployer with fewer than 25 employees unless, as of the end of the prior calendar year, the deployer deployed an automated decision tool that impacted more than 999 people per year.
22756.2.
(a) (1) A deployer shall, at or before the time an automated decision tool is used to make a consequential decision, notify any natural person that is the subject of the consequential decision that an automated decision tool is being used.(2) A deployer shall provide to a natural person notified pursuant to this subdivision all of the following:
(A) A statement of the purpose of the automated decision tool.
(B) Contact information for the deployer.
(C) A plain language
description of the automated decision tool that includes a description of any human components and how any automated component is used to inform a consequential decision.
(D) Information sufficient to enable the natural person to request to be subject to an alternative selection process or accommodation, as applicable, in lieu of the automated decision tool, as provided in subdivision (b).
(b) (1) If a consequential decision is made solely based on the output of an automated decision tool, a deployer shall, if technically feasible, accommodate a natural person’s request to not be subject to the automated decision tool and to be subject to an alternative selection process or accommodation.
(2) After
a request pursuant to paragraph (1), a deployer may reasonably request, collect, and process information from a natural person for the purposes of identifying the person and the associated consequential decision. If the person does not provide that information, the deployer shall not be obligated to provide an alternative selection process or accommodation.
22756.3.
(a) A developer shall provide a deployer with a statement regarding the intended uses of the automated decision tool and documentation regarding all of the following:(1) The known limitations of the automated decision tool, including any reasonably foreseeable risks of algorithmic discrimination arising from its intended use.
(2) A description of the type of data used to program or train the automated decision tool.
(3) A description of how the automated decision tool was evaluated for validity and explainability before sale or licensing.
(4) A description of the deployer’s responsibilities under this chapter.
(b) This section does not require the disclosure of trade secrets, as defined in Section 3426.1 of the Civil Code.
22756.4.
(a) (1) A deployer or developer shall establish, document, implement, and maintain a governance program that contains reasonable administrative and technical safeguards to map, measure, manage, and govern the reasonably foreseeable risks of algorithmic discrimination associated with the use or intended use of an automated decision tool.(2) The safeguards required by this subdivision shall be appropriate to all of the following:
(A) The use or intended use of the automated decision tool.
(B) The deployer’s or developer’s role
as a deployer or developer.
(C) The size, complexity, and resources of the deployer or developer.
(D) The nature, context, and scope of the activities of the deployer or developer in connection with the automated decision tool.
(E) The technical feasibility and cost of available tools, assessments, and other means used by a deployer or developer to map, measure, manage, and govern the risks associated with an automated decision tool.
(b) The governance program required by this section shall be designed to do all of the following:
(1) (A) Designate at least one
employee to be responsible for overseeing and maintaining the governance program and compliance with this chapter.
(B) (i) An employee designated pursuant to this paragraph shall have the authority to assert to the employee’s employer a good faith belief that the design, production, or use of an automated decision tool fails to comply with the requirements of this chapter.
(ii) An employer of an employee designated pursuant to this paragraph shall conduct a prompt and complete assessment of any compliance issue raised by that employee.
(2) Identify and implement safeguards to address reasonably foreseeable risks of algorithmic discrimination resulting from the use or intended use of an automated
decision tool.
(3) If established by a deployer, provide for the performance of impact assessments as required by Section 22756.1.
(4) If established by a developer, provide for compliance with Sections 22756.2 and 22756.3.
(5) Conduct an annual and comprehensive review of policies, practices, and procedures to ensure compliance with this chapter.
(6) Maintain for five years after completion the results of an impact assessment.
(7) Evaluate and make reasonable adjustments to administrative and technical safeguards in light of material changes in technology, the risks associated with the automated
decision tool, the state of technical standards, and changes in business arrangements or operations of the deployer or developer.
(c) This section does not apply to a deployer with fewer than 25 employees unless, as of the end of the prior calendar year, the deployer deployed an automated decision tool that impacted more than 999 people per year.
22756.5.
A deployer or developer shall make publicly available, in a readily accessible manner, a clear policy that provides a summary of both of the following:(a) The types of automated decision tools currently in use or made available to others by the deployer or developer.
(b) How the deployer or developer manages the reasonably foreseeable risks of algorithmic discrimination that may arise from the use of the automated decision tools it currently uses or makes available to others.
22756.6.
(a) A deployer shall not use an automated decision tool that results in algorithmic discrimination.(b) A developer shall not make available to potential deployers an automated decision tool that results in algorithmic discrimination.
22756.7.
(a) A state government deployer shall, by January 1, 2026, provide the Civil Rights Department a list of automated decision tools initially deployed prior to January 1, 2025, by the state government deployer. The list shall identify all of the following:(1) Each automated decision tool deployed by the state government deployer.
(2) The role of each automated decision tool in making consequential decisions.
(3) The population affected by each automated decision tool.
(b) The
Civil Rights Department shall, by January 1, 2027, establish a staggered schedule that identifies when each state government deployer shall comply with Sections 22756.1, 22756.2, 22756.3, 22756.4, 22756.5, and 22756.6 for each automated decision tool deployed by the state government deployer.
(1) The schedule established by the department shall prioritize compliance for automated decision tools used by state government deployers with the highest risk for adverse impacts, including civil rights violations and other discriminatory outcomes.
(2) The schedule established by the department shall require full compliance by each state government deployer by January 1, 2031.
(3) A state government deployer that fails to comply
with the requirements of this chapter as specified in the schedule established by the department may be subject to enforcement by the department, as authorized in Sections 22756.8 and 22756.9.
(c) The Civil Rights Department may adopt rules and regulations pursuant to the rulemaking provisions of the Administrative Procedure Act (Chapter 3.5 (commencing with Section 11340) of Part 1 of Division 3 of Title 2 of the Government Code) as necessary to effectuate the intent of this section.
22756.8.
(a) The Civil Rights Department may investigate a report of algorithmic discrimination or any other violation of this chapter.(b) (1) Upon the request of the Civil Rights Department, a deployer or a developer shall, within seven days of the request, provide any impact assessment that it performed pursuant to this chapter to the Civil Rights Department.
(2) The disclosure of an impact assessment pursuant to this subdivision does not constitute a waiver of any attorney-client privilege or work-product protection that might otherwise exist with respect to the impact assessment and any
information contained in the impact assessment.
(3) (A) A trade secret, as defined in Section 3426.1 of the Civil Code, contained in an impact assessment disclosed to the Civil Rights Department pursuant to this subdivision is exempt from the California Public Records Act (Division 10 (commencing with Section 7920.000) of Title 1 of the Government Code).
(B) In complying with a request pursuant to the California Public Records Act (Division 10 (commencing with Section 7920.000) of Title 1 of the Government Code), the Civil Rights Department, or a public prosecutor with whom an impact assessment was shared pursuant to subdivision (c), shall redact any trade secret, as defined in Section 3426.1 of the Civil Code, from the impact assessment.
(4) A deployer or developer who violates this subdivision shall be liable for an administrative fine of not more than ten thousand dollars ($10,000) per violation in an administrative enforcement action brought by the Civil Rights Department.
(5) Each day on which an automated decision tool is used for which an impact assessment has not been submitted pursuant to this subdivision shall give rise to a distinct violation of this subdivision.
(c) The Civil Rights Department may provide an impact assessment it receives to a public prosecutor listed in subdivision (a) of Section 22756.9 to assist that public prosecutor in initiating or litigating a civil action under that section.
(d) The Civil Rights Department may adopt rules and regulations pursuant to the rulemaking provisions of the Administrative Procedure Act (Chapter 3.5 (commencing with Section 11340) of Part 1 of Division 3 of Title 2 of the Government Code) as necessary to effectuate the intent of this section.
22756.9.
(a) (1) Any of the following public entities may bring a civil action against a deployer or developer for a violation of this chapter:(A) The Attorney General in the name of the people of the State of California.
(B) A district attorney, county counsel, or city attorney for the jurisdiction in which the violation occurred.
(C) A city prosecutor in any city having a full-time city prosecutor, with the consent of the district attorney.
(D) The Civil Rights
Department.
(2) A court may award in an action brought pursuant to this subdivision all of the following:
(A) Injunctive relief.
(B) Declaratory relief.
(C) Reasonable attorney’s fees and litigation costs.
(D) Only in an action for a violation involving algorithmic discrimination, a civil penalty of twenty-five thousand dollars ($25,000) per violation.
(b) (1) A public attorney, or the Civil Rights Department, before commencing an action pursuant to this section for injunctive relief, shall provide 45 days’ written
notice to a deployer or developer of the alleged violations of this chapter.
(2) (A) The developer or deployer may cure, within 45 days of receiving the written notice described in paragraph (1), the noticed violation and provide the person who gave the notice an express written statement, made under penalty of perjury, that the violation has been cured.
(B) If the developer or deployer cures the noticed violation and provides the express written statement pursuant to subparagraph (A), a claim for injunctive relief shall not be maintained for the noticed violation.
22756.10.
It shall be unlawful for a deployer, state government deployer, or developer to retaliate against a natural person for that person’s exercise of rights provided for under this chapter.22756.11.
This chapter does not apply to cybersecurity-related technology.22756.12.
The rights, remedies, and penalties established by this chapter are cumulative and shall not be construed to supersede the rights, remedies, or penalties established under other laws, including, but not limited to, Chapter 6 (commencing with Section 12940) of Part 2.8 of Division 3 of Title 2 of the Government Code and Section 51 of the Civil Code.