Bill Text: IL HB5322 | 2023-2024 | 103rd General Assembly | Introduced
Bill Title: Creates the Illinois Commercial Algorithmic Impact Assessments Act. Defines "algorithmic discrimination", "artificial intelligence", "consequential decision", "deployer", "developer" and other terms. Requires that by January 1, 2026 and annually thereafter, a deployer of an automated decision tool must complete and document an assessment that summarizes the nature and extent of that tool, how it is used, and assessment of its risks among other things. Requires on or after January 1, 2026 and annually thereafter, developers of an automated decision tool must complete and document a similar assessment. Provides that upon the request of the Attorney General, a developer or deployer must provide that Office any impact assessment performed that is exempt from the Freedom of Information Act. Requires that a developer must provide a deployer with a statement regarding the intended uses of the automated decision tool and documentation regarding all of the following: (i) the known limitations of the automated decision tool, including any reasonably foreseeable risks of algorithmic discrimination arising from its intended use; (ii) a description of the types of data used to program or train the automated decision tool; and (iii) a description of how the automated decision tool was evaluated for validity and the ability to be explained before sale or licensing. Exempts a deployer with fewer than 50 employees unless, as of the end of the prior calendar year, the deployer deployed an automated decision tool that affected more than 999 people per year.
Spectrum: Partisan Bill (Democrat 1-0)
Status: (Failed) 2025-01-07 - Session Sine Die [HB5322 Detail]
Download: Illinois-2023-HB5322-Introduced.html
| ||||||||||||||||||||
| ||||||||||||||||||||
| ||||||||||||||||||||
1 | AN ACT concerning business.
| |||||||||||||||||||
2 | Be it enacted by the People of the State of Illinois, | |||||||||||||||||||
3 | represented in the General Assembly:
| |||||||||||||||||||
4 | Section 1. Short title. This Act may be cited as the | |||||||||||||||||||
5 | Illinois Commercial Algorithmic Impact Assessments Act.
| |||||||||||||||||||
6 | Section 5. Definitions. As used in this Act: | |||||||||||||||||||
7 | "Algorithmic discrimination" means the condition in which | |||||||||||||||||||
8 | an automated decision tool contributes to unjustified | |||||||||||||||||||
9 | differential treatment or impacts disfavoring people on the | |||||||||||||||||||
10 | basis of race, color, national origin, citizen or immigration | |||||||||||||||||||
11 | status, families with children, creed, religious belief or | |||||||||||||||||||
12 | affiliation, sex, marital status, the presence of any sensory, | |||||||||||||||||||
13 | mental, or physical disability, age, honorably discharged | |||||||||||||||||||
14 | veteran or military status, sexual orientation, gender | |||||||||||||||||||
15 | expression or gender identity, or any other protected class | |||||||||||||||||||
16 | under Illinois statute. | |||||||||||||||||||
17 | "Artificial intelligence" means a machine-based system | |||||||||||||||||||
18 | that can, for a given set of human-defined objectives, make | |||||||||||||||||||
19 | predictions, recommendations, or decisions influencing a real | |||||||||||||||||||
20 | or virtual environment. | |||||||||||||||||||
21 | "Automated decision tool" means a system or service that | |||||||||||||||||||
22 | uses artificial intelligence and has been specifically | |||||||||||||||||||
23 | developed and marketed to, or specifically modified to, make, |
| |||||||
| |||||||
1 | or be a controlling factor in making, consequential decisions. | ||||||
2 | "Consequential decision" means a decision or judgment that | ||||||
3 | has a legal, material, or similarly significant effect on an | ||||||
4 | individual's life relating to the impact of, access to, or the | ||||||
5 | cost, terms, or availability of, any of the following: | ||||||
6 | (1) Employment, workers management, or | ||||||
7 | self-employment, including, but not limited to: | ||||||
8 | (A) Pay or promotion; | ||||||
9 | (B) Hiring or termination; and | ||||||
10 | (C) Automated task allocation that automatically | ||||||
11 | limits, segregates, or classifies employees based on | ||||||
12 | individual behavior or performance for the purpose of | ||||||
13 | assigning or determining material terms or conditions | ||||||
14 | of employment. | ||||||
15 | (2) Education and vocational training, including, but | ||||||
16 | not limited to: | ||||||
17 | (A) Assessment, including, but not limited to, | ||||||
18 | detecting student cheating or plagiarism; | ||||||
19 | (B) Accreditation; | ||||||
20 | (C) Certification; | ||||||
21 | (D) Admissions; and | ||||||
22 | (E) Financial aid or scholarships. | ||||||
23 | (3) Housing or lodging, including rental or short-term | ||||||
24 | housing or lodging; | ||||||
25 | (4) Essential utilities, including electricity, heat, | ||||||
26 | water, internet or telecommunications access, or |
| |||||||
| |||||||
1 | transportation; | ||||||
2 | (5) Family planning, including adoption services or | ||||||
3 | reproductive services, as well as assessments related to | ||||||
4 | child protective services; | ||||||
5 | (6) Health care or health insurance, including mental | ||||||
6 | health care, dental, or vision; | ||||||
7 | (7) Financial services, including a financial service | ||||||
8 | provided by a mortgage company, mortgage broker, or | ||||||
9 | creditor; | ||||||
10 | (8) The criminal justice system, including, but not | ||||||
11 | limited to, risk assessments for pretrial hearings, | ||||||
12 | sentencing, and parole; | ||||||
13 | (9) Legal services, including private arbitration or | ||||||
14 | mediation; | ||||||
15 | (10) Voting; and | ||||||
16 | (11) Access to benefits or services or assignment of | ||||||
17 | penalties. | ||||||
18 | "Deployer" means a person, partnership, State or local | ||||||
19 | government agency, or corporation that uses or modifies an | ||||||
20 | automated decision tool to make a consequential decision. | ||||||
21 | "Developer" means a person, partnership, State or local | ||||||
22 | government agency, or corporation that designs, codes, or | ||||||
23 | produces an automated decision tool, or substantially modifies | ||||||
24 | an artificial intelligence system or service for the known | ||||||
25 | intended purpose of making, or being a controlling factor in | ||||||
26 | making, consequential decisions, whether for its own use or |
| |||||||
| |||||||
1 | for use by the deployer. | ||||||
2 | "Ethical artificial intelligence" means automated decision | ||||||
3 | tools that are developed and deployed with reasonable efforts | ||||||
4 | by the developer and the deployer to: | ||||||
5 | (1) Minimize unlawful discriminatory or biased outputs | ||||||
6 | or applications; | ||||||
7 | (2) Ensure that automated decision tools are being | ||||||
8 | operated reliably, safely, and consistently; | ||||||
9 | (3) Protect the data of natural persons by | ||||||
10 | incorporating robust privacy and data security measures; | ||||||
11 | (4) Prioritize transparency so that the behavior and | ||||||
12 | functional components of automated decision tools can be | ||||||
13 | understood in order to enable the identification of | ||||||
14 | performance issues, safety and privacy concerns, biases, | ||||||
15 | exclusionary practices, and unintended outcomes; and | ||||||
16 | (5) Promote individual rights and minimize reasonably | ||||||
17 | foreseeable harm to individuals resulting from use of the | ||||||
18 | automated decision tool. | ||||||
19 | "Impact assessment" means a documented risk-based | ||||||
20 | evaluation of an automated decision tool that meets the | ||||||
21 | criteria of this Act. | ||||||
22 | "Sex" includes pregnancy, childbirth, and related | ||||||
23 | conditions, gender identity, intersex status, and sexual | ||||||
24 | orientation. | ||||||
25 | "Significant update" means a new version, new release, or | ||||||
26 | other update to an automated decision tool that materially |
| |||||||
| |||||||
1 | changes its principal use, principal intended use, or expected | ||||||
2 | outcome.
| ||||||
3 | Section 10. Assessment required. | ||||||
4 | (a) By January 1, 2026, and annually thereafter, a | ||||||
5 | deployer of an automated decision tool must complete and | ||||||
6 | document an impact assessment for any automated decision tool | ||||||
7 | the deployer uses that includes all of the following: | ||||||
8 | (1) A statement of the purpose of the automated | ||||||
9 | decision tool and its intended benefits, uses, and | ||||||
10 | deployment contexts; | ||||||
11 | (2) A description of the automated decision tool's | ||||||
12 | outputs and how they are used to make, or be a controlling | ||||||
13 | factor in making, a consequential decision; | ||||||
14 | (3) A summary of the types of data collected from | ||||||
15 | natural persons and processed by the automated decision | ||||||
16 | tool when it is used to make, or be a controlling factor in | ||||||
17 | making, a consequential decision; | ||||||
18 | (4) A statement of the extent to which the deployer's | ||||||
19 | use of the automated decision tool is consistent with or | ||||||
20 | varies from the statement required of the developer by | ||||||
21 | Section 15 of this Act; | ||||||
22 | (5) An assessment of the reasonably foreseeable risks | ||||||
23 | of algorithmic discrimination arising from the use of the | ||||||
24 | automated decision tool known to the deployer at the time | ||||||
25 | of the impact assessment; |
| |||||||
| |||||||
1 | (6) A description of the safeguards implemented, or | ||||||
2 | that will be implemented, by the deployer to align use of | ||||||
3 | the automated decision tool with principles of ethical | ||||||
4 | artificial intelligence and to address any reasonably | ||||||
5 | foreseeable risks of algorithmic discrimination arising | ||||||
6 | from the use of the automated decision tool; | ||||||
7 | (7) A description of how the automated decision tool | ||||||
8 | will be used by a natural person, or monitored when it is | ||||||
9 | used, to make, or be a controlling factor in making, a | ||||||
10 | consequential decision; and | ||||||
11 | (8) A description of how the automated decision tool | ||||||
12 | has been or will be evaluated for validity or relevance. | ||||||
13 | (b) By January 1, 2026, and annually thereafter, a | ||||||
14 | developer of an automated decision tool must complete and | ||||||
15 | document an impact assessment of any automated decision tool | ||||||
16 | that it designs, codes, or produces that includes all of the | ||||||
17 | following: | ||||||
18 | (1) A statement of the purpose of the automated | ||||||
19 | decision tool and its intended benefits, uses, and | ||||||
20 | deployment contexts; | ||||||
21 | (2) A description of the automated decision tool's | ||||||
22 | outputs and how they are used, as intended, to make, or be | ||||||
23 | a controlling factor in making, a consequential decision; | ||||||
24 | (3) A summary of the types of data collected from | ||||||
25 | natural persons and processed by the automated decision | ||||||
26 | tool when it is used to make, or be a controlling factor in |
| |||||||
| |||||||
1 | making, a consequential decision; | ||||||
2 | (4) An assessment of the reasonably foreseeable risks | ||||||
3 | of algorithmic discrimination arising from the intended | ||||||
4 | use or foreseeable misuse of the automated decision tool; | ||||||
5 | (5) A description of the measures taken by the | ||||||
6 | developer to incorporate principles of ethical artificial | ||||||
7 | intelligence and to mitigate the risk known to the | ||||||
8 | developer of algorithmic discrimination arising from the | ||||||
9 | use of the automated decision tool; and | ||||||
10 | (6) A description of how the automated decision tool | ||||||
11 | is intended to be used by a natural person, or monitored | ||||||
12 | when it is used, to make, or be a controlling factor in | ||||||
13 | making, a consequential decision. | ||||||
14 | (c) A deployer or developer must, in addition to the | ||||||
15 | impact assessment required by subsections (1) and (2) of this | ||||||
16 | Section, perform, as soon as feasible, an impact assessment | ||||||
17 | with respect to any significant update. | ||||||
18 | (d) Upon the request of the Office of the Attorney | ||||||
19 | General, a developer or deployer must provide any impact | ||||||
20 | assessment that it performed pursuant to this Section to the | ||||||
21 | Office of the Attorney General. If a developer or deployer | ||||||
22 | fails or refuses to provide an impact assessment requested by | ||||||
23 | the Attorney General, the Attorney General may seek injunctive | ||||||
24 | relief, actual damages caused by failure or refusal, and | ||||||
25 | attorney's fees. | ||||||
26 | (e) Impact assessments provided pursuant to subsection (d) |
| |||||||
| |||||||
1 | of this Section are confidential and exempt from disclosure | ||||||
2 | under the Freedom of Information Act.
| ||||||
3 | Section 15. Statements provided to deployers. A developer | ||||||
4 | must provide a deployer with a statement regarding the | ||||||
5 | intended uses of the automated decision tool and documentation | ||||||
6 | regarding all of the following: | ||||||
7 | (a) The known limitations of the automated decision | ||||||
8 | tool, including any reasonably foreseeable risks of | ||||||
9 | algorithmic discrimination arising from its intended use; | ||||||
10 | (b) A description of the types of data used to program | ||||||
11 | or train the automated decision tool; and | ||||||
12 | (c) A description of how the automated decision tool | ||||||
13 | was evaluated for validity and the ability to be explained | ||||||
14 | before sale or licensing.
| ||||||
15 | Section 20. Developer policies. A developer must make | ||||||
16 | publicly available, in a readily accessible manner, a clear | ||||||
17 | policy that provides a summary of both of the following: | ||||||
18 | (a) The types of automated decision tools currently | ||||||
19 | made available to others by the developer; and | ||||||
20 | (b) How the developer manages the reasonably | ||||||
21 | foreseeable risks of algorithmic discrimination that may | ||||||
22 | arise from the use of the automated decision tools it | ||||||
23 | currently makes available to others.
|
| |||||||
| |||||||