Bill Text: CA SB287 | 2023-2024 | Regular Session | Introduced

NOTE: There are more recent revisions of this legislation. Read Latest Draft
Bill Title: Features that harm child users: civil penalty.

Spectrum: Partisan Bill (Democrat 1-0)

Status: (Failed) 2024-02-01 - Died on file pursuant to Joint Rule 56. [SB287 Detail]

Download: California-2023-SB287-Introduced.html


CALIFORNIA LEGISLATURE— 2023–2024 REGULAR SESSION

Senate Bill
No. 287


Introduced by Senator Skinner

February 02, 2023


An act to add Section 1714.48 to the Civil Code, relating to social media platforms.


LEGISLATIVE COUNSEL'S DIGEST


SB 287, as introduced, Skinner. Features that harm child users: civil penalty.
Existing law, the California Consumer Privacy Act of 2018, prohibits a business from selling the personal information of a consumer if the business has actual knowledge that the consumer is less than 16 years of age, unless the consumer, in the case of a consumer at least 13 years of age and less than 16 years of age, or the consumer’s parent or guardian, in the case of a consumer who is less than 13 years of age, has affirmatively authorized the sale of the consumer’s personal information.
Existing law, the California Age-Appropriate Design Code Act, requires, beginning July 1, 2024, a business that provides an online service, product, or feature likely to be accessed by children to comply with specified requirements, including a requirement to configure all default privacy settings offered by the online service, product, or feature to the settings that offer a high level of privacy, as prescribed, and requires a business, before any new online services, products, or features are offered to the public, to complete a Data Protection Impact Assessment for any online service, product, or feature likely to be accessed by children and maintain documentation of this assessment as long as the online service, product, or feature is likely to be accessed by children.
This bill would prohibit a social media platform, as defined, from using a design, algorithm, or feature that the platform knows, or which by the exercise of reasonable care should have known, causes child users to do any of certain things, including experience addiction to the social media platform.
This bill would provide that a social media platform is not in violation of the bill if the social media platform instituted and maintained a program of at least quarterly audits, as defined, of its designs, algorithms, and features to detect designs, algorithms, or features that have the potential to cause or contribute to violations of the provision described above, and the social media platform corrected, within 30 days of the completion of the audit, any design, algorithm, or feature discovered by the audit to present more than a de minimis risk of violating the provision described above.
This bill would subject a social media platform that knowingly and willfully violates these provisions to a civil penalty not to exceed $250,000 per violation, an injunction, and an award of litigation costs and attorney’s fees.
Vote: MAJORITY   Appropriation: NO   Fiscal Committee: NO   Local Program: NO  

The people of the State of California do enact as follows:


SECTION 1.

 Section 1714.48 is added to the Civil Code, immediately following Section 1714.45, to read:

1714.48.
 (a) A social media platform shall not use a design, algorithm, or feature that the platform knows, or which by the exercise of reasonable care should have known, causes child users to do any of the following:
(1) Receive content or messages that facilitate the purchase of fentanyl.
(2) Inflict harm on themselves or others.
(3) Develop an eating disorder.
(4) Receive content or messages that facilitate suicide by offering information on how to die by suicide.
(5) Receive content or messages offering diet pills, diet products, or ways to reduce eating, purge food that has been eaten, or lose weight.
(6) Experience addiction to the social media platform.
(b) A social media platform is not in violation of this section if both of the following are true:
(1) The social media platform instituted and maintained a program of at least quarterly audits of its designs, algorithms, and features to detect designs, algorithms, or features that have the potential to cause or contribute to violations of subdivision (a).
(2) The social media platform corrected, within 30 days of the completion of an audit described in paragraph (1), any design, algorithm, or feature discovered by the audit to present more than a de minimis risk of violating subdivision (a).
(c) A social media platform that has knowingly and willfully violated subdivision (a) shall be liable for a civil penalty not to exceed two hundred fifty thousand dollars ($250,000) per violation, an injunction, and an award of litigation costs and attorney’s fees.
(d) This section shall not be construed to impose liability on a social media platform for any of the following:
(1) Content that is generated by a user of the service or uploaded to or shared on the service by a user of the service.
(2) Passively displaying content that is created entirely by third parties.
(3) Information or content for which the social media platform was not, in whole or in part, responsible for creating or developing.
(4) Conduct by a social media platform involving child users that would otherwise be protected by Section 230 of Title 47 of the United States Code.
(e) An action to enforce a cause of action pursuant to this section shall be commenced within four years after the cause of action accrued.
(f) For purposes of this section:
(1) “Addiction” means a use of one or more social media platforms that does both of the following:
(A) Indicates preoccupation or obsession with, or withdrawal or difficulty to cease or reduce use of, a social media platform despite the user’s desire to cease or reduce that use.
(B) Causes physical, mental, emotional, developmental, or material harms to the user.
(2) “Audit” means a good faith, written, systemic review or appraisal by a social media company that provides reasonable assurance of monitoring compliance with this section that meets both of the following criteria:
(A) The review or appraisal describes and analyzes each of the social media platform’s current and forthcoming designs, algorithms, and features that have the potential to cause or contribute to the addiction of child users.
(B) The review of appraisal includes any plans to change designs, algorithms, and features that pose more than a de minimis risk of violating subdivision (a).
(3) “Child user” means a person who uses a social media platform and is younger than 18 years of age.
(4) (A) “Content” means statements or comments made by users and media that are created, posted, shared, or otherwise interacted with by users on an internet-based service or application.
(B) “Content” does not include media put on a service or application exclusively for the purpose of cloud storage, transmitting files, or file collaboration.
(5) “Eating disorder” means a behavioral condition characterized by a severe and persistent disturbance in eating behaviors and associated distressing thoughts and emotions, including anorexia nervosa, bulimia nervosa, and avoidant restrictive food intake disorder.
(6) “Public or semipublic internet-based service or application” excludes a service or application used to facilitate communication within a business or enterprise among employees or affiliates of the business or enterprise, provided that access to the service or application is restricted to employees or affiliates of the business or enterprise using the service or application.
(7) “Social media platform” means a public or semipublic internet-based service or application that has users in California and that meets both of the following criteria:
(A) (i) A substantial function of the service or application is to connect users in order to allow users to interact socially with each other within the service or application.
(ii) A service or application that provides email or direct messaging services shall not be considered to meet the criterion described in clause (i) on the basis of that function alone.
(B) The service or application allows users to do all of the following:
(i) Construct a public or semipublic profile for purposes of signing into and using the service.
(ii) Populate a list of other users with whom an individual shares a social connection within the system.
(iii) Create or post content viewable by other users, including, but not limited to, on message boards, in chat rooms, or through a landing page or main feed that presents the user with content generated by other users.
(8) “Suicidal” means likely to die by suicide and includes major depressive disorder with suicidal ideation.
(g) This section does not apply to a social media platform that is controlled by a business entity that generated less than one hundred million dollars ($100,000,000) in gross revenue during the preceding calendar year.

feedback