Bill Text: CA SB1444 | 2023-2024 | Regular Session | Amended


Bill Title: Let Parents Choose Protection Act of 2024.

Spectrum: Partisan Bill (Democrat 1-0)

Status: (Introduced) 2024-05-16 - May 16 hearing: Held in committee and under submission. [SB1444 Detail]

Download: California-2023-SB1444-Amended.html

Amended  IN  Senate  April 25, 2024

CALIFORNIA LEGISLATURE— 2023–2024 REGULAR SESSION

Senate Bill
No. 1444


Introduced by Senator Stern

February 16, 2024


An act to add Chapter 22.2.8 (commencing with Section 22588.5) to Division 8 of the Business and Professions Code, relating to privacy.


LEGISLATIVE COUNSEL'S DIGEST


SB 1444, as amended, Stern. Let Parents Choose Protection Act of 2024.
Existing law establishes various online privacy rights for minors, including prohibiting the operator of an internet website, online service, online application, or mobile application from marketing or advertising specified types of products or services to a minor, and requires an operator to permit a registered user who is a minor to remove content or information posted.
This bill, beginning July 1, 2025, would require large social media platform providers, as defined, to create, maintain, and make available to specified third-party safety software providers a set of third-party-accessible application programming interfaces to allow a third-party safety software provider, upon authorization by a child or a parent or legal guardian of a child, to manage a child’s online interactions, content, and account settings and initiate secure transfers of the child’s user data for these purposes, as provided. The bill would prohibit the third-party safety software provider from disclosing user data unless specified exceptions apply, and would authorize the child or the parent or legal guardian, as applicable, to revoke the authorization with the third-party safety software provider or disable the account with the large social media provider.
The bill would require the third-party safety software provider to register with the Attorney General’s office as a condition of accessing an application programming interface from a large social media platform provider and would require the Attorney General to affirm that the third-party safety software provider meets specified requirements, including that it is solely engaged in the business of internet safety. The bill would also require a large social media platform to register with the Attorney General’s office within 30 days of meeting specified requirements, including that it enables a child to share images, text, or video through the internet with other users of the service, as provided, and has more than 100,000,000 monthly global active users or generates more than $1,000,000,000 in gross revenue per year, as provided. The bill would authorize the Attorney General to deregister or issue a civil penalty not to exceed $5,000 per violation to a third-party safety software provider if specified conditions occur. The bill would require the Attorney General to post both registration lists on its internet website, and to establish processes to deregister third-party safety software providers and large social media platform providers if certain criteria is met.
The bill would provide that a large social media platform provider is not liable for damages arising out of the transfer of user data to a third-party safety software provider in accordance with these provisions if the large social media platform provider has in good faith complied with specified requirements.
The bill would require the Department of Technology, before July 1, 2025, to issue guidance for large social media providers and third-party software providers regarding the implementation and maintenance of technical standards to protect user data, as specified, and would require the Department of Technology to biennially update that guidance.
This bill would require a third-party safety software provider receiving any data pursuant to these provisions to at least annually enlist a qualified independent auditing firm to audit its privacy, security, and legal compliance, as provided. The bill would require the auditor to provide the audit findings, including a summary of those findings, directly to the Attorney General and the third-party safety software provider. The bill would authorize the Attorney General to deregister, suspend, or issue a civil penalty to a third-party safety software provider based on an auditor’s findings of willful or grossly negligent conduct or a third-party safety software provider’s negligent failure to respond to an unusual finding, as specified.
The California Privacy Rights Act of 2020 authorizes the Legislature to amend the act to further the purposes and intent of the act by a majority vote of both houses of the Legislature, as specified.
This bill would declare that its provisions further the purposes and intent of the California Privacy Rights Act of 2020.
Existing constitutional provisions require that a statute that limits the right of access to the meetings of public bodies or the writings of public officials and agencies be adopted with findings demonstrating the interest protected by the limitation and the need for protecting that interest.
This bill would make legislative findings to that effect.
Vote: MAJORITY   Appropriation: NO   Fiscal Committee: YES   Local Program: NO  

The people of the State of California do enact as follows:


SECTION 1.

 Chapter 22.2.8 (commencing with Section 22588.5) is added to Division 8 of the Business and Professions Code, to read:
CHAPTER  22.2.8. Let Parents Choose Protection Act of 2024

22588.5.
 This chapter shall be known, and may be cited, as the “Let Parents Choose Protection Act of 2024” or “Sammy’s Law of 2024.”

22588.5.1.
 (a) The Legislature finds and declares all of the following:
(1) Parents and legal guardians should be empowered to use the services of third-party safety software providers to protect their children from certain harms on large social media platforms.
(2) Dangers like cyberbullying, human trafficking, illegal drug distribution, sexual harassment, and violence perpetrated, facilitated, or exacerbated through the use of certain large social media platforms have harmed children on those platforms.
(b) It is the intent of the Legislature to require large social media platform providers to create, maintain, and make available to third-party safety software providers a set of real-time application programming interfaces, through which a child or a parent or legal guardian of a child may delegate permission to a third-party safety software provider to manage the child’s online interactions, content, and account settings on the large social media platform on the same terms as the child, and for other purposes.

22588.5.2.
 For purposes of this chapter, the following definitions apply:
(a) “Child” means any individual under 17 years of age who has registered an account with a large social media platform.
(b) (1) “Large social media platform” means a service that meets all of the following:
(A) Is provided through an internet website or a mobile application, or both.
(B) The terms of service do not prohibit the use of the service by a child.
(C) The service includes features that enable a child to share images, text, or video through the internet with other users of the service whom the child has met, identified, or become aware of solely through the use of the service.
(D) The service has more than 100,000,000 monthly global active users or generates more than one billion dollars ($1,000,000,000) in gross revenue per year, adjusted yearly for inflation, or both.
(2) “Large social media platform” does not include any of the following:
(A) A service that primarily serves to facilitate the sale or provision of professional services or the sale of commercial products.
(B) A service that primarily serves to provide news or information and the service does not offer the ability for content to be sent by a user directly to a child.
(C) A service that has features that enable a user who communicates directly with a child through a message, including a text, audio, or video message, not otherwise available to other users of the service, to add other users to that message that the child may not have otherwise met, identified, or become aware of solely through the use of the service and does not have any features described in subparagraph (C) of paragraph (1).
(c) “Large social media platform provider” means any person who, for commercial purposes, provides, manages, operates, or controls a large social media platform.
(d) “Third-party safety software provider” means any person who, for commercial purposes, is authorized by a child, if the child is 13 years of age or older, or a parent or legal guardian of a child, to interact with a large social media platform to manage the child’s online interactions, content, or account settings for the sole purpose of protecting the child from harm, including physical or emotional harm.
(e) “User data” means any information needed to have a profile on a large social media platform or content on a large social media platform, including images, video, audio, or text, that is created by or sent to a child on or through the child’s account with that platform, and the information or content is created by or sent to the child while a delegation under Section 22588.5.4 is in effect with respect to the account. Information shall only be considered “user data” for 30 days, beginning on the date on which the information or content is created by or sent to the child.

22588.5.3.
 (a) The Attorney General shall administer and enforce this chapter.
(b) Before July 1, 2025, the Attorney General shall issue guidance on both of the following:
(1) Facilitating a third-party safety software provider’s ability to obtain user data or access under Section 22588.5.4 in a way that ensures that a request for user data or access on behalf of a child is a verifiable request.
(2) For large social media platform providers and third-party safety software providers, maintaining reasonable safety standards to protect user data.
(c) The Attorney General shall make publicly available on the internet website a list of the third-party safety software providers registered under Section 22588.5.5, a list of the large social media platforms registered under Section 22588.5.6, and a list of the third-party safety software providers deregistered under Section 22588.5.5. 22588.5.5, and the summary of audit findings of the third-party software providers under Section 22588.5.9, except that no proprietary or confidential information shall be publicly disclosed.
(d) The Attorney General may adopt regulations to implement this chapter. The adoption, amendment, repeal, or readoption of a regulation authorized by this section is deemed to address an emergency, for purposes of Sections 11346.1 and 11349.6 of the Government Code, and the Attorney General is hereby exempted for this purpose from the requirements of subdivision (b) of Section 11346.1 of the Government Code.

22588.5.4.
 (a) Before August 1, 2025, or within 30 days after a service becomes a large social media platform, as applicable, the large social media platform provider shall create, maintain, and make available to any third-party safety software provider registered with the Attorney General pursuant to Section 22588.5.5 a set of third-party-accessible real time application programming interfaces, including any information necessary to use the interfaces, by which a child, or a parent or legal guardian of a child, may delegate permission to the third-party safety software provider to do the following:
(1)  Manage the child’s online interactions, content, and account settings on the large social media platform.
(2) Initiate secure transfers of user data from the large social media platform in a commonly used and machine-readable format to the third-party safety software provider, and the frequency of the transfers may not be limited by the large social media platform provider to less than once per hour.
(b) Once a child or a parent or legal guardian of a child makes a delegation under subdivision (a), the large social media platform provider shall make the application programming interfaces and information available to the third-party safety software provider on an ongoing basis until one of the following applies:
(1) The delegation is revoked by the child or the child’s parent or legal guardian.
(2) The child’s account is disabled with the large social media platform.
(3) The third-party safety software provider rejects the delegation.
(4) One or more of the affirmations made by the third-party safety software provider under Section 22588.5.5 is no longer true.
(c) A large social media platform provider shall establish and implement reasonable policies, practices, and procedures regarding the secure transfer of user data pursuant to a delegation under subdivision (a) from the large social media platform to a third-party safety software provider in order to mitigate any risks related to user data.
(d) If a delegation is made by a child or a parent or legal guardian of a child under subdivision (a) with respect to the account of the child with a large social media platform, the large social media platform provider shall do all of the following:
(1) Disclose to the child and, if the parent or legal guardian made the delegation, the parent or legal guardian the fact that the delegation has been made.
(2) Provide to the child and, if the parent or legal guardian made the delegation, the parent or legal guardian a summary of what user data is being transferred to the third-party safety software provider.
(3)  Provide any update to the summary under paragraph (2) as necessary to reflect any change to what user data is being transferred to the third-party safety software provider.
(e) (1) A third-party safety software provider shall not disclose any user data obtained under this section to any person except as follows:
(A) Pursuant to a lawful request for law enforcement purposes or for judicial or administrative proceedings by means of a court order or a court ordered warrant, a subpoena or summons issued by a judicial officer, or a grand jury subpoena.
(B) To the extent that the disclosure is required by law and the disclosure complies with and is limited to the relevant requirements of that law.
(C) To the child, or a parent or legal guardian of the child, who made a delegation under Section 22588.5.4 and whose data is at issue. The disclosure shall be limited, by a good faith effort on the part of the third-party safety software provider, only to the user data strictly sufficient for a reasonable parent or legal guardian to understand that the child is at foreseeable risk or currently experiencing any of the following harms:
(i) Suicide.
(ii) Anxiety.
(iii) Depression.
(iv) Eating disorders.
(v) Violence, including being the victim of or planning to commit or facilitate battery as defined by Section 242 of the Penal Code and assault defined by Section 240 of the Penal Code.
(vi) Substance abuse.
(vii) Fraud.
(viii) Human trafficking as defined by Section 236.1 of the Penal Code.
(ix) Sexual abuse.
(x) Physical injury.
(xi) Harassment, including hate-based harassment, sexual harassment, and stalking as defined by Section 646.9 of the Penal Code.
(xii) Exposure to “harmful matter” as defined by Section 313 of the Penal Code.
(xiii) Communicating with a terrorist organization defined under Section 219 of the federal Immigration and Nationality Act (8 U.S.C. Sec. 1101).
(xiv) Academic dishonesty, including cheating, plagiarism, or other forms of academic dishonesty that are intended to gain an unfair academic advantage.
(xv) Sharing personal information limited to any of the following:
(I) Home address.
(II) Telephone number.
(III) Social security number.
(IV) Personal banking information.
(D) In the case of a reasonably foreseeable serious and imminent threat to the health or safety of any individual, if the disclosure is made to a person or persons reasonably able to prevent or lessen the threat.
(2) A third-party safety software provider that makes a disclosure permitted under this subdivision shall promptly inform the child, and if a parent or legal guardian of the child made the delegation, the parent or legal guardian, that such a disclosure has been or will be made.

22588.5.5.
 (a) A third-party safety software provider shall register with the Attorney General’s office as a condition of accessing an application programming interface and any information or user data pursuant to Section 22588.5.4.
(b) The registration shall require the third-party safety software provider to affirm that the third-party safety software provider meets all of the following requirements:
(1) Is solely engaged in the business of internet safety.
(2) Will use any user data obtained under Section 22588.5.4 solely for the purpose of protecting a child from any harm.
(3) Will only disclose user data obtained under Section 22588.5.4 as permitted by that section.
(4) Will disclose, in an easy-to-understand, readable format, to each child and to the parent or legal guardian, sufficient information detailing the operation of the service and what information the third-party safety software provider is collecting to enable the child and the parent or legal guardian to make informed decisions regarding the use of the service.
(5) Will delete any user data obtained under Section 22588.5.4 no later than 21 days after receiving the data.
(c) Within 30 days after there is any change to an affirmation made under subdivision (a) by a third-party safety software provider that is registered under subdivision (a), the provider shall notify both of the following of the change:
(1) The Attorney General.
(2) Each child with respect to whose account with a large social media platform the service of the third-party safety software provider is operating and, if a parent or legal guardian of the child made the delegation under Section 22588.5.4 with respect to the account, the parent or legal guardian.
(d) (1) The Attorney General may deregister or issue a civil penalty not to exceed five thousand dollars ($5,000) per violation to a third-party safety software provider if it is determined that the provider has violated or misrepresented the affirmations made under subdivision (b) or has not notified the Attorney General, a child, or a parent or legal guardian of a child, of a change to an affirmation as required by subdivision (c).
(2) If the Attorney General deregisters a third-party safety software provider under paragraph (1), the Attorney General shall notify each large social media platform provider of the deregistration of the third-party safety software provider and the specific reason for the deregistration.
(3) A large social media platform provider that receives a notification from the Attorney General under paragraph (2) that the Attorney General has deregistered a third-party safety software provider pursuant to paragraph (1) shall notify each child with respect to whose account with the large social media platform the service of the third-party safety software provider was operating and, if a parent or legal guardian of the child made the delegation under Section 22588.5.4 with respect to the account, the parent or legal guardian of the deregistration of the third-party safety software provider and the specific reason for the deregistration provided by the Attorney General.

22588.5.6.
 (a) Before August 1, 2025, or within 30 days after a service becomes a large social media platform, as applicable, the large social media platform provider of the platform shall register the platform with the Attorney General by submitting to the Attorney General a statement indicating that the platform is a large social media platform.
(b) The Attorney General shall establish a process to deregister a service registered under subdivision (a) if the service is no longer a large social media platform.

22588.5.7.
 In any civil action, other than an action brought by the Attorney General, a large social media platform provider shall not be held liable for damages arising out of the transfer of user data to a third-party safety software provider in accordance with this chapter, if the large social media platform provider has in good faith complied with the requirements of this chapter and the guidance issued by the Attorney General in accordance with this act.

22588.5.8.
 (a) Before July 1, 2025, the Department of Technology shall issue guidance for large social media platform providers and third-party safety software providers regarding the implementation and maintenance of technical standards to protect user data.
(b) For purposes of issuing the guidance, the Department of Technology shall review prevailing industry practices and applicable technical standards published by the National Institute of Standards and Technology.
(c) The Department of Technology shall update the guidance biennially.

22588.5.9.
 (a) (1) A third-party safety software provider receiving any data pursuant to the permissions required under this chapter shall at least annually enlist a qualified independent auditing firm to audit its privacy, security, and legal compliance.
(2) The Attorney General shall develop a list of qualified independent auditing firms that a third-party safety software provider shall select to perform the audit specified in paragraph (1).
(b) The audit described in subdivision (a) shall do all of the following:
(1) Assess potential risks to protect the confidentiality, integrity, and security of any user data.
(2) Assess operational compliance with California and federal law, including the requirements of this chapter.
(3) Be designed to determine the adequacy of business and technology-related controls, policies, procedures, and other safeguards employed by the third-party safety software provider based on the standards issued by the Department of Technology and other industry standards and best practices.
(c) (1) The auditor shall provide the audit findings, including a summary of those findings, directly to the Attorney General and the third-party safety software provider.
(2) A summary of findings shall not include any proprietary information or other trade secrets.
(d) If an audit detects an unusual finding, a third-party safety software provider shall promptly investigate and resolve the matter.
(e) The Attorney General may deregister, suspend, or issue a civil penalty not to exceed five thousand dollars ($5,000) per violation to a third-party safety software provider based on either of the following:
(1) An audit finding by an independent auditor for an audit required by this section of willful or grossly negligent conduct on the part of the third-party safety software provider.
(2) A third-party safety software provider’s negligent failure to respond to an unusual finding in an issued audit required by this section.

22588.5.9.22588.5.10.
 This chapter shall become operative on July 1, 2025.

SEC. 2.

 The Legislature finds and declares that this act furthers the purposes and intent of the California Privacy Rights Act of 2020.

SEC. 3.

 The Legislature finds and declares that Section 1 of this act, which adds Chapter 22.2.8 (commencing with Section 22588.5) to Division 8 of the Business and Professions Code, imposes a limitation on the public’s right of access to the meetings of public bodies or the writings of public officials and agencies within the meaning of Section 3 of Article I of the California Constitution. Pursuant to that constitutional provision, the Legislature makes the following findings to demonstrate the interest protected by this limitation and the need for protecting that interest:
To protect the confidentiality of trade secrets and proprietary information, it is necessary to enact legislation that limits access to third-party safety software providers proprietary information and confidential information by the public.
feedback