Bills

AB 1137: Reporting mechanism: child sexual abuse material.

  • Session Year: 2025-2026
  • House: Assembly

Current Status:

In Progress

(2025-05-23: In committee: Held under submission.)

Introduced

First Committee Review

First Chamber

Second Committee Review

Second Chamber

Enacted

Version:

Existing law requires a social media platform to take certain actions with respect to child sexual abuse material on the social media platform, including by requiring the social media platform to provide, in a mechanism that is reasonably accessible to users, a means for a user who is a California resident to report material to the social media platform that the user reasonably believes meets certain criteria, including that the reported material is child sexual abuse material and that the reporting user is depicted in the material. Existing law also requires the social media platform to collect information reasonably sufficient to enable the social media platform to contact, as specified, a reporting user.

This bill would delete the requirement for reporting material that the reporting user be depicted in the material, would require that the depicted individual be depicted as a minor, and would additionally require the mechanism to be clear and conspicuous, and conspicuous. The bill would require a social media platform to ensure that any report submitted using the reporting mechanism receives a review is reviewed through a hash matching process or, if there is not an established or known hash match to child sexual abuse material, and would require a social media company to ensure review by a natural person. person if there is not an established or known hash match to child sexual abuse material with respect to the reported material and the reported material is not otherwise blocked.

Existing law makes a noncomplying social media company liable to a reporting user for actual damages and statutory damages, as specified.

This bill would also impose a civil penalty on a noncomplying social media company to be collected in a civil action by certain public attorneys, including the Attorney General. The bill would make a social media company liable to a depicted individual, as defined, for specified violations.

Existing law prohibits a social media platform from knowingly facilitating, aiding, or abetting commercial sexual exploitation, as defined, and exempts a social media platform from being deemed in violation of that prohibition if it instituted a specified audit program and provided to each member of its board of directors a true and correct copy of each audit, as prescribed.

This bill would revise those provisions to, instead, require a social media platform to submit to third-party audits and release audit reports to the public in order to be exempt from being deemed in violation that prohibition, as prescribed.

This bill would declare that its provisions are severable.

Discussed in Hearing

Assembly Standing Committee on Judiciary16MIN
Apr 29, 2025

Assembly Standing Committee on Judiciary

Assembly Standing Committee on Privacy and Consumer Protection19MIN
Apr 22, 2025

Assembly Standing Committee on Privacy and Consumer Protection

View Older Hearings

News Coverage:

AB 1137: Reporting mechanism: child sexual abuse material. | Digital Democracy