6
10

Sammy’s Law

12/13/2025, 9:06 AM

Summary of Bill HR 2657

Bill 119 HR 2657, also known as the "Social Media Safety and Accountability Act," aims to hold large social media platform providers accountable for the safety of children using their platforms. The bill requires these companies to create and maintain real-time application programming interfaces (APIs) that allow third-party safety software providers to manage the online interactions, content, and account settings of children on the platform.

Under this bill, children or their parents/legal guardians can delegate permission to a third-party safety software provider to monitor and control their online activities on social media platforms. This delegation of permission must be done on the same terms as the child, ensuring that their online safety is prioritized.

The main goal of this legislation is to enhance the safety and security of children using social media platforms by giving parents and guardians more control over their online interactions. By requiring social media companies to provide APIs for third-party safety software providers, the bill aims to empower parents and guardians to better protect their children from potential online dangers. Overall, Bill 119 HR 2657 seeks to address the growing concerns surrounding child safety on social media platforms and provide a framework for ensuring that children are protected while using these platforms.

Congressional Summary of HR 2657

Sammy’s Law

This bill requires large social media platforms to permit certain providers of safety software to monitor and manage the activity of children under the age of 17 on such platforms.

Specifically, large social media platforms must make available a mechanism by which a child or their parent or guardian may permit a provider of safety software to (1) manage the child’s interactions, content, and account settings on the platform; and (2) regularly access the child’s user data.

A software provider may only disclose a child’s data under limited circumstances, including to the child’s parent or guardian if the child is experiencing or is at foreseeable risk of experiencing specified harms. Such harms include suicide, eating disorders, sexual abuse, harassment, and academic dishonesty. The provider may only share data necessary for a reasonable parent or caregiver to understand that the child is experiencing or is at risk of harm.

To participate, a software provider must register with the Federal Trade Commission, undergo a security review, and demonstrate that, among other requirements, the provider is based in the United States and will use a child's data solely to protect them from harm.

Under the bill, a large social media platform is generally a service that enables a child to share content through the internet with other users that the child has become aware of solely through the platform, and which has more than 100 million monthly global active users or generates more than $1 billion in gross annual revenue.

Current Status of Bill HR 2657

Bill HR 2657 is currently in the status of Bill Introduced since April 3, 2025. Bill HR 2657 was introduced during Congress 119 and was introduced to the House on April 3, 2025.  Bill HR 2657's most recent activity was Forwarded by Subcommittee to Full Committee by Voice Vote. as of December 11, 2025

Bipartisan Support of Bill HR 2657

Total Number of Sponsors
1
Democrat Sponsors
1
Republican Sponsors
0
Unaffiliated Sponsors
0
Total Number of Cosponsors
22
Democrat Cosponsors
10
Republican Cosponsors
12
Unaffiliated Cosponsors
0

Policy Area and Potential Impact of Bill HR 2657

Primary Policy Focus

Science, Technology, Communications

Alternate Title(s) of Bill HR 2657

To require large social media platform providers to create, maintain, and make available to third-party safety software providers a set of real-time application programming interfaces, through which a child or a parent or legal guardian of a child may delegate permission to a third-party safety software provider to manage the online interactions, content, and account settings of such child on the large social media platform on the same terms as such child, and for other purposes.
To require large social media platform providers to create, maintain, and make available to third-party safety software providers a set of real-time application programming interfaces, through which a child or a parent or legal guardian of a child may delegate permission to a third-party safety software provider to manage the online interactions, content, and account settings of such child on the large social media platform on the same terms as such child, and for other purposes.

Comments