As the House Energy and Commerce Committee considers a package of bills intended to protect children online, one proposal deserves closer scrutiny. H.R. 2657, known as “Sammy’s Law,” aims to empower parents by requiring social media services to provide real-time access to third-party safety software companies. The goal is admirable: to give parents tools to protect their kids from online harms. But in reality, it creates serious privacy risks that could actually undermine the safety and wellbeing of the teenagers lawmakers want to protect.
Under this bill, platforms must build APIs allowing commercial software providers to access everything a teen does online—every message, photo and conversation—at least once per hour. That data gets stored by multiple third-party companies for up to two weeks, creating numerous new opportunities for breaches, misuse or unauthorized access.
Consider what this looks like in practice. A 16-year-old is struggling and desperately searches for “signs of depression” or finally works up the courage to confide in a friend about feeling anxious. Under Sammy’s Law, that private moment of vulnerability becomes surveillance—automatically flagging “anxiety” and “depression” to trigger alerts. A teen joking with classmates about homework help gets flagged for “academic dishonesty.” Someone privately discussing a difficult family situation or simply being a normal teenager navigating complex emotions—all potentially monitored and disclosed. This will not catch predators or stop serious threats. The bill’s harm categories are so broad they could capture virtually any teenage experience. And while older teens can technically consent to this monitoring themselves starting at age 13, we’re asking kids who might not fully grasp the implications to authorize constant surveillance and storage of their most private data.
The bill tries to address these privacy and security concerns through FTC registration, security reviews and audits of third-party providers. But these safeguards may not be enough. We’ve seen countless examples of well-intentioned companies suffering data breaches or gradually expanding how they use customer data. Now, the government is mandating companies share teens’ sensitive communications with multiple commercial startups, trusting that each will properly secure information about mental health struggles, relationship issues and personal development. Even if these companies act in good faith, we’re creating an entirely new attack surface for hackers and bad actors.
Beyond privacy concerns, the proposal raises First Amendment questions. Sammy’s Law would wrest editorial choices and freedom away from social media websites and force those websites to allow others to make those choices while still presenting them as the branded curation decisions of the company itself. By forcing social media websites to turn over their editorial decision-making to third parties, the bill amounts to an impermissible governmental interference in free speech. The Supreme Court issued a resounding rejection of similar proposals in Moody v. NetChoice in 2024. Sammy’s Law is the same game played under a new name. Imagine the government passed a law that would require a newspaper to turn over its potential stories and allow a third party to decide what stories to run and what prominence to give them, all while being presented under that newspaper’s masthead. In essence, that is what Sammy’s Law would require for social media services. This scenario would be a travesty for newspapers’ First Amendment rights and so too for social media.
We can do better. Parents deserve tools to keep their kids safe online, but those tools should be opt-in, transparent and under family control—not mandatory infrastructure that forces platforms to share data with third parties. We should focus on education, holding bad actors accountable for actual harms, and giving families choices rather than surveillance mandates. Congress has an opportunity to get this right. We encourage lawmakers to focus on evidence-based approaches to online safety that don’t require sacrificing teenage privacy or creating new security and constitutional vulnerabilities in the process.
Image via Unsplash.