Close this menu

NetChoice Testimony in Opposition to South Carolina S 1037, an AI Chatbot Bill

S 1037 is an unconstitutional overreach that forces large chatbot companies to implement mandatory age verification, restrict minor access by default and obtain parental consent — exposing South Carolina to litigation it will almost certainly lose, just as courts have struck down similar laws in Arkansas, California, Ohio and Louisiana. The bill’s vague definitions and sweeping private right of action make compliance nearly impossible while inviting abusive litigation, and its age verification regime would centralize millions of South Carolinians’ sensitive personal data in commercial databases — putting the very children it claims to protect at greater risk.

NetChoice Testimony in Opposition to South Carolina S 1037, Protecting Children from Chatbots

April 14, 2026

South Carolina Legislature
Members of the Senate Labor, Commerce, and Industry Committee,

NetChoice respectfully requests your opposition to S 1037, which requires large chatbot companies to verify user ages, restrict minors to a limited feature set by default and obtain parental consent before unlocking advanced features, among other provisions.  

We take seriously the goal of protecting children from online harm — that’s one goal we share. But good intentions do not immunize legislation from constitutional problems. This bill, however well-meaning, is fatally flawed on multiple grounds. It will expose South Carolina to costly and losing litigation, burden law-abiding companies with unworkable compliance requirements, while creating a sweeping new data collection regime that puts the very children it claims to protect at greater risk. We urge the committee to oppose this bill.

S 1037’s Core Provisions are Unconstitutional Under the First Amendment—and are Already the Subject of Litigation in Other States.

The bill’s centerpiece — mandatory age verification as a condition of accessing online features — is not a new idea. Courts across the country have been confronting this question for years, and the record is clear: age verification mandates that condition access to lawful online speech fail constitutional scrutiny. Federal courts have enjoined similar laws in Arkansas (NetChoice v. Griffin, 2023 WL 5660155 (W.D. Ark., Aug. 31, 2023) (enjoining Arkansas’s parental consent and age-verification law to access social media for violation of the First Amendment)), California (NetChoice v. Bonta, 2023 WL 6135551 (N.D. Cal., Sep. 18, 2023) (enjoining California’s Age-Appropriate Design Code Act for violation of the First Amendment)), Ohio (NetChoice v. Yost, 2024 U.S. Dist. LEXIS 24129 (S.D. Ohio, Feb. 12, 2024) (enjoining Ohio’s parental consent for social media law as unconstitutional under the First Amendment)) and Louisiana (NetChoice v. Murrill, No. 3:25-cv-00231 (M.D. La. Dec. 15, 2025) (permanently enjoining Louisiana’s age verification and parental consent law for social media as a violation of the First Amendment)).

The First Amendment protects not only the right to speak, but the right to receive information and ideas. Chatbots, like websites, search engines, books and newspapers, are conduits through which speech flows. The content they produce — information, advice, creative writing, educational material, casual conversation — is constitutionally protected expression. When the government requires a person to surrender identifying credentials before accessing that speech, it chills the very communication the First Amendment is designed to protect. Anonymous access to information is not a loophole; it is a First Amendment right the Supreme Court has protected for decades.

South Carolina has already been on the receiving end of this principle. NetChoice is currently litigating NetChoice v. Wilson over South Carolina’s prior attempt to mandate online age verification — legislation that also came with strong public support and compelling stated purposes. The constitutional defects in that bill are the constitutional defects in this one. The committee should not repeat the same mistake.

The bill cannot cure these defects through careful drafting. The problem is not how the verification is structured — it is that the structure exists at all. Requiring verified identity as a ticket to protected speech has been the core defect in every enjoined law in this space, and this bill’s “reasonable age verification” requirement — explicitly including government-issued ID, driver’s licenses, bank account verification and third-party identity verifiers — is as demanding as any that courts have struck down.

Age Verification Regimes Create Serious Privacy and Cybersecurity Risks

Supporters of S 1037 will point to Section 39-81-20’s data minimization and deletion requirements as evidence that privacy has been taken seriously. However, provisions do not solve the privacy problem — they acknowledge it. And no amount of statutory language can paper over the fundamental reality that building a system to verify the government-issued identity of every person who wants to use a chatbot creates an enormous, high-value trove of sensitive personal data that will be targeted by hackers, scammers and data brokers.

Bills whose age verification provisions require users to submit government identification or biometric data introduce new cybersecurity risks that no deletion timeline can fully eliminate. A database that exists for twenty-four hours can be breached in twenty-four hours. And the breach of an age verification database is not a minor inconvenience — it is an exposure of government identification documents linked to specific users’ online activity.

The irony is acute: a bill sold as protecting children’s safety online would require millions of South Carolinians — adults and children alike — to hand over the most sensitive forms of personal identification simply to have a conversation with a chatbot. Minors who use these platforms do not become safer when their identifying information is centralized in a commercial database. They become more vulnerable. 

The Bill is Vague, Overbroad and Invites Arbitrary Enforcement

Beyond its constitutional and privacy defects, S 1037 suffers from definitional problems that make meaningful compliance nearly impossible. The definition of “chatbot” sweeps broadly across AI systems that “maintain a conversational state” and produce “adaptive or context-responsive” output. That description could encompass customer service tools, educational platforms, coding assistants and general-purpose AI products that have no relationship to the companion chatbot use cases that animate public concern about this legislation.

The definition of “restricted features” is similarly elastic. An “extended interaction session” that “may pose an unreasonable risk” of emotional dependence is not a standard any engineer can implement or any court can reliably apply. What duration triggers unreasonable risk? What patterns of use constitute “emotional dependence”? The bill defines the term with examples — a user expressing that the chatbot is a “primary source of emotional support” — but leaves open every hard question about how a company is supposed to detect and act on that in real time without surveilling and analyzing the content of private conversations, which creates its own constitutional and privacy concerns. The prohibition on features that “prioritize engagement… at the expense of user wellbeing” is equally unworkable. Every product feature involves tradeoffs between user engagement and other values. A law that makes it a violation to favor engagement “at the expense of wellbeing” — without defining what wellbeing means, how it is measured, or what the threshold of impermissible prioritization is — does not regulate conduct. It creates a general standard of liability that covered entities cannot satisfy through any compliance program and that enforcers can apply selectively against disfavored products.

The Private Right of Action will Generate Abusive Litigation

S 1037 creates a private right of action with actual damages, attorney fees and punitive damages for willful or grossly negligent violations — while prohibiting arbitration clauses and class action waivers. That combination deserves scrutiny.

The private right of action does not just invite legitimate suits from genuinely harmed families — it issues an open invitation to the plaintiffs’ bar. Billboard attorneys will stretch this statute’s vague definitions far beyond anything the drafters intended: a coding assistant becomes a “chatbot maintaining conversational state,” a notification becomes evidence of prioritizing engagement “at the expense of wellbeing.” Armed with attorney’s fees, punitive damages and a mandatory courthouse door, these lawyers need not win — only threaten. The companies most likely to settle are not the bad actors this bill targets, but compliant ones who cannot afford litigation over undefined standards.

Because there is no statutory damages floor, recovery requires proving that a specific chatbot interaction caused a specific cognizable harm. That is a genuinely high bar — but the availability of attorney fees, punitive damages and mandatory court proceedings creates powerful settlement pressure against well-resourced defendants regardless of merit. The result is not compensation for genuinely harmed children. It is a litigation tax on compliant companies that produces no systematic improvement in child safety.

The punitive damages exposure is particularly concerning given the bill’s vague operative standards. What design decision crosses the line into “prioritizing engagement at the expense of wellbeing”? When has a company taken insufficient “reasonable steps” to reduce emotional dependence? Companies facing open-ended liability under undefined standards do not invest in better products — they restrict services for everyone, including the minors this bill claims to protect.

Again, we respectfully ask you to oppose S 1037. As always, we offer ourselves as a resource to discuss any of these issues with you in further detail, and we appreciate the opportunity to provide the committee with our thoughts on this important matter (The views of NetChoice expressed here do not necessarily represent the views of all NetChoice members.).

Sincerely, 

Amy Bos
Vice President of Government Affairs, NetChoice
NetChoice is a trade association that works to protect free expression and promote free enterprise online.