Close this menu

SCOTUS Should Uphold Critical Liability Protections for Online Speech

WASHINGTON—Today, NetChoice and a coalition of trade groups concerned about online speech filed an amicus brief supporting Twitter in Twitter, Inc. v. Taamneh, a case pending before the U.S. Supreme Court. In Taamneh, the Court will decide whether the federal Anti-Terrorism Act extends beyond punishing wrongdoers and their aiders-and-abettors to punishing third parties that neither knew of nor participated in any crimes. Although a technical question of statutory interpretation, the case will affect online speech and the internet’s functioning. 

“With billions of pieces of content added to the internet every day, content moderation is an imperfect—but vital—tool in keeping users safe and the internet functioning,” said Chris Marchese, NetChoice Counsel. 

“Without moderation, the internet will become a cesspool of vile content, making it easier for bad actors to exploit the internet’s reach and harm its users,” added Marchese. “Imposing liability for moderating imperfectly will only chill constitutional speech, stifle the diversity of voices online and punish what this tragic case proves the internet needs: content moderation.” 

Authored by the firm of Lehotsky Keller, the coalition brief includes as signatories the Computer & Communications Industry Association, Software & Information Industry Association, Developers Alliance, Chamber of Progress, Internet Infrastructure Coalition, Consumer Technology Association, and ACT | The App Association.

* * *

The coalition brief explains that:

  • With millions of pieces of content added to the internet every minute, content moderation is a necessary editorial function for keeping the internet safe and useful. In fact, websites, digital services, and apps removed over 3 billion pieces of spam in just 6 months in 2020 alone.
  • Human reviewers and artificial intelligence are vital in keeping the internet working, and together have been successful in removing about 90% of harmful content before it’s seen.
  • Even so, some content evades detection. Indeed, just as content editors respond to new threats, wrongdoers adapt around new policies. Content moderation is thus an iterative process that sometimes detects content only after it’s been seen. 
  • While content moderation can never be perfect, its use should not be discouraged because it is vital to the open internet’s functioning. 

In sum, we ask the Supreme Court to reverse the Ninth Circuit’s judgment to ensure critical liability protections for content moderation remain intact and the internet remains safe and functional.

* * *

You can read NetChoice’s brief here. Please contact Krista Chavez at press@netchoice.org with inquiries.