WASHINGTON—Today, the U.S. Supreme Court will hear oral arguments in Twitter, Inc., v. Taamneh, a case that will determine whether online services may be held liable under the Anti-Terrorism Act for unintentionally hosting pro-terrorism content.
In December, NetChoice filed an amicus brief with a coalition of trade groups, explaining that content moderation is a critical part of online services’ operations.
“With billions of pieces of content added to the internet every day, content moderation is an imperfect—but vital—tool in keeping users safe and the internet functioning,” said Chris Marchese, NetChoice Counsel.
“Without incentives to moderate, the internet will be immersed in a cesspool of exploitative content. We hope the Supreme Court will uphold this critical protection that helps online services thrive.”
Marchese continued: “Even with the best moderation systems available, a service like Twitter alone cannot screen every single piece of user-generated content with 100% accuracy. Imposing liability on such services for harmful content that unintentionally falls through the crack would disincentivize them from hosting any user generated content.”
While content moderation can never be perfect, its use should not be discouraged because it is vital to the open internet’s functioning. NetChoice hopes the Court will reverse the Ninth Circuit’s holding that online services may be held liable for unintentionally hosting pro-terrorism content their users post.
You can find our amicus brief with the Computer & Communications Industry Association, Software & Information Industry Association, Developers Alliance, Chamber of Progress, Internet Infrastructure Coalition, Consumer Technology Association, and ACT | The App Association here.
Please contact Krista Chavez at email@example.com with inquiries.