fbpx

Content Moderation

02/28/2022

NetChoice Releases Response To Duty Of Care Proposals

Chris Cox
Chris Cox Member of the Board of Directors

Today, NetChoice released a response paper about the proposed liability rules in Who Moderates the Moderators?: A Law and Economics Approach to Holding Online Platforms Accountable Without Destroying the Internet by the International Center For Law And Economics.

The paper commends the authors Geoffrey Manne, Kristian Stout, and Ben Sperry for not wanting to hold online services liable for third-party speech while it also counters the duty of care proposals the authors recommend.

Chris Cox’s response discusses how:

  • Imposing liability even when platforms have no knowledge of illegality, but “should” have known of it, is too subjective a standard for the sheer quantity of user-created content.
  • Requiring discovery in most lawsuits over user-created content will expose platforms to open-ended liability in an ever-growing volume of litigation that few, if any, platforms could survive.
  • The authors describe their duty of care proposal as an infinitely flexible one that will discourage platforms from hosting it in the first place.
  • Guessing wrong about what might happen if such a proposal does happen risks undermining the foundation of all user-created content on the internet.

“Grafting negligence concepts onto Section 230 would be a nightmare for consumers,” said the Hon. Christopher Cox, the author and co-sponsor of Section 230. “When almost every complaint about the thousands of postings each day can make it past the pleading stage, platforms facing unlimited liability will inevitably cut back on all user-created content, from how-to videos to consumers’ opportunities for hosted free speech.”

“Subjective negligence concepts are a poor foundation for shifting liability from internet users to the platforms hosting their content,” continued Cox. “With more than 20 million websites governed by Section 230—many of them hosting thousands, millions, and in some cases billions of pieces of user-created content each day—it is inconceivable that almost every complaint about content should proceed past the pleading stage to become a multi-year lawsuit.”

To read the full report countering Who Moderates The Moderators, click here.

Next Post Conservatives Warn Social Media Censorship of Russians Could Target Americans Next