Republicans and Democrats increasingly disagree about content moderation. Do we want more or less? Given the debate over what social media should be allowed to do or not do regarding user content, it’s more important than ever to be informed about what content social media moderates and why.
Today, NetChoice released By The Numbers: What Content Social Media Removes And Why—a report aggregating social media transparency reports to nail down exactly what online companies remove and for what reason.
The key takeaway? Social media actively removes objectionable content all the time to bolster free expression and preserve online safety. In fact, social media is actively fighting dangerous and hateful content online to the tune of removing 6 billion posts in the second half of 2020 alone.
“Social media helps every community come together, feel safe, and express themselves through posts, photos, articles, and more,” said Robert Winterton, Director of Public Affairs at NetChoice. “Contrary to the politically charged claims we’ve all heard, social media sites actively remove huge amounts of harmful content from the internet. Social media businesses are doing more than ever to defend us from harm online. It’s crucial that we ensure such practices can continue unperturbed by government intervention.”
“When we’re still dealing with global lockdowns and the aftermath of pandemic measures that have cut people off from offline outlets and support groups, it’s more important than ever that we let online avenues continue to innovate and support our communities,” said Chris Marchese, Counsel at NetChoice. “Changing Section 230 is a bad solution to a problem social media sites are already working hard to fix. We hope to continue working with lawmakers to protect the internet’s ability to supercharge American innovation, American free expression, and American constitutional values.”
In By The Numbers, NetChoice Policy Analyst Malena Dailey explains how:
- Social media removed approximately 6 BILLION posts in the second half of 2020 alone.
- Social media companies recognize they need to be transparent and ensure that users know how to safely and effectively use their platforms.
- Proactive removal mitigates the threat of spam, violent, or otherwise offensive content being spread beyond the account responsible for the original post.
- Social media companies have shown they understand their responsibility to users to maintain a safe online environment while also being transparent and instilling user confidence in their services.
You can find the full NetChoice report, By The Numbers: What Content Social Media Removes And Why, here.