Close this menu

Section 230: Immunizing the Internet From Coronavirus

Months after it first entered the United States, coronavirus continues to bring disruption and destruction, meaning many states, cities, and counties are maintaining their shelter-in-place orders.

Imagine sheltering in place and social distancing without the internet — no video streaming, no Facebook posts, no happy birthday Zoom calls, no virtual learning, no online shopping. Thankfully, though, most Americans are able to keep life moving because the internet is there for us.

But the internet’s success isn’t just good luck. Indeed, the Coronavirus threatened the internet too.

As we previously wrote, “the coronavirus pandemic has seen bad actors try to exploit Americans, especially online. From swindlers of all stripes to peddlers of disinformation, criminals, cheats, and charlatans have done their best to weaponize our channel to the outside world.”

In spite of the tough times, we can take some solace in knowing that a 1996 federal law helps protect us online.

These bad actors have largely failed, however. That’s because of Section 230 of the Communications Decency Act, a 1996 federal law that lets online platforms and marketplaces remove harmful content without being sued for removing that content. So under the law, for example, Facebook is able to remove posts hawking dangerous “miracle” drugs. And YouTube is able to pull videos that encourage people to ingest deadly chemicals.

At the same time, Section 230 lets user-created content flourish. The law boils down to this: You are responsible for what you post, not the platform on which you post. Although this principle is deceptively simple, it packs a powerful punch.

If Facebook, Twitter, Yelp, YouTube — even the Washington Post and New York Times — had to fact check and review every single post, we’d have far less content, far less discussion and debate, and far more frivolous lawsuits (we previously discussed how new outlets use their own form of intermediary liability minimization).

As we explained, “Without Section 230’s clear limitation on liability it’s hard to imagine that most of our online services would — or even could — exist. Without Section 230, Snapchat could be responsible and held legally liable for every mean message. Match.com could be responsible for every over-inflated listing. Doordash could be required to inspect every item on a restaurant’s menu for accuracy.

Without Section 230, none of these services — nor any other small business that hosts user-created content — could operate with so much potential liability.”

But because online firms are liable only for content they produce, they are empowered to host responsible content and to remove dangerous content. Section 230 has therefore immunized most of the internet from coronavirus. So although the virus continues to threaten American lives, we can take some solace in knowing that a 1996 federal law helps protect us online.