Instead of holding all online platforms exempt from liability by default, IBM believes that the exemption should be conditioned on companies applying a standard of “reasonable care” and taking actions and preventative measures to curb unlawful uses of their service. In a 2017 research paper, Professors Danielle Citron and Ben Wittes proposed this approach as a balanced compromise to address the growing proliferation of illegal and harmful online content.
The “reasonable care” standard would provide strong incentives for companies to limit illegal and illicit behavior online, while also being flexible enough to promote continued online innovation and fairly easy adaptation to different online business models.
Reasonable care does not mean eliminating entirely the intermediary liability protections of CDA 230, or comparable laws in Europe and elsewhere. Nor are we calling for amending the “Good Samaritan” provision of CDA 230, which limits the liability of companies that take voluntary actions to stop bad actors. We simply believe companies should also be held legally responsible to use reasonable, common-sense care when it comes to moderating online content. This means, for example, quickly identifying and deleting content focused on child pornography, violence on child-oriented sites, or online content promoting acts of mass violence, suicide, or the sale of illegal drugs. A reasonable care standard in CDA 230 would add a measure of legal responsibility to what many platforms are already doing voluntarily.