What the heck is CDA 230 anyway?

Once again, I find myself compelled to provide a primer on a law that’s getting a lot of attention these days, but that lots of folks don’t quite understand. What is CDA 230? Did Twitter go beyond it when it flagged Trump tweets? What exactly does that executive order do? Let’s take a look.

What is CDA 230?

CDA 230 is the acronym for Section 230 of the Communications Decency Act, which was enacted as Title V of the Telecommunications Act of 1996. The CDA was intended to address the increasing use of the Internet to distribute indecent and obscene materials. Although just one year later, the Supreme Court struck down other portions of the CDA, Section 230 remains intact.

CDA 230 has two basic parts. The first provides certain legal immunities to interactive computer services for third-party content. That’s a fancy way of saying Facebook is not liable for something I post to my page. The law achieves this grant of immunity by declaring that these services are not “publishers” of other people’s content (more on this below). CDA 230 preempts state civil and criminal laws that may otherwise expose these services to liability for illegal or prohibited content (i.e. child pornography). But these services are not immune from federal criminal law, intellectual property laws, or the Electronic Communications Privacy Act and similar state laws.

The second part of 230, known as the “Good Samaritan” clause, holds these services harmless for their good faith efforts to “restrict access to or availability of material that the provider considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

Why did Congress enact CDA 230?

While the larger statute was aimed at tamping down on online obscenity, CDA 230 was inspired by a pair of lawsuits against CompuServe and Prodigy (remember them?), both sued for defamation; not for their speech, but for content posted by users. The CompuServe suit was tossed, but not the suit against Prodigy. The distinguishing fact between the two was Prodigy’s decision to moderate content on its platform. Both lawsuits were concerning to Congress. If America’s world wide web pioneers were exposed to litigation for any and all unlawful content on their platforms, they’d never get off the ground. Equally alarming, as  result of the Prodigy suit, these platforms could actually be incentivized to not remove harmful content.

Thus, the two-part approach of “not a publisher” immunity and the “Good Samaritan” clause. And while inartfully written, CDA 230 continues to bar most lawsuits against websites, social media platforms, search engines, app stores, and the like. For now that is.

What’s changed in the last 24 years?

A lot! Virtually nothing we do online today existed when CDA 230 was enacted. Smart phones, apps, social media platforms, and millions of website enable a wide array of online activities. We shop, book travel, socialize, read the news and books, binge watch TV shows, and yes, even work online. The Internet’s proliferation has also facilitated criminal activity, including an alarming increase in child pornography distribution, sex trafficking, illegal drug markets, terrorism facilitation, and more. Online platforms like Facebook and YouTube have even unwittingly hosted real-time videos of violent crimes and murders.

In recent years, online providers have dramatically increased the policing of their platforms, not only to identify and remove illegal content, but to remove or restrict speech they or their users deem “harmful,” “offensive,” or “fake news.” Americans now expect sites like Twitter or Google to police everything – hate speech, election interference or propaganda by foreign adversaries, vaccine and COVID-19 conspiracy theories, or pretty much anything at least some of us don’t agree with.

These heightened expectations have also brought an outcry for heightened accountability. Conservatives complain that content moderation practices are discriminatory; liberals complain that they don’t go far enough to address misinformation or hate speech. Family members of terror victims have sought to hold platforms like Facebook liable for terrorists’ use of its platform (and that lawsuit was barred by CDA 230). And most recently, the outrage over online sex trafficking via Backpage.com led Congress in 2018 to amend CDA 230 to enable state civil and criminal sex trafficking actions against interactive computer services.

Does content moderation make an Internet company a “publisher”?

Remember, CDA 230 provides immunity to interactive computer services so long as they don’t function as a publisher. So while they can remove or restrict access to content under the “Good Samaritan” clause, if they “edit” or otherwise control the third party content they host, they run the risk of being a publisher. In 2008, the website Roommates.com was found to be acting as a publisher, and not merely a content host, because the online questionnaire it provided to users defined the content posted to its site.

Online companies must walk a very fine line on whether and to what extent they edit or alter third party content if they want to enjoy protections of CDA 230. Removing or restricting access to content is arguably easier since the “Good Samaritan” clause applies broadly to “objectionable” material.

So what about those tweets?

Ah yes, Twittergate! Last week, Twitter placed a fact-check label to two of President Trump’s tweets alleging mail-in ballot fraud. These labels did not edit, restrict, or obstruct the tweets. Users could click the link to access content that refuted Trump’s claims. It’s unlikely, then, that Twitter’s actions rise to the level of “publisher,” but it’s also unlikely that they fell under the “Good Samaritan” protections. That’s not to say that Twitter is therefore liable for its labeling. CDA 230 doesn’t create liability; that is it doesn’t define a civil tort or criminal law under which companies like Twitter can be sued or prosecuted. Rather, it limits liability under other laws. So, President Trump or anyone else who wishes to sue Twitter for its content moderation practices (or lack thereof) must first convince a court that Twitter was acting outside of its CDA 230 immunity and then must prove that Twitter violated an applicable civil law.  

Following President Trump’s CDA 230 executive order, Twitter doubled-down on its content moderation by obscuring, but not removing, a Trump tweet in response to the George Floyd protests that said “when the looting starts, the shooting starts.” Again, not likely publishing behavior, but could be “Good Samaritan” behavior if it meets the standard of “restricting” access to “objectionable” or perhaps “excessively violent” material.

What does Trump’s executive order do?

It’s not clear yet exactly what the executive order will do to CDA 230. As a limited liability statute, its interpretation has fallen to the courts over the last two-and-a-half decades. This isn’t extraordinary. A number of other federal statutes that either assign or limit liability are not subject to federal regulation, but rather court interpretation. With this executive order, President Trump is attempting to introduce federal rulemaking into the mix. It instructs the Commerce Department to petition the FCC for a CDA 230 rulemaking, directs the FTC to address “deceptive or unfair practices,” directs federal agencies to assess and report on their online ad spending, and instructs the Attorney General to convene a working group to develop model state legislation for deceptive or unfair practices.

What’s next?

This week, the Center for Democracy and Technology, an advocacy group funded by technology companies, filed suit challenging the executive order on First Amendment grounds, arguing it will chill protected speech.

Congress is also ramping up its CDA 230 rhetoric and adding more bills to the hopper to amend the law. But none are likely to be taken up before the FCC petition is filed. It’s also unclear whether and how the FCC will respond.

Caroline Lynch