Saturday, December 21, 2019

Section 230 of the Communications Decency Act is not primarily about Deceny at all, it's about user empowerment

Congress is currently debating about whether or not and, if so, how to alter the provisions of Section 230 of the Communications Decency Act. Why? Because of " failures of companies to block harmful content and allegations that social media services block certain political viewpoints" writes Jeff Kosseff in Lawfare: What’s in a Name? Quite a Bit, If You’re Talking About Section 230.
Misunderstandings of Section 230’s history already have framed the current debate, including claims that Section 230 applies only to “neutral platforms” and assumptions that Congress passed the statute to censor speech through private companies. In reality, Congress passed Section 230 so that platforms could choose not to be neutral and to moderate content based on the demands of their users (rather than regulators or judges). I spent more than two years researching and writing a book about Section 230, and I found that Congress passed Section 230 to empower consumers and platforms—rather than the government—to develop the rules of the road for the nascent commercial internet.

Members of Congress introduced Section 230 to address an illogical gap in the law that applied to distributors of content created by others. Under a First Amendment and common law rule, which first emerged in obscenity prosecutions of bookstore owners in the 1950s and 1960s, a distributor of someone else’s speech is liable for that speech only if the distributor knew or had reason to know of the particular illegal content. This rule, for instance, prevented newsstands from having to prescreen every newspaper and magazine they sold for fear of obscenity prosecutions and defamation lawsuits.

This common law protection worked pretty well until the early 1990s, when online services emerged, carrying far more content than the average bookstore or newsstand. These early platforms had different approaches for moderating third-party content on their bulletin boards and chat rooms. CompuServe took a relatively hands-off approach, while Prodigy billed itself as a family-friendly alternative, establishing intricate user content policies, hiring moderators to police its sites and using automated software to screen for offensive content.

Yet both CompuServe and Prodigy were sued for defamation arising from third-party content posted on their services. CompuServe convinced a New York federal judge to dismiss the case. The judge concluded that CompuServe was nothing more than an electronic version of a bookstore. Applying the common law rule for distributors, the judge ruled that CompuServe could not be liable because it did not know and had no reason to know of the alleged defamation.

Prodigy, however, was not so fortunate. Because the platform chose to moderate user content, a New York state court judge ruled in 1995, it did not receive the protections afforded to distributors. Rather, the judge held that Prodigy was a publisher and thus faced the same potential liability as the author of the posts.

Taken together, these court rulings meant that a platform might reduce its potential liability by taking a hands-off approach, as CompuServe did.
The story has the sort of complication that resists easy summary. The provisions in 230 were tacked on to another bill, most of which was subsequently struck down in a Court ruling. The upshot:
Herein lies the source of the confusion surrounding the name used to refer to Section 230. Because Cox and Wyden’s legislation was placed in the same title of the Telecommunications Act of 1996 as Exon’s amendment [which was struck down], courts soon began referring to it as “Section 230 of the Communications Decency Act,” though it technically should be called “Section 230 of the Communications Act of 1934, as amended” or “Section 509 of the Telecommunications Act of 1996.” But the best descriptor of its two primary goals remains its initial title: the Internet Freedom and Family Empowerment Act.
And so we arrive:
It is true that Section 230 provides platforms with the breathing room to moderate (or not moderate) third-party content. [...] But it is absolutely critical to remember that both immunity provisions of Section 230 explicitly apply not only to providers of interactive computer services but also to users. This was not a typo; it reflects the very clear preference of the drafters: user empowerment.

User empowerment recognizes that some platforms may moderate more than others and users will decide which to gravitate toward. This framing ultimately favors the free market over government regulation. The 1995 Center for Democracy and Technology report that provided the foundations for Section 230 emphasized the need to allow diverse approaches: “To be sure, some system operators will want to offer services that prescreen content. However, if all systems were forced to do so, the usefulness of digital media as communication and information dissemination systems would be drastically limited. Where possible, we must avoid legal structures that force those who merely carry messages to screen their content.”

No comments:

Post a Comment