Pages in this blog

Tuesday, November 20, 2018

How is Facebook going to regulate hate speech on its platform? An independent Supreme Court of Facebook?


Mark Zuckerberg has suggested that an independent oversight body should "determine the boundaries of acceptable speech on the platform". This raises a host of issues, most centrally:
What standards, past decisions and values will it consider when evaluating, for example, whether a particular post is “hate speech”?

This is not an easy question. Indeed, the difficulty of answering that question seems to be one of the reasons Zuckerberg wanted such an independent body in the first place. In March 2018, Zuckerberg told Recode, “I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world. … [T]hings like where is the line on hate speech? I mean, who chose me to be the person that [decides]?” No doubt his unease with this situation was only furthered when he sparked off controversy by suggesting in a later interview that he didn’t think Holocaust deniers should be removed from Facebook—a perfect example of the difficulty Facebook faces. The U.S. has a famously expansive interpretation of free speech, and the court rulings that the First Amendment protected the right of Nazis to march in Skokie is remembered as one of the “truly great victories” in American legal history. By contrast, Holocaust denial is a crime in Germany. Putting aside the wisdom of either position, how should Facebook—a global platform connecting over two billion monthly users—respect conflicting standards of free speech, of which the example of Holocaust denial is only one?

Unfortunately, Zuckerberg’s Nov.15 Facebook post suggests he hasn’t given this issue enough attention. His post itself suggests several, sometimes contradictory, options. When he writes of the forthcoming independent body, he says, “How do we ensure their independence from Facebook, but also their commitment to the principles they must uphold?”—implying that the values in question are Facebook’s. These are embodied in the company’s Community Standards—which, along with its internal guidelines, are the rules that determine what content is allowed on the platform and which the 30,000 content reviewers use to make individual calls. Given these are the rules that the first-instance decision-maker will be applying, it makes sense that the tribunal should also be guided by them. This is consistent with Facebook’s goal that the standards “apply around the world to all types of content.”

But in his post, Zuckerberg also notes that “services must respect local content laws.” So will the Supreme Court of Facebook be charged with interpreting this local law? In deciding whether a post was justifiably taken down, will it interpret Thailand’s Lèse-Majesté laws prohibiting criticism of the Thai monarchy? Will it try and interpret the sometimes differing decisions of German regional courts on what is hate speech under German law?

Zuckerberg also suggests that “it's important for society to agree on how to reduce [harmful content] to a minimum—and where the lines should be drawn between free expression and safety.” If it’s society that decides the lines for free expression, how will the independent body determine what society’s views are? Will it take polls? If so, will those polls be national, regional or global? Will Facebook take into consideration national voting ages? Furthermore, doesn’t leaving the decisions to “society” risk undermining protection of minorities?

These options by no means exhaust the possibilities raised by Zuckerberg’s proposal in his post.
And:
Though Zuckerberg appears to be seriously pursuing the idea, currently his conception of the independent body is more soundbite than substance. When he says that the SCOF [Supreme Court of Facebook] will “ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world,” he sets an impossible goal. There is no homogenous global community whose norms can be reflected in the decisions of a single body deciding contentious issues. But that doesn’t mean the proposed body cannot be an important development in online governance, creating a venue for appeal and redress, transparency and dialogue, and through which the idea of free speech in the online global community develops a greater substantive meaning than simply “whatever the platform says it is.”

How the independent body is set up will determine whether it furthers or hinders rights to freedom of expression and due process. There is a rich literature in comparative law showing that decisions of institutional design can have significant impacts not only on outcomes but the entire stability and legitimacy of a governance structure.

1 comment:

  1. Look at the movie ratings systems to get a preview of effectiveness?

    ReplyDelete