Avi Tuschman, Want to solve the misinformation crisis? We already have a proven solution at our fingertips, Fast Company, 4.8.21:
The most dystopian feature of our time is not that we face formidable challenges; in a different era we might have had enough shared beliefs to navigate a bloodless presidential transition, vaccine hesitancy, racial tension, and even climate change. Today, however, our body politic suffers from an informational infection that hinders our ability to adequately respond to these serious threats.
What do we do? We need a third entity:
How can we improve our information systems to save lives? As NYC Media Lab’s Steve Rosenbaum has pointed out, neither the tech platforms nor governments can be trusted fully to regulate the internet, “So it’s like we want this magical entity that isn’t the government, that isn’t Facebook or YouTube or Twitter.”
Rosenbaum is absolutely correct: solving the misinformation crisis requires a “magical” third entity that lacks any incentive to manipulate information for economic or political ends. However, it is the tech companies that could build a sufficiently fast and scalable system for distinguishing facts from falsehoods. Such a solution is not only theoretical—its key components are already well developed and proven.
For example, Wikipedia has become a trusted source of reliable information.
Social media platforms can leverage Wikipedia’s strengths to reduce their weakness. They must provide an open-source fact-checking space in their content-moderation systems. Doing so is not only an ethical responsibility; it’s also a smart move to get ahead of the regulatory hammer.
Here’s how it could work: A tiny percentage of social media content contains viral misinformation deleterious to public health. Tech companies should start by implementing policies to make such content eligible for open-source fact-checking. The platforms could use several mechanisms to pass suspect content to a distributed review process. Here, fact-checking users would utilize the same open-source software and mechanisms that have successfully evolved on Wikipedia to adjudicate verifiability. The “visible process” of fact-checking would occur on a MediaWiki, ideally governed by a multi-stakeholder organization. The facts themselves—the “ground truth”—would be English-language Wikipedia text, from articles that meet minimum authorship and editorship thresholds.
There’s more at the link.