How big a threat does misinformation pose to democracy?


“Epistemic” is a good five-dollar word. It means, roughly, “of or relating to knowledge or knowing.” Or “relating to knowledge or the study of knowledge.” (Think epistemology, fellow liberal-arts graduates.)

The first time I remember encountering it in mainstream usage was during the early days of the Obama administration, when some of the internal intellectual bonds within the Republican Party were beginning to fracture. For those conservatives skeptical of the growing Tea Party/talk radio/Fox News wing of the party, a key phrase was “epistemic closure” — the idea that some of their fellow partisans had shut themselves off from the reality-based world. From The New York Times in 2010:

The phrase is being used as shorthand by some prominent conservatives for a kind of closed-mindedness in the movement, a development they see as debasing modern conservatism’s proud intellectual history. First used in this context by Julian Sanchez of the libertarian Cato Institute, the phrase “epistemic closure” has been ricocheting among conservative publications and blogs as a high-toned abbreviation for ideological intolerance and misinformation.

Conservative media, Mr. Sanchez wrote at juliansanchez.com — referring to outlets like Fox News and National Review and to talk-show stars like Rush Limbaugh, Mark R. Levin and Glenn Beck — have “become worryingly untethered from reality as the impetus to satisfy the demand for red meat overtakes any motivation to report accurately.” (Mr. Sanchez said he probably fished “epistemic closure” out of his subconscious from an undergraduate course in philosophy, where it has a technical meaning in the realm of logic.)

As a result, he complained, many conservatives have developed a distorted sense of priorities and a tendency to engage in fantasy, like the belief that President Obama was not born in the United States or that the health care bill proposed establishing “death panels.”

Soon conservatives across the board jumped into the debate. Jim Manzi, a contributing editor at National Review, wrote that Mr. Levin’s best seller, “Liberty and Tyranny: A Conservative Manifesto” (Threshold Editions) was “awful,” and called the section on global warming a case for “willful ignorance,” and “an almost perfect example of epistemic closure.” Megan McArdle, an editor at The Atlantic, conceded that “conservatives are often voluntarily putting themselves in the same cocoon.”

Liberals, of course, were then happy to wield the phrase in turn — a rhetorical update on “reality-based community.” Thankfully, soon after that, everyone agreed that fantastical beliefs based on misinformation were bad and politics got normal again. (Wait, what’s that? You’re telling me that then Donald Trump became president?)

Anyway, the last decade or so has been a boom time for all things epistemic, as the narratives both journalists and citizens told themselves about the role of knowledge in political decision-making got…complicated. The internet is, it turns out, a powerful engine for the creation of mistrust and a rich source of raw materials for false beliefs.

Which brings me to a new paper from a set of heavy hitters in the misinformation academic research space. It’s titled “Misinformation and the Epistemic Integrity of Democracy,” and it’s by Stephan Lewandowsky, Ullrich K. H. Ecker, John Cook, Sander van der Linden, Jon Roozenbeek, and Naomi Oreskes — variously of Harvard, Cambridge, and the universities of Bristol, Potsdam, Melbourne, and Western Australia. (It’s a pre-print set to run in Current Opinion in Psychology.) It gets at a growing problem in epistemic battles — rhetorical attacks on those who attempt to referee them. Here’s the abstract:

Democracy relies on a shared body of knowledge among citizens — for example, trust in elections and reliable knowledge to inform policy-relevant debate. We review the evidence for widespread disinformation campaigns that are undermining this shared knowledge. We establish a common pattern by which science and scientists are discredited and how the most recent frontier in those attacks involves researchers in misinformation itself. We list several ways in which psychology can contribute to countermeasures.

The authors introduce a concept I hadn’t heard of previously: an “epistemic theory of democracy.” What makes a democratic form of government legitimate? One can certainly make an argument about fairness or civil rights — that every human deserves to have a voice in the state. Epistemic democrats add on another argument: Democracy is legitimate because it works. The “wisdom of the crowd” is a real thing in their minds, and decisions made at some level by a diverse, heterogeneous group of citizens are likely to be better than those made by, say, a dictator or a theocrat.

The truth of this theory is a matter of real debate. But there’s broad agreement that, if the crowd is to be wise, it needs to have access to accurate information. Like, facts and stuff. And that’s what concerns Lewandowsky et al.:

This concern is particularly acute when decisions require consideration of scientific evidence, such as in public health or regarding climate change. The ongoing organized dissemination of misinformation about scientific issues thus arguably undermines democracy in much the same way as a “big lie” about an election, albeit in a more indirect manner. We consider two domains, climate change and the COVID-19 pandemic, in which misinformation has played a crucial, and adverse, role.

The authors outline some recent history that will be familiar to most: well-funded climate change deniers, antivax crankery, 5G conspiracy theories, #StopTheSteal, and the like. They note that, in many cases, the ire of the (shall we say) epistemically closed is directed at specific individuals — often scientists and researchers — who are cast as drivers of the conspiracy:

In the case of scientists, personal attacks range from abusive emails to threats of physical harm or harassment through frivolous freedom-of-information requests. Hate mail, such as accusations of “mass murder” directed at climate scientists, tends to peak after the posting of scientists’ email addresses on websites run by political operatives. Those public attacks are often paralleled by complaints to scientists’ host institutions with allegations of research misconduct. In the case of tobacco research, there is evidence that complaints about academics are not random but organized by or on behalf of the tobacco industry.

Contrarian efforts have also focused on quote-mining scientists’ emails to construct conspiratorial narratives about alleged malfeasance, for example during the scandal arising from the release of stolen emails between climate scientists in 2009. The response to the COVID-19 pandemic similarly involved increasingly personalized attacks on public-health officials, such as Anthony Fauci, who was chief medical adviser to the president during the pandemic and became a central figure in the far-right imaginary.

The latest to face these sorts of attacks are misinformation researchers themselves. Ohio Republican Jim Jordan — last seen nearly becoming Speaker of the House — has spent substantial time and resources investigating the investigators, issuing sweeping subpoenas and painting them as part of a censorship conspiracy. After research by the Center for Countering Digital Hate found a spike in hate speech on Twitter after Elon Musk’s acquisition, Musk decided to sue it. The threats are working, at least to a degree: “The campaign led by Jordan has caused several prominent researchers in the field to curtail public engagements and has had a chilling effect overall on research on misinformation, at a time when the U.S. is preparing for another bruising presidential election.” (Though, thankfully, many are resisting.)

While Lewandowsky et al. diagnose the problem (and its threat to any epistemic theories of democracy), their suggested solutions are uninspiring — EU-style mandated platform reports and the sorts of small-scale behavioral interventions (media literacy tips! “nudges”!) that seem unlikely to make much of a difference. The broader forces — the ones already leaving people “worryingly untethered from reality” more than a decade ago — have proved quite resistant to nudges.

Back then, Julian Sanchez defined “epistemic closure,” in part, as “an overbroad ideological justification for treating mainstream output as intrinsically suspect.” It’s fundamentally an in-group phenomenon. But attacking these external sources of knowledge — like public health officials, like climate scientists, like misinformation researchers — has an external impact that goes beyond any “closure.” As Lewandowsky et al. put it:

At the time of this writing, it is difficult to avoid the realization that one side of politics — mainly in the U.S. but also elsewhere — appears more threatened by research into misinformation than by the risks to democracy arising from misinformation itself.



Source link