Misinformation Mistakes (guest post)


“The mistake involves grouping together into an all-or-nothing package entire sets of claims whose epistemological credentials are quite varied. It also often involves collapsing epistemic and moral concerns.”

In the following guest post, Eric Winsberg, Professor of Philosophy at the University of South Florida and British Academy Global Professor of History and Philosophy of Science at the University of Cambridge, discusses why he has become “extremely skeptical of recent philosophical work, both in social epistemology and in philosophy of science, around such topics as misinformation or fake news.”

This is the third in a series of weekly guest posts by different authors at Daily Nous this summer.

(Posts in this series will remain pinned to the top of the homepage for several days following initial publication.)


[Robert Longo, drawings from the series “Men in the Cities”]

Misinformation Mistakes
by Eric Winsberg

I recently reviewed a book by Paul Thagard called Falsehoods Fly: Why Misinformation Spreads and How to Stop It for the Journal of Value Inquiry. (It got some uptake here.) The main reason that I was eager to review this particular book is that I have become extremely skeptical of recent philosophical work, both in social epistemology and in philosophy of science, around such topics as misinformation or fake news. If readers are interested in the topic, I urge them to go and read the review before they join the discussion of this topic below. This isn’t just because I want the clicks. It’s because to understand and appreciate the root of my skepticism requires working through some of the (admittedly) tedious scrutiny that I subject some of Thagard’s claims to. The book covers a number of topics, including the war in Ukraine, inequality, and political conspiracy theories like Pizzagate, but I focus primarily on the chapters on the Covid-19 pandemic and on climate change, both because those are areas about which I have some (admittedly limited) expertise, and because those two topics share a pair of features that I think aggravates a wound in the misinformation discourse in philosophy.

What pair of features do the Covid-19 pandemic and climate change share? First, both subject areas are founded on certain well-credentialed propositions that are nonetheless doubted by a highly vocal minority. Covid-19 was a real virus that killed millions of people. Carbon dioxide is a heat trapping gas that has contributed to some of the observed warming of the planet over the last century and, if emissions are unabated, will continue to warm the planet into the future. Some of the people who doubt the truth of these propositions have bad intentions; some of them are simply confused.

If Thagard’s book is at all representative when it comes to claims about misinformation (and I think it is), people often want to do quite a bit more than condemn people’s denial of these epistemologically secure claims. This points us to the second feature that the Covid-19 and climate change share. In both subject areas, people like Thagard want to condemn people for denying claims that are not at all epistemologically secure. These include claims, for example, to the effect that SARS-CoV-2 has a natural zoonotic origin; that all the Covid-19 vaccines stopped the spread of the virus, and provided overwhelmingly more benefit than risk to everyone; that the models that made projections of the likely impact of adopting, or failing to adopt, various non-pharmaceutical interventions (NPI) always reflected the best available evidence; that climate change, if left unchecked, will kill a net 50,000,000 people from malnutrition, disease, and heat stress between the years 2030 and 2050.[1] My view is that all of these claims are uncertain and at least some will prove to be false. (If you are keeping score, the two I’m fairly confident are false are the one about the NPI models and the one that all the vaccines provided a net benefit to everyone, including previously infected, healthy children.)

In the review, I show that in discussing these topics, Thagard makes a very large number of factual mistakes. Indeed, Thagard often gives an argument that goes, with respect to a given claim C, “only sloppy, motivated reasoning could lead anyone to doubt C, because reasons x, y and z”, but where at least some of x, y and z are contradicted by matters of record. To give one example, Thagard claims that “[clinical] trials have found that vaccines, produced by Pfizer, Moderna, AstraZeneca, and other companies, are effective at preventing the spread and reducing the severity of COVID-19”. But while it’s obviously true that all of these trials showed that vaccines were effective at reducing the severity of Covid, not a single clinical trial ever showed that any Covid-19 vaccine was at all effective at preventing spread. Indeed, the balance of real-world evidence strongly suggests that they don’t. (If you doubt either of these claims, please see the review for the receipts.)

The reason that I tediously compared so many of Thagard’s claims to the best available evidence was to make a general point. And the general point is that a lot of the empirical claims that people fight over when it comes to policy decisions (like whether to enact mask mandates, impose lockdowns, mandate Covid-19 vaccines for children and young adults at university, or support various climate mitigation or adaptation measures) are actually hard to get to the bottom of, and, ultimately, uncertain. I worry that this is underappreciated by many of the people who work on misinformation, especially those (like Thagard) who think the solution to misinformation lies in censorship, regulation of social media companies, or even the application of consumer pressure on social media companies to do the regulation voluntarily.

I believe that many of the parties who contribute to this line of thinking are making a fundamental mistake. The mistake involves grouping together into an all-or-nothing package entire sets of claims whose epistemological credentials are quite varied. It also often involves collapsing epistemic and moral concerns. To give just one example, consider the claim that the Measles/Mumps/Rubella vaccine has not saved lives. We should all be extremely confident that this claim is false. And we should all be concerned that people who make this claim are guilty of moral wrongdoing. When these convictions kick in, there’s a natural moral-epistemic heuristic that leads many to assume that anyone who is even in the vicinity of these claims is equally factually wrong and equally morally reprehensible. Suddenly, the conviction isn’t just that MMR vaccines have saved lives, it’s that every single Covid-19 vaccine provides a net benefit to everyone, in any number of doses. Or, even, simply that there is incontrovertible evidence that climate change will take the form of a specific disaster by a specific time. We saw this over and over during the pandemic, where people equated skepticism about any particular Covid mitigation policy with “Covid denialism.”[2]

To some degree, I understand this moral-epistemic heuristic, and I’m sure that I sometimes employ it. But what I believe quite strongly is that it is not the job of philosophers, especially philosophers of science and social epistemologists, to try to find a philosophical foundation for this heuristic as if it were actually a secure way to reason. It’s not a good way to reason and there are no shortcuts for figuring out what policy-relevant facts are true by consulting the tribal affiliations of their exponents. And if Thagard’s book is any guide (and I think it is), this is exactly what much of the philosophical literature on misinformation is up to.

(Addendum: Justin specifically asked me to comment on the work of Cailin O’Connor and James Weatherall on misinformation, pointing me to this piece they wrote in the Boston Review. Justin noted that what they say about the pandemic is not that different than what I say. They wrote, for example “This legitimate uncertainty means that pundits and journalists who treat claims supporting hydroxychloroquine [and other controversial claims during the pandemic] as akin to typical misinformation are misdiagnosing the situation.” So, one thing I want to clear up is that I did not say, above, that all of the philosophical literature on misinformation is up to this. Indeed, O’Connor and Weatherall never say anything about what “typical misinformation” is.  And more importantly, they are not doing what Thagard and many other misinformationists are doing: purporting to provide simple tools for identifying misinformation and making recommendations for how it should be eliminated. It’s noteworthy that reviewers often comment that readers are likely to find this aspect of their book “disappointing”. I would actually argue that their work has little to do with “misinformation” as it is commonly understood, as it is instead simply a book about how falsehoods of any kind persist. But it is clear that Thagard doesn’t equate misinformation with falsehood. He thinks truths can be misinformation, and he thinks some misinformation is a particularly morally loaded kind of falsehood. And it is easy to see by perusing the philosophical literature on misinformation that most of the contributors think misinformation per se needs an analysis, and that this is a novel topic relative to the analysis of truth and falsehood that goes back at least to Aristotle.)

[1] See the review for evidence that Thagard condemns people either explicitly for denying these claims or for making value judgments that implicitly assume these claims are known with certainty.  (For example he condemns people for valuing corporate profits more than 50,000,000 lives, rather than considering the possibility that they consider the claim uncertain, or that they view the cost benefit analysis in different terms.)

[2] Indeed even in the last few days, “Philosopher Jersey Flight” on Twitter/X said my review was “like the people trying to argue that Covid wasn’t real”.

Subscribe
Notify of
guest

35 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Brad
3 days ago

a small correction to your review (linked above). Footnote 8 should be “Let me be clear …” (the “be” was dropped)
I am an editor … I could not help but help 😉

Marc Champagne
3 days ago

This is the best, most sensible, post I have ever read on DailyNous. Thomas Sowell, btw, has been meticulously warning against exactly this sort of thing for decades… Good work Eric!

Eric
Eric
Reply to  Marc Champagne
2 days ago

Thanks!

Richard Hanley
Richard Hanley
3 days ago

Thanks, Eric. I found the linked review very informative (much like Peter Godfrey-Smith’s stuff on this issue). And I was with you until the penultimate sentence in the final paragraph:

Now, it’s possible to believe that the ends justified the means. Especially in the first case. Maybe labelling the Hunter Biden laptop story misinformation, keeping the New York Post story about it off of social media, and stopping Glen Greenwald from writing about it in the Intercept, kept Donald Trump from being re-elected. Maybe this saved democracy. So maybe it was worth it. But if you think this, you should write a book entitled “Falsehoods Fly: Misinformation and how to use it for political advantage.” I would have declined to review such a book.

I am utterly opposed to the censorship you outline here. But I was surprised that you seemed so ready to equate “the ends” of the censors with the basest of motivations. I found this quite out of keeping with your overall message about epistemological responsibility.
RH

Whitney Schwab
Whitney Schwab
Reply to  Richard Hanley
3 days ago

“But if you think this [viz. the substance of the previous sentence], you should write a book entitled “Falsehoods Fly: Misinformation and how to use it for political advantage.””

Unless I’m missing something, there’s no call for any censorship, but actually something close to the opposite (if there are opposites here).

Nicolas Delon
Nicolas Delon
Reply to  Whitney Schwab
3 days ago

I took Richard to be referring to the calls to censor the Biden laptop story, which Eric reluctantly grants might have been justified on some view he rejects. Richard’s issue seems to be about Eric’s characterization of the ends—is it mere political advantage (securing victory for your own side) or is it something loftier (saving democracy)?

Eric
Eric
Reply to  Nicolas Delon
2 days ago

I thought I was clear enough that some of these people genuinely thought they were saving democracy.

Nicolas Delon
Nicolas Delon
Reply to  Eric
2 days ago

That’s how I was reading you but also: why-not-both?.gif

Eric
Eric
Reply to  Nicolas Delon
2 days ago

Exactly

Richard Hanley
Richard Hanley
Reply to  Eric
2 days ago

Good to know. But then, I don’t see why they *should* write the book with the title suggested. Maybe they do better with a different title: “Falsehoods Fly: Misinformation and how to use it for Good” or some such. (Psychological report: for me, the piece ended on an unnecessarily cranky note.)

Eric
Eric
Reply to  Richard Hanley
1 day ago

See my reply to Kenny below

Richard Hanley
Richard Hanley
Reply to  Nicolas Delon
2 days ago

Thanks, Nicolas–that was exactly my worry.

Kenny Easwaran
Reply to  Richard Hanley
1 day ago

Are you saying that the desire to save democracy is “the basest of motivation”?

Eric
Eric
Reply to  Kenny Easwaran
1 day ago

I think he doesn’t like that I called it “political advantage”. But that was (I think–i probably don’t have first person omniscience here) my way of being as open ended as possible. I don’t know who’s trying to save democracy, who’s trying to get their party to win, who’s doing both, and who’s doing the latter while convincing themselves they are doing the former. It was meant as a catch all phrase.

Richard Hanley
Richard Hanley
Reply to  Eric
1 day ago

Hey, Eric–kind of sorry I started this. Your explanation makes sense. But I wasn’t just trolling… I’m continually trying to figure out how best to present arguments to unreceptive audiences. My concern started with endnote 2 of your OP, where some person (philosopher?) seemed to completely misrepresent your review. I’m always flabbergasted at the uncharitable interpretations one finds on the internet (viz. twice on this thread about my posts!) and I think that in the case of “Jersey Flight” there might be motivated reasoning that combines with a negative reaction (like the one I myself was left with, despite my sympathy with your view) to produce a rejection of the argument given as itself motivated reasoning. So my reaction was a kind of “why give them the ammo?”

Anyway, I appreciate you fighting the good fight. We need it.
RH

Common Reader
Common Reader
3 days ago

Last year, Boston Review also published a highly critical review of recent misinformation research by philosopher Dan Williams. It is specifically a review of Sander van der Linden’s book Foolproof:

https://www.bostonreview.net/articles/the-fake-news-about-fake-news/

Eric
Eric
Reply to  Common Reader
2 days ago

Dan had a nice tweet thread where he said what he agreed with me about and what he didn’t.

krell_154
krell_154
2 days ago

Great post.

The problem is that “misinformation” is sometimes nothing more than a censhorship tool, which people whip out when they don’t want to debate the substance of some issue.

What better example than origin of Covid? Strong evidence points to it being created in a lab, the scientists who discussed it among themselves also suspected that it came from a lab – and yet, anyone who dared suggest that during 2020 and 2021.was labeled as a conspiracy theorist.

This makes it extremly hard to efficiently label true misinformation as such.

Another Philosopher
Another Philosopher
Reply to  krell_154
1 day ago

I’m not sure that this is true—I’m pretty sure that most people who have looked into it still think that the lab theory is false.

But irrespective of the actual truth of that claim, most people who believed it *were* conspiracy theorists.

Why? Because they didn’t believe it because they looked at the evidence and concluded that it was the theory that best fit that evidence. Their reasoning was emotionally and politically motivated after all.

I’m not saying that the matter wasn’t handled badly, but let’s not pretend that people who took the opposite view from those discussed here were epistemically virtuous.

Eric
Eric
Reply to  Another Philosopher
1 day ago

Are you really a conspiracy “theorist” when the conspirators conspired over email and slack and we have detailed authenticated records of their communications?

Nicolas Delon
Nicolas Delon
Reply to  Another Philosopher
1 day ago

“Why? Because they didn’t believe it because they looked at the evidence and concluded that it was the theory that best fit that evidence. Their reasoning was emotionally and politically motivated after all.”

If *that* is why they were conspiracy theorists then most people who believed the zoonotic hypothesis were also conspiracy theorists.

Last edited 1 day ago by Nicolas Delon
Another Philosopher
Another Philosopher
Reply to  Nicolas Delon
1 day ago

Obviously I wasn’t trying to define what it is to be a conspiracy theorist. My only point is that the dialectic is being set up in a very disingenuous way where one side is presented as conspiring elites and the other as plucky underdogs, seeing through all the nonsense.

It wasn’t like that at all.

By the way, I agree that many who believed the zoonotic hypothesis also did it for poor reasons.

Nicolas Delon
Nicolas Delon
Reply to  Another Philosopher
1 day ago

Motte, bailey.

The dialectic of labeling people you disagree with conspiracy theorists while taking cover behind The Science ™, Experts, and what your favorite media sources say is epistemically vicious.

Another Philosopher
Another Philosopher
Reply to  Nicolas Delon
1 day ago

It is also disingenuous to frame what is happening in these cases as labelling people as conspiracy theorists *just* that people disagree with me.

Nicolas Delon
Nicolas Delon
Reply to  Another Philosopher
1 day ago

I agree, I don’t know your reasons—you’ve already given different explanations of what makes them conspiracy theorists. But many people did do that just because of tribal disagreement.

Another Philosopher
Another Philosopher
Reply to  Nicolas Delon
1 day ago

I don’t think they did it just because of tribal disagreement—obviously that was part of it, but an important part of it was that they were not motivated by seeking the truth.

If you say that’s also true about those who deny the lab theory, there is still an asymmetry there: Namely there was no stipulation of a conspiracy to cover it up.

Nicolas Delon
Nicolas Delon
Reply to  Another Philosopher
1 day ago

That, I agree, is a relevant difference and now we’re closer to a meaningful account of conspiracy theory. Problem is, of course, conspiracies do exist (even if that wasn’t the case here, I’m agnostic for present purposes). And there is a growing trend, which Eric rightly criticizes, of misapplying the label ‘conspiracy theory’ and ‘misinformation’ to vilify political opponents and suppress dissent. Contributing to this tendency is very bad IMO.

Last edited 1 day ago by Nicolas Delon
David Wallace
Reply to  Another Philosopher
15 hours ago

“ If you say that’s also true about those who deny the lab theory, there is still an asymmetry there: Namely there was no stipulation of a conspiracy to cover it up.”

That seems to assume that there *wasn’t* a conspiracy to cover it up, which is (at the least) not obvious.

krell_154
krell_154
Reply to  Another Philosopher
1 day ago

Here’s the thing: believing Covid started naturally because experts say so is rational, because following the advice of experts is usually reliable.

But, thinking that there is a good chance that a lab that studies corona viruses, and enhances them for virulence and infectivity, has some relevance to the sudden appearance of a corona virus which shows extreme infectivity without any trace of its recent predecessesors is also rational, at least prima facie.

Yet people like Andersen, Fauci, Focuhier, Holmes and of all Daszak, acted as if anyone who thought the latter was a conspiracy lunatic.

So bith sides of this issue had some reason for their belief, and neither had sufficient reason to make their belief unproblematically rational. Yet the experts completely distorted that description and pushed the version of the origin that benefitted them.

Another Philosopher
Another Philosopher
Reply to  krell_154
1 day ago

How did it benefit them?

Eric
Eric
Reply to  Another Philosopher
1 day ago

We know they lied under oath, destroyed evidence after it was FOIAed, wrote scientific opinion pieces that said they knew with certainty that things were false what they privately said was “so friggin likely”, etc etc. do I really owe a specific account of how I know this benefitted them?

krell_154
krell_154
Reply to  Another Philosopher
21 hours ago

How does it benefit virologists that people believe something other than virological research is the cause of the worst pandemic in the last 100 years? Are we going to argue over that?

Eric
Eric
Reply to  JDRox
1 day ago

Here’s what I said about that on fb:

Yeah, there really ought to be a better word for this, but it’s definitely a thing. In fact it goes hand in hand with the version of misinformation I find pernicious because often the things MY are pointing at are actually reinforced by misinformation accusations. That’s one of the mechanisms by which we get people to believe false things: we label their denial “denialism” or “misinformation”.

Gordon
Gordon
1 day ago

Two of the things that emerged during the early part of Covid (and I think they’re related) is the importance of good science communication and the difficulty the public has in assessing uncertainty. The dynamic seems to have been that public health advocates were overclaiming their evidence because they believed that if they presented the public with the actual levels of uncertainty behind their evidence, most people would abandon any effort to contain the virus, and that would be bad. I remember there was a vocal minority of public health people that kept pointing to the need for better science communication – empathetic, honest, and so forth (this was particularly around vaccine skepticism). So that’s one factor.

The more philosophically interesting one, at least to me, is that people (all of us) are really bad at dealing with epistemic uncertainty. People get that – and that’s why they “do their own research” so they feel more confident about their decisions. That of course turns out to be hard, and most people do it badly (really? microchips inside the vaccine needles?). This is a science literacy problem, as well as a science communication one. At the public policy level, which is really where a lot of this debate lies, we need second order strategies for dealing with scientific uncertainty. Some of that happens with peer review, scientific advisory panels, and so on. One of the things Covid showed isn’t just that those bodies aren’t always reliable, but that there’s a lot of uncertainty they don’t cover.

In any case, second-order strategies are almost inevitably going to be political (in the broad, not partisan sense: I mean normative, motivated by a vision of the way the world ought to be). It seems to me that they are up for legitimate debate. For example, lockdowns at the beginning of Covid struck me as a plausible way to try to buy some time until we knew what we were dealing with – but that they weren’t sustainable long-term (I said this publicly on NewAPPs). I defended that view with reference to the precautionary principle, which I interpreted to mean that it was ok to do something protective in the absence of scientific evidence (I know that’s not what the Cartegena Protocol or Rio Declaration say. I was adapting). That might or might not have been well-reasoned, but one of the things that the word “misinformation” tends to obscure is not just that we reason badly, but that we aren’t able to have decent conversations about what to do about that.

I’m not a social epistemologist, but that struck me as missing here.