As I’ve gone into the world of facebook comments, I have noticed that many people trying to make sense of the pandemic (myself included) are running into a number of logical fallacies. As you listen to other people’s views and form your own, try to keep these fallacies in mind to avoid common but unhelpful missteps in human reasoning.
First, what are logical fallacies? My good friend, Dr. Gretchen Ellefson (Assistant Professor of Philosophy) just taught a class on critical thinking that included this very topic! Here is her basic intro to fallacies:
First, at a basic level, fallacies are flaws in reasoning or faulty forms of reasoning. What we mean when we say this is that people have arguments in mind when they form beliefs, meaning that they have reasons supporting a given conclusion. An argument has a fallacy when the reasons fail to do their job–they don’t actually support the conclusion.
The second thing is that fallacies are super normal and common and everyone uses them. The reason we want to talk about them is because recognizing where fallacies come in can help us do better at making sure we have good reasons for what we think is true. This means that when we see people using fallacies, we should recognize that this is a normal human thing, not a sign of stupidity.
That said, fallacies do tend to track forms of reasoning that human brains, for whatever reason, find compelling, even though they shouldn’t be. This means that fallacies CAN be used in really manipulative ways. If someone wants to convince their audience of something, and they know that there aren’t good reasons for their audience to accept that thing, then they may look for ways to use these kinds of patterns of reasoning which people will be less likely to immediately recognized as flawed.
Fallacy 1: Some people who have stayed home from work ended up getting COVID-19, therefore staying home doesn’t help prevent the spread of COVID-19.
This is a false dilemma fallacy — where the arguer tries to make it seem there are only two options (staying home protects 100% of people or it doesn’t protect at all), when in fact other explanations exist. In this case, an alternative explanation exists: staying home helps slow the spread of COVID-19 so that fewer people get sick, but it is not a guarantee that no one will get sick. (This also has elements of the false cause fallacy).
Fallacy 2: I have been around lots of people and I haven’t gotten sick, therefore social distancing isn’t really needed.
This is the anecdotal fallacy which assumes that one person’s experience must be reflective of everyone in the population. A similar argument could be made for drunk driving; someone could say that because they drove drunk last night and nobody got hurt, avoiding drunk driving is not necessary. Obviously that is not true — most would realize that this person was just lucky that they didn’t cause an accident. We have to look at the effect of both social distancing and drunk driving on the population, not just individual cases. Why? Because a lot of people who are exposed to the coronavirus do not end up getting symptoms, so if you are one of those lucky people, you may be around tons of people (and get exposed) and never get sick. Or maybe you have been lucky and just haven’t been exposed yet, even if you’ve been around lots of people. This does not mean that we can assume that will be the case for everybody. (This also has elements of the false cause fallacy).
Fallacy 3: If someone is concerned about the economy, they must not be taking the pandemic seriously. And alternatively, if someone is advocating for social distancing, they must not care about the economy.
These are also false dilemma fallacies, where issues are falsely divided into simple boxes creating an either/or scenario, when reality is in fact much more complex.
Fallacy 4: There are videos of Dr. Fauci and Bill Gates discussing the likelihood of a pandemic several years ago; these videos are evidence that these men were involved in planning the pandemic.
This is a false cause fallacy (also perhaps a post hoc fallacy), which assumes that if two events are associated, one must be the cause of the other. A similar but more obvious fallacy would be an oncologist giving a cancer prognosis that turns out to be accurate, and then suspecting them of murder because they “knew” when the patient was going to die. Or accusing meteorologists of manipulating hurricanes because they accurately predicted it was going to hit a major city. Science can be used to predict lots of things (with varying degrees of certainty), including the possibility of a pandemic.
Fallacy 5: If the government forces us to wear masks for the sake of public health, they will soon encroach on our liberty in more extreme ways.
This is the slippery slope fallacy, which avoids discussing the issue at hand (mandatory masks) by distracting with a far more extreme claim, and assuming that these two things must be connected without providing any evidence that they are connected.
Fallacy 6: Don’t worry about the civil liberty implications of shut downs, think about all the people who are suffering from the virus right now!
This is an appeal to emotion fallacy, where instead of providing a logical argument, the arguer attempts to manipulate the person’s emotions.
Fallacy 7: The vast majority of people support social distancing, therefore it’s the right thing to do. This is the bandwagon fallacy, which asserts something is true because lots of people think it’s true. This is, of course, not how reality works; there have been many examples where popular opinion was flat out wrong.
Fallacy 8: Because the last argument (Fallacy 7) was a fallacy, social distancing is not the right thing to do.
This is a fallacy fallacy, which asserts that any conclusion that is based on a fallacy must be wrong. This isn’t necessarily the case; it just means that the statement didn’t pull from valid evidence or arguments, but it may be the case that valid arguments do exist.
Fallacy 9: Getting immunity naturally is better than getting a vaccine because vaccines are artificial.
This is the appeal to nature fallacy, which asserts that because there are so many good things that come from nature, everything from nature is always better than something that is created by humans. This is of course untrue, there are many bad things that come from nature (like hurricanes, and viruses) and many good things that are man-made (like breakfast tacos).
Fallacy 10: The person with higher scientific credentials must be right.
This is the appeal to authority fallacy, which asserts that authority figures (experts) are always right. But of course, this is not always true; reality does not bend to the will or whims of experts. Right now I think this one is especially confusing, because the scientific community says “trust the experts!” but then when an expert says something a bit weird, they say “ignore them!” The rules of science are this: the best data and the best analysis win. It does not matter who is saying it; the data and analysis are all that really matter. Often, scientists with more credentials are better at recognizing good data and analysis, so that is why it usually makes sense to “trust the experts!” But sometimes, an expert or two might latch onto bad data for a variety of reasons (they got overly attached to their hypothesis and want to be right, they have a financial conflict of interest, the science conflicts with their ideological views, they need an audience, they are speaking outside their area of expertise). In these cases, it is best to ignore them. So if you don’t know how to judge the quality of the data and analysis for yourself, how do you tell the difference? My best advice is to look at what the majority of scientists are saying — there are very few things that scientists dislike more than bad science (hence the existence of this blog), and we will call it out, no matter how many degrees the person promoting it may have. The consensus of the majority of scientists (who have expertise on the question at hand) will usually* lead you to the right answer.
*There are a few cases in history where the majority of scientists were wrong and the lone “crazy one” was right. Here is one of my favorite examples.