photo of golden cogwheel on black background

Dichotomy Is False

A false dichotomy is a logical fallacy. You may be presented with alternatives A and B, when there is at least one other alternative C that is also available. You may then be presented with arguments that force you to choose either A or B, and reach some conclusion as a consequence. The danger lurking here is that unless you actively seek out additional alternatives, you may not easily notice that you’re falling into a trap.

In classical logic, the principle of explosion says that anything may be proven from a contradiction. In Latin, this idea goes by the phrase ‘ex falso [sequitur] quodlibet’ — from falsehood, anything [follows]. If wishes were horses, beggars would ride…a true statement. And so, when you start with a false dichotomy, the conclusion you might arrive at doesn’t carry the force of logic: it may be true or it may be false; the argument itself is null and void.

In my experience, all dichotomies are false in the real world. In other words, there are always more options; you only have to look a little deeper. Often, the options presented are the extremities of reasonable ideas that actually span a spectrum of possibilities.

Checking Facts

Society is having a hard time fact-checking politicians these days, although the roots of the problem are quite ancient. There is a natural human tendency to assign categories to things. It is very easy and very human to categorize knowledge into things you believe are true versus things you believe are false. This true-or-false dichotomy, like all others in the real world, is false. To put it in more colloquial terms, there is always some truth to what you label as false (and vice versa).

Here’s the catch, once you label something in your mind, you are psychologically less inclined to alter the label, even in light of new facts. The easiest way for someone to convince you of a falsehood is to club it together with something you already believe to be true. When you know a man to be virtuous, you’re far more likely to attribute his sins to factors outside of his control. Over time, these falsehoods build upon each other to create a warped worldview.

“The first principle is that you must not fool yourself, and you are the easiest person to fool.”

—Richard Feynman

Every individual who wishes to avoid this sad fate should try to relentlessly pursue the truth, using the principle of falsifiability or refutability expounded by Karl Popper back in 1934. Falsifiability is the capacity for a statement or hypothesis to be contradicted by evidence. Here’s one way of applying this idea: you should believe something to be true only if you’ve made a good faith attempt to prove that it is false, and failed to do so despite your best efforts. This is exemplified in the ‘black swan’ argument: if someone told you that all swans are white, you should look for colored swans — not white ones — to validate the truth of the statement. Look for evidence that disconfirms your belief.

bird couple australia black
Photo by Pixabay on Pexels.com

In everyday situations, you might be faced with ideas that conflict with each other, some that you are more inclined to believe than others. As a relentless pursuer of truth, your objective should be to identify the most believable ideas, and look for evidence that these ideas are wrong. By critiquing the ideas you wish to believe, you make them more robust, understand the nuances and half-truths, and end up with something that is far closer to the truth than where you started. The truth will not make you feel warm and fuzzy, but it will be grounded in reality.

Avoiding Extremes

“-Ism’s in my opinion are not good. A person should not believe in an -ism, he should believe in himself.”

—Ferris Bueller, in ‘Ferris Bueller’s Day Off’

There are many forces in the world that polarize opinion, creating more of the same false dichotomies that we want to avoid in social discourse. Social media platforms have made it easy to reach millions of people and convey information — and misinformation — in real time, reinforcing the beliefs of opposing camps.

For instance, we hear debates about capitalism versus socialism (sometimes incorrectly equated with communism). Both of these are unrealistic models, figments of our imagination. In practice, every sovereign nation has to adopt policies that may be inspired by either model, or maybe something else altogether.

The real debates usually need to be on the merits and demerits of specific policies. People with ulterior motives are inclined to assign labels to these policies and associate them with existing factions. For instance, in a debate on government-sponsored healthcare, one side may be painted as being ‘socialist’, even though the actual debate is far more nuanced. Interestingly, it is easy to recognize this kind of labeling when it works against us, but when it works in our favor, we might be quick to assume it’s because we’re right.

That is the most dangerous time of all — when we are starting to believe we are right, but haven’t yet looked for evidence countering our belief.

2 thoughts on “Dichotomy Is False”

  1. This is a masterly analysis of what is going wrong in our society. People prefer to take positions and find arguments and “facts” that support them, rather than look for the truth of each matter and change their positions accordingly.

    This is either due to a disinclination to devote their time and energy to dig up the truth or because they prefer to think they belong to a particular ideology and follow their leaders blindly. In my opinion, ideologies are an outdated concept, and we should prefer to apply our minds without bias refusing to let our thoughts be shaped by self-proclaimed leaders.

    1. This New Yorker article from 2017, Why Facts Don’t Change Our Minds, provides a thought-provoking overview of studies on confirmation bias — why we’re more inclined to be influenced by evidence that supports our own points of view. Research suggests that people experience pleasure (a rush of dopamine) when processing information that confirms their beliefs.

      The main takeaway is that we have evolved to be irrational in this way because it encourages cooperation amongst humans, thus helping survival of the species. At least in this particular dimension, natural selection seems to have favored ‘winning the argument’ over reasoning clearly about reality.

      The article also discusses a very interesting effect called illusion of explanatory depth. According to this effect, people believe they know more than they actually do. They are able to persist in this belief because of society — they get confirmation from others in their group, at which point they become even more entrenched in it.

Leave a Reply to Thenamelessone Cancel reply