Uncommon Common Sense
By Bill Frayer
Being wrong is easy. Realizing and admitting we are wrong is more difficult. Yet, I cannot help but wonder how much better the world might be if we had a more refined ability to admit error.
I recently read The Undoing Project by Michael Lewis, about the genius partnership of Amos Tversky and Daniel Kahneman, who revolutionized the standard thinking in economics. Most economists had assumed that people made decisions based on rational behavior. Tversky and Kahneman demonstrated that most of our decisions are, in fact, based on irrational paradigms. Kahneman, who received the Nobel Prize in Economics in 2002, later wrote the bestselling book Thinking Fast, Thinking Slow in which he outlines much of the work he did with Tversky for a lay audience.
Tversky and Kahneman determined that we use heuristics that often tend to reinforce our preconceived notions about reality. Rather than dispassionately examine the evidence before making a rational decision, we often have a tendency to decide based on fear or other emotions. As a result, we are wrong much of the time, even though we are usually confident we are right.
One of the most common erroneous tendencies is confirmation bias, the “tendency to search for, interpret, favor, and recall information in a way that confirms our preexisting beliefs or hypotheses.” (Wikipedia) Scientists can do this when interpreting experimental results. If they are looking to prove their hypothesis, they unconsciously tend to filter out data which does not confirm it. Physicians can do this when a patient’s symptoms seem to fit a predictable pattern; they can ignore evidence which may point in another direction. We may all do this when we consume news. First of all, we tend to get our news from sources which are already biased in our direction. Secondly, when presented with information which does not conform to our preconceived beliefs, we tend to reject it.
Case in point: In the 2016 US Presidential Election, many on the left refused to believe that Trump could win despite his drawing huge crowds at his rallies and the persistent unpopularity of Hillary Clinton. Another example: Those who supported the 2003 US invasion of Iraq, as well as many competent journalists, believed that Saddam Hussein possessed weapons, despite conflicting evidence.
In our private lives, we jump to conclusions, react emotionally, and depend too much on what we want to be true. We are frequently wrong. The problem is not being wrong. We are fallible. We will inevitably be wrong sometimes. But believing we are NOT wrong is where the difficulty lies.
In my critical thinking class, I would pose a hypothetical situation to the class. Would you prefer to undergo surgery by a surgeon who is able to understand and admit when he or she is wrong? Of course, we wouldn’t want our surgeon to blindly push ahead, unaware that they had misjudged a medical situation.
The ability to admit we are wrong, that we do not have all the answers, is a form of intellectual humility. Good thinkers understand that they may not always have sufficient evidence and that they will have a tendency to look at evidence which conforms to their preexisting beliefs. I would ask my students if, when entering into an argument with a friend or loved one, would they be able to admit that they might not have all the information, that they might be wrong. Most would resist this idea. But think about it. If we do not, or cannot, consider the possibility that we might be wrong, what’s the purpose of the discussion?
Food for thought, especially these days.