The Value Of Cultivating Some Self Doubt
Tips for how to check yourself before you wreck yourself.
While enjoying my morning coffee and googling to learn more about a random subject (Yuri Bezmenov, for those who might be interested) I came across an article that really resonated with me: Change your mind with these gateway drugs to intellectual humility by David McRaney at the site Big Think. Given its clickbaity title I wasn’t expecting much, but I was pleasantly surprised by what I read. The tl;dr at the start of the article really nicely sums things up:
There are more than 180 cognitive biases that affect our judgments and beliefs, one of the most pernicious being the notion that we understand a topic better than we actually do. We cannot escape our cognitive biases through education. In fact, the more intelligent you are and the more education you gain, the better you become at rationalizing and justifying your existing beliefs — even if they’re wrong. If you want to change your mind about all the things you are wrong about, you must first consider what might be motivating you to remain blissfully unaware of your wrongness.David McRaney
In the article McRaney, who recently wrote the book How Minds Change on this topic, sets up the stakes of why intellectual humility is so necessary, especially in this increasingly polarized media environment:
[C]ommon misconceptions and false beliefs… spread from conversation to conversation and survive from generation to generation to become anecdotal currency in our marketplace of ideas. They confirm our assumptions and validate our opinions and thus raise few skeptical alarms. They make sense, and they help us make sense of other things, and as Carl Jung once wrote, “The pendulum of the mind oscillates between sense and nonsense, not between right and wrong.”David McRaney
I think we can all get on board with that idea. Heck, even the great psychologist Carl Jung agrees. Please remind me to save that quote for future us….
Well, I used to believe [Jung] once wrote that. I’ve certainly shared that bit of wisdom many times thinking he had. But, while writing this paragraph I discovered he, in fact, did not. It turns out I was wrong about that as well.David McRaney
How easy it is to spread misinformation. One of the reasons for this, which McRaney details in the article, is the psychological phenomenon of motivated reasoning: that, in day-to-day operations, humans unconsciously interpret facts to support existing beliefs and minimize cognitive dissonance. Take this summary of one social psychology experiment from the article:
A great example of this comes from the work of psychologist Dan Kahan. He once brought together more than 1,000 subjects, asked them about their political dispositions, tested their math skills, and then presented them with a fake study into the effectiveness of a new rash-reducing skin cream. Their challenge? Determine if the cream worked after two weeks of use.
Subjects looked at a table full of numbers showing who got better and who got worse. The top row showed patients who used the cream, the bottom those who didn’t. The catch was that more people had used the cream than did not, so the number of people who got better after two weeks was higher in that group simply because there were more of them to count. They had made it easy to get the wrong answer if you just looked at the numbers and made a snap judgment. But if you knew how to calculate percentages, and took a moment to work out the math, you’d find that 75% of the cream group got better while 84% of the no cream group did. So, not only did the cream not work, it may have made the rash worse.
Unsurprisingly, the better people were at math, regardless of their political dispositions, the more likely they would take the extra step to calculate the percentages instead of going with their guts. If they took that step, it was less likely they would walk away with an incorrect belief. Kahan’s team then repeated the study so that the numbers showed the cream worked, and once again, the better people were at math, the more likely they arrived at the correct answer.
But here’s the twist. When researchers relabeled the exact same numbers as the results of a study into the effectiveness of gun control, the better some people were at math, the more likely they made mathematical mistakes. If the results showed gun control was effective, the more likely a conservative with good math skills would get the wrong answer; if the results showed gun control was ineffective, the more likely a liberal with good math skills would get the wrong answer.McRaney
Let me call out that, in my reading, McRaney gets things a little backward in his final paragraph. Here are Kahan and team’s findings from the original article:
Subjects were more likely to correctly identify the result most supported by the data when doing so affirmed the position one would expect them to be politically predisposed to accept – that the ban decreased crime, in the case of more liberal subjects who identify with the Democratic Party; and that it increased crime, in the case of more conservative ones who identify with the Republican Party – than when the correct interpretation of the data threatened or disappointed their predispositions.Kahan et al.
That mistake aside, the really important point–especially for those well-read and educated among us–is the fact that counter to what we logical and rational thinkers would expect, including numeracy correlated with an increased tendency towards getting the interpretation wrong if it matched your priors:
Science Comprehension Thesis, however, predicted that polarization among high-numeracy partisans would be lower than among low-numeracy ones in the gun ban conditions, consistent with the premise that political conflict over decision-relevant science is fed by defects in the capacity of ordinary members of the public to make sense of empirical evidence. The data did not support this prediction. On the contrary, numeracy magnified political polarization among high-numeracy partisans.Kahan et al.
This was too small to use as the lead image, but I knew I had to use it in the post somewhere
So what are those of us with big brains to do if we can’t necessarily trust them to (initially) correctly interpret information that involves our preexisting biases? McRaney provides a really powerful thought exercise that can help towards the beginning of his article (actually, he gets to it much faster than I did in this summary–sometimes it’s better to click the link and read the article before you read the commentary).
A few years ago the great science writer Will Storr shared with me a powerful thinking exercise, one I’ve been passing forward ever since. I’d like to share it with you now. It’s very simple, just two questions.
First, ask yourself: Do you think you are right about everything?
If your answer is “yes,” then perhaps you should consider a career in politics, but if your answer is “no,” now ask yourself this second, more crucial question: If you aren’t right about everything, then what, exactly, are you wrong about?McRaney
The image so nice, I had to use it twice.
I’ll be honest: that second question stopped me in my tracks and made me put the article down for a bit. I have already jotted the question on a post-it and have it hanging off my work monitor (right next to the “Shame never fixes anything” one).
Crippling self doubt is never a good thing. Yet some self doubt, or, more accurately, acceptance that no matter how hard we try to avoid them we all have blind spots, can be a virtue. McRaney refers to this as intellectual humility. The article dives into some other techniques for cultivating it and is worth the read (if you want to get some time back you can jump down to the “Thinking about thinking about thinking” section as I have now covered most of the previous material).
As for me, I’m off to ponder what, exactly, am I wrong about.
Or I will be doing that until the comments start coming in, as then I get another opportunity to prove how right I always am.