The Value Of Cultivating Some Self Doubt

Tips for how to check yourself before you wreck yourself.

[Image of a human brain]

While enjoying my morning coffee and googling to learn more about a random subject (Yuri Bezmenov, for those who might be interested) I came across an article that really resonated with me: Change your mind with these gateway drugs to intellectual humility by David McRaney at the site Big Think. Given its clickbaity title I wasn’t expecting much, but I was pleasantly surprised by what I read. The tl;dr at the start of the article really nicely sums things up:

There are more than 180 cognitive biases that affect our judgments and beliefs, one of the most pernicious being the notion that we understand a topic better than we actually do. We cannot escape our cognitive biases through education. In fact, the more intelligent you are and the more education you gain, the better you become at rationalizing and justifying your existing beliefs — even if they’re wrong. If you want to change your mind about all the things you are wrong about, you must first consider what might be motivating you to remain blissfully unaware of your wrongness.

David McRaney

In the article McRaney, who recently wrote the book How Minds Change on this topic, sets up the stakes of why intellectual humility is so necessary, especially in this increasingly polarized media environment:

[C]ommon misconceptions and false beliefs… spread from conversation to conversation and survive from generation to generation to become anecdotal currency in our marketplace of ideas. They confirm our assumptions and validate our opinions and thus raise few skeptical alarms. They make sense, and they help us make sense of other things, and as Carl Jung once wrote, “The pendulum of the mind oscillates between sense and nonsense, not between right and wrong.”

David McRaney

I think we can all get on board with that idea. Heck, even the great psychologist Carl Jung agrees. Please remind me to save that quote for future us….

Well, I used to believe [Jung] once wrote that. I’ve certainly shared that bit of wisdom many times thinking he had. But, while writing this paragraph I discovered he, in fact, did not. It turns out I was wrong about that as well.

David McRaney

How easy it is to spread misinformation. One of the reasons for this, which McRaney details in the article, is the psychological phenomenon of motivated reasoning: that, in day-to-day operations, humans unconsciously interpret facts to support existing beliefs and minimize cognitive dissonance. Take this summary of one social psychology experiment from the article:

A great example of this comes from the work of psychologist Dan Kahan. He once brought together more than 1,000 subjects, asked them about their political dispositions, tested their math skills, and then presented them with a fake study into the effectiveness of a new rash-reducing skin cream. Their challenge? Determine if the cream worked after two weeks of use. 

Subjects looked at a table full of numbers showing who got better and who got worse. The top row showed patients who used the cream, the bottom those who didn’t. The catch was that more people had used the cream than did not, so the number of people who got better after two weeks was higher in that group simply because there were more of them to count. They had made it easy to get the wrong answer if you just looked at the numbers and made a snap judgment. But if you knew how to calculate percentages, and took a moment to work out the math, you’d find that 75% of the cream group got better while 84% of the no cream group did. So, not only did the cream not work, it may have made the rash worse.

Unsurprisingly, the better people were at math, regardless of their political dispositions, the more likely they would take the extra step to calculate the percentages instead of going with their guts. If they took that step, it was less likely they would walk away with an incorrect belief. Kahan’s team then repeated the study so that the numbers showed the cream worked, and once again, the better people were at math, the more likely they arrived at the correct answer. 

But here’s the twist. When researchers relabeled the exact same numbers as the results of a study into the effectiveness of gun control, the better some people were at math, the more likely they made mathematical mistakes. If the results showed gun control was effective, the more likely a conservative with good math skills would get the wrong answer; if the results showed gun control was ineffective, the more likely a liberal with good math skills would get the wrong answer.

McRaney

Let me call out that, in my reading, McRaney gets things a little backward in his final paragraph. Here are Kahan and team’s findings from the original article:

Subjects were more likely to correctly identify the result most supported by the data when doing so affirmed the position one would expect them to be politically predisposed to accept – that the ban decreased crime, in the case of more liberal subjects who identify with the Democratic Party; and that it increased crime, in the case of more conservative ones who identify with the Republican Party – than when the correct interpretation of the data threatened or disappointed their predispositions.

Kahan et al.

That mistake aside, the really important point–especially for those well-read and educated among us–is the fact that counter to what we logical and rational thinkers would expect, including numeracy correlated with an increased tendency towards getting the interpretation wrong if it matched your priors:

Science Comprehension Thesis, however, predicted that polarization among high-numeracy partisans would be lower than among low-numeracy ones in the gun ban conditions, consistent with the premise that political conflict over decision-relevant science is fed by defects in the capacity of ordinary members of the public to make sense of empirical evidence. The data did not support this prediction. On the contrary, numeracy magnified political polarization among high-numeracy partisans.

Kahan et al.
[Spock's brain hurts]

This was too small to use as the lead image, but I knew I had to use it in the post somewhere

So what are those of us with big brains to do if we can’t necessarily trust them to (initially) correctly interpret information that involves our preexisting biases? McRaney provides a really powerful thought exercise that can help towards the beginning of his article (actually, he gets to it much faster than I did in this summary–sometimes it’s better to click the link and read the article before you read the commentary).

A few years ago the great science writer Will Storr shared with me a powerful thinking exercise, one I’ve been passing forward ever since. I’d like to share it with you now. It’s very simple, just two questions. 

First, ask yourself: Do you think you are right about everything? 

If your answer is “yes,” then perhaps you should consider a career in politics, but if your answer is “no,” now ask yourself this second, more crucial question: If you aren’t right about everything, then what, exactly, are you wrong about?

McRaney
[Spock's brain hurts]

The image so nice, I had to use it twice.

I’ll be honest: that second question stopped me in my tracks and made me put the article down for a bit. I have already jotted the question on a post-it and have it hanging off my work monitor (right next to the “Shame never fixes anything” one).

Crippling self doubt is never a good thing. Yet some self doubt, or, more accurately, acceptance that no matter how hard we try to avoid them we all have blind spots, can be a virtue. McRaney refers to this as intellectual humility. The article dives into some other techniques for cultivating it and is worth the read (if you want to get some time back you can jump down to the “Thinking about thinking about thinking” section as I have now covered most of the previous material).

As for me, I’m off to ponder what, exactly, am I wrong about.

Or I will be doing that until the comments start coming in, as then I get another opportunity to prove how right I always am.

FILED UNDER: *FEATURED, Books, Education, Guns and Gun Control, Health, Science & Technology, US Politics, , , , ,
Matt Bernius
About Matt Bernius
Matt Bernius is a design researcher working to create more equitable government systems and experiences. He's currently a Principal User Researcher on Code for America's "GetCalFresh" program, helping people apply for SNAP food benefits in California. Prior to joining CfA, he worked at Measures for Justice and at Effective, a UX agency. Matt has an MA from the University of Chicago.

Comments

  1. Good post!

    4
  2. Matt Bernius says:

    @Steven L. Taylor:

    Good post!

    Correct as always Steven!

    3
  3. Michael Reynolds says:

    As I lay on damp concrete beneath a freeway overpass, with no money, no job and two felony warrants after me, it did begin to occur to me that I might not be right about absolutely everything.

    Within weeks there followed a different correction. I had long dismissed the concept of love, arguing that self-sacrifice ran counter to evolution’s Priority #1: stay alive to reproduce. So, naturally I fell in love and immediately realized I understood nothing.

    I began to understand that intelligence (had that) is not the same thing as wisdom (nope). So, in answer to the question, ‘what are you wrong about?’ I could give you a nice long list of things I’ve been wrong about. What am I wrong about now? Well, I don’t know, obviously, but finding out is a major motivation for me reading and watching and coming here to have smart people tell me I’m full of shit.

    Just before I blew up my life I’d begun skimming my way through philosophy and was taken by the notion of presuppositionlessness, the process of examining my own beliefs, and insofar as possible, discarding the b.s.. It’s a lifelong thing – you can get more and more right and yet never be entirely right. I realize that here at OTB my perceived arrogance is second only to @Lounsbury, but that’s a misreading of my intentions. If I state a belief it’s as test: prove me wrong. Prove me wrong or give me a new slant and I am genuinely grateful because the goal is not to be seen as right, but to be right.

    Everyone picks their own prime virtue, I pick Truth – ironic given that I have been a fantastic liar in my past. Since 2001, the year I bought my way out of trouble, I’ve had an almost physical reaction to lying – sort of like when you get a bad oyster and you get very leery about shellfish. I don’t lie anymore outside of the demands of politesse. What am I wrong about? Possibly everything.

    7
  4. Kathy says:

    I rely on two principles a great deal:

    1) The Socrates principle: a wise person is aware of their own ignorance and fallibility.

    2) The Gil Grissom principle: the evidence doesn’t lie.

    6
  5. Just nutha ignint cracker says:

    @Michael Reynolds:

    If I state a belief it’s as test: prove me wrong.

    Here’s the rub; how many of us care enough about your growth as a human to dispute with an admitted arrogant [expletive deleted]head? The job of testing faith always lies with the faithful. Examine yourself. We’re not doing the work for you.

    5
  6. Mimai says:

    If you aren’t right about everything, then what, exactly, are you wrong about?

    I appreciate this question as a starting point. The problem with it as a sole prompt is that our reflections are likely to be just as susceptible to the cognitive biases that led us to be wrong in the first place. Perhaps even more susceptible.

    @Michael Reynolds: is on to something when he writes:

    I could give you a nice long list of things I’ve been wrong about.

    From here, it can be useful to identify themes (Matt Bernius, this should warm your qualitative researcher heart). What kinds of topics were I wrong about? What information sources did I privilege? What did it do for me to be wrong in this way? Who else was wrong in this way? etc…

    From here, one is better able to consider the types of things they might be wrong about right now. Still not full-proof, but does erect some guardrails.

    At the end of the day, affiliation needs trump accuracy needs. And this isn’t necessarily a bad thing.

    4
  7. Michael Reynolds says:

    @Just nutha ignint cracker:

    We’re not doing the work for you.

    I’m not holding a gun to anyone’s head. Engage, don’t engage, it’s a free country. Sort of. In the end, I get what I need.

    3
  8. CSK says:

    The older I get, the more I realize I don’t know a goddamned thing.

    5
  9. Kurtz says:

    @Michael Reynolds:

    In the end, I get what I need.

    Eh, I dunno about that. 😉

    @Just nutha ignint cracker:

    Here’s the rub; how many of us care enough about your growth as a human to dispute with an admitted arrogant [expletive deleted]head

    Judging by the number of thumbs and replies MR’s posts generate, as well as direct references made in Michael-less (free?) threads plus subtweets*, quite a few of us have some stake in his growth as a human.

    Also, with due respect to the odorous, odious shittiness of most of the characters in Glass Onion, who all did it, I would much rather be described as a shithead or fuckhead than a shit or a fuck.

    *forum-equivalent thereof°.
    °I really want a comma before “thereof” because it seems stylish. But I am practicing self control. (And fighting overhyphenization by eliminating them in almost all circumstances.)

  10. Mister Bluster says:

    …as Carl Jung once wrote, “The pendulum of the mind oscillates between sense and nonsense, not between right and wrong.”

    A comment I posted on a Forum Thread:

    Mister Bluster says:
    Sunday, 29 May 2022 at 11:22
    “The truth is out there”
    The X-Files aka Science Fiction.

    To which I reply:
    “Side by side with the nonsense.”

    2
  11. Tony W says:

    @Michael Reynolds: The trick is to not require somebody to directly confront you, but rather to be able to look at a body of evidence and compare its findings to your own ideas and closely held beliefs.

    Sometimes our knowledge does not stand up to scrutiny, but it is incumbent upon us to figure out what we’re missing. But that happens only if we value the truth and the personal growth that comes from it.

    Some people have found it is easier to live in their bubble of affirmation.

    That will never be easier for me, because I’ll tear my own hair out.

    But I envy those folks.

    5
  12. Mikey says:

    I heard David McRaney on a podcast and was so interested I bought “How Minds Change.” I’m about halfway through it, it’s pretty good so far. Apparently the payoff is learning a conversational style that can actually change minds. I wonder if applying it in my interactions with Trumpies will actually accomplish that.

    As far as cultivating some self-doubt, my raging impostor syndrome does that just fine for me, thank you very much…

    2
  13. grumpy realist says:

    It’s also possible that the more you stick to humility in one area, the more arrogant and certain you will be about your beliefs in other areas, mainly because it’s tiring to constantly double-check yourself on everything. There’s gotta be SOMETHING that I am certain about, so I can take it for granted.

    Also, if you’re trying to figure out “the truth” about something, there’s always the question: what of the surrounding knowledge is relevant? What, even though it may be relevant, can I ignore because the contribution is so small?

    2
  14. Michael Reynolds says:

    @Tony W:

    The trick is to not require somebody to directly confront you, but rather to be able to look at a body of evidence and compare its findings to your own ideas and closely held beliefs.

    It’s not either/or. I can read and analyze (as I do) but that’s not the stress test. Expert says X+Y=Z, so I find a quick refutation. Then what? I can almost always come up with a first round refutation, but I know better than to trust that. My ongoing cult vs. partisan sorting thing with @Steven? I’m not convinced he’s right, but I’m also not convinced that I’m right.

    Look, I have a different life than most people. I can go on Amazon, Goodreads, Reddit, Twitter or YouTube 24/7/365 and hear that I’m a genius who changed lives, even saved lives. That’s not a brag, that’s my work environment. Unlike probably everyone here, I have an endless supply of emotional fentanyl available, hooked right into a vein. That’s not good, that easy ego-gratification is a trap. So, I can go and be stroked, or I can come here and have my ass kicked.

    I’m a clever debater. I remember being maybe 14 and debating my friend Kenneth Kisser on the existence of God. He was an atheist. I won the debate, but I only won because of skill, not because I was right. He was right. I’ve never forgotten that. Clever is not the same as right.

    5
  15. gVOR08 says:

    As part of my effort to expose myself to other points of view I’ve been reading, among other things, Marginal Revolution. Tyler Cowen likes to cite academic papers supporting conservative points. His efforts, along with many others, have succeeded in convincing me. Convincing me that publish or perish and funders motivate a lot of crap. Shaky methodology applied to silly questions.

    I confess I haven’t read the link, but I regard it as obvious that we should hold our beliefs tentatively so I’m not terribly interested.

    People judge truth or falsity by how well it meshes with everything else you know about the world. If you have a well informed world view this works fairly well, and if not, not. No one has any priors about the effectiveness of a new, real, skin cream, much less a fictional one. So the more numerate are more likely to correctly analyze (fictitious) data. Great. Now, in an artificial situation, you’re presented with gun data that contradicts everything you know from elsewhere. Do you play the game within the artificial rules created by the experimenter and ignore everything else you know? Or do you react as though you’re being presented with what you know is likely made up bullshit data? Which you are. What percentage of educated people have been subjects of a psych experiment? For me it was required for Psych 101 and 102. Think they might recognize the situation for what it is?

  16. steve says:

    I thought that is why we get married. To always have someone around to remind us that we are wrong.

    Steve

    4
  17. Ken_L says:

    If you aren’t right about everything, then what, exactly, are you wrong about?

    Talk about a false dichotomy. Wise (a neglected descriptor) people hold values and beliefs with varying degrees of confidence, according to the persuasiveness of the evidence that supports them. Even those they believe are ‘right’ are held contingently, open to amendment in the light of better and further particulars. I could write a dissertation about the values and beliefs I once held, which I have subsequently altered over the years.

    1
  18. Matt Bernius says:

    @Mimai:

    I appreciate this question as a starting point. …

    From here, it can be useful to identify themes (Matt Bernius, this should warm your qualitative researcher heart). What kinds of topics were I wrong about? What information sources did I privilege? What did it do for me to be wrong in this way? Who else was wrong in this way?

    The question definitely is a starting point, not an ending one. And, as you suspected, I love this framework for identifying past patterns of wrongness to understand historic blind spots. Bravo.

    @Mikey:

    I heard David McRaney on a podcast and was so interested I bought “How Minds Change.” I’m about halfway through it, it’s pretty good so far.

    Cool. When you finish it, let us know how it was (preferably in one of my post threads as I don’t usually check in on the daily’s and I’d love to hear your opinion on it.

    @Ken_L:

    Talk about a false dichotomy.

    I guess I see it a little differently. The more I think about that question, the more I see it as something like what I understand a Zen Koan to be–an idea that sparks a degree of cognitive dissonance in order to break oneself out of a pattern of thinking (apologies, if I have butchered what a Koan is–my experience with Zen is minimal). I also think that Mimai’s point above is really important, for many of us (I know for myself at least) there are definite patterns to where I have tended to unquestioningly accept and spread misinformation. And partisanship and ideology definitely play a role in that. Knowing that has helped me avoid doing that in the present. Equally important is also admitting and correcting the mistake as soon as I realize it.

    @steve:

    I thought that is why we get married. To always have someone around to remind us that we are wrong.

    That’s definitely been my experience at least.

    @Michael Reynolds:

    I realize that here at OTB my perceived arrogance is second only to @Lounsbury, but that’s a misreading of my intentions.

    I detect no lies here. Just kidding Michael… first no one’s arrogance can rival @lounsbury’s, also while you can be a PiTA at times, I find you to always be a more productive PiTA. I don’t think I have ever successfully gotten you to significantly change an opinion, but debating you definitely has helped me sharpen my positions over the years. And that’s a real gift.

    1
  19. Tony W says:

    @Michael Reynolds: Emotional fentanyl is available to us all, perhaps not at your scale, and not in the same form, but that little rush of dopamine that comes from having tons of Reddit karma, or the neighbor admiring your lawn mowing skills, or just killing it in the line at Starbucks while the crowd waits for their drinks – all of that challenges our self-awareness and tries to make us lazy and complacent.

    The paradox of self-awareness has strong ties to the Dunning-Kruger effect, and those of us who wish to see our blind spots are always troubled by what we are missing. My primary reason for wasting time at the keyboard, poking and arguing with people on the internet, is looking for those blind spots that I know are there but can’t locate on my own.

    I think this is a byproduct of intelligence and curiosity. It has always been more difficult to be smart, and interested in being smarter, than it is to be content and dim.

    How much easier would life be if we could be content sitting on the porch sipping tea and watching the birds? And yet I seek that very contentment, that very situation, that level of peace and freedom.

    It’s a fools game, but I don’t know how to stop playing.

    2