Low-Information Individuals, Motivated Reasoning, and Epistemic Overconfidence
Why rational discourse seldom changes the minds of some people.
ScienceAlert (“Didn’t Read The Article Before Commenting? Science Says It Really Shows“): summarizes a recent scholarly article:
A little bit of knowledge can go straight to your head, and not in a good way. New research has found that those who only read snippets of their Facebook newsfeed often think they know more than they actually do.
By glancing through article previews, instead of reading the full piece, many users overestimate their understanding of an issue, and this is especially true for those whose knowledge is guided by strong emotions – and, therefore, strong opinions.
“Because most social media users only have a passing engagement with posted news, exposure to political information on social media may simply create the illusion of political learning,” write the researchers at the York College of Pennsylvania.
Nothing surprising there. The research design seems reasonable enough:
The first group (320 participants) was asked to read through a full article from The Washington Post about genetically modified (GM) foods. The second group (319 participants) was given a Facebook newsfeed with four article previews, one of which featured the same GM article.
The third and last group (351 participants) was given no information at all.
To assess their level of knowledge about GM foods, participants were given six factual questions, with five answers that could be found embedded in the article, and three answers that could be found in the Facebook preview.
To measure their level of confidence, the participants were also asked to estimate how many questions they got right.
Unsurprisingly, those who read the full article answered the most questions correctly, while those who read the preview scored only one more correct answer than those who were given no information at all.
Again, no surprises. But here’s where it gets interesting:
Additionally, the findings suggest that people who read only the previews were far too confident in their knowledge. What’s more, those participants whose cognitive style is more guided by emotion, tend to be more certain of their rightness.
This “need for affect” essentially means that participants have strong gut reactions or intuitions that they don’t question. In other words, the feeling of being accurate is more satisfying for many people than actually being accurate.
“Those who are more driven by emotion allow the positive feelings associated with being right to override the need for actual accuracy,” the authors write, “thus coming away from limited exposure to information falsely overconfident in their knowledge of the subject matter.”
Unfortunately, this false confidence may have serious repercussions. Not only does it make users more susceptible to fake news and misinformation – a burgeoning issue in the modern day and age – it could also make them more polarised and politically uninformed.
This gets at recent discussions we’ve had here in the comment sections: many people are simply immune to having their minds changed upon presentation of new information or argument.
The “need for affect” aspect of the research, moreso than the social media frame, strikes me as more interesting. Looking at the original article, Nicolas M. Anspach, Jay T. Jennings, and Kevin Arceneaux, “A little bit of knowledge: Facebook’s News Feed and self-perceptions of knowledge,” we see that,
Overconfidence in knowledge is often situated in the context of the classic motivated reasoning framework. Here, scholars draw an important distinction between being uninformed (recognizing one’s own ignorance) and being misinformed (confidently holding inaccurate beliefs), with overconfidence research focusing on the latter (Kuklinski et al., 2000). We argue that overconfidence stems from people’s directional goals, or their desire to reach a particular conclusion (see Kunda, 1990). Directional goals manifest in two ways. The first is the desire to hold a certain attitude or belief (Abelson, 1986). This type of directional goal is the basis of most models of motivated reasoning in politics, since people possess ideological and partisan preferences that make some beliefs preferable to others. In doing so, people place more weight on information that agrees with their views and reject information with which they disagree (Edwards and Smith, 1996; Lord, Ross and Lepper, 1979; Taber and Lodge, 2006). The second directional goal, accuracy, motivates individuals to consider the quality and quantity of the evidence before them (Petty and Cacioppo, 1986) and display less false confidence in their knowledge, either by holding confident, correct beliefs or being less confident when the correctness of their beliefs is in doubt.
Given these different motivations, we do not expect everyone to be equally overconfident in their knowledge. Though we anticipate that individuals vary in the degree to which they are driven by accuracy goals, we want to stress that for some, the feeling of being accurate satisfies directional goals more easily than actually being accurate. We argue that these individuals are driven, in part, by a high need for affect. People vary in their need for affect, with some individuals seeking out strong emotions (regardless of whether they are positive or negative), some people trying to avoid feeling strong emotions, and everyone in between (Maio and Esses, 2001). Individuals who possess a high need for affect tend to be the most certain in the rightness of their attitudes and are more likely to form firm and strong opinions (Britt et al., 2009). Those scoring high in a need for affect also tend to take the party line on policies, even when it conflicts with their values, as well as apply double standards when evaluating politicians of the opposing party (Arceneaux and Vander Wielen, 2013, 2017). We extend this logic to epistemic overconfidence, as evidence indicates that a high need for affect moderates how individuals process political news (Ryan, Wells and Acree, 2016). Specifically, we expect those who are high in need for affect to form relatively strong opinions based on the limited information gleaned from the Facebook News Feed and, therefore, be more likely to come away with an illusion of confidence in their knowledge relative to those who score lower on need for affect.
If article previews in the Facebook News Feed can inform users, it is important to understand the effects of these small increases in knowledge. Because individuals with a high need for affect (henceforth, NFA) are more likely to seek out strong emotions (Maio and Esses, 2001), they tend to form strong attitudes (Arceneaux and Vander Wielen, 2013, 2017) and exhibit certainty in the rightness of those attitudes (Britt et al., 2009). Furthermore, we have a similar prediction when we consider epistemic directional goals. We expect that individuals scoring high in NFA will be more overconfident in their knowledge, particularly when only given limited information, as is often the case in Facebook’s News Feed. When asked to consider gains in knowledge, we expect high-NFA individuals to go with their gut reaction, and to be likely to see themselves as knowledgeable even when they are not. Low-NFA individuals, on the other hand, should appreciate that they have only encountered a brief headline and, thus, should be less likely to form overconfident beliefs.
It would seem that using rational discourse with high-NFA, low-information individuals is all but useless. They’re seeking validation for an emotional belief, not understanding. Presumably, the way to change their minds is through emotional means. (Which is why personal experience is so effective. Having a transgender child, for example, is much more likely to change one’s perspective on the issue than a dozen scientific articles.)