Low-Information Individuals, Motivated Reasoning, and Epistemic Overconfidence
Why rational discourse seldom changes the minds of some people.
ScienceAlert (“Didn’t Read The Article Before Commenting? Science Says It Really Shows“): summarizes a recent scholarly article:
A little bit of knowledge can go straight to your head, and not in a good way. New research has found that those who only read snippets of their Facebook newsfeed often think they know more than they actually do.
By glancing through article previews, instead of reading the full piece, many users overestimate their understanding of an issue, and this is especially true for those whose knowledge is guided by strong emotions – and, therefore, strong opinions.
“Because most social media users only have a passing engagement with posted news, exposure to political information on social media may simply create the illusion of political learning,” write the researchers at the York College of Pennsylvania.
Nothing surprising there. The research design seems reasonable enough:
The first group (320 participants) was asked to read through a full article from The Washington Post about genetically modified (GM) foods. The second group (319 participants) was given a Facebook newsfeed with four article previews, one of which featured the same GM article.
The third and last group (351 participants) was given no information at all.
To assess their level of knowledge about GM foods, participants were given six factual questions, with five answers that could be found embedded in the article, and three answers that could be found in the Facebook preview.
To measure their level of confidence, the participants were also asked to estimate how many questions they got right.
Unsurprisingly, those who read the full article answered the most questions correctly, while those who read the preview scored only one more correct answer than those who were given no information at all.
Again, no surprises. But here’s where it gets interesting:
Additionally, the findings suggest that people who read only the previews were far too confident in their knowledge. What’s more, those participants whose cognitive style is more guided by emotion, tend to be more certain of their rightness.
This “need for affect” essentially means that participants have strong gut reactions or intuitions that they don’t question. In other words, the feeling of being accurate is more satisfying for many people than actually being accurate.
“Those who are more driven by emotion allow the positive feelings associated with being right to override the need for actual accuracy,” the authors write, “thus coming away from limited exposure to information falsely overconfident in their knowledge of the subject matter.”
Unfortunately, this false confidence may have serious repercussions. Not only does it make users more susceptible to fake news and misinformation – a burgeoning issue in the modern day and age – it could also make them more polarised and politically uninformed.
This gets at recent discussions we’ve had here in the comment sections: many people are simply immune to having their minds changed upon presentation of new information or argument.
The “need for affect” aspect of the research, moreso than the social media frame, strikes me as more interesting. Looking at the original article, Nicolas M. Anspach, Jay T. Jennings, and Kevin Arceneaux, “A little bit of knowledge: Facebook’s News Feed and self-perceptions of knowledge,” we see that,
Overconfidence in knowledge is often situated in the context of the classic motivated reasoning framework. Here, scholars draw an important distinction between being uninformed (recognizing one’s own ignorance) and being misinformed (confidently holding inaccurate beliefs), with overconfidence research focusing on the latter (Kuklinski et al., 2000). We argue that overconfidence stems from people’s directional goals, or their desire to reach a particular conclusion (see Kunda, 1990). Directional goals manifest in two ways. The first is the desire to hold a certain attitude or belief (Abelson, 1986). This type of directional goal is the basis of most models of motivated reasoning in politics, since people possess ideological and partisan preferences that make some beliefs preferable to others. In doing so, people place more weight on information that agrees with their views and reject information with which they disagree (Edwards and Smith, 1996; Lord, Ross and Lepper, 1979; Taber and Lodge, 2006). The second directional goal, accuracy, motivates individuals to consider the quality and quantity of the evidence before them (Petty and Cacioppo, 1986) and display less false confidence in their knowledge, either by holding confident, correct beliefs or being less confident when the correctness of their beliefs is in doubt.
Given these different motivations, we do not expect everyone to be equally overconfident in their knowledge. Though we anticipate that individuals vary in the degree to which they are driven by accuracy goals, we want to stress that for some, the feeling of being accurate satisfies directional goals more easily than actually being accurate. We argue that these individuals are driven, in part, by a high need for affect. People vary in their need for affect, with some individuals seeking out strong emotions (regardless of whether they are positive or negative), some people trying to avoid feeling strong emotions, and everyone in between (Maio and Esses, 2001). Individuals who possess a high need for affect tend to be the most certain in the rightness of their attitudes and are more likely to form firm and strong opinions (Britt et al., 2009). Those scoring high in a need for affect also tend to take the party line on policies, even when it conflicts with their values, as well as apply double standards when evaluating politicians of the opposing party (Arceneaux and Vander Wielen, 2013, 2017). We extend this logic to epistemic overconfidence, as evidence indicates that a high need for affect moderates how individuals process political news (Ryan, Wells and Acree, 2016). Specifically, we expect those who are high in need for affect to form relatively strong opinions based on the limited information gleaned from the Facebook News Feed and, therefore, be more likely to come away with an illusion of confidence in their knowledge relative to those who score lower on need for affect.
[…]
If article previews in the Facebook News Feed can inform users, it is important to understand the effects of these small increases in knowledge. Because individuals with a high need for affect (henceforth, NFA) are more likely to seek out strong emotions (Maio and Esses, 2001), they tend to form strong attitudes (Arceneaux and Vander Wielen, 2013, 2017) and exhibit certainty in the rightness of those attitudes (Britt et al., 2009). Furthermore, we have a similar prediction when we consider epistemic directional goals. We expect that individuals scoring high in NFA will be more overconfident in their knowledge, particularly when only given limited information, as is often the case in Facebook’s News Feed. When asked to consider gains in knowledge, we expect high-NFA individuals to go with their gut reaction, and to be likely to see themselves as knowledgeable even when they are not. Low-NFA individuals, on the other hand, should appreciate that they have only encountered a brief headline and, thus, should be less likely to form overconfident beliefs.
It would seem that using rational discourse with high-NFA, low-information individuals is all but useless. They’re seeking validation for an emotional belief, not understanding. Presumably, the way to change their minds is through emotional means. (Which is why personal experience is so effective. Having a transgender child, for example, is much more likely to change one’s perspective on the issue than a dozen scientific articles.)
You do realize of course that you just explained the last 10 years of the Democrat Party?
What’s striking about this is the role of affect in this formula.
As a follow up, it would be interesting to look to what degree these individuals are attracted to affect-juicing mode of discourse, like talk radio, that are often designed to provide emotionally laden, fact free reinforcements of gut beliefs with little regard for actual facts.
@Eric Florack:
The poster child of epistemic closure, and the Dunning-Kruger effect, has spoken.
@Daryl and his brother Darryl: The irony is beyond strong.
(With a huge bonus for “Democrat Party” which connects to @mattbernius‘s reference to persons who are attracted to “affect-juicing mode of discourse, like talk radio.”).
@Steven L. Taylor:
Frankly, it’s full-on broken through to pitiful/pitiable territory — both for him and especially for his immediate family as I imagine he’s just a gem to have at Thanksgiving and other gatherings.
Some people read only the article titles. Since titles are often composed not by the author of a piece, but by an editor, they can be misleading.
@CSK:
Yup. Editors are trying to catch attention and are constrained by space.
@CSK: It gets even worse when dealing with articles of about research papers. Very rarely do newspaper writers actually go and read the peer-reviewed article, they’re often times using a press release, put out by the college’s/university’s press relations office, to write an article.
So, by the time we get to the newspaper article itself, we’re two levels removed from anyone with a science background, and then three levels removed when it comes to the headline.
Many years ago I was involved in an argument about personal matters and became angry, which led to an almost out-of-body observation: I said something that I felt (self-righteously) proved my point, but even as I was saying it I realized it was wrong. Then, almost as if I was observing myself from the outside, I realized that it made absolutely no effect about how “right” it felt in the moment. Righteous anger, at least for me, was real. Later, when I calmed down I realized how scary that is, especially since I’m sure it doesn’t only apply to me.
I try (and all too often fail) not to argue or debate when I’m angry. And I try not to engage the perpetually angry, on or off line. One on one, in a calm moment, you might be able to have a real discussion, but when someone is feeling righteous the mechanisms that allow reasoned debate have been decoupled from the engine.
@Eric Florack:
That was funny. The piece is essentially about you, and up you pop, first thing. Well-done. Thanks. Nice little morning chuckle.
@CSK:
What’s worse is that often the summary on Facebook is blatant clickbait, often with a misleading or outright unrelated photo.
In principle I don’t follow clickbait, as the result tends to be disappointment and wasted time.
And this, I must say, includes many links like “Shocking revelation about Trump’s actions!” 90% of the time it’s relatively minor, and usually something I’ve already read about from real news sources.
@Timothy Watson:
You are not kidding. A very intelligent friend, highly skeptical, went through a painful medical procedure that his dermatologist told him would pro-actively remove skin cells that might become cancerous. I was surprised I hadn’t heard about it. He sent over an article that was positive and referenced some journal papers. When I looked into it the papers didn’t really say what the article inferred, and more importantly, weren’t the best papers written by any stretch. A couple of hours searching around PubMed left me convinced that while this treatment may have had legitimate therapeutic benefits for a highly specific disease state, it was being used way off label based on someone’s pet theory, and the clinical studies showed little initial effects, and no effects at all after only a couple of years. Since the whole way this is being pitched is that a painful treatment today will set you up for the next 10-15 years, it was worse than useless.
This story is more of the norm than an exception.
One more note on the affect angle, there’s a lot of evidence in the field of Cognitive Science that our most deep seeded beliefs and learning are formed in moments of high affect (either positive or negative): https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5573739/
Likewise, high affect states can also block certain forms of learning.
So the mindset/emotional state you are in when you encounter information can have a HUGE influence on your interaction with it (far deeper than most of us realize).
@Steven L. Taylor:
Oh, Professor, we’ve moved WAAAYYY beyond irony with Florack. Honestly, though, I think he’s doing it just for the attention, anymore.
@dennis: I have been known to engage in more than a bit of understatement from time to time 😉
@Eric Florack: Do you feel you’ve added to the discussion in a useful way?
in the field of science communication, this is known as “facts don’t change minds.”
@Franklin:
This is an angle that was missing from the previous commenter discussions. It’s worth noting that some people are fundamentally NOT interested in advancing or participating in a discussion.
Their goal is to express an opinion.
Nothing more. Nothing less.
Its helpful to try and identify opinion sharing versus discussion posts and react accordingly.
There is plenty of evidence to suggest that the ideas about reasoning employed here are anachronistic. All people judge the salience of information using emotion. This is clear from neuroscience work by Antonio Damasio (amongst others) on the effects of damage to the ventromedial prefrontal cortex. And in the excellent recent book on reasoning by Dan Sperber and Hugo Mercier, The Enigma of Reason (building on their earlier collaborations and citing research that goes back to the mid 60s on reasoning and decision making). I’m quite familiar with the literature on this having just written a book on religious belief, but none of the names cited above ring any bells.
The better theories don’t invalidate this analysis, they just generalise it. Everyone makes decisions based on how the feel about the information available to them. And we do it unconsciously. Reasons for actions or decisions are only produced post hoc and only if required. Some people come up with more rational sounding reasons and some with more emotional sounding reasons, but we all make out decisions the same way: unconsciously and emotionally.
In fact, Mercier and Sperber argue that we don’t use reasoning to solve problems at all until we are in social settings and required to evaluate someone else’s preferred option. That is, we don’t use reasoning to make personal decisions at all, we use it to evaluate possibilities in a social setting.
In other words how people say they came to a conclusion is not related to how they actually came to the decision. And that might have more to do with why reasoning with people doesn’t work, especially if you annoy them right at the start.
The sweet irony of people like @Florack is that they all think they’re the shill in this game of 3-card monte. The shill is the guy the grifter allows to win so as to encourage the marks. Trumpies all think we’re the ones being played by the grifter. One problem with that: we never believe anything the grifter says. Who does? All those people like Florack, which is what makes them such perfect marks.
There is no better mark than the dummy who thinks he’s smart.
Meanwhile, money continues to fly right by people like Florack and into the pockets of the rich. The trade deficit they’re so worried about is getting worse. Farmers are seeing their carefully-nurtured markets destroyed. The debt is skyrocketing. Ain’t gonna be no wall. And the whole world is laughing at us and rightly so.
@Eric Florack: or the past 60-some years of your own life?
@Jayarava: While poking around on the BBC News website to find more information about May’s purported trip to Strasbourg, I ran across something that the BBC is doing to apparently help people learn how to have productive conversations (Crossing Divides). They have one for Brexit, and they have several others as well. I did the Brexit one, taking the role of the Estonian nurse because why not. But the best part of this is that it gives you feedback at the end about your responses and what works and doesn’t work in these types of fraught conversations. I wonder if this helps people learn how to have conversations and if anyone is using any data from this in studies.
http://crossingdivides.bbcnewslabs.co.uk/story/start/3/en
@Jayarava:
In other words how people say they came to a conclusion is not related to how they actually came to the decision.
One of the few useful things I took from my brief stay at SJ State was the idea of de-cathexis, setting aside your existing beliefs to achieve a state of pre-suppositionlessness. From that pre-suppositionless position you then carefully reassemble your belief system using only those elements that stand up to rigorous examination.
By definition it is not possible to avoid subjectivity, we are a subjective, we unavoidably filter data according to various criteria. But one thing I’ve found useful is argument, because in argument with an intelligent counterpart, one can see the holes in one’s belief system. So before I tentatively load up another ‘belief’ I always want to subject it to crossfire. That’s one of the reasons I come here.
This leads to the question: what is the right way to change their minds?
When you have the Republican Party going all in on affect, decrying everything from Medicare for all to having to ask for a straw if they want one as “SOCIALISM!”, how do we counter it? It’s been very effective.
Do we simply point out that AOC is way cooler than Mitch McConnel, Tucker Carlson, Jared Kishner and Donald Trump (either individually, or taken together)? Even Bernie is cooler than any of those.
(My phone tried to autocorrect Bernie to BernieBros. Ha ha ha)
@MarkedMan:
Hoo boy, this is a big deal. And, frankly, the publish-or-perish culture of Western science has screwed the pooch (and the rest of society) badly, by relaxing publication standards well below the danger level.
Only a tiny fraction of people are qualified to read published technical articles critically, to distinguish the wheat from the chaff from the ergot. Even then, they probably can only do so for a small subset of fields. I can vet most papers that rely on statistical inference of some sort, but don’t ask me to review a non-empirical physics or chemistry or materials science paper. (Or historical linguistics, or neurobiology, or forensic anthropology, or structural macroeconomics, or a thousand other fields.)
Everyone else has to rely on trusted experts. And when the trusted experts disagree, or are undermined by a deliberate disinformation campaign, everyone reverts to emotion to decide what is true. Worse yet, they abandon the idea that truth is accessible by objective means — next time, they don’t wait for the experts to disagree…
I wish I could remember enough details about this ad to look it up, but I can’t. Anyway a few years ago a conservative group filmed some ads with this “hipster” in a car just giving the 4-1-1 to the kids about how drilling and fracking and such were totally awesome solutions for the future.
God they were embarrassing. He didn’t call fracking radical to the max, but he may as well have.
Years ago I attended a class called “Great Decisions”. We would study issues, mainly in foreign policy. We would discuss these issues. People would give their opinions, ideas, and questions. That is what our elected leaders need to see.
@DrDaveT: “to distinguish the wheat from the chaff from the ergot.”
Nicely turned phrase. Kudos!
@DrDaveT: I remember working with a fellow grad student at one point going through the seminal papers on the Quantum Hall Effect. And then being taken aback by the multiple fractional charges that showed up in the math–because all the stuff written afterwards just gloms onto the 1/3 charge case and never mentions the possibility of 1/5, 1/7, or smaller fractions of a charge cases.
Brought home to me again that yeah, you have to go through the actual calculations and not rely on other people’s reports.
P.S. Today’s one of the make-or-break days for Brexit, for those of you following that increasingly chaotic soap opera. Vote tonight on the Withdrawal Agreement. Theresa May has just come back from Brussels with what could be considered a fig-leaf about a solution to the Irish “backdrop problem”; the question is whether it is enough to convince MPs in the House of Commons to help the WA over the line. So far the Euro-skeptics aren’t budging.
Chances are high that the WA will be turned down once again, in which case we’re probably looking at an accidental no-deal Brexit. (The House of Commons won’t want to crash out and may in fact vote against No Deal, but since No Deal remains the default, the lack of any other alternative will mean sliding off the cliff no matter what.) The EU hasn’t expressed any interest in allowing an extension to Article 50, especially not if the extension is just going to be used by the Brits to continue their internal political squabbling and not solve anything. I think the EU is basically fed up with dealing with Brexit and would rather have the U.K. crash out and then pick over the pieces.
The pound has already taken a small dive based on Cox’s legal opinion on the updated WA; I suspect we’ll see continued drops as the whole system moves step-by-step towards an accidental crash-out.
@DrDaveT:
Actually, in my field (Medical Devices and Procedures), it’s easier than it sounds. An awful lot of papers are published with n=7 (i.e. 7 patients in the study) and are more anecdotal than based on the scientific method. These aren’t bad in and of themselves as they often are the first step in finding new treatments but clinicians tend to read them and built up whole revenue streams around them. The number one lesson I’ve learned in medicine is that most things don’t work, and most of the ones that do only have moderate affects. The things that truly make a huge difference are far far apart and take years to suss out. It is a myth that there are hundreds of cures the FDA is sitting on, because, reasons. A more typical example is the clinicians making millions off of injecting stem cells into knee joints when that “miracle” treatment was shown to be essentially worthless 10 years ago.
So tonight I was flipping through the channels, and I ran across an interview with Ken Burns, where he quoted a novelist he likes as saying (paraphrased) “No one ever had their mind changed by an argument. People only respond to stories.”
My first thought was that this is a scathing indictment of humanity. Then it occurred to me to go check the relative prevalence of Myers-Briggs types in the general population…
The combination NT — “analytical” (as opposed to “sensing”) and “thinking” (as opposed to “feeling”) only occurs in about 10% of the general population. The four types that include both N and T are 4 of the 6 least common combinations. Since I’m an NT, I think that means most people are irrational — but it’s still a fact that the vast majority of people simply do not respond to the world with analysis and thought as a first reflex.
@DrDaveT:
While there’s quite a bit of research in the psychology literature saying the Myers-Briggs categories are hokum, there is an incredibly high concentration of one unusual personality type—INTJ—in the academy, even though we’re only 2% of the general population (and only 0.8% of women). Even us non-STEM types are often INTJs.
@James Joyner:
This depends a bit on what you mean by ‘hokum’. I agree that there is very little evidence that the categories mean what Myers and Briggs (or other later writers) have said they mean. On the other hand…
Exactly. That can’t be a coincidence, and nobody is giving Myers-Briggs tests to applicants. The test measures something, and that something is repeatable. Like an IQ test, even if you abandon any notion of IQ as being a measure of intelligence simpliciter, if you accept that IQ is what IQ tests measure you can use the results predictively.
At my previous company, roughly 80% of the technical staff evaluated as INTJ or ENTJ. It is a measurable attribute, and it is rare in the general population.