Jumping To Conclusions: Why It Happens And How To Stop It

Every day requires hundreds of tiny judgments and decisions. Usually, there’s not enough time or information to really make a rigorous judgment with perfectly solid evidence and airtight reasoning behind it. So we take shortcuts.

For instance, I recently had to evaluate a research proposal. But I did not have enough time to read through all the researcher’s previous work. So I trusted her reputation and a persuasive summary of the project and supported the work.

Most of the time this kind of reasoning is justified, but every now and then it goes too far. People jump to conclusions that aren’t warranted by the limited information at their disposal — or they consider conclusions solid that should actually only be regarded as very tentative.

Like with other cognitive biases, most of the time jumping to conclusions is harmless. Even if judgments based on limited reasoning and quick thinking are wrong, the consequences are rarely dire. It’s unfortunate, for example, if you dismiss a young politician too quickly because of his age, for instance, but it’s hardly the end of the world.

But it is a problem when jumping to conclusions becomes the default, especially in relation to decisions made about complex social and political issues, where more fine-grained reasoning is typically required. The prevalence of jumping to conclusions, on social media contributes to serious negative trends, like polarization and even conspiracy theory. This can happen to even the sharpest among us. Take, for example, a recent episode where a number of Jeopardy contestants jumped to an extreme conclusion about a fellow contestant’s innocuous hand signal.

Democrats Reintroduce Bill To Double The Pell Grant
Harvard Wins Dismissal Of Lawsuit Seeking Covid-19 Tuition Refund
So, why do people end up jumping to conclusions? And what kinds of educational interventions and reminders might help us all reason better? Recent research from Professors Carmen Sanchez and David Dunning may shed light on this question.

First, what exactly is jumping to conclusions? Technically defined, it involves extrapolation from limited evidence that goes well beyond what is warranted by that evidence. People prone to jumping to conclusions take small amounts of evidence as reason for being confident about conclusions that are not warranted; they read meaningful patterns into random data; or they develop elaborate theories on the basis of one or two items of data that are not representative.

This behavior is often associated with clinical problems like schizophrenia, anxiety, and depression, and it can contribute to delusional thinking — though it is also a problem in the general population as well.

Sanchez and Dunning’s experiments sought to learn more about this bias in a nonclinical population and what can be done to remediate it. Specifically, they investigated what kinds of other behaviors are associated with jumping to conclusions and they tested a promising intervention strategy.

In one experiment, participants were asked to assess whether fish being caught one by one wer

e coming from Lake 1, where the fish are 80 percent red and 20 percent gray, or Lake 2, where they are 80 percent grey and 20 percent red. Participants could either come to a decision or ask to see more fish. On average, they were ready to make a guess after between three and four fish. But those prone to jumping to conclusions, around 30 percent of participants, issued a verdict after just two fish or fewer.

The researchers then put participants through a battery of other tests, including a cognitive reflection test, which shows how well people can avoid intuitive, but wrong, answers. “High jumpers” (those prone to conclusion jumping) performed worse on the reflection test and other cognitive tests. They had more trouble evaluating logical arguments and accurately evaluating gambling odds, and they showed overconfidence in answering questions on civics and current events.

They also made errors caused by overconfidence when learning new tasks, like diagnosing a hypothetical medical malady. Those prone to jumping to conclusions were “exuberant theorizers,” meaning they were more likely to develop theories of how to complete the diagnosis based on little experience or evidence — leading to more mistakes.

Finally, they were also shown to be more likely to hold “oddball” or conspiratorial beliefs: including that health officials are hiding evidence that cell phones cause cancer or that the Apollo moon landings were faked. Jumping to conclusions may, therefore, help explain, as other research has suggested, the attraction of conspiracy theories and other misinformation.

Further analysis of these reasoning failures showed that jumping to a conclusion involves a problem with what is called “controlled processing.” This is opposed to “automatic processing,” where judgments are made and tasks completed without conscious awareness of what is happening. In controlled processing people are explicitly aware of the decision-making process. In other words, they “see themselves” thinking.

Sanchez and Dunning were able to isolate parts of these tasks and found that the correlation between conclusion jumping and poor cognitive performance is connected to a lack of controlled processing. Simply put, those more prone to make snap judgments don’t spend as much time thinking reflectively and analytically and are thus susceptible to the mistakes of automatic judgment.

This finding suggests that encouraging more explicit and analytical thinking might help curb jumping to conclusions. To see if educational interventions could have any impact, the researchers relied on tools that have been used successfully in schizophrenia treatment. Specifically, they used metacognitive training tools that have proven effective in combating certain thinking patterns in schizophrenic patients who are prone to jumping to conclusions and delusional beliefs.

The application of schizophrenia treatment may sound like a “jump” itself, but it makes sense. Metacognition is a core part of critical thinking, in any population. Metacognitive training is geared toward getting people to see their own thinking patterns and adjust them intentionally in line with evidence. It includes exposing trainees to examples of problematic thinking like overconfidence, jumping to conclusions, and single-cause explanations. Trainees also practice externalizing their own thinking in writing or speech, reflecting on it, and pushing back against it explicitly with more rational and analytical thinking.

The training proved successful with Sanchez and Dunning’s population, at least in part. It was particularly effective in reducing the overconfidence that is usually paired with jumping to conclusions. The researchers write that this suggests productive intersections between clinical and social psychology.

It also suggests that there may be a number of quick and easy ways to educate the general population to think more carefully and analytically. More than ever, a complex and oversaturated information environment calls on news consumers to resist easy answers, bad reasoning, and outright fake stories. Inflammatory stories and easy narratives push people in exactly toward knee-jerk responses and emotional reasoning.

Developing quick and easy metacognitive interventions that could be completed online could be a significant step forward. It’s clear that people in general lack the skills they need to think through the information environment. Tools and education are needed to help them arrive at productive judgments and conclusions — while keeping their feet firmly on the ground.


Leave a Reply

Your email address will not be published. Required fields are marked *