Why do people run from facts?

The truth is often unpleasant. Maybe that’s why no one is looking for her in disputes? Usually a person who wants to talk about something (argue) already has a strong conviction. And any facts seem unconvincing to him.

Typical dialogue. “There was a study that proved that vaccination causes autism.” “Well, this ‘researcher’ lost his medical license a long time ago, and there is a huge amount of evidence that there is no link between vaccines and autism.” “It doesn’t matter: I, as a parent, have the right to decide for myself what my child needs.” Familiar? The argument begins with arguments that can be verified if desired, but as soon as a person becomes uncomfortable (unbearable, unpleasant) to hear the truth, he begins to avoid facts.

Hard facts?

On any controversial issue, such as vaccination, health care reform or same-sex marriage, discussions flare up almost instantly, as soon as the topic is raised. At first, many participants try to back up their arguments with scientific evidence, especially since finding and verifying their validity today is not difficult. Another thing is strange – why do biased opinions and sharp disagreements still persist? Why is it that when people are told the facts, it often doesn’t work? In a new study published in Journal of Personality and Social Psychology, it has been suggested that people begin to avoid facts when they contradict their beliefs. Of course, sometimes a person questions the reliability of a particular fact. But many go further and transfer the problem to another plane, where the facts cannot be verified. Thus, potentially important facts and scientific data become useless.

The eternal question

Take, for example, same-sex marriages. Facts can be decisive when deciding whether to legalize them – for example, data can show that children raised in same-sex families have more problems than their peers from ordinary families. Or, on the contrary, no differences will be found … But what if the facts contradict the person’s beliefs? The study involved 174 people with opposing positions on the issue of same-sex marriage. They were offered (supposedly) scientific evidence that supported or disproved the correctness of their beliefs. When facts conflicted with beliefs, both supporters and opponents of same-sex unions were more likely to argue that it was not the facts that mattered, but the moral side of the issue. But if the facts supported their position, they were more likely to say that their opinion was based on facts and not on moral considerations. In other words, what was found was not just a denial of specific facts, but a denial of the importance of facts as such.

This experiment clearly shows that when people’s beliefs are threatened, they often begin to deny the validity of the facts at all. Scientifically speaking, their ideas lose their “falsifiability” – the ability to be tested by the scientific method for confirmation or refutation.

S. Martin, N. Goldstein, R. Cialdini “Psychology of persuasion. Important little things that guarantee success

It would seem, what is the difference between the promises to oneself: “I plan to lose weight by 3 kg per month” and “I plan to lose weight by 2–4 kg”? Does it matter if the tutor says to the child: “Learn 8 English words every day”, or “… 7–9 words”? The differences are small, but it turns out that the determination to complete the task depends on them.

Confirm or deny?

Another example: many people criticize the government’s policies, claiming that the system does not work. If the facts show that it works, a person is unlikely to give up his views and still criticize the government simply on principle. Perhaps the avoidance of facts was the result of the refusal to rely on facts in the course of discussions? This hypothesis is difficult to objectively test, but psychologists did conduct an experiment to answer a fundamental question: Is it true that when people find their important beliefs difficult to objectively test, it reinforces their adherence to those beliefs? The results speak in favor of this assumption.

The experiment involved 103 people whose degree of religiosity ranged from “medium” to “high”. It has been found that when highly religious people were told that the existence of God could never be proven or disproved, they reported having stronger belief in ideas they desired (such as that God cares for them) than those who were told that science will someday be able to find the answer to the question of the existence of God.

These data show that, at least in some cases, when the discussion of the issue does not take into account the possibility of fact checking, people tend to be more protective of their pre-existing beliefs. These results are similar to other studies that have shown that the less clear the facts are, the more people cling to the beliefs they desire.

Several conclusions. Bias is a disease that is cured with good doses of facts and education. When facts appear in the discussion, bias recedes. But, unfortunately, the power of facts is far from unlimited. People are ready to “run away” from the facts and use many other tricks to defend their beliefs, just to avoid coming to conclusions that are unpleasant for themselves.

To vaccinate society against the virus of bias, it is worth “teaching” people not to be afraid of uncertainty, develop critical thinking and reject ideological dogmas. And while we will never be able to completely get rid of bias – in ourselves, in others, and in society as a whole, we have a chance to learn to rely less on ideology and more on facts.

Leave a Reply