
Cognitive science explains why it is easier to deny factual evidence than to renounce one’s beliefs.
Have you ever met people who change their minds when you expose facts that are contrary to their convictions? Me never. Worse, people seem to strengthen their beliefs and defend them fiercely when there is overwhelming evidence against them. The explanation is related to the fact that our view of the world seems threatened by factual evidence that does not go in its direction.
Creationists, for example, challenge the evidence of evolution as fossils or genetics because they worry that secular forces encroach on religious faith. Anti-vaccination is wary of pharmaceutical companies and think that money corrupts medicine. This leads them to believe, for example, that there is a cause and effect relationship between vaccines and autism despite the embarrassing truth that the only study claiming such a link has been retracted and its principal perpetrator accused of fraud. The September 11 conspirators focused on minute details like the melting point of steel in the towers of the World Trade Center, which caused their collapse because they believe the US government is lying and conducting operations „under False flag „to create a new world order. Climate deniers are studying tree rings, ice cores, and greenhouse gas concentrations because they are passionate about freedom, particularly that of industries to conduct their business without being constrained by regulations Restrictions. Barack Obama’s obsessed followers desperately dissected his birth certificate in quest of fraud because they believed that the first African-American president of the United States was a socialist who aimed to destroy the country.
In these examples, the deep world views of these supporters are seen as being threatened by the rationalists, making them „the enemy to be defeated.“ This grip of belief on the evidence is explained by two factors: cognitive dissonance and the backfire effect. In a classic book published in 1956, When Prophecy Fails, psychologist Leon Festinger and his co-authors described what happened to a cult worshiping UFOs after the expected extraterrestrial mother ship did not arrive The time announced. Instead of admitting their mistake, „the group members frantically sought to convince the world of their beliefs,“ and they made „a series of desperate attempts to erase this dissonance between their belief and reality by making new predictions After the initial prophecy, in the hope that one would end up being the right one. “ Festinger described this state as cognitive dissonance, an uncomfortable tension that arises when we consider two contradictory ideas simultaneously.
In their book The Mistakes of Others. Self-justification, its springs and misdeeds, published in 2007, two social psychologists Carol Tavris and Elliot Aronson (a former Festinger student) document thousands of experiences demonstrating how people distort and select facts to fit their Pre-existing beliefs and reduce their cognitive dissonance. Their metaphor of the „pyramid of choice“ illustrates how two individuals with close positions – side by side at the top of the pyramid – can quickly diverge and end up at the foot of the pyramid on opposite faces, with opposite views, ‘They set out to defend a position.
In a series of experiments, Brendan Nyhan of Dartmouth College and Jason Reifler of the University of Exeter have identified a second, related factor called the „backfire“: Correcting factual errors related to a person’s beliefs is not only ineffective, but it reinforces his erroneous beliefs because „it threatens his vision of the world or the idea that it is self-made.“ Experimental subjects received, for example, fictitious press articles confirming widespread misconceptions, such as the presence of weapons of mass destruction in Iraq. Then the participants were given an article that showed that no weapons of mass destruction had been found. As a result, liberal subjects opposed to the war accepted the new article and rejected the elders, while the conservatives who supported the war did the opposite. Worse, they said they were even more convinced of the existence of weapons of mass destruction after reading the article that there were none, on the grounds that it only proved that Saddam Hussein had hidden or destroyed them . In fact, Nyhan and Reifler noted that among many Conservatives, „the belief that Iraq possessed weapons of mass destruction just before the invasion by the United States persisted long after the Bush administration itself Eventually admitting that this was not the case „.
If factual corrections only make things worse, what can we do to convince people that their beliefs are wrong? According to my empirical experience, the following behavior can be adopted:
- Put his emotions aside.
- Discuss, do not attack (no ad hominem attack or Godwin point).
- Listen carefully and try to analyze the position of your interlocutor accurately.
- Show respect.
- Recognize that you understand why someone can support this opinion.
- Trying to show how changing the vision of the facts does not necessarily imply changing the world view.
These strategies do not always work to convince people to change their point of view, but at a time when it has become so common to get rid of the truth in public debate, it could at least help to reduce unnecessary disagreements.
Source: Michael Shermer ScientificAmerican.com