Ever try to use facts to win an argument, only to have the other person dig in their heels even harder? A new study suggests that there’s a “backfire effect” when you show someone evidence that contradicts their beliefs.
Researchers had two groups of people read a fake news story about the presence of weapons of mass destruction in Iraq. Both of them had a quote from President Bush that seemed to imply there were WMD’s in Iraq. One version of the story also had a quote from the Duelfer Report showing no evidence of WMD’s, the other version did not have this quote.
After reading one version or the other of this article, the participants were asked whether they agreed or disagreed with a statement claiming that Iraq had weapons of mass destruction prior to the US invasion. The people who rated themselves as liberal, left of center, or centrist, did not agree — and whether they read the correction had little effect on their views. The people who rated themselves as conservative did agree. And they agreed even more, when they read the article with the correction than when they read the article without the correction.
The study did a few other similar experiments, all with the same theme. And there were things that would amplify this “Backfire Effect.” Participants in the experiments were more likely to experience the Backfire Effect when they sensed that the contradictory information had come from a source that was hostile to their political views. But under a lot of conditions, the mere existence of contradictory facts made people more sure of themselves — or made them claim to be more sure.