Does the Backfire Effect Really Exist?

If you’ve been active in political circles in the last year or so, you’re probably heard about the “backfire effect“: When people are presented with facts that contradict their stated position, they dig in deeper, becoming even more convinced that they’re right.

It’s a disheartening sociological phenomenon to consider, because it suggests that appeals to facts don’t always work. But the backfire effect also turns out to … maybe not exist.

Researchers at George Washington University and Ohio State University tried to replicate the effect in a study of over 10,100 people, spanning 52 issues, and they couldn’t find strong support for it. Perhaps, it turns out, facts really do matter.

Before we delve in, some important things to understand about social psychology and this kind of research: No one study can definitively prove, or disprove, a hypothesis about how people think. For that, we need lots of different studies, conducted by different people in different settings. But a study can suggest that we explore an issue further.

Be wary of news reporting — or research — that triumphantly proclaims a definitive conclusion on the basis of one study. This research shows that a widely-accepted and studied hypothesis could benefit from a closer look, and hopefully it will set the stage for more research into how people interact with each other during arguments and discussions. That, in turn, could furnish us with some useful material for learning how to effectively change hearts and minds.

In the original “backfire effect” study, the researchers found that when participants were presented with corrections on contentious issues, they didn’t just disregard them. They grew actively convinced that their positions were correct. For instance, when presented with information that Iraq never had weapons of mass destruction — thus eroding the case for invading the country — a person might say that lack of finding them was evidence that Iraq successfully concealed them.

I was certainly among the many people who found this study highly persuasive, and maybe I relied on it a little too heavily — in part because I fell into the trap of a logical fallacy: It seemed to align with my beliefs about other humans, so I thought it was probably true. And I had ample anecdotal evidence to support my belief from any number of arguments over everything from abortion policy to sexism in the tech industry.

These researchers might have expected to confirm the backfire effect with their study, in which they took on charged political issues and walked their participants through a process much like that used in the original research. To their astonishment, though, it turned out that the facts were persuasive. People did, in fact, adjust their position in response to the revelation of a correction — though if people received corrections that enforced or validated their attitudes, they tended to react more strongly.

This research suggests that it’s worth it to correct misinformation — and to present people with data that contradicts a given attitude or belief. But, if that’s the case, why does it feel like confronting people with facts makes them shut down and tune out the argument? While that’s not an issue these researchers delved into, it would be fascinating to learn more.

Does the format and presentation of facts change how people interact with them? How much does tone matter? Does the authority of the person involved change their perspective? Knowing the answers to questions like these could help us have more functional and pleasant conversations with people — even when we’re talking about very contentious issues.

For now, the big takeaway is this: Keep presenting people with new information because, as it turns out, they’re listening.

Photo Credit: Tam Tran/Flickr


Marie W
Marie W12 months ago

Thanks for sharing.

Paulo R
Paulo Rabout a year ago


Danii P
Past Member about a year ago


heather g
heather gabout a year ago

If people have never been encouraged to question facts by their parents or through schooling they tend to be biased and even racist in extreme cases.

Chrissie R
Chrissie Rabout a year ago

@Ian T. Agreed!

Julie W
Julie Wabout a year ago

No matter the topic, I've found it a complete waste of time to argue with a stupid person. They are not interested in facts.

Dan Blossfeld
Dan Blossfeldabout a year ago

Ian T is a prime example of the backfire effect. He seems to be digging in deeper.

Winn A
Winn Aabout a year ago

Petition Signed

Ian T
Ian Tabout a year ago

Ultimately, people will believe what they want to be believe; evidence is a major irritation for such individuals. After all, most people follow some religion or other - the pre-eminent evidence-free zones. "Fake news" is nothing new; it has been around for thousands of years - excellent sources include the Bible and the Koran.

Ellie M
Ellie Mabout a year ago