Trump supporters know Trump lies.

During the campaign — and into his presidency — Donald Trump repeatedly exaggerated and distorted crime statistics.



“Decades of progress made in bringing down crime are now being reversed,” he asserted in his dark speech at the Republican National Convention in July 2016. But the data here is unambiguous: FBI statistics show crime has been going down for decades.
CNN’s Jake Tapper confronted Trump’s then-campaign manager, Paul Manafort, right before the speech. “How can the Republicans make the argument that somehow it’s more dangerous today, when the facts don’t back that up?” Tapper asked.
“People don’t feel safe in their neighborhoods,” Manafort responded, and then
dismissed the FBI as a credible source of data.
This type of exchange — where a journalist fact-checks a powerful figure — is an essential task of the news media. And for a long time, political scientists and psychologists have wondered: Do these fact checks matter in the minds of viewers, particularly those whose candidate is distorting the truth? Simple question. Not-so-simple answer.
In the past, the research has found that no only do facts fail to sway minds, but they can sometimes produce what’s known as a “backfire effect,” leaving people even more stubborn and sure of their preexisting belief.
But there’s new evidence on this question that’s a bit more hopeful. It finds backfiring is rarer than originally thought — and that fact-checks can make an impression on even the most ardent of Trump supporters.
But there’s still a big problem: Trump supporters know their candidate lies, but that doesn’t change how they feel about him. Which prompts a scary thought: Is this just a Trump phenomenon? Or can any charismatic politician get away with being called out on lies?

Earlier studies found that not only do fact-checks not work, but they can actually backfire

In 2010, political scientists Brendan Nyhan and Jason Reifler published one of the most talked about (and most pessimistic) findings in all of political psychology.
The study, conducted in the fall of 2005, split 130 participants into groups who read different versions of a news article about President George W. Bush defending his rationale for engaging in the Iraq War. One version merely summarized Bush’s rationale — ‘‘There was a risk, a real risk, that Saddam Hussein would pass weapons or materials or information to terrorist networks.” Another version of the article offered a correction that, no, there was not any evidence Saddam Hussein was stockpiling weapons of mass destruction.
The results were stunning: Staunch conservatives who saw the correction becamemore likely to believe Hussein had weapons of mass destruction. (In another experiment, the study found a backfire on a question about tax cuts. On other questions, like on stem cell research, there was no backfire.)
“Backfire is a pretty radical claim if you think about it,” Ethan Porter, a political scientist at George Washington University, says. Not only do attempts to correct information not sink in, but they can actually make conflicts even more intractable. It means earnest attempts to educate the public may actually making things worse. So in 2015, Porter and a colleague, Thomas Wood at the Ohio State University, set out to try to replicate the effect for a paper (which is currently undergoing peer review for publishing in the journal Political Behavior).
And among 8,100 participants — and on the sort of political questions that tend to bring out hardline opinions — Porter and Wood hardly found any evidence of backfire. (The one exception, interestingly, was the question of weapons of mass destruction in Iraq. But even on that, the backfire effect went away when they tweaked the wording of the question.)
“There’s no evidence that backfire describes a common reflex of Americans” when it comes to facts, Porter assures me. (Nyhan, for his part, never asserted that backfire was ubiquitous, just that it was a possible and particularly consequential result of fact-checking.)
Stories of failed replications in social psychology often grow ugly, with accusations of bullying and scientific misconduct flying in both directions. But in this story, researchers decided to team up to test the idea again.

No comments: