"A radical environmentalist, a socialist, and an illegal alien walk into a bar. The bartender says: 'What you having, Mr. President?'"--Sam Aanestad, Republican candidate for Congress, warming up the crowd at a Tea Party Patriots debateThe ‘continued influence effect’ is short for ‘the continued influence of misinformation.’ The term refers to the way false claims enter memory and continue to influence beliefs even after they have been corrected. Unfortunately, many people do not understand how memory works. Worse, they have little interest in the science of memory. If a false claim fits with beliefs that more-or-less define a person's worldview and has a strong emotional component, they instinctively accept the false claim rather than investigate it as a critical thinker would. For example, a few people—whose motives we need not explore here—were able to manipulate the mass media to make a story out of the claims that Barak Obama was born in Kenya and is a Muslim. If these claims were true, Mr. Obama would not be eligible to be president of the United States. Actually, only the first claim--that he was born in Kenya--makes him ineligible. Despite the presentation of overwhelming evidence that President Barack Obama is a Christian and was born in Hawaii, many Americans continue to believe otherwise. On March 12, 2012, the results of a poll in Alabama and Mississippi found that among Republican voters about half still believe he is a Muslim. The importance that emotion and worldview have in affecting this erroneous belief is indicated by the fact that about 25% of those who think Obama is a Muslim also believe that interracial marriage should be illegal. (Obama's father was Kenyan, his mother was Irish.) One year ago, a national poll found that one-fourth of all Americans think President Obama was not born in the United States. Among Republicans and Tea Party supporters, 45% believe he was born in another country. It is very difficult to be fair and balanced in evaluating new information when one has a strong emotional attachment to beliefs that conflict with the new information.
Some people believed Vice-president Dick Cheney when he claimed: “There is no doubt that Saddam Hussein now has weapons of mass destruction; there is no doubt that he is amassing them to use against our friends, against our allies, and against us.” The good news is that many people changed their minds when provided with good evidence contrary to what Cheney had claimed. Several years after Cheney made his false claim and the evidence for it remained near zero, the percentage of Americans who accepted the falsehood about Saddam and weapons of mass destruction dropped from 36 to 26 percent. Still, one out of four Americans believing something false is not something to be proud of.
It should be obvious that most of us are not critical of claims that fit well with our prejudices and emotion-laden beliefs. Still, you would think that we would give up believing something once the evidence shows that we’re wrong, especially since most of us are encouraged in childhood to be truthful and honest. The science indicates otherwise. See, for example, my previous post on the backfire effect. Even without the science, most of know from experience that some nuts are nearly impossible to crack. One of the more obvious examples is religion. Last year, a Gallup poll found that 3 in 10 Americans take the Bible to be the literal word of the god of Abraham. Another 49 percent say the Bible is inspired by a god but should not be taken literally. When you are taught something from childhood that is continually reinforced by one’s family and other communities, it is very difficult to be fair and balanced in evaluating evidence that conflicts with those teachings. On the other hand, in areas where emotion is less dominating, when people are faced with overwhelming evidence contrary to what they believe, they correct their errors. This is what happens in science again and again, unlike what has occurred with fundamentalist religious believers.
It has long been known that false information can influence memory. Recent studies have found that correcting false information often has little effect on changing beliefs. Discredited information continues to influence reasoning and understanding even after one has been corrected. The backfire and continued influence effects should be disheartening to those who think that the first step in arguing with those who base their beliefs on misinformation should be to get their opponents to see what the facts are. Correcting errors is pointless when dealing with people who attribute their own beliefs to principled, unprejudiced inquiry, while attributing the beliefs of those who disagree with them to bias and ulterior motives. But even if a person admits that those who disagree with him have integrity and are really seeking the truth, you are probably wasting your time providing data and facts that might change his mind if the claim you are trying to correct challenges his gut feelings and core beliefs.
Critical thinkers want errors corrected. At the very least, getting the facts right might prevent some faulty inferences and prevent one from behaving in ways that could prove harmful. Is there any hope that those who tend to stick to their beliefs--no matter what the evidence--can change? Yes, there’s some hope, but it is very slight. A study by Ullrich Ecker, Stephan Lewandowsky, and David Tang found giving subjects detailed information about the continued influence effect reduced the reliance on outdated information but did not eliminate it. They also found that reminding people that facts are not always properly checked before information is published in the media didn’t have much effect on reducing the continued influence of misinformation. Holly M. Johnson and Colleen M. Seifert have argued that providing a plausible causal alternative, rather than simply negating misinformation, mitigates the continued influence effect ( “Sources of the continued influence effect: When misinformation in memory affects later inferences” ). They may be right for some beliefs, but I have not found that providing a causal alternative to astrologers, acupuncturists, homeopaths, parapsychologists, or defenders of applied kinesiology, for example, has had much effect on true believers. Political beliefs, religious beliefs, and conspiratorial beliefs seem impenetrable to facts that contradict them. Changes in these beliefs seem more likely to occur outside of direct confrontation with opponents.
One obviously important area where reliance on misinformation can be harmful is in the courtroom. Jurors’ reasoning is influenced by misinformation. Just warning them that something they’ve been presented with is false won’t necessarily prevent the false information from affecting their thinking. We like to think that such warnings would prevent our memories from being distorted, but the way memory ordinarily works is that it often instinctively draws on misinformation—even misinformation that we know is wrong because we’ve had it corrected. In addition to the Ecker et al. study and the Johnson and Seifert study, another important study has found evidence for the continued influence of corrected misinformation: Brendan Nyhan and Jason Reifler's “When Corrections Fail: The persistence of political misperceptions.”
So, is there anything that might change the minds of those who believe Obama is a Muslim born outside of the United States or that Saddam Hussein was behind 9/11 and had weapons of mass destruction before George W. Bush ordered an invasion of Iraq? I would argue that if a person is driven by emotions, especially fear, you probably have little chance of changing his mind. If, however, a person is not driven by emotion and is flexible and open to new information, you have a good chance of changing that person’s mind by providing accurate information backed up with reliable sources. What percentage of so-called birthers are not driven by emotion and are comfortable with changing their minds as new information becomes available? Figure that out and you will know the probable odds of your success at persuading a birther to change his mind. I’d say your odds of success are about the same as those of getting someone who considers “abortion the ultimate child abuse” to engage in a rational discussion of the moral and legal issues regarding abortion.
Finally, further complicating matters are the suggestions that some people's personalities and brains are structured in ways that make them nearly impenetrable to data that conflicts with what in their hearts they know to be true. Their belief armor, perhaps, makes them impervious to change unless one appeals directly to their emotions and gut feelings without challenging the core beliefs that define who they are. Data runs off the backs of some people who are moved to tears by an emotional story that is merely anecdotal. Authoritarian personalities who know what they know is true are driven by fear of "liberals" who are conspiring to take away their freedom and establish an atheistic and socialist state and are not going to be too open minded when it comes to things like Obama's citizenship or abortion issues. The belief armor is strengthened by the tendency of many people to seek out sources of misinformation, perhaps deluding themselves into thinking that they are truth seekers. But, hey, Stephen Colbert figured this out a long time ago.