The prevalence of misinformation in contemporary political discourse, on both sides of the political spectrum, is extraordinarily disconcerting. In recent years, we have seen widespread support for the conspiracy theory that Bush administration officials were complicit in the 9/11 terrorist attacks (especially among Democrats), and that President Obama was not born in this country (especially among Republicans). These myths have proven to be remarkably difficult to correct, and their perseverance undermines citizens’ ability to cast informed votes and participate meaningfully in public debate.
Misperceptions about political candidates and government officials are the most widely covered by the media. If public misperceptions were limited to political candidates, perhaps we could write them off as an unhappy consequence of political campaigns. But they aren’t. Misperceptions about policy are just as widespread. For instance, surveys have repeatedly demonstrated that Americans are woefully uninformed about the amount the U.S. spends on foreign aid, the performance of the economy under presidents from the other party, or the effects of tax cuts on government revenue.
To combat this problem, in recent years we have seen the rise of political fact checking. Through devices such as “Pinocchios” and “Pants-on-Fire” verdicts, journalists have formally asserted their right to adjudicate the truth or falsehood of the carefully constructed narratives of politicians. But does fact-checking work?
Fact checking in a political context is based on a relatively simple premise: a politician makes a claim, reporters investigate, and a decision is made on the veracity of that claim. The fact-checking process, however, is not done for purposes of merely identifying a statement as either “true” or “false.” Rather, the decision on the veracity of a claim is meant to mean something. Effective fact checking means not only catching and correcting a falsehood, but also doing so in a way that dislodges it from the public mind.
The available evidence from social science, however, casts doubt on whether effective fact checking is even possible, particularly in situations where the misperception relates to highly salient or controversial issues.
There is some research suggesting that people’s policy opinions can sometimes be responsive to new information. One of the more optimistic studies in this area found that giving policy-specific information to survey respondents affected their policy preferences in understandable ways. Specifically, the researchers found that: (1) telling participants that the crime rate “is now lower than at any time since 1974” resulted in less support for increased federal spending on prisons; and (2) telling participants that “the amount of money we spend on foreign aid has been going down and now makes up less than one cent of every dollar that the federal government spends” resulted in increased support for federal spending on foreign aid.
However, the effect is not consistent. Similar studies have found that providing people information about the human or financial costs of the Iraq war did little to alter judgments about whether the war was worth fighting, and that giving people information about the actual number of undocumented immigrants in the U.S. did not alter immigration attitudes. More facts, it seems, aren’t always better.
In addition to being inconsistent, a major drawback to the research in this area is that it generally does not directly measure the misperception itself, making it difficult to determine whether underlying factual beliefs change in response to corrective information. Studies that do focus on the misperception have found that it is extraordinarily difficult, and in some cases even counterproductive, to correct people’s misperceptions.
In 2006, for example, Brendan Nyhan and Jason Reifler created fake newspaper articles about polarizing political issues. In one of their experiments, Nyhan and Reifler asked participants to read a mock news article about the Iraq war that suggested that Saddam Hussein possessed weapons of mass destruction (“WMD”) that he could have passed to terrorists after September 11, 2001 (i.e. a false statement). In the experimental condition, the story then discussed the Duelfer Report, which documented the lack of Iraqi WMD stockpiles (i.e. it corrected the previous false statement about Iraq possessing WMDs). After reading the article, participants were asked to state whether they agreed with the following statement: “Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.”
Nyhan and Reifler found that, for subjects who identified as “very liberal,” the correction worked as expected, making them more likely to disagree with the statement that Iraq had WMD compared with the controls. The correction did not have a statistically significant effect on individuals who described themselves as “liberal,” “somewhat left of center,” or “centrist.” Those reactions shouldn’t necessarily surprise you. What should give you pause though is how “conservatives” felt about the correction. After reading that there were no WMDs, they reported being even more certain than before there actually were WMDs and their original beliefs about Iraq were correct. In other words, the corrections “backfired.”
Nyhan and Reifler repeated the experiment with other salient or controversial issues like stem cell research and tax reform. And once again, they found corrections to increase the strength of the participants’ misconceptions if those corrections contradicted their ideologies.
Recent research points to two potential explanations for the “backfire effect.” First, a recent psychological study suggests that negating descriptors that lack an opposite concept (e.g., “criminal”) can backfire. For example, repeating a false claim with a negation (“John is not a criminal”) leads people to more easily remember the false claim at the core of the sentence (“John is a criminal”), reinforcing the association that the speaker intends to falsify. By contrast, negations are more effective if the descriptor has an opposing concept (e.g., rich/poor). Second, corrections may fail to reduce misperception because they make readers better able to repeat the rumor, which may in turn make it seem more true because fluency is used as a heuristic.
Regardless of the specific explanation for the “backfire effect,” however, we have long known that people engage in “selective exposure” – that is, they seek out information that is consistent with their pre-existing views and avoid information that contradicts with their prior beliefs. And once people are exposed to information, they engage in “motivated reasoning” – that is, they are prone to accepting claims that reinforce their pre-existing views (confirmation bias) while rejecting or ignoring statements that undermine their beliefs or opinions (disconfirmation bias). In many ways, just as the confirmation bias shields you when you actively seek information, the backfire effect defends you when the information blindsides you, causing you to stick to your beliefs, and indeed strengthen them, instead of questioning them.
The persistence of these biases appears to be correlated with political sophistication, such that they are stronger among politically sophisticated individuals and those with stronger attitudes, and weaker among the less politically sophisticated and those with weak attitudes. In addition, there is currently a debate about whether these biases may be stronger among political conservatives than political liberals.
If the social science were to be boiled to one succinct sentence, it is this: once something is added to your collection of beliefs, you instinctively and unconsciously protect it from attitude-inconsistent information. This is the reality political “fact checkers” must confront. And confronted with this reality, the academic research in this area paints a pessimistic picture – the most salient misperceptions are widely held, easily spread, and incredibly difficult, if not virtually impossible, to correct.
None of this should be read to suggest that fact checking is a fruitless enterprise. It is not. Rather, careful fact checking is at the heart of good journalism. But as the fact checking industry becomes more than a cottage industry, it is important to recognize its limits. For fact checking to become the civic corrective it aims to be, it must not only recognize our cognitive biases, it must find ways to overcome them.
In the end, truth will out. Won’t it?