Have you heard about how Al Gore claimed to have invented the Internet? What about how Iraq was responsible for the attacks on the World Trade Center? Or maybe the one about how George W. Bush has the lowest IQ of any U.S. president ever? Chances are pretty good that you might even believe one (or more) of these claims. And yet all three are false. At FactCheck.org our stock in trade is debunking these sorts of false or misleading political claims, so when the Washington Post told us that we might just be making things worse, it really made us stop and think.
A Sept. 4 article in the Post discussed several recent studies that all seemed to point to the same conclusion: Debunking myths can backfire because people tend to remember the myth but forget what the debunker said about it. As Hebrew University psychologist Ruth Mayo explained to the Post, “If you think 9/11 and Iraq, this is your association, this is what comes in your mind. Even if you say it is not true, you will eventually have this connection with Saddam Hussein and 9/11.” That leaves myth busters like us with a quandary: Could we, by exposing political malarkey, just be cementing it in voters’ minds? Are we contributing to the problem we hope to solve?
Possibly. Yet we think that what we do is still necessary. And we think the facts back us up.
The Post story wasn’t all that surprising to those who follow the findings of cognitive science research, which tells us much of our thinking happens just below the level of consciousness. The more times we hear two particular bits of information associated, for example, the more likely it is that we’ll recall those bits of information. This is how we learn multiplication tables – and why we still know the Big Mac jingle.
Our brains also take some surprising shortcuts. In a study published in the Journal of Personality and Social Psychology, Virginia Tech psychologist Kimberlee Weaver shows that the more easily we recall something the more likely we are to think of it as being true. It’s a useful shortcut since, typically, easily recalled information really is true. But combine this rule with the brain’s tendency to better remember bits of information that are repeated frequently, and we can run into trouble: We’re likely to believe anything we hear repeated frequently enough. At FactCheck.org we’ve noted how political spin-masters exploit this tendency ruthlessly, repeating dubious or false claims endlessly until, in the minds of many voters, they become true. Making matters worse, a study by Hebrew University’s Mayo shows that people often forget “denial tags.” Thus many people who hear the phrase “Iraq does not possess WMDs” will remember “Iraq” and “possess WMDs” while forgetting the “does not” part.
The counter to this requires an understanding of how it is that the brain forms beliefs.
In 1641, French philosopher René Descartes suggested that the act of understanding an idea comes first; we accept the idea only after evaluating whether or not it rings true. Thirty-six years later, the Dutch philosopher Baruch de Spinoza offered a very different account of belief formation. Spinoza proposed that understanding and believing happen simultaneously. We might come to reject something we held to be true after considering it more carefully, but belief happens prior to the examination. On Spinoza’s model, the brain forms beliefs automatically. Rejecting a belief requires a conscious act.
Unfortunately, not everyone bothers to examine the ideas they encounter. On the Cartesian model, that failure results in neither belief nor disbelief. But on the Spinozan model we end up with a lot of unexamined (and often false) convictions.
One might rightly wonder how a 17th-century philosophical dispute could possibly be relevant to modern myth-busting. Interestingly, though, Harvard psychologist Daniel T. Gilbert designed a series of experiments aimed specifically at determining whether Descartes or Spinoza got it right. Gilbert’s verdict: Spinoza is the winner. People who fail to carry through the evaluation process are likely to believe whatever statements they read. Gilbert concludes that “[p]eople do have the power to assent, to reject, and to suspend their judgment, but only after they have believed the information to which they have been exposed.”
Gilbert’s studies show that, initially at least, we do believe everything we hear. But it’s equally obvious that we reject many of those beliefs, sometimes very quickly and other times only after considerable work. We may not be skeptical by nature, but we can nonetheless learn to be skeptical. Iowa State’s Gary Wells has shown that social interaction with those who have correct information is often sufficient to counter false views. Indeed, a study published in the Journal of Applied Psychology by the University of Southern California’s Peter Kim shows that meeting a charge (regardless of its truth or falsity) with silence increases the chances that others will believe the claim. Giving false claims a free pass, in other words, is more likely to result in false beliefs (a notion with which 2004 presidential candidate John Kerry, who didn’t immediately respond to accusations by a group called Swift Boat Veterans for Truth about his Vietnam record, is all too familiar).
So, yes, a big ad budget often trumps the truth, but that doesn’t mean we should go slumping off in existential despair. You see, the Spinozan model shows that we will believe whatever we hear only if the process of evaluating those beliefs is somehow short-circuited. Humans are not helpless automatons in the face of massive propaganda. We may initially believe whatever we hear, but we are fully capable of evaluating and rejecting beliefs that turn out not to be accurate. Our brains don’t do this naturally; maintaining a healthy skeptical attitude requires some conscious effort on our part. It also requires a basic understanding of logic – and it requires accurate information. That’s where this Web site comes in.
If busting myths has some bad consequences, allowing false information to flow unchecked is far worse. Facts are essential if we are to overcome our brain’s tendency to believe everything it hears. As a species, we’re still pretty new to that whole process. Aristotle invented logic just 2,500 years ago – a mere blink of the eye when compared with the 200,000 years we Homo sapiens relied on our brain’s reflex responses to avoid being eaten by lions. We still have a long way to go. Throw in a tsunami of ads and Internet bluster and the path gets even harder, which is why we’re delighted to find new allies at PolitiFact.com and the Washington Post’s FactChecker. We’ll continue to bring you the facts. And you can continue to use them wisely.
Sources
Descartes, Rene. Principles of Philosophy. Tr. John Cottingham. Cambridge: Cambridge University Press, 1985 [1644].
Gilbert, Daniel T., Romin W. Tafarodi and and Patrick S. Malone. “You Can’t Not Believe Everything Your Read.” Journal of Personality and Social Psychology 65.2 (1993): 221-233.
Kim, Peter H., et al. “Silence Speaks Volumes: The Effectiveness of Reticence in Comparison to Apology and Denial for Responding to Integrity- and Competence-Based Trust Violations.” Journal of Applied Psychology 92.4 (2007): 893-908.
Mayo, Ruth, Yaacov Schul and Eugene Burnstein. “‘I Am Not Guilty’ vs. ‘I Am Innocent’: Successful Negation May Depend on the Schema Used for its Encoding.” Journal of Experimental Social Psychology 40.4 (2004): 433-449.
Spinoza, Baruch de. Ethics. Tr. Edwin Curley. Princeton: Princeton University Press, 1994 [1677].
Weaver, Kimberlee, et al. “Inferring the Popularity of an Opinion from its Familiarity: A Repetitive Voice Can Sound Like a Chorus.” Journal of Personality and Social Psychology 92.5 (2007): 821-833.
Wright, E.F. and Gary L. Wells. “Does Group Discussion Attenuate the Dispositional Bias?” Journal of Applied Psychology 15 (1985): 531-546.