Because this is a corporate blog and not a political one, I normally steer clear of political issues. This post is no different. Although it focuses on studies that touch on why campaigns use certain tactics, the underlying theories are important in a number of areas including security, finance, and development. In each of those areas, perception is often as important (or even more important) than reality. We generally dismiss much of what is said in advertisements because we know that ad agencies often ignore the truth to push a product. For example, Lehman Brothers advertising slogan was “Where Vision Gets Built” and AIG has used the slogan “We Know Money.” And who could forget Merrill Lynch’s famous slogan, “We’re Bullish on America.” Basically, this blog is about how lies, half-truths, and misrepresentations and how they get accepted as truth. It’s an important topic because believing lies can lead to disastrous consequences. It was lies broadcast over the radio in Rwanda, for example, that led to the 1994 genocide there. Hutus were led to believe that their Tutsi neighbors were coming to kill them and to save themselves and their families they had to strike first. As a result, hundreds of thousands of people were slaughtered by machetes and cudgels wielded by once-friendly neighbors.
There has been a lot said and written in the media about the lies, half-truths, and misrepresentations being made by U.S. presidential candidates. Even Karl Rove, who many believe is the master of half-truths, has noted that both candidates have “crossed the line” in some of their campaign ads. Washington Post columnist Shankar Vedantam reports that the reason campaigns resort to such tactics is because they work [“The Power of Political Misinformation,” 15 September 2008]. He writes:
“Have you seen the photo of Republican vice presidential nominee Sarah Palin brandishing a rifle while wearing a U.S. flag bikini? Have you read the e-mail saying Democratic presidential nominee Barack Obama was sworn into the U.S. Senate with his hand placed on the Koran? Both are fabricated — and are among the hottest pieces of misinformation in circulation. As the presidential campaign heats up, intense efforts are underway to debunk rumors and misinformation. Nearly all these efforts rest on the assumption that good information is the antidote to misinformation.”
The Bible claims “the truth will set you free.” It doesn’t promise that it will get you elected. Vedantam reports why this so.
“A series of new experiments show that misinformation can exercise a ghostly influence on people’s minds after it has been debunked — even among people who recognize it as misinformation. In some cases, correcting misinformation serves to increase the power of bad information.”
Propagandists have known this for a long time. Say something often enough, even if it’s totally false, and people will begin to believe it. Vedantam discusses experiments that prove the point.
“In experiments conducted by political scientist John Bullock at Yale University, volunteers were given various items of political misinformation from real life. One group of volunteers was shown a transcript of an ad created by NARAL Pro-Choice America that accused John G. Roberts, Jr., President Bush’s nominee to the Supreme Court at the time, of ‘supporting violent fringe groups and a convicted clinic bomber.’ A variety of psychological experiments have shown that political misinformation primarily works by feeding into people’s preexisting views. People who did not like Roberts to begin with, then, ought to have been most receptive to the damaging allegation, and this is exactly what Bullock found. Democrats were far more likely than Republicans to disapprove of Roberts after hearing the allegation. Bullock then showed volunteers a refutation of the ad by abortion-rights supporters. He also told the volunteers that the advocacy group had withdrawn the ad. Although 56 percent of Democrats had originally disapproved of Roberts before hearing the misinformation, 80 percent of Democrats disapproved of the Supreme Court nominee afterward. Upon hearing the refutation, Democratic disapproval of Roberts dropped only to 72 percent. Republican disapproval of Roberts rose after hearing the misinformation but vanished upon hearing the correct information. The damaging charge, in other words, continued to have an effect even after it was debunked among precisely those people predisposed to buy the bad information in the first place.”
It seems that people have a predisposition to hear things or seek out opinions that support their core beliefs. Conservatives listen to Fox News and liberals read the New York Times. Vedantam continues:
“Political scientists Brendan Nyhan and Jason Reifler provided two groups of volunteers with the Bush administration’s prewar claims that Iraq had weapons of mass destruction. One group was given a refutation — the comprehensive 2004 Duelfer report that concluded that Iraq did not have weapons of mass destruction before the United States invaded in 2003. Thirty-four percent of conservatives told only about the Bush administration’s claims thought Iraq had hidden or destroyed its weapons before the U.S. invasion, but 64 percent of conservatives who heard both claim and refutation thought that Iraq really did have the weapons. The refutation, in other words, made the misinformation worse. … In a paper approaching publication, Nyhan, a PhD student at Duke University, and Reifler, at Georgia State University, suggest that Republicans might be especially prone to the backfire effect because conservatives may have more rigid views than liberals: Upon hearing a refutation, conservatives might ‘argue back’ against the refutation in their minds, thereby strengthening their belief in the misinformation. Nyhan and Reifler did not see the same ‘backfire effect’ when liberals were given misinformation and a refutation about the Bush administration’s stance on stem cell research. Bullock, Nyhan and Reifler are all Democrats. Reifler questioned attempts to debunk rumors and misinformation on the campaign trail, especially among conservatives: ‘Sarah Palin says she was against the Bridge to Nowhere,’ he said, referring to the pork-barrel project Palin once supported before she reversed herself. ‘Sending those corrections to committed Republicans is not going to be effective, and they in fact may come to believe even more strongly that she was always against the Bridge to Nowhere.'”
Even if these experiments indicate that effects they test are real, they don’t explain why. Sam Wang, an associate professor of molecular biology and neuroscience at Princeton, and Sandra Aamodt, a former editor in chief of Nature Neuroscience, attempt to explain this phenomenon in article published in the New York Times’ [“Your Brain Lies to You,” 27 June 2008]. They indicate that the way our brain is wired helps explain why we believe things that can be proven false:
“False beliefs are everywhere. Eighteen percent of Americans think the sun revolves around the earth, one poll has found. Thus it seems slightly less egregious that, according to another poll, 10 percent of us think that Senator Barack Obama, a Christian, is instead a Muslim. The Obama campaign has created a Web site to dispel misinformation. But this effort may be more difficult than it seems, thanks to the quirky way in which our brains store memories — and mislead us along the way. The brain does not simply gather and stockpile information as a computer’s hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man’s curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don’t remember how you learned it. This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.”
Apparently, the longer a false truth resides in our memories, the more true it becomes to us. Wang and Aamodt explain:
“With time, this misremembering only gets worse. A false statement from a noncredible source that is at first not believed can gain credibility during the months it takes to reprocess memories from short-term hippocampal storage to longer-term cortical storage. As the source is forgotten, the message and its implications gain strength. This could explain why, during the 2004 presidential campaign, it took some weeks for the Swift Boat Veterans for Truth campaign against Senator John Kerry to have an effect on his standing in the p
olls.”
As I noted earlier, propagandists have known that if you tell lies long enough people begin to believe them. Nazi Germany propagandists exploited the “big lie” about as well as anybody has. Today’s political strategists also understand the effect.
“Even if they do not understand the neuroscience behind source amnesia, campaign strategists can exploit it to spread misinformation. They know that if their message is initially memorable, its impression will persist long after it is debunked. In repeating a falsehood, someone may back it up with an opening line like ‘I think I read somewhere’ or even with a reference to a specific source. In one study, a group of Stanford students was exposed repeatedly to an unsubstantiated claim taken from a Web site that Coca-Cola is an effective paint thinner. Students who read the statement five times were nearly one-third more likely than those who read it only twice to attribute it to Consumer Reports (rather than The National Enquirer, their other choice), giving it a gloss of credibility. Adding to this innate tendency to mold information we recall is the way our brains fit facts into established mental frameworks. We tend to remember news that accords with our worldview, and discount statements that contradict it.”
One would have hoped that Internet would have made us all more open-minded as we gained access to many more sources of information than any previous generation. Unfortunately, the opposite has been true. When there were just a few sources of information, like major network news shows, Americans were all hearing basically the same news. Nowadays, as I noted earlier, people search for sources of news that support their worldview — which means that we are listening to fewer points of view and becoming more closed-minded. At least that cuts down on the cognitive dissonance.
“In another Stanford study, 48 students, half of whom said they favored capital punishment and half of whom said they opposed it, were presented with two pieces of evidence, one supporting and one contradicting the claim that capital punishment deters crime. Both groups were more convinced by the evidence that supported their initial position. Psychologists have suggested that legends propagate by striking an emotional chord. In the same way, ideas can spread by emotional selection, rather than by their factual merits, encouraging the persistence of falsehoods about Coke — or about a presidential candidate.”
The natural tendency, of course, is try to counter false ads as soon as they are published or broadcast. Candidates maintain political SWAT teams that react immediately to provide information to the media when the other side lets loose a broadside of half-truths. They are probably wasting their time.
“Journalists and campaign workers may think they are acting to counter misinformation by pointing out that it is not true. But by repeating a false rumor, they may inadvertently make it stronger. In its concerted effort to ‘stop the smears,’ the Obama campaign may want to keep this in mind. Rather than emphasize that Mr. Obama is not a Muslim, for instance, it may be more effective to stress that he embraced Christianity as a young man. Consumers of news, for their part, are prone to selectively accept and remember statements that reinforce beliefs they already hold. In a replication of the study of students’ impressions of evidence about the death penalty, researchers found that even when subjects were given a specific instruction to be objective, they were still inclined to reject evidence that disagreed with their beliefs. In the same study, however, when subjects were asked to imagine their reaction if the evidence had pointed to the opposite conclusion, they were more open-minded to information that contradicted their beliefs. Apparently, it pays for consumers of controversial news to take a moment and consider that the opposite interpretation may be true.”
That last point provides a glimmer of hope that we can be open-minded. I think it also indicates why a growing number of Americans identify themselves as independents rather than supporters of either major political party. The parties have made politics in America so divisive that many people are simply turning away from politics altogether. In a democracy, that is bad thing. Wang and Asmodt conclude:
“In 1919, Justice Oliver Wendell Holmes of the Supreme Court wrote that ‘the best test of truth is the power of the thought to get itself accepted in the competition of the market.’ Holmes erroneously assumed that ideas are more likely to spread if they are honest. Our brains do not naturally obey this admirable dictum, but by better understanding the mechanisms of memory perhaps we can move closer to Holmes’s ideal.”
Most of us like to believe that we can recognize and accept the truth when we see it. The evidence indicates, however, that we may be fooling ourselves. We all suffer from “source amnesia” and it would serve us well to take a moment and consider that what we accept as the truth may not be. A little open-mindedness could go a long way to bringing civility back into politics.