Tuesday, January 4, 2011

The Truth, If You Can Handle It: Why People Continue To Believe Lies in the Face of Contrary Evidence

There's an old adage in politics that if you repeat a lie often enough, people will believe it. History's most effective propagandaists - Hitler, for instance - understood this principle well. As a PR professional and student of mass communications, I am forever fascinated by the pervasiveness of lies in public discourse and the percentage of people who believe them.

I have often tried to counsel my trial lawyer clients that the truth will not set them free - that a majority of Americans believe our civil justice system is awash in frivolous litigation and no amount of evidence is going to convince them otherwise. Because they are trained to rely on evidence to prove their cases, it is a difficult concept for most attorneys to accept.

For example, in Pennsylvania, which was at the epicenter of a nationwide debate several years ago over limiting lawsuits for medical malpractice, the number of malpractice lawsuits has actually dropped by 40 percent over the past eight years and the compensation being paid to victims is down 50 percent. Meanwhile, physicians' insurance premiums have skyrocketed. If we can just get that information in front of people, trial attorneys believe, we can win the public debate over tort reform.

In a courtroom, maybe. But in the court of public opinion, whichever side in a particular debate frames the issue and gets their message out most effectively wins, regardless of the truth. Once a person adopts a particular position based on "what they've heard," it is very difficult, if not impossible, to change their minds. Why? Because people basically hear what they want to believe. A viewpoint that hews most closely to their own politics or worldview is most likely going to be the position they adopt.

An op-ed from the June 27, 2008, New York Times by two college professors, Sam Wang and Sandra Aamodt, sheds some light on the question. Here are some excerpts:
"The brain does not simply gather and stockpile information as a computer’s hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man’s curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don’t remember how you learned it.

This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.

With time, this misremembering only gets worse. A false statement from a noncredible source that is at first not believed can gain credibility during the months it takes to reprocess memories from short-term hippocampal storage to longer-term cortical storage. As the source is forgotten, the message and its implications gain strength ...

Even if they do not understand the neuroscience behind source amnesia, campaign strategists can exploit it to spread misinformation. They know that if their message is initially memorable, its impression will persist long after it is debunked. In repeating a falsehood, someone may back it up with an opening line like “I think I read somewhere” or even with a reference to a specific source.

In one study, a group of Stanford students was exposed repeatedly to an unsubstantiated claim taken from a Web site that Coca-Cola is an effective paint thinner. Students who read the statement five times were nearly one-third more likely than those who read it only twice to attribute it to Consumer Reports (rather than The National Enquirer, their other choice), giving it a gloss of credibility.

Adding to this innate tendency to mold information we recall is the way our brains fit facts into established mental frameworks. We tend to remember news that accords with our worldview, and discount statements that contradict it.

In another Stanford study, 48 students, half of whom said they favored capital punishment and half of whom said they opposed it, were presented with two pieces of evidence, one supporting and one contradicting the claim that capital punishment deters crime. Both groups were more convinced by the evidence that supported their initial position ... 

Journalists and campaign workers may think they are acting to counter misinformation by pointing out that it is not true. But by repeating a false rumor, they may inadvertently make it stronger."
Obviously, why people believe misinformation defies easy explanation. Some theorists believe people are most effectively motivated by fear - that the reptilian part of the brain (the part that contains the survival instinct) is the part you have to activate if you want to persuade people. There is certainly ample evidence to support such a theory.

Whatever the reasons, the truth is you can get at least some people - even a majority - to believe almost anything.

No comments:

Post a Comment