I embrace reason and a desire to understand the universe. While we certainly don't have all of the answers, that doesn't mean we stop asking questions. There is always something interesting to learn about.
I love asking questions. One that has puzzled me in the last year or so is why people cling to false beliefs so strongly. No matter how much I reason with a True Believer TM or try to educate them with facts, they just dig in their heels, sit on their haunches and bray their myths and lies over and over. They block their ears, squint their eyes shut and cry out, "I can't hear you, I won't listen… lalalalalalalalala…."
Yesterday I found a new study on ScienceDaily titled "There Must Be a Reason: Osama, Saddam and Inferred Justification". It was published in the most recent issue of Sociological Inquiry. It's a study by four research institutions to understand why people kept believing that Saddam was involved in the 9/11 terrorist attacks even once it was obvious that there was no evidence supporting that claim.
But that's not why I was so fascinated. What caught my attention was the underlying research about false beliefs.
Some people form and cling to false beliefs despite overwhelming evidence thanks to a mental phenomenon called motivated reasoning, says sociologist Steven Hoffman, visiting assistant professor at the University at Buffalo and co-author of the study. "Rather than search rationally for information that either confirms or disconfirms a particular belief, people actually seek out information that confirms what they already believe. For the most part," says Hoffman, "people completely ignore contrary information" and are able to "develop elaborate rationalizations based on faulty information." (Newsweek)
Now, let's think about the belief in a god that created the world in 6 days about 6,000 to 10,000 years ago, shall we? And of course, Jesus who died on the cross to save us all from hellfire and damnation. So let's talk a bit about motivated reasoning:
On page 2 of the paper it states: This model [motivated reasoning] envisions respondents as processing and responding to information defensively, accepting and seeking out confirming information, while ignoring, discrediting the source of, or arguing against the substance of contrary information (DiMaggio 1997; Kunda 1990; Lodge and Tabor 2000).
Motivated reasoning is a descendant of the social psychological theory of cognitive dissonance (Festinger and Carlsmith 1959; Kunda 1990), which posits an unconscious impulse to relieve cognitive tension when a respondent is presented with information that contradicts preexisting beliefs or preferences. Recent literature on motivated reasoning builds on cognitive dissonance theory to explain how citizens relieve cognitive dissonance: they avoid inconsistency, ignore challenging information altogether, discredit the information source, or argue substantively against the challenge (Jobe, Tourangeau, and Smith 1993; Lodge and Taber 2000; Westen et al. 2006).
What happens when one set of ideas (about god the father and his peace-loving hippie son Jesus) is repeated over and over to you when you are a child? You haven't even formed your ego before age 6, and already you know about Jesus dying on the cross for you, or whatever myths your parents and religious community has brainwashed you with. When you get older and information starts to conflict with these beliefs, how do you handle it?
Cognitive Dissonance: an uncomfortable feeling caused by holding two contradictory ideas simultaneously. The "ideas" or "cognitions" in question may include attitudes and beliefs, the awareness of one's behavior, and facts. The theory of cognitive dissonance proposes that people have a motivational drive to reduce dissonance by changing their attitudes, beliefs, and behaviors, or by justifying or rationalizing their attitudes, beliefs, and behaviors.
Of course the easiest way to deal with cognitive dissonance is to justify and rationalize your beliefs, and deny anything that might contradict them. That brings us to:
Confirmation Bias: an irrational tendency to search for, interpret or remember information in a way that confirms one's preconceptions or working hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning.
Back to the paper (page 3): This confirmation bias means that people value evidence that confirms their previously held beliefs more highly than evidence that contradicts them, regardless of the source (DiMaggio 1997; Nickerson 1998, Wason 1968).So how do people go about dealing with contradictory information to their beliefs? There are several options: (% is how many people in the study used the different strategies)
- Bayesian Updating: the willingness to change your mind in the face of contradictory information from a trusted source. To successfully change your opinion you need to attend to new information, remember it, and then use it in decision making. (page 7 of the study) (2.04% of respondents actually changed their mind when given contradictory information)
- Deny you ever believed the contradictory information in the first place. (page 9) (14.29% in the study simply denied that they had ever held the false belief)
- Resist the new information: (page 11)
- Counter-argue: directly rebut the contradictory information. (12.24%)
- Attitude Bolstering: bring facts that support your position to mind without directly refuting the contradictory information. Change the subject and justify your opinion with other reasons that seem good to you. (32.65%)
- Selective Exposure: Ignore the contradictory information without rebutting or supporting other positions. Just say you don't know. (6.12%)
- Dispute Rationality: argue that opinions do not have to be grounded in facts or reasoning. (16.33%)
- Inferred Justification: infer evidence which would support your beliefs. This is a sort of backward chain of reasoning. You start with something you believe strongly and work backward to find support for it. (14.29%) (with another 2.04% possibly also falling in this category)
Oh, and regarding attentiveness to the contradictory information by the study's respondents, the researchers said,
"We found that while all of our “above average” respondents did attend to the information, a substantial minority of our less well-informed respondents—nearly one-third—ignored or refused to engage with information that challenged their political preferences, even when that information came from a source they favor. (page 16)How many times have I had a conversation with a religious person only to have them pointedly ignore the information I have? Lots. Or when they quote from their own bible and only pick out the parts that support their set beliefs, and blatantly ignore the parts they don't understand or don't like? Cherry-picking, anyone?
So this is all food for thought. It at least starts us thinking why most people cling to their gods and myths like safety blankets. It certainly gives us an idea as to how they resist the facts, logic and new information they may come across in favor of their old comfortable beliefs.
Note: I need to clarify something. Belief is a powerful force in our minds that we can all fall into. Quite some time ago I wrote about how knowledge and beliefs are stored differently in the brain. I am sure I have justified my beliefs many times in my life. You probably have as well. I think the difference is when some people are confronted with conflicting information, they attend to it, maybe do some research on it, process it, then use it to change their minds as necessary. I've done that many times. The more I practice re-analyzing old beliefs and getting rid of ones that have no basis in reality, or ones that don't fit my ethics and understanding of the universe, the easier it gets. I think that's what makes some people skeptics and atheists. It can be quite challenging.
I think there is a difference between a skeptic who is willing to look at different information and change their mind and someone who is dogmatic. That is who is addressed here.
What are your thoughts?