Table of Contents
Robert Todd Carroll

 logo.gif (4146 bytes)
SkepDic.com


Click to order from Amazon

For information on ordering from B&N or from other countries, click here

vertline.gif (1078 bytes)

 

cognitive dissonance

"There's no success like failure...." -- Bob Dylan, Love Minus Zero

Cognitive dissonance is a theory of human motivation that asserts that it is psychologically uncomfortable to hold contradictory cognitions. The theory is that dissonance, being unpleasant, motivates a person to change his cognition, attitude, or behavior. This theory was first explored in detail by social psychologist Leon Festinger, who described it this way:

Dissonance and consonance are relations among cognitions that is, among opinions, beliefs, knowledge of the environment, and knowledge of one's own actions and feelings. Two opinions, or beliefs, or items of knowledge are dissonant with each other if they do not fit together; that is, if they are inconsistent, or if, considering only the particular two items, one does not follow from the other (Festinger 1956: 25).

He argued that there are three ways to deal with cognitive dissonance. He did not consider these mutually exclusive.

  1. One may try to change one or more of the beliefs, opinions, or behaviors involved in the dissonance;

  2. One may try to acquire new information or beliefs that will increase the existing consonance and thus cause the total dissonance to be reduced; or,

  3. One may try to forget or reduce the importance of those cognitions that are in a dissonant relationship (Festinger 1956: 25-26).

For example, people who smoke know smoking is a bad habit. Some rationalize their behavior by looking on the bright side: They tell themselves that smoking helps keep the weight down and that there is a greater threat to health from being overweight than from smoking. Others quit smoking. Most of us are clever enough to come up with ad hoc hypotheses or rationalizations to save cherished notions. Why we can't apply this cleverness more competently is not explained by noting that we are led to rationalize because we are trying to reduce or eliminate cognitive dissonance. Different people deal with psychological discomfort in different ways. Some ways are clearly more reasonable than others. So, why do some people react to dissonance with cognitive competence, while others respond with cognitive incompetence?

Cognitive dissonance has been called "the mind controller's best friend" (Levine 2003: 202). Yet, a cursory examination of cognitive dissonance reveals that it is not the dissonance, but how people deal with it, that would be of interest to someone trying to control others when the evidence seems against them.

For example, Marian Keech was the leader of a UFO cult in the 1950s. She claimed to get messages from extraterrestrials, known as The Guardians, through automatic writing. Like the Heaven's Gate folks forty years later, Keech and her followers, known as The Seekers or The Brotherhood of the Seven Rays, were waiting to be picked up by flying saucers. In Keech's prophecy, her group of eleven was to be saved just before the earth was to be destroyed by a massive flood on December 21, 1954. When it became evident that there would be no flood and the Guardians weren't stopping by to pick them up, Keech

became elated. She said she'd just received a telepathic message from the Guardians saying that her group of believers had spread so much light with their unflagging faith that God had spared the world from the cataclysm (Levine 2003: 206).

More important, the Seekers didn't abandon her. Most became more devoted after the failed prophecy. (Only two left the cult when the world didn't end.) "Most disciples not only stayed but, having made that decision, were now even more convinced than before that Keech had been right all along....Being wrong turned them into true believers (ibid.)." Some people will go to bizarre lengths to avoid inconsistency between their cherished beliefs and the facts. But why do people interpret the same evidence in contrary ways?

The Seekers would not have waited for the flying saucer if they thought it might not come. So, when it didn't come, one would think that a competent thinker would have seen this as falsifying Keech's claim that it would come. However, the incompetent thinkers were rendered incompetent by their devotion to Keech. Their belief that a flying saucer would pick them up was based on faith, not evidence. Likewise, their belief that the failure of the prophesy shouldn't count against their belief was another act of faith. With this kind of irrational thinking, it may seem pointless to produce evidence to try to persuade people of the error of their ways. Their belief is not based on evidence, but on devotion to a person. That devotion can be so great that even the most despicable behavior by one's prophet can be rationalized. There are many examples of people so devoted to another that they will rationalize or ignore extreme mental and physical abuse by their cult leader (or spouse or boyfriend). If the basis for a person's belief is irrational faith grounded in devotion to a powerful personality, then the only option that person has when confronted with evidence that should undermine her faith would seem to be to continue to be irrational, unless her faith was not that strong to begin with. The interesting question, then, is not about cognitive dissonance but about faith. What was it about Keech that led some people to have faith in her and what was it about those people that made them vulnerable to Keech? And what was different about the two who left the cult?

"Research shows that three characteristics are related to persuasiveness: perceived authority, honesty, and likeability" (ibid. 31). Furthermore, if a person is physically attractive, we tend to like that person and the more we like a person the more we tend to trust him or her (ibid. 57). Research also show that "people are perceived as more credible when they make eye contact and speak with confidence, no matter what they have to say" (ibid. 33).

According to Robert Levine, "studies have uncovered surprisingly little commonality in the type of personality that joins cults: there's no single cult-prone personality type" (ibid. 144). This fact surprised Levine. When he began his investigation of cults he "shared the common stereotype that most joiners were psychological misfits or religious fanatics" (ibid. 81). What he found instead was that many cult members are attracted to what appears to be a loving community. "One of the ironies of cults is that the craziest groups are often composed of the most caring people (ibid. 83)." Levine says of cult leader Jim Jones that he was "a supersalesman who exerted most every rule of persuasion" (ibid. 213). He had authority, perceived honesty, and likeability. It is likely the same could be said of Marian Keech. It also seems likely that many cult followers have found a surrogate family and a surrogate mother or father or both in the cult leader.

It should also be remembered that in most cases people have not arrived at their irrational beliefs overnight. They have come to them over a period of time with gradually escalated commitments (ibid. chapter 7). Nobody would join a cult if the pitch were: "Follow me. Drink this poisoned-but-flavored water and commit suicide." Yet, not everybody in the cult drank the poison and two of Keech's followers quit the cult when the prophecy failed. How were they different from the others? The explanation seems simple: their faith in their leader was weak. According to Festinger, the two who left Keech--Kurt Freund and Arthur Bergen--were lightly committed to begin with (Festinger 1956: 208).

Even people who erroneously think their beliefs are scientific may come by their notions gradually and their commitment may escalate to the point of irrationality. Psychologist Ray Hyman provides a very interesting example of cognitive dissonance and how one chiropractor dealt with it.

Some years ago I participated in a test of applied kinesiology at Dr. Wallace Sampson's medical office in Mountain View, California. A team of chiropractors came to demonstrate the procedure. Several physician observers and the chiropractors had agreed that chiropractors would first be free to illustrate applied kinesiology in whatever manner they chose. Afterward, we would try some double-blind tests of their claims.

The chiropractors presented as their major example a demonstration they believed showed that the human body could respond to the difference between glucose (a "bad" sugar) and fructose (a "good" sugar). The differential sensitivity was a truism among "alternative healers," though there was no scientific warrant for it. The chiropractors had volunteers lie on their backs and raise one arm vertically. They then would put a drop of glucose (in a solution of water) on the volunteer's tongue. The chiropractor then tried to push the volunteer's upraised arm down to a horizontal position while the volunteer tried to resist. In almost every case, the volunteer could not resist. The chiropractors stated the volunteer's body recognized glucose as a "bad" sugar. After the volunteer's mouth was rinsed out and a drop of fructose was placed on the tongue, the volunteer, in just about every test, resisted movement to the horizontal position. The body had recognized fructose as a "good" sugar.

After lunch a nurse brought us a large number of test tubes, each one coded with a secret number so that we could not tell from the tubes which contained fructose and which contained glucose. The nurse then left the room so that no one in the room during the subsequent testing would consciously know which tubes contained glucose and which fructose. The arm tests were repeated, but this time they were double-blind -- neither the volunteer, the chiropractors, nor the onlookers was aware of whether the solution being applied to the volunteer's tongue was glucose or fructose. As in the morning session, sometimes the volunteers were able to resist and other times they were not. We recorded the code number of the solution on each trial. Then the nurse returned with the key to the code. When we determined which trials involved glucose and which involved fructose, there was no connection between ability to resist and whether the volunteer was given the "good" or the "bad" sugar.

When these results were announced, the head chiropractor turned to me and said, "You see, that is why we never do double-blind testing anymore. It never works!" At first I thought he was joking. It turned it out he was quite serious. Since he "knew" that applied kinesiology works, and the best scientific method shows that it does not work, then -- in his mind -- there must be something wrong with the scientific method. (Hyman 1999)

What distinguishes the chiropractor's rationalization from the cult member's is that the latter is based on pure faith and devotion to a guru or prophet, whereas the former is based on evidence from experience. Neither belief can be falsified because the believers won't let them be falsified: Nothing can count against them. Those who base their beliefs on experience and what they take to be empirical or scientific evidence (e.g., astrologers, palm readers, mediums, psychics, the intelligent design folks, and the chiropractor) make a pretence of being willing to test their beliefs. They only bother to submit to a test of their ideas to get proof for others. That is why we refer to their beliefs as pseudosciences. We do not refer to the beliefs of cult members as pseudoscientific, but as faith-based irrationality.


See also hidden persuaders, Occam's razor, and pathological science.

further reading

Festinger, Leon. A Theory of Cognitive Dissonance (Stanford University Press 1957).

Festinger Leon. When Prophecy Fails: A Social and Psychological Study (Harpercollins 1964). (Originally published in 1956 by the University of Minnesota Press.)

Harmon-Jones, Eddie and Judson Mills, editors. Cognitive Dissonance: Progress on a Pivotal Theory in Social Psychology (American Psychological Association 1999).

Hyman, Ray. "The Mischief-Making of Ideomotor Action," in the Scientific Review of Alternative Medicine 3(2):34-43, 1999. Originally published as "How People Are Fooled by Ideomotor Action."

Levine, Robert. The Power of Persuasion - How We're Bought and Sold  (John Wiley & Sons 2003).

©copyright 2005
Robert Todd Carroll

larrow.gif (1051 bytes) codependency

Last updated 06/29/05

cold reading rarrow.gif (1048 bytes)

 
Google
 
Web skepdic.com
1