Robert Todd Carroll
For information on ordering from B&N or from other countries, click here
Cognitive dissonance is a theory of human motivation that asserts that it is psychologically uncomfortable to hold contradictory cognitions. The theory is that dissonance, being unpleasant, motivates a person to change his cognition, attitude, or behavior. This theory was first explored in detail by social psychologist Leon Festinger, who described it this way:
He argued that there are three ways to deal with cognitive dissonance. He did not consider these mutually exclusive.
For example, people who smoke know smoking is a bad habit. Some rationalize their behavior by looking on the bright side: They tell themselves that smoking helps keep the weight down and that there is a greater threat to health from being overweight than from smoking. Others quit smoking. Most of us are clever enough to come up with ad hoc hypotheses or rationalizations to save cherished notions. Why we can't apply this cleverness more competently is not explained by noting that we are led to rationalize because we are trying to reduce or eliminate cognitive dissonance. Different people deal with psychological discomfort in different ways. Some ways are clearly more reasonable than others. So, why do some people react to dissonance with cognitive competence, while others respond with cognitive incompetence?
Cognitive dissonance has been called "the mind controller's best friend" (Levine 2003: 202). Yet, a cursory examination of cognitive dissonance reveals that it is not the dissonance, but how people deal with it, that would be of interest to someone trying to control others when the evidence seems against them.
For example, Marian Keech was the leader of a UFO cult in the 1950s. She claimed to get messages from extraterrestrials, known as The Guardians, through automatic writing. Like the Heaven's Gate folks forty years later, Keech and her followers, known as The Seekers or The Brotherhood of the Seven Rays, were waiting to be picked up by flying saucers. In Keech's prophecy, her group of eleven was to be saved just before the earth was to be destroyed by a massive flood on December 21, 1954. When it became evident that there would be no flood and the Guardians weren't stopping by to pick them up, Keech
More important, the Seekers didn't abandon her. Most became more devoted after the failed prophecy. (Only two left the cult when the world didn't end.) "Most disciples not only stayed but, having made that decision, were now even more convinced than before that Keech had been right all along....Being wrong turned them into true believers (ibid.)." Some people will go to bizarre lengths to avoid inconsistency between their cherished beliefs and the facts. But why do people interpret the same evidence in contrary ways?
The Seekers would not have waited for the flying saucer if they thought it might not come. So, when it didn't come, one would think that a competent thinker would have seen this as falsifying Keech's claim that it would come. However, the incompetent thinkers were rendered incompetent by their devotion to Keech. Their belief that a flying saucer would pick them up was based on faith, not evidence. Likewise, their belief that the failure of the prophesy shouldn't count against their belief was another act of faith. With this kind of irrational thinking, it may seem pointless to produce evidence to try to persuade people of the error of their ways. Their belief is not based on evidence, but on devotion to a person. That devotion can be so great that even the most despicable behavior by one's prophet can be rationalized. There are many examples of people so devoted to another that they will rationalize or ignore extreme mental and physical abuse by their cult leader (or spouse or boyfriend). If the basis for a person's belief is irrational faith grounded in devotion to a powerful personality, then the only option that person has when confronted with evidence that should undermine her faith would seem to be to continue to be irrational, unless her faith was not that strong to begin with. The interesting question, then, is not about cognitive dissonance but about faith. What was it about Keech that led some people to have faith in her and what was it about those people that made them vulnerable to Keech? And what was different about the two who left the cult?
"Research shows that three characteristics are related to persuasiveness: perceived authority, honesty, and likeability" (ibid. 31). Furthermore, if a person is physically attractive, we tend to like that person and the more we like a person the more we tend to trust him or her (ibid. 57). Research also show that "people are perceived as more credible when they make eye contact and speak with confidence, no matter what they have to say" (ibid. 33).
According to Robert Levine, "studies have uncovered surprisingly little commonality in the type of personality that joins cults: there's no single cult-prone personality type" (ibid. 144). This fact surprised Levine. When he began his investigation of cults he "shared the common stereotype that most joiners were psychological misfits or religious fanatics" (ibid. 81). What he found instead was that many cult members are attracted to what appears to be a loving community. "One of the ironies of cults is that the craziest groups are often composed of the most caring people (ibid. 83)." Levine says of cult leader Jim Jones that he was "a supersalesman who exerted most every rule of persuasion" (ibid. 213). He had authority, perceived honesty, and likeability. It is likely the same could be said of Marian Keech. It also seems likely that many cult followers have found a surrogate family and a surrogate mother or father or both in the cult leader.
It should also be remembered that in most cases people have not arrived at their irrational beliefs overnight. They have come to them over a period of time with gradually escalated commitments (ibid. chapter 7). Nobody would join a cult if the pitch were: "Follow me. Drink this poisoned-but-flavored water and commit suicide." Yet, not everybody in the cult drank the poison and two of Keech's followers quit the cult when the prophecy failed. How were they different from the others? The explanation seems simple: their faith in their leader was weak. According to Festinger, the two who left Keech--Kurt Freund and Arthur Bergen--were lightly committed to begin with (Festinger 1956: 208).
Even people who erroneously think their beliefs are scientific may come by their notions gradually and their commitment may escalate to the point of irrationality. Psychologist Ray Hyman provides a very interesting example of cognitive dissonance and how one chiropractor dealt with it.
What distinguishes the chiropractor's rationalization from the cult member's is that the latter is based on pure faith and devotion to a guru or prophet, whereas the former is based on evidence from experience. Neither belief can be falsified because the believers won't let them be falsified: Nothing can count against them. Those who base their beliefs on experience and what they take to be empirical or scientific evidence (e.g., astrologers, palm readers, mediums, psychics, the intelligent design folks, and the chiropractor) make a pretence of being willing to test their beliefs. They only bother to submit to a test of their ideas to get proof for others. That is why we refer to their beliefs as pseudosciences. We do not refer to the beliefs of cult members as pseudoscientific, but as faith-based irrationality.
Festinger Leon. When Prophecy Fails: A Social and Psychological Study (Harpercollins 1964). (Originally published in 1956 by the University of Minnesota Press.)
Hyman, Ray. "The Mischief-Making of Ideomotor Action," in the Scientific Review of Alternative Medicine 3(2):34-43, 1999. Originally published as "How People Are Fooled by Ideomotor Action."
Robert Todd Carroll
|Last updated 06/29/05|