IT WAS SHOCKING NEWS: 39 PEOPLE WERE FOUND DEAD AT A LUXURY ESTATE IN RANCHO SANTA FE, CALIFORNIA, PARTICIPANTS IN A MASS SUICIDE. All were members of an obscure cult called Heaven’s Gate. Each body was laid out neatly, feet clad in brand-new black Nikes, face covered with a purple shroud.
The cult members died willingly and peacefully, leaving behind videotapes describing their reasons for suicide: They believed that the Hale-Bopp Comet, a recently discovered comet streaking across the night skies, was their ticket to a new life in paradise. They were convinced that in Hale-Bopp’s wake was a gigantic spaceship whose mission was to carry them off to a new incarnation.
To be picked up by the spaceship, they first needed to rid themselves of their current “containers.” That is, they needed to leave their own bodies by ending their lives. Alas, no spaceship ever came.
Several weeks before the mass suicide, some members of the cult purchased an expensive, high-powered telescope. They wanted to get a clearer view of the comet and the spaceship that they believed was traveling behind it. A few days later, they returned the telescope and politely asked for their money back. When the store manager asked them if they had problems with the scope, they replied, “Well, gosh, we found the comet, but we can’t find anything following it” (Ferris, 1997). Although the store manager tried to convince them that there was nothing wrong with the telescope and that nothing was following the comet, they remained unconvinced. Given their premise, their logic was impeccable: We know an alien spaceship is following behind the Hale-Bopp Comet. If an expensive telescope has failed to reveal that spaceship, then there is something wrong with the telescope.
Their thinking might strike you as strange, irrational, or stupid, but, generally speaking, the members of the Heaven’s Gate cult were none of those things. Neighbors who knew them considered them pleasant, smart, and reasonable. What is the process by which intelligent, sane people can succumb to such fantastic thinking and self-destructive behavior? We will attempt to explain their actions at the end of this chapter. For now, we will simply state that their behavior is not unfathomable. It is simply an extreme example of a normal human tendency: the need to justify our actions and commitments.
The Theory of Cognitive Dissonance
During the past half-century, social psychologists have discovered that one of the most powerful determinants of human behavior stems from our need to preserve a stable, positive self-image (Aronson, 1969, 1998). Most people believe they are above average—more ethical and competent, better drivers, better leaders, better judges of character, and more attractive than the majority (Fine, 2008; Gilovich, 1991). But if most of us see ourselves as reasonable, moral, and smart, what happens when we are confronted with information implying that we have behaved in ways that are unreasonable, immoral, or stupid? That is the subject of this chapter.
Maintaining a Positive Self-Image
The feeling of discomfort caused by performing an action that is discrepant from one’s self-concept is called cognitive dissonance. Leon Festinger (1957) was the first to investigate the precise workings of this phenomenon and elaborated his findings in what is arguably social psychology’s most important and most provocative theory.
Cognitive dissonance always produces discomfort, and in response we try to reduce it. The process is similar to the effects of hunger and thirst: Discomfort motivates us to eat or drink. But unlike satisfying hunger or thirst by eating or drinking, the path to reducing dissonance is not always simple or obvious. In fact, it can lead to fascinating changes in the way we think about the world and the way we behave. How can we reduce dissonance? There are three basic ways (see Figure 1):
• By changing our behavior to bring it in line with the dissonant cognition.
• By attempting to justify our behavior through changing one of the dissonant cognitions.
• By attempting to justify our behavior by adding new cognitions.
To illustrate each of these, let’s look at something that millions of people do several times a day: smoke cigarettes. If you are a smoker, you are likely to experience dissonance because you know that this behavior significantly increases the risks of lung cancer, emphysema, and earlier death. How can you reduce this dissonance? The most direct way is to change your behavior and give up smoking. Your behavior would then be consistent with your knowledge of the link between smoking and cancer. Although many people have succeeded in quitting, it’s not easy; many have tried and failed. What do these people do? It would be wrong to assume that they simply swallow hard, light up, and prepare to die. They don’t. Researchers studied the behavior and attitudes of heavy smokers who attended a smoking cessation clinic but then relapsed into heavy smoking again. What do you suppose the researchers discovered? Heavy smokers who tried to quit and failed managed to lower their perception of the dangers of smoking.
In this way, they could continue to smoke without feeling terrible about it (Gibbons, Eggleston, & Benthin, 1997). A study of more than 360 adolescent smokers found the same thing: the greater their dependence on smoking and the greater the trouble they had quitting, the more justifications they came up with to keep smoking (Kleinjan, van den Eijnden, & Engels, 2009).
Smokers can come up with some pretty creative justifications. Some convince themselves that the data linking cigarette smoking to cancer are inconclusive. Or they say that smoking is worth the risk of cancer and emphysema because it is so enjoyable, and besides it relaxes them and reduces nervous tension and in this way actually improves their health. Some add a cognition that allows them to focus on the vivid exception: “Look at my grandfather. He’s 87 years old, and he’s been smoking a pack a day since he was 12. That proves it’s not always bad for you.” Another popular way of reducing dissonance through adding a new cognition is self-affirmation, in which a person focuses on one or more of his or her good qualities to lessen the dissonant sting caused by doing something foolish: “Yeah, I feel pretty stupid to still be smoking, but boy am I a good cook. In fact, let me tell you about this new recipe . . .” (Steele, 1988; Mc Connell & Brown, 2010).
These justifications may sound silly to the nonsmoker, but that is our point. As the smokers’ rationales show, people experiencing dissonance will often deny or distort reality to reduce it. People who don’t want to give up scientifically discredited ideas, refuse to practice safe sex, or receive bad news about their health can be equally “creative” in denying evidence and reducing their discomfort (Aronson, 1997; Croyle & Jemmott, 1990; Kassarjian & Cohen, 1965; Leishman, 1988).
When you understand dissonance, you will see it in action all around you: in the politician who opposes prostitution but is caught with a high-priced call girl (“oh, a call girl isn’t really a prostitute”), in the people who predict the end of the world but who, fortunately, turn out to be wrong (“our prediction was accurate; we just used numbers from the wrong chapter of the Bible”). In one study, researchers wondered how gay men who were strongly identified with their Christian church dealt with anti-gay pronouncements from their ministers. One way to resolve dissonance would be to change their behavior—that is, to change their church or even leave their religion. But those who decide to stay in the church resolve dissonance by focusing on the shortcomings of the minister; for example, they say, “It’s not my religion that promotes this prejudice— it’s the bigotry of this particular preacher” (Pitt, 2010).
Why We Overestimate the Pain of Disappointment
Imagine that you have just interviewed for the job of your dreams. You expect to be very disappointed if you don’t get the job. Then, to your utter amazement, you don’t get the job. How long do you think your disappointment will last? The answer is: It depends on how successfully you reduce the dissonance caused by not getting the job. When you first get the bad news, you will be disappointed; however, more than likely you will soon put a spin on it that makes you feel better. It was a dead-end job anyway. And that interviewer was a jerk.
Interestingly, people often do not anticipate how successfully they will reduce dissonance. When people think about how they will react to future negative events, they show an impact bias, whereby they overestimate the intensity and duration of their negative emotional reactions. For example, people overestimate how dreadful they will feel following a romantic breakup, loss of a job, or not getting into the dorm they wanted (Dunn, Wilson, & Gilbert, 2003; Gilbert et al., 1998; Mellers & McGraw, 2001; Wilson & Gilbert, 2005).
Given that people have successfully reduced dissonance in the past, why is it that they are not aware that they will do so in the future? The answer is that the process of reducing dissonance is largely unconscious. Indeed, dissonance reduction works better that way (Gilbert et al., 1998). It is not very effective to hear ourselves say, “I’ll try to make myself feel better by convincing myself that the person who just rejected me is an idiot.” It is more effective if we unconsciously transform our view of the interviewer; we feel better believing that anyone could see that he is an idiot (Bem & McConnell, 1970; Goethals & Reckman, 1973). Because the dissonance-reduction process is mostly unconscious, we do not anticipate that it will save us from future anguish.
Self-Esteem and Dissonance
Who do you think feels the greatest dissonance after doing something cruel, foolish, or incompetent: a person with high self-esteem or low? The answer is the former; people with the highest self-esteem experience the most dissonance when they behave in ways that are contrary to their high opinion of themselves, and they will work harder to reduce it than will those with average levels of self-esteem. When people who have low self-esteem commit a stupid or immoral action, they do not feel as much dissonance, because the cognition “I have done an awful thing” is consonant with the cognition “I am a schlunk; I’m always doing awful things.”
In a classic experiment, researchers predicted that individuals who had been given a boost to their self-esteem would be less likely to cheat, if given the opportunity to do so, than individuals who had a lower opinion of themselves (Aronson & Mettee, 1968). After all, if you think of yourself as a decent person, cheating would be dissonant with that self-concept. However, people who have had a temporary blow to their self-esteem, and thus are feeling low and worthless, might be more likely to cheat at cards, kick their dog, or do any number of things consistent with having a low opinion of themselves.
In this experiment, the self-esteem of college students was temporarily modified by giving the subjects false information about their personalities. After taking a personality test, one-third of the students were given positive feedback; they were told that the test indicated that they were mature, interesting, deep, and so forth. Another third of the students were given negative feedback; they were told that the test revealed that they were relatively immature, uninteresting, shallow, and the like. The remaining one-third of the students were not given any information about the results of the test. Immediately afterward, the students were scheduled to participate in an experiment conducted by a different psychologist who had no apparent relation to the personality inventory. As part of this second experiment, the participants played a game of cards against some of their fellow students. They were allowed to bet money and keep whatever they won. In the course of the game, they were given a few opportunities to cheat and thereby win a sizable sum of cash. The findings confirmed the prediction of dissonance theory: The students who had gotten the positive feedback were least likely to take the opportunity to cheat; the students who had gotten the negative feedback were most likely to cheat; and the control group fell in between.
If high self-esteem can serve as a buffer against dishonest or self-defeating behavior because people strive to keep their self-concepts consonant with their actions, this research has wide-ranging applications. For example, many African American children believe that they “don’t have what it takes” to succeed academically, so they don’t work hard, so they don’t do as well as they might—all of this perfectly, if tragically, consonant. A team of social psychologists conducted a simple intervention, which they replicated three times with three different classrooms (Cohen et al., 2009). They bolstered African American children’s self-esteem by having them do structured, self-affirming writing assignments. The children had to focus their attention on their good qualities in areas outside of academics and their most important values (e.g., religion, music, or love for their family). This self-affirmation raised their general self-esteem, which in turn reduced their academic anxiety, resulting in better performance. The lowest-achieving black students benefitted the most, and the benefits persisted in a follow-up study two years later. Thus, changing the students’ negative self-perceptions had long-term benefits both on self-esteem and performance on objective exams.
Do these results sound too good to be true? They are not. Still, we must be cautious in generalizing from them. Bolstering self-esteem can’t be done in an artificial way. To be effective, this kind of intervention must be grounded in reality (Kernis, 2001). If a person were to look in the mirror and say, “Boy, I sure am terrific,” it is unlikely to help much; the person has to focus on his or her actual strengths, positive values, and good qualities and then strive to make them consonant with his or her actions.
Rational Behavior versus Rationalizing Behavior
Most people think of themselves as rational beings, and generally they are right: We are certainly capable of rational thought. But as we’ve seen, the need to maintain our self-esteem leads to thinking that is not always rational; rather, it is rationalizing. People who are in the midst of reducing dissonance are so involved with convincing themselves that they are right that they frequently end up behaving irrationally and maladaptively.
During the late 1950s, when segregation was still widespread, two social psychologists did a simple experiment in a southern town ( Jones & Kohler, 1959). They selected people who were deeply committed to a position on the issue of racial segregation: some strongly supported segregation; others opposed it just as strongly. Next, the researchers presented these individuals with a series of arguments on both sides of the issue. Some of the arguments were plausible, and others were rather silly. The question was: Which of the arguments would people remember best?
If the participants were behaving in a purely rational way, we would expect them to remember the plausible arguments best and the implausible arguments least, regardless of how they felt about segregation. But what does dissonance theory predict? A silly argument that supports your own position arouses some dissonance because it raises doubts about the wisdom of that position or the intelligence of people who agree with it. Likewise, a sensible argument on the other side of the issue also arouses some dissonance because it raises the possibility that the other side
might be smarter or more accurate than you had thought. Because these arguments arouse dissonance, we try not to think about them. This is exactly what the researchers found. The participants remembered the plausible arguments agreeing with their own position and the implausible arguments agreeing with the opposing position. Subsequent research has yielded similar results on many issues, from whether or not the death penalty deters people from committing murder to the risks of contracting HIV through heterosexual contact (e.g., Biek, Wood, & Chaiken, 1996; Edwards & Smith, 1996; Hart et al., 2009).
In sum, we humans do not always process information in an unbiased way. Sometimes, of course, we pursue new information because we want to be accurate in our views or make the wisest decisions. But once we are committed to our views and beliefs, most of us distort new information in a way that confirms them (Hart et al., 2009; Ross, 2010).
Decisions, Decisions, Decisions
Every time we make a decision, we experience dissonance. How come? Suppose you are about to buy a car, but you are torn between a van and a subcompact. You know that each has advantages and disadvantages: The van would be more convenient. You can sleep in it during long trips, and it has plenty of power, but it gets poor mileage and it’s hard to park. The subcompact is a lot less roomy, and you wonder about its safety: but it is less expensive to buy, it’s a lot zip-pier to drive, and it has a pretty good repair record. Before you decide, you will probably get as much information as you can. You go online and read what the experts say about each model’s safety, gas consumption, and reliability. You’ll talk with friends who own a van or a subcompact. You’ll probably visit automobile dealers to test-drive the vehicles to see how each one feels. All this predecision behavior is perfectly rational.
Let’s assume you decide to buy the subcompact. We predict that your behavior will change in a specific way: You will begin to think more and more about the number of miles to the gallon as though it were the most important thing in the world. Simultaneously, you will almost certainly downplay the fact that you can’t sleep in your subcompact. Who wants to sleep in their car on a long trip anyway? Similarly, you will barely remember that your new small car can put you at considerable risk of harm in a collision. How does this shift in thinking happen?
Distorting Our Likes and Dislikes
In any decision, whether it is between two cars, two colleges, or two potential lovers, the chosen alternative is seldom entirely positive and the rejected alternative is seldom entirely negative. After the decision, your cognition that you are a smart person is dissonant with all the negative things about the car, college, or lover you chose; that cognition is also dissonant with all the positive aspects of the car, college, or lover you rejected. We call this postdecision dissonance. Cognitive dissonance theory predicts that to help yourself feel better about the decision, you will do some unconscious mental work to try to reduce the dissonance.
What kind of work? In a classic experiment, Jack Brehm (1956) posed as a representative of a consumer testing service and asked women to rate the attractiveness and desirability of several kinds of small appliances. Each woman was told that as a reward for having participated in the survey, she could have one of the appliances as a gift. She was given a choice between two of the products she had rated as being equally attractive. After she made her decision, each woman was asked to rerate all the products. After receiving the appliance of their choice, the women rated its attractiveness somewhat higher than they had the first time. Not only that, but they drastically lowered their rating of the appliance they might have chosen but decided to reject.
In other words, following a decision, to reduce dissonance we change the way we feel about the chosen and unchosen alternatives, cognitively spreading them apart in our own minds in order to make ourselves feel better about the choice we made.
The Permanence of the Decision
The more important the decision, the greater the dissonance. Deciding which car to buy is clearly more important than deciding between a toaster and a coffeemaker; deciding which person to marry is clearly more important than deciding which car to buy. Decisions also vary in terms of how permanent they are—that is, how difficult they are to revoke. It is a lot easier to trade in your new car for another one than it is to get out of an unhappy marriage. The more permanent and less revocable the decision, the stronger is the need to reduce dissonance.
In a simple but clever experiment, social psychologists intercepted people at a racetrack who were on their way to place $2 bets and asked them how certain they were that their horses would win (Knox & Inkster, 1968). The investigators also approached other bettors just as they were leaving the $2 window, after having placed their bets, and asked them the same question. Almost invariably, people who had already placed their bets gave their horses a much better chance of winning than did those who had not yet placed their bets. Because only a few minutes separated one group from another, nothing real had occurred to increase the probability of winning; the only thing that had changed was the finality of the decision—and hence the dissonance it produced.
Moving from the racetrack to the Harvard campus, other investigators tested the irrevocability hypothesis in a photography class (Gilbert & Ebert, 2002). In their study, participants were recruited through an advertisement for students interested in learning photography while taking part in a psychology experiment. Students were informed that they would shoot some photographs and print two of them. They would rate the two photographs and then get to choose one to keep. The other would be kept for administrative reasons. The students were randomly assigned to one of two conditions. In Condition One, students were informed that they had the option of exchanging photographs within a five-day period; in Condition Two, students were told that their choice was final. The researchers found that prior to making the choice between the two photographs, the students liked them equally. The experimenters then contacted the students two, four, and nine days after they had made their choice to find out if those who had a choice to exchange photographs liked the one they chose more or less than did those in the no-choice (irrevocable) condition. And, indeed, the students who had the option of exchanging photographs liked the one they finally ended up with less than did those who made the final choice on the first day.
Interestingly, when students were asked to predict whether keeping their options open would make them more or less happy with their decision, they predicted that keeping their options open would make them happier. They were wrong. Because they underestimated the discomfort of dissonance, they failed to realize that the finality of the decision would make them happier.
Creating the Illusion of Irrevocability
The irrevocability of a decision always increases dissonance and the motivation to reduce it. Because of this, unscrupulous salespeople have developed techniques for creating the illusion that irrevocability exists. One such technique is called lowballing (Cialdini, 2009; Cialdini et al., 1978; Weyant, 1996). Robert Cialdini, a distinguished social psychologist, temporarily joined the sales force of an automobile dealership to observe this technique closely. Here’s how it works: You enter an automobile showroom intent on buying a particular car. Having already priced it at several dealerships, you know you can purchase it for about $18,000. You are approached by a personable middle-aged man who tells you he can sell you one for $17,679.
Excited by the bargain, you agree to write out a check for the down payment so that he can take it to the manager as proof that you are a serious customer. Meanwhile, you imagine yourself driving home in your shiny new bargain. Ten minutes later the salesperson returns, looking forlorn. He tells you that in his zeal to give you a good deal, he miscalculated and the sales manager caught it. The price of the car comes to $18,178. You are disappointed. Moreover, you are pretty sure you can get it a bit cheaper elsewhere. The decision to buy is not irrevocable. And yet in this situation far more people will go ahead with the deal than if the original asking price had been $18,178, even though the reason for buying the car from this particular dealer—the bargain price—no longer exists (Cialdini, 2009; Cialdini et al., 1978).
There are at least three reasons why lowballing works. First, although the customer’s decision to buy is reversible, a commitment of sorts does exist. Signing a check for a down payment creates the illusion of irrevocability, even though, if the car buyer thought about it, he or she would quickly realize that it is a nonbinding contract. In the world of high-pressure sales, however, even a temporary illusion can have real consequences. Second, the feeling of commitment triggered the anticipation of an exciting event: driving out with a new car. To have had the anticipated event thwarted (by not going ahead with the deal) would have been a big letdown. Third, although the final price is substantially higher than the customer thought it would be, it is probably only slightly higher than the price at another dealership. Under these circumstances, the customer in effect says, “Oh, what the heck. I’m here, I’ve already filled out the forms, I’ve written out the check—why wait?” Thus, by using dissonance reduction and the illu-sion of irrevocability, high-pressure salespeople increase the probability that you will decide to buy their product at their price.
The Decision to Behave Immorally
Of course, decisions about cars, appliances, racehorses, and even presidential candidates are the easy ones. Often, however, our choices involve moral and ethical issues. When is it OK to lie to a friend, and when is it not? When is an act stealing, and when is it just “what everyone does”? How people reduce dissonance following a difficult moral decision has implications for their self-esteem and for whether they behave more or less ethically in the future.
Take the issue of cheating on an exam. Suppose you are a college sophomore taking the final exam in organic chemistry. Ever since you can remember, you have wanted to be a surgeon, and you think that your admission to medical school will depend heavily on how well you do in this course. A key question involves some material you know fairly well, but because so much is riding on this exam, you feel acute anxiety and draw a blank. You glance at your neighbor’s paper and discover that she is just completing her answer to the crucial question. Your conscience tells you it’s wrong to cheat, and yet, if you don’t cheat, you are certain to get a poor grade. And if you get a poor grade, you are convinced that there goes medical school.
Regardless of whether you decide to cheat or not, the threat to your self-esteem arouses dissonance. If you cheat, your belief or cognition “I am a decent, moral person” is dissonant with your cognition “I have just committed an immoral act.” If you decide to resist temptation, your cognition “I want to become a surgeon” is dissonant with your cognition “I could have nailed a good grade and admission to medical school, but I chose not to. Wow, was I stupid!”
Suppose that after a difficult struggle, you decide to cheat. According to dissonance theory, it is likely that you would try to justify the action by finding a way to minimize its negative aspects. In this case, an efficient path to reducing dissonance would involve changing your attitude about cheating. You would adopt a more lenient attitude toward cheating, convincing yourself that it is a victimless crime that doesn’t hurt anybody, that everybody does it, and that, therefore it’s not really so bad.
Suppose, by contrast, after a difficult struggle, you decide not to cheat. How would you reduce your dissonance? Again, you could change your attitude about the morality of the act, but this time in the opposite direction. That is, to justify giving up a good grade, you convince yourself that cheating is a heinous sin, that it’s one of the lowest things a person can do, and that cheaters should be rooted out and severely punished.
How Dissonance Affects Personal Values
What has happened is not merely a rationalization of your own behavior, but a change in your system of values. Thus, two people acting in two different ways could have started out with almost identical attitudes toward cheating. One came within an inch of cheating but decided to resist, while the other came within an inch of resisting but decided to cheat. After they had made their decisions, however, their attitudes toward cheating would diverge sharply as a consequence of their actions (see Figure 2 on next page).
These speculations were tested by Judson Mills (1958) in an experiment he performed in an elementary school. Mills first measured the attitudes of sixth graders toward cheating. He then had them participate in a competitive exam, with prizes awarded to the winners. The situation was arranged so that it was almost impossible to win without cheating. Mills made it easy for the children to cheat and created the illusion that they could not be detected. Under these conditions, as one might expect, some of the students cheated and others did not. The next day, the sixth graders were again asked to indicate how they felt about cheating. Sure enough, the children who had cheated became more lenient toward cheating, and those who had resisted the temptation to cheat adopted a harsher attitude.
Our prediction is that as you read this, you are thinking about your own beliefs about cheating and how they might relate to your own behavior. Not long ago, a scandal broke out at a Florida busi-ness school. In one course, a professor discovered, more than half the students had cribbed from an exam stolen in advance. When interviewed, those who cheated said things like, “Hey, no big deal. Everyone does it.” Those who refrained from cheating said, “What the cheaters did was awful. They are lazy and unethical. And they are planning for careers in business?”
Take another look at Figure 2 and imagine yourself at the top of that pyramid, about to make any important decision, such as whether to stay with a current romantic partner or break up, use illegal drugs or not, choose this major or that one, get involved in politics or not. Keep in mind that once you make a decision, you are going to justify it to reduce dissonance, and that justification may later make it hard for you to change your mind . . . even when you should.
Dissonance, Culture, and the Brain (...)
Some Final Thoughts on Dissonance: Learning from Our Mistakes
At the beginning of this chapter, we raised a vital question regarding the followers of Heaven’s Gate: How could intelligent people allow themselves to be led into what the overwhelming majority of us see as senseless behavior resulting in mass suicide? Of course, many factors were operating, including the charismatic power of each of the leaders, the existence of social support for the views of the group from other members, and the relative isolation of each group from dissenting views, producing a closed system—a little like living in a roomful of mirrors.
Yet, in addition to these factors, one of the single most powerful forces was the existence of a high degree of cognitive dissonance within the minds of the participants. After reading this chapter, you now realize that when individuals make an important decision and invest heavily in that decision (in terms of time, effort, sacrifice, and commitment), the result is a strong need to justify those actions and that investment. The more they give up and the harder they work, the greater will be the need to convince themselves that their views are correct. The members of the Heaven’s Gate cult made monumental sacrifices for their beliefs: they abandoned their friends and families, left their professions, relinquished their money and possessions, moved to another part of the world, and worked hard and long for the particular cause they believed in—all increasing their commitment to the belief.
By understanding cognitive dissonance, therefore, you can understand why the Heaven’s Gate people, having bought a telescope that failed to reveal a spaceship that wasn’t there, concluded that the telescope was faulty. To have believed otherwise would have created too much dissonance to bear. That they went on to abandon their “containers,” believing that they were moving on to a higher incarnation, is not unfathomable. It is simply an extreme manifestation of a process that we have seen in operation over and over again throughout this chapter.
Perhaps you are thinking, “Well, but they were a strange, isolated cult.” But, as we have seen, dissonance reduction affects everyone. Much of the time, dissonance-reducing behavior can be useful because it allows us to maintain self-esteem. Yet if we were to spend all our time and energy defending our egos, we would never learn from our mistakes, bad decisions, and incorrect beliefs. Instead, we would ignore them, justify them, or, worse still, attempt to turn them into virtues. We would get stuck within the confines of our narrow minds and fail to grow or change. And, in extreme cases, we might end up justifying our own smaller Heaven’s Gates—mistakes that can harm ourselves and others.
It’s bad enough when ordinary people get caught up in the self-justifying cycle, but when a political leader does so, the consequences can be devastating for the nation and the world (Tavris & Aronson, 2007). In 2003, President George W. Bush wanted to believe that Iraqi leader Saddam Hussein possessed weapons of mass destruction (WMD), nuclear and biochemical weapons that posed a threat to America and Europe. He needed this belief to be true to justify his decision to launch a preemptive war, although Iraq posed no immediate threat to the United States and none of its citizens had been involved in the attacks of 9/11. According to White House insider Scott McClellan (2009), this need led the president and his advisers to interpret CIA reports as definitive proof of Iraq’s weapons of mass destruction, even though the reports were ambiguous and were contradicted by other evidence (Stewart, 2011; Wilson, 2005).[Jewish "advisers", just wanted to attack Irak, and had no any dissonance to be reduced].
After the invasion of Iraq, administration officials, when asked “Where are the WMD?,” said that Iraq is a big country and that Saddam Hussein had them well hidden, but they were sure they would be found. As the months dragged on and still no WMD were discovered, the administration officials had to admit that there were none. Now what? How did President Bush and his staff reduce dissonance between “We believed there were WMD that justified this war” and “We were wrong”? By adding new cognitions to justify the war: Now they said that the U.S. mission was to liberate the nation from a cruel dictator and give the Iraqi people the blessings of democratic institutions. Even if things are not going well now, they said, history will vindicate us in 10 or 20 or 50 years. To an observer, these justifications are inadequate; after all, there are many brutal dictators in the world, and no one can foresee the long-term results of any war begun for a short-term purpose. But to President Bush and his advisers, the justifications seemed reasonable (Bush, 2010).
Of course we cannot be certain what was going on in President Bush’s mind, but some five decades of research on cognitive dissonance suggests that the president and his advisers may not have been intentionally deceiving the American people; it is more likely that, like the members of Heaven’s Gate, they were deceiving themselves, blinding themselves to the possibility of being wrong. [The author is trying to reduce his own dissonance 😊]. (...)
Few of us will ever wield the power of a world leader or end our lives in a cult waiting for a spaceship to transport us to another planet. But, on a smaller scale, in our zeal to protect our self-concept, we often make foolish mistakes and compound that failure by blinding ourselves to the possibility of learning from them. Is there hope? We think so. Although the process of self-justification is unconscious, once we know that we are prone to justify our actions, we can begin to monitor our thinking and, in effect, “catch ourselves in the act.” If we can learn to examine our behavior critically and dispassionately, we stand a chance of breaking out of the cycle of action followed by self-justification followed by more committed action. Admittedly, acknowledging our mistakes and taking responsibility for them is easier said than done. Imagine that you are a prosecutor who has worked hard for many years to put “bad guys” in prison. You’re the good guy. How will you respond to the dissonant information that DNA testing suggests that a few of those bad guys you put away might be innocent? Will you welcome this evidence with an open mind, because you would like justice to be done, or will you reject it, because it might show that you were wrong? Unfortunately—but not surprisingly for those who understand dissonance theory—many prosecutors in America make the latter choice: they resist and block the efforts by convicted prisoners to reopen their cases and get DNA tests (Tavris & Aronson, 2007). Their dissonance-reducing reasoning is something like this: “Well, even if he wasn’t guilty of this crime, he was surely guilty of something else; after all, he’s a bad guy.”
But at least one prosecutor chose to resolve that dissonance in a more courageous way. Thomas Vanes had routinely sought the death penalty or extreme prison sentences for defendants convicted of horrible crimes. One man, Larry Mayes, served more than 20 years for rape before DNA testing cleared him of the crime. “When [Mayes] requested a DNA retest on that rape kit,” he wrote, “I assisted in tracking down the old evidence, convinced that the current tests would put to rest his long-standing claim of innocence. But he was right, and I was wrong. Hard facts trumped opinion and belief, as they should. It was a sobering lesson, and none of the easy-to-reach rationalizations (just doing my job, it was the jurors who convicted him, the appellate courts had upheld the conviction) completely lessen the sense of responsibility—moral, if not legal—that comes with the conviction of an innocent man” (quoted in T avris & Aronson, 2007, p. 157).
Throughout our lives, all of us, in our roles as family members, workers, professionals, and citizens, will be confronted with evidence that we were wrong about something important to us—something we did or something we believed. Will you step off the pyramid in the direction of justifying that mistake . . . or will you strive to correct it?
Summary
What is theory of cognitive dissonance, and how do people avoid dissonance to maintain a stable, positive self-image?
■ The Theory of Cognitive Dissonance
Most people need to see themselves as intelligent, sensible, and decent folks who behave with integrity. This chapter is about the behavior changes and cognitive distortions that occur when we are faced with evidence that we have done something that is not intelligent, sensible, or decent—the mental effort we expend to maintain that positive self-image.
• Maintaining a positive self-image
According to cognitive dissonance theory, people experience discomfort (dissonance) when they behave in ways that are inconsistent with their conception of themselves (self-image). To reduce the dissonance, people either (1) change their behavior to bring it in line with their cognitions about themselves, (2) justify their behavior by changing one of their cognitions, or (3) attempt to justify their behavior by inventing new cognitions. One common kind of new cognition is self-affirmation, focusing on a positive quality to offset feelings of having acted foolishly. When people’s self-esteem is temporarily enhanced, they are less likely to cheat or commit other unethical acts, and more likely to work hard to improve their grades, so as to keep their behavior consonant with their self-concept. But people are not good at anticipating how they will cope with future negative events; they show an impact bias, overestimating how bad they will feel, because they don’t realize that they will be able to reduce dissonance.
• Rational behavior versus rationalizing behavior
Humans often process information in a biased way, one that fits our preconceived notions. The explanation for this is that information or ideas that disagree with our views arouse dissonance. And we humans avoid dissonance even at the expense of rational behavior.
• Decisions, decisions, decisions
Decisions arouse dissonance because they require choosing one thing and not the other. The thought that we may have made the wrong choice causes discomfort—postdecision dissonance— because it would threaten our self-image as one who makes good decisions. After the choice is final, the mind diminishes the discomfort through so-lidifying the case for the item chosen or the course of action taken. That is how dissonance reduction can change a person’s values and morality: once an unethi-cal act is committed, the person experiencing disso-nance justifies it, thereby increasing the likelihood of committing it again.
• Dissonance, culture, and the brain
Dissonance seems to be hardwired in the brain; different parts of the brain are activated when people are in a state of mental conflict or have made a choice. Because postdecision dissonance has been observed in monkeys but not other species, many researchers believe it must have an evolutionarily adaptive purpose in primates. However, although cognitive dissonance seems to be universal, occurring in non-Western cultures as well as Western ones, the content of what creates dissonant cognitions and the process and intensity of dissonance reduction do vary across cultures, reflecting the difference in cultural norms.
How is the justification of effort a product of cognitive dissonance, and what are some practical applications for reducing dissonance?
■ Self-Justification in Everyday Life
Researchers have studied the forms of dissonance reduction and their application in many spheres of life.
• The justification of effort
People tend to increase their liking for something they have worked hard to attain, even if the thing they have attained is not something they would otherwise like. This explains the intense loyalty that initiated recruits feel for their fraternities and military institutions after undergoing hazing.
• External versus internal justification
When we perform an action because of the ample external reward to do it, then the action has little or no effect on our attitudes or beliefs. However, if the reward is not big enough to justify the action, we find ourselves experiencing cognitive dissonance because there is little external justification for what we did. This activates an internal justification process to justify the action to ourselves. The internal process of self-justification has a much more powerful effect on an individual’s long-term values and behaviors than does a situation where the external justifications are evident. When people publicly advocate something that is counter to what they believe or how they behave, called counterattitudinal advocacy, they will feel dissonance. Counterattitudinal advocacy has been used to change people’s attitudes in many ways, from their preju-dices to self-defeating beliefs and harmful practices such as bulimia.
• Punishment and self-persuasion
Another way of getting people to change is not by administering severe punishment, but insufficient or mild punishment, as the forbidden-toy experiment demonstrated. The less severe the threat or the smaller the reward, the less external justification the person has for compliance, and thus the greater the need for internal justification. The resulting self-persuasion becomes internalized and lasts longer than temporary obedience to avoid a punishment.
• The hypocrisy paradigm
Inducing hypocrisy— making people face the difference between what they say and what they do—is one way to use the human tendency to reduce dissonance to foster socially beneficial behaviors. In the case of an AIDS-prevention experiment, participants videotaped speeches about the importance of using condoms and they were made aware of their own failure to use them. To reduce dissonance, they changed their behavior—they purchased condoms.
• Justifying good deeds and harmful acts
A clever application of cognitive dissonance theory is to get someone to like you by having them do you a favor. The reason this works is that the person needs to internally justify the fact that they did something nice for you. The converse is true as well. If you harm another person, to reduce the threat to your self-image that could come from doing a bad deed, you will tend to justify what you did by denigrating your victim: the person deserved it, or he or she is not “one of us” anyway. In extreme cases such as conflict and war, many people will embrace the cognition that the victim or enemy deserved everything they got because they are less than human.
How can people avoid the traps of self-justification and other dissonance-reducing behavior?
■ Some Final Thoughts on Dissonance: Learning from Our Mistakes
Much of the behavior described in this chapter may seem startling: people coming to dislike others more after doing them harm, people liking others more after doing them a favor, people believing a lie they’ve told only if there is little or no reward for telling it. These behaviors would be difficult for us to understand if it weren’t for the insights provided by the theory of cognitive dissonance.
There are times when dissonance reduction is counterproductive because it solidifies negative values and behaviors, and this applies to everyone from members of small cults to national leaders. Although the process of reducing dissonance is unconscious, it is possible to intervene in the process. Knowing that humans are dissonance-reducing animals can make us more aware of the process. The next time we feel the discomfort of having acted counter to our values, we can consciously pause the self-justification process to reflect on our action.
Social Psychology
Aronson Wilson Akert
No comments:
Post a Comment