MISTAKES WERE MADE (BUT NOT BY ME) --- WHY WE JUSTIFY FOOLISH BELIEFS,
BAD DECISIONS, AND HURTFUL ACTS by Carol Tavris and Elliot Aronson. Harcourt, Inc., 2007


OUTLINE OF BOOK'S FACTS & IDEAS

    INTRODUCTION --- Knaves, fools, villains, and hypocrites (p1-10)

        How do they live with themselves? (p1-10)

      1) COGNITIVE DISSONANCE --- The engine of self-justification (p11-39)

      2) PRIDE AND PREJUDICE --- And other blind spots (p40-67)

      3) MEMORY --- The self-justifying historian (p68-96)

      4) GOOD INTENTIONS, BAD SCIENCE --- The closed loop of clinical judgment (p97-126)

      5) LAW AND DISORDER (p127-157)

      6) LOVE'S ASSASSIN --- Self-justification in marriage (p158-184)

      7) WOUNDS, RIFTS, AND WARS (p185-212)

      8) LETTING GO AND OWNING UP (p213-236)

    AFTERWORD (p237-238)

    ENDNOTES (p239-276)

    INDEX (p277-292)

BOOK'S DESCRIPTION & REVIEWS

    [1] Dr. Cathy Goodwin (Amazon Book Reviewer)

      Why do people refuse to admit mistakes so deeply that they transform their own brains? They are not kidding themselves. They really believe what they have to believe to justify their original thought.

      There are some pretty scary examples in this book. Psychologists who refuse to admit they had bought into the false memory theories, causing enormous pain. Politicians. Authors. Doctors. Therapists. Alien abduction victims.

      Most terrifying: the justice system operates this way. Once someone is accused of a crime --- even under the most bizarre circumstances --- the police believe he is guilty of something. Even when the DNA shows someone is innocent, or new evidence reveals the true perpetrator, they hesitate to let the accused person go free.

      This book provides an enjoyable, accurate guide through contemporary social psychology. So many "obvious myths" are debunked as we learn the way memory really works and why revenge does not end long-term conflict.

      Readers should pay special attention to the authors' discussion of the role of science in psychology, as compared to psychiatry, which is a branch of medicine. I must admit I was shocked to realize how few psychiatrists understand the concept of control groups and disconfirmation. Psychoanalysis in particular is not scientific. The authors stop short of comparing it to astrology or new age.

      This book should be required reading for everyone, especially anyone who is in a position to make policy or influence the lives of others. But after reading the book, I suspect it will not do any good. Once we hold a position, say the authors, it is almost impossible to make a change.

    [2] Roger K. Miller (A newspaperman for many years; is now a freelance writer, reviewer and editor living in Wisconsin) Sunday, May 13, 2007

      Sorry, Bogie, but you were wrong in "Casablanca" when you told Ingrid Bergman she would regret "maybe not today, maybe not tomorrow, but soon, and for the rest of your life," if she stayed with you in Morocco instead of leaving with her Nazi-fighting husband.

      Quite the contrary: She would have found reasons to justify making that choice and not the other.

      And as Carol Tavris and Elliot Aronson explain in their new book, subtitled "Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts," either decision would have suited her in the long run.

      We all struggle mightily to prove to ourselves and others that whatever we do is the right thing to have done, even -- or most especially -- when it is not.

      This team of social psychologists tackles "the inner workings of self-justification," the mental gymnastics that allow us to bemoan the mote in our brother's eye while remaining blissfully unaware of the beam in our own.

      Their prose is lively, their research is admirable, and their examples of our arrogant follies are entertaining and instructive. Two concepts are central to their study:

      Cognitive dissonance: "The hard-wired psychological mechanism that creates self-justification and protects our certainties, self-esteem and tribal affiliations."

      Pyramid of choice: When we first deal with a mistake, we are at the top of the pyramid. As we create ever more elaborate fictions that absolve us and restore our sense of self-worth and thereby remove the dissonance, we descend step by step to the base.

      The authors follow the trail of "self-justification" through the areas of family, memory, therapy, law, prejudice and conflict, but some of the juiciest examples come from politics. Think most recently of U.S. Attorney General Alberto Gonzales, echoing Ronald Reagan when he used the very words of this book's title --- "mistakes were made." (Politicians are especially fond of the passive voice.)

      What's going on is not lying, exactly, except insofar as it is lying to oneself. As Aldous Huxley said, "There is probably no such thing as a conscious hypocrite."

      Newt Gingrich surely did not say to himself, "Here I am, condemning Bill Clinton for a sexual affair while I am doing exactly the same thing."

      We begin to believe the lies to ourselves. Lyndon Johnson was a master at it. His press secretary, George Reedy, said he had a fantastic capacity to will "what was in his mind to become reality."

      This is akin to "naive realism," the conviction that we perceive objects and events "as they really are." If other people don't perceive them the same way, they must be biased.

      The authors describe a whole toolbox of "mental instruments" with which we dig the hole deeper and deeper, among them:

        1) Ethnocentricism: us against them, or us against those not us.

        [2] Confirmation bias --- finding ways to distort or dismiss evidence that unconfirms our stance.

        [3] Internalizing beliefs --- assuring ourselves that we have always felt a certain way, even when we make 180-degree turns.

        [4] Source confusion: not being able to distinguish what really happened from subsequent information that crept in from elsewhere --- particularly characteristic of false memories, a concept they deplore

        [5] Getting what you want by revising what you had: "mis-remembering, for instance, that your childhood was awful, thus distorting how far you have come, to feel better about yourself now."

      The authors conclude that we don't change because we aren't aware that we need to, and we are, like many other cultures, "mistake-phobic." We see the admission of a mistake not as a sign that something needs to be fixed -- even though such an admission often elicits the plaudits of others -- but that we are weak.

      We need more "light," they say, more self-awareness; we need "trusted naysayers" in our lives. Actually, though they don't say as much, it seems as simple as what Robert Burns wrote more than 200 years ago (freely rendered here):

        "Oh would some Power the giftie gie us / To see ourselves as others see us! / It would from many a blunder free us."

        But then, that never has been as simple as it seems.

    [3] SAMPLE SECTION OF CHAPTER ONE --- by Carol Tavris and Elliot Aronson Buzzle Staff and Agencies, 4/24/2007

      Half a century ago, a young social psychologist named Leon Festinger and two associates infiltrated a group of people who believed the world would end on December 21. They wanted to know what would happen to the group when (they hoped!) the prophecy failed. The group's leader, whom the researchers called Marian Keech, promised that the faithful would be picked up by a flying saucer and elevated to safety at midnight on December 20. Many of her followers quit their jobs, gave away their homes, and dispersed their savings, waiting for the end. Who needs money in outer space? Others waited in fear or resignation in their homes. (Mrs. Keech's own husband, a nonbeliever, went to bed early and slept soundly through the night as his wife and her followers prayed in the living room.) Festinger made his own prediction: The believers who had not made a strong commitment to the prophecy -- who awaited the end of the world by themselves at home, hoping they weren't going to die at midnight -- would quietly lose their faith in Mrs. Keech. But those who had given away their possessions and were waiting with the others for the spaceship would increase their belief in her mystical abilities. In fact, they would now do everything they could to get others to join them.

      At midnight, with no sign of a spaceship in the yard, the group felt a little nervous. By 2 A.M., they were getting seriously worried. At 4:45 A.M., Mrs. Keech had a new vision: The world had been spared, she said, because of the impressive faith of her little band. "And mighty is the word of God," she told her followers, "and by his word have ye been saved -- for from the mouth of death have ye been delivered and at no time has there been such a force loosed upon the Earth. Not since the beginning of time upon this Earth has there been such a force of Good and light as now floods this room."

      The group's mood shifted from despair to exhilaration. Many of the group's members, who had not felt the need to proselytize before December 21, began calling the press to report the miracle, and soon they were out on the streets, buttonholing passersby, trying to convert them. Mrs. Keech's prediction had failed, but not Leon Festinger's.

      The engine that drives "self-justification," the energy that produces the need to justify our actions and decisions --- especially the wrong ones --- is an unpleasant feeling that Festinger called "cognitive dissonance." Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent. This could be such a big national disgrace such as cigarette smoking: "Smoking is a dumb thing to do because it could kill me" and "I smoke two packs a day."

      Dissonance produces mental discomfort, ranging from minor pangs to deep anguish; people don't rest easy until they find a way to reduce it. In this example, the most direct way for a smoker to reduce dissonance is by quitting. But if she has tried to quit and failed, now she must reduce dissonance by convincing herself that smoking isn't really so harmful, or that smoking is worth the risk because it helps her relax or prevents her from gaining weight (and after all, obesity is a health risk, too), and so on. Most smokers manage to reduce dissonance in many such ingenious, if self-deluding, ways.

      Dissonance is disquieting because to hold two ideas that contradict each other is to flirt with absurdity and, as Albert Camus observed, we humans are creatures who spend our lives trying to convince ourselves that our existence is not absurd. At the heart of it, Festinger's theory is about how people strive to make sense out of contradictory ideas and lead lives that are, at least in their own minds, consistent and meaningful.

      The theory inspired more than 3,000 experiments that, taken together, have transformed psychologists' understanding of how the human mind works. Cognitive dissonance has even escaped academia and entered popular culture. The term is everywhere. The two of us have heard it in TV newscasts, political columns, magazine articles, bumper stickers, even on a soap opera. Alex Trebek used it on Jeopardy, Jon Stewart on The Daily Show, and President Bartlet on The West Wing. Although the expression has been thrown around a lot, few people fully understand its meaning or appreciate its enormous motivational power.

      In 1956, one of us (Elliot) arrived at Stanford University as a graduate student in psychology. Festinger had arrived that same year as a young professor, and they immediately began working together, designing experiments to test and expand dissonance theory. Their thinking challenged many notions that were gospel in psychology and among the general public, such as the behaviorist's view that people do things primarily for the rewards they bring, the economist's view that human beings generally make rational decisions, and the psychoanalyst's view that acting aggressively gets rid of aggressive impulses.

      Consider how dissonance theory challenged behaviorism. At the time, most scientific psychologists were convinced that people's actions are governed by reward and punishment. It is certainly true that if you feed a rat at the end of a maze, he will learn the maze faster than if you don't feed him; if you give your dog a biscuit when she gives you her paw, she will learn that trick faster than if you sit around hoping she will do it on her own. Conversely, if you punish your pup when you catch her peeing on the carpet, she will soon stop doing it. Behaviorists further argued that anything that was merely associated with reward would become more attractive -- your puppy will like you because you give her biscuits -- and anything associated with pain would become noxious and undesirable.

      Behavioral laws do apply to human beings, too, of course. Nobody would stay in a boring job without pay, and if you give your toddler a cookie to stop him from having a tantrum, you have taught him to have another tantrum when he wants a cookie. But, for better or worse, the human mind is more complex than the brain of a rat or a puppy. A dog may appear contrite for having been caught peeing on the carpet, but she will not try to think up justifications for her misbehavior.

      Humans think! And because we think, "dissonance theory" demonstrates that our behavior transcends the effects of rewards and punishments and often contradicts them!

Go to Essay-Set #2: Focus on the Brain Science Facts of Life