“Strive to keep the door open”: An Interview with “Mistakes Were Made” Co-Author Carol Tavris

Mistakes_were_made
As regular readers will know, I recently had one of those “books that changed my life” experiences. For Santamas, Ingrid gave me Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, the book on cognitive dissonance and our rationalizations thereof… and I quickly became fascinated, bordering on obsessive. I couldn’t shut up about the book for weeks, and I’ve already blogged about it in a two part post.

And I was fortunate enough to get an interview this week with one of the book’s co-authors, Carol Tavris. We talked about cognitive dissonance and rationalization, and how they relate to international politics, gay sex, religion, wedding plans, and other burning issues of the day.

Greta: Thank you so much for talking with me. Let’s start with a really basic question: How did you and Elliot get interested in this topic? How long have you been researching it, and what made you decide to pursue it?

Georgewbush
Carol: The two of us have been friends for over 30 years, sharing a passion for psychological science and its relevance to human problems. I’d gone to visit Elliot as he was beginning to lose his vision to macular degeneration, and we were talking about George W. Bush. Bush had already become the poster boy for the inability to admit a mistake — that Saddam Hussein had weapons of mass destruction that warranted preemptive action, that Iraqis would be greeting our soldiers by dancing in the streets — and Elliot and I got to talking about this universal glitch of the human mind: Why it is that most individuals, when confronted by evidence that they are wrong or made a mistake, do not say, “Hey! thanks for that great information!” Rather than change their point of view, they cling even more tenaciously to their beliefs or courses of action.

Social_animal
Elliot has been at the forefront of the scientific study of self-justification for many years; he has conducted many experiments that have illuminated its workings in all corners of our lives. His research and writings are world famous, and his understanding of cognitive dissonance began to answer questions that had motivated my own writing over the years — why so many professionals are unable to give up theories and practices that have been shown to be wrong, including therapists who cling to outdated methods or theories like “repression,” scientists who are unconsciously corrupted by conflicts of interest, and social workers who fed the daycare sex-abuse hysteria with the notion that “children never lie” about sexual matters.

And so we said to each other, in effect, “say, we are on to something important here.” We decided to pool our areas of expertise to examine how self-justification operates across many domains, from the public sphere of politics, justice, and war to our most private lives — and how, with a little self-awareness, conscious effort, and sense of humor we can all learn to beat the brain’s wiring. The title was Elliot’s, which is ironic, since, as he says, he’s only ever made one mistake himself in his life, oh, around 1973.

Thinking
The book had a very strong effect on me, as you probably noticed from my review. As a writer and a thinker, of course, but also in my personal life. I’ve been much more conscious about rationalizing, and I think I’ve been better about copping to it when I make mistakes. But I’m also seeing what you mean when you say that rationalization is necessary. When I’m trying to be super-conscious about it, it can be paralyzing — it’s hard to make decisions, I keep second-guessing myself. And I’ve been getting kind of overwhelmed with guilt over very small misdeeds. (I’ve been apologizing to my girlfriend ad nauseum. She finally had to tell me to knock it off.)

Yes, anything is bad in excess — even chocolate and apologies! OK, maybe not chocolate.

My question: Is that something you’ve dealt with as you’ve been researching and writing about this subject? And if so, how do you cope with it? You, personally — but also, what’s your professional advice about it? How do you stay conscious about rationalization so it doesn’t screw things up for you and everyone else… but still let yourself rationalize enough to get on with your life? How do you strike that balance?

Anger
When I wrote my first book, on anger, that was the hardest lesson: How do you decide which battles are worth fighting — when is anger morally and politically necessary — and when should you let things go. It is the same here. None of us could get through the day if we stopped to examine everything we do: “What, exactly, are the data for brushing your teeth?” But there are guidelines, and I try to follow them myself.

Eye
First, the more important the decision, the more vigilant we have to be. Knowing that we will start reducing dissonance the moment we make a choice, for example, means forcing ourselves to keep an open mind about disconfirming evidence that might come along later. If the decision is unimportant, it’s no big deal; let it go; reducing dissonance lets you sleep at night. If the decision could have major consequences in your life, personally or professionally, strive to keep the door open. Intellectually, this is crucial — to keep an open mind about, say, hormone replacement therapy or medical procedures or psychological beliefs that are important to us. On the latter, many developmental psychologists and parents still can’t give up the belief that parents determine everything about how their kids turn out. I’ve modified my own views about the power of genetics in human behavior — I was once a radical behaviorist.

Of course, as we say in the relationships chapter, sometimes it is good to blind ourselves to disconfirming evidence — say, to our loved ones’ flaws and foibles!

Another good example of rationalization sometimes being necessary. 🙂

Interview continues below the fold.

So when I was reading the book, a lot of questions came up. Mostly along the lines of, “How can you tell the difference between rationalization and (X)?” So I wanted to ask you a couple. The first one, the one that came up for me a lot, was this: How can you tell the difference between rationalization and just having an optimistic, silver-lining attitude?

Cutting_the_cake
For instance: Let’s say — oh, just for a random example — that you’re planning your wedding, and you put off hiring a caterer until really late in the process… too late to get your first choice. And then you say to yourself, “Well, we really loved the caterer we wound up with, and we never would have known about them if we hadn’t procrastinated, so it all turned out for the best.” Is that rationalization… or is that just being able to see the bright side? I’d like to be able to rationalize less… but I’d hate to give up my optimistic attitude, my ability to focus on the positive side of things. How do you tell the difference?

Lemons_2
The ability to make lemonade out of lemons is crucial to well-being. If the caterer’s food was fine, you aren’t “rationalizing,” you are, as you say, simply looking on the bright side: “The food was great! OK, so we didn’t have the salmon truffles with gold sprinkles that we would have gotten from our first choice caterer, but so what?”

But if, by delaying, you got a caterer who robbed you blind, made awful food, and stole all the muffins, “rationalizing” that you got a good one is self-defeating — and it’s a lie, to yourself. And if you now justify that choice of caterer by telling your best friend to choose the same one, you are perpetuating the lie, and the mistake. Best to say: “Boy, we sure learned a lesson in how not to choose a caterer. At our next wedding, we’ll know what to do!” Notice that the point is to accept and learn from the mistake, not to beat yourself up. You don’t want to go to the opposite extreme, saying “The food was lousy — and that means everything was horrible and no one will love me and it was a completely disastrous day, whine!”

That makes sense. That’s a useful distinction, actually: the difference between admitting mistakes and wallowing in self-recrimination. I think we sometimes forget that you can do one without the other.

Journey_out
On a similar topic: What’s the difference between justifying why your bad behavior wasn’t really bad — and genuinely changing your mind about what is and isn’t bad? Example: Let’s say you have homosexual feelings, which you believe are bad and wrong… but then you go ahead and have gay sex, and you start meeting gay people, and you decide there isn’t actually anything wrong with it. A lot of people, such as the Christian right, would call that a rationalization. Others — me, for instance — would call it a genuine change of heart. How do you tell the difference?

Catfish
That example is a genuine change of heart, which is often what happens when we have new experiences. People have prejudices against all sorts of things (eating insects or catfish) and kinds of people they don’t know — Turks or New Yorkers, Jews and Muslims, Southerners and evangelicals. The “contact hypothesis” in social psychology has long predicted that one of the first steps in undoing a prejudice toward a group is becoming more familiar with its members. (Or eating that catfish.)

So when people meet someone from a group they formerly detested, they are often surprised to learn the person is actually human. They aren’t “rationalizing” a change of heart; they’ve had a change of heart. Many atheists, by the way, are jolted by dissonance on learning that the evangelical movement in America has been enormously influential in feeding the poor and sheltering the homeless, and, more recently, in supporting the environmental movement — those people aren’t supposed to be Christian!

Ted_haggard_1
Now, in the example you gave, what if a person has homosexual feelings that he or she believes are morally wrong? Or is simply a member of a religion that is anti-gay, but who has a child or friend who is gay or lesbian? That person will be in a classic, deeply felt state of dissonance. Something has to yield — the belief or the behavior. We can see how religious and political leaders struggle over how to reduce that dissonance. One way, for the person who begins a gay relationship, is to give up the belief that homosexuality is wrong — because the relationship is right. Others compartmentalize: “It’s wrong for everyone else but OK for me.” Others redefine the behavior: “Impersonal sex in bathrooms isn’t ‘homosexual sex,’ anyway” or “It isn’t homosexual if I’m just the passive recipient.” Others see it as a momentary lapse, a sin that can be atoned for.

Christian_case_for_gay_marriage
It is interesting to see how many fundamentalists are struggling to find a way to resolve Church doctrine with their own personal feelings of tolerance and understanding. Many, having met gay people and/or learning more about it, cannot live comfortably with the church view that homosexuality is a sin. David Myers, a social psychologist who is a devout Christian, wrote The Christian Case for Gay Marriage.

Which brings us to religion. I couldn’t help thinking about religion when I read your book, and I’d like to ask a couple of questions about cognitive dissonance and religion.

A lot of atheists — including me — see religion, and especially religious apologetics, as a prime example of rationalization and justification of a mistaken belief. Even if you don’t think all religious belief is like that by definition, certainly a lot of it is. Do you have any thoughts on how cognitive dissonance plays out in the religious world?

Airplane
Cognitive dissonance is at the very heart and soul of the greatest theological question any religion struggles with: how can a good, kind, just God permit so much evil, cruelty, and injustice? For many, the need to believe that someone is looking over us trumps any evidence. You see this in every disaster, say when a plane crashes, killing 327 people. The three who are randomly saved always say, “God was looking out for me!” And was God mad at the other 327 who died?

Shoah1
I was at a Bar Mitzvah not long ago when the rabbi raised this question. How could a good, wise God, the same God who freed the Jews from slavery in Egypt, have let six million be exterminated in the Holocaust? There are only two responses: There is no God, or at least no God who cares about humanity; or believe even more strongly than before — after all, God is testing us. Believers are on the tip of that pyramid we describe in our book. When the religious belief is central to a person’s very identity and security in the world, he or she will not give it up. At this synagogue, the rabbi’s solution to the dissonance was to say: “God is responsible for all the good in the world; human beings create the evil.”

And Carol, you’d mentioned something in an email about the question of how to talk to religious believers without arousing dissonance… or too much dissonance, anyway. What thoughts do you (either of you) have about that? In debates with believers, it does often seem that just pointing out evidence and logic for why a belief is probably mistaken is often not very effective. What advice do you have for atheists debating with believers?

Letting_go_of_god
It is just as difficult for an atheist to persuade a believer to give up God as for a believer to convert an atheist. What evidence could someone give you to change your mind? This is why stories of conversions — in both directions — are generally because of an individual’s personal quest for understanding, or because the original belief was precariously held. Julia Sweeney’s brilliant one-woman show, “Letting Go of God,” describes how that process worked for her: first she lost her religion, then she lost the other religions she thought might replace it; then, scarily, she let go of God. The opposite conversion — atheists becoming believers — is often spurred by a fear of illness, or death, or a search for ultimate meaning.

Frame
Therefore, when atheists debate believers, I think there is no point arguing matters of faith. Where science is involved, then — and this is crucial — the task for the atheist is not to frame the debate as one between science and religion. Give most Americans a choice like that, and they choose religion. But when — I learned this from an ACLU attorney specializing in separation of church and state issues — you frame the issue as one on which religious people themselves disagree, believers are less likely to get their backs up.

Same_sex_marriage
Gay marriage is an excellent example. Trying to argue with a fundamentalist about what the Bible does or doesn’t say won’t get you anywhere. Neither is shouting, “What is the matter with you? How could you be so stupid as to think that?” But framing the issue as one on which good Christians can disagree is better — as Dave Myers did, as educators do with evolution. Because then believers can move to a more accepting position, on religious grounds.

And finally, what are some of the ways that you see cognitive dissonance playing out in the atheist community? Obviously, cognitive dissonance and rationalization are universal human traits, it’s not like they’re limited to religious believers.

You betcha they’re not.

Are there ways that you see rationalization and justification playing out among atheists? Are there any pitfalls that you see the newly-galvanized atheist community falling into that you’d like to warn us against?

Im_with_stupid
Yes. Beware of taking on that smug tone of certainty, that “we know what’s right and you don’t, pbftttttt.” Most especially, we should all be careful in our arguments to avoid making the other person feel stupid or incompetent for holding the beliefs they do — that creates dissonance, and they will resolve it by telling you to get lost. The job of atheists is to make the case for nonbelief as clearly, unthreateningly, and persuasively as possible.

I think that’s everything. I could ask you a million more questions, but I don’t want to take up any more of your time. Thank you so much for your time — and thanks again for the wonderful book!

You are most happily welcome.

Carol
Carol Tavris is co-author with Elliot Aronson of Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. She is also author of The Mismeasure of Woman: Why Women Are Not the Better Sex, the Inferior Sex, or the Opposite Sex; Anger: The Misunderstood Emotion; and Psychobabble and Biobunk: Using Psychology To Think Critically About Issues in the News, as well as many other titles. She is a Fellow of the American Psychological Association, a charter Fellow of the American Psychological Society, and a Fellow of the Committee for Skeptical Inquiry. More information about the book and the authors can be found on the Mistakes Were Made (But Not By Me) website.

{advertisement}
“Strive to keep the door open”: An Interview with “Mistakes Were Made” Co-Author Carol Tavris
{advertisement}

2 thoughts on ““Strive to keep the door open”: An Interview with “Mistakes Were Made” Co-Author Carol Tavris

  1. 1

    “The job of atheists is to make the case for nonbelief as clearly, unthreateningly, and persuasively as possible.” – best summary of my own position that I’ve come across these days. Great interview. I even consider buying the book.

Leave a Reply

Your email address will not be published. Required fields are marked *