Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts — A Review

Mistakes_were_made
I am totally having fits about this book. Everyone reading this blog has to read it. Everyone not reading this blog has to read it. I was already more or less familiar with the concepts in it before I started reading… and I am nevertheless finding it a life-changer.

And in particular, anyone interested in religion has to read it. It doesn’t talk much about religion specifically; but the ideas in it are spot-on pertinent to the topic.

For believers… and for atheists.

Excuses_for_dummies
A quick summary. Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts is about cognitive dissonance: the uncomfortable-at-best feeling you get when things you do, or things that happen, contradict your beliefs — about yourself or the world. It’s about the unconscious justifications, rationalizations, and other defense mechanisms we use to keep that dissonance at bay. It’s about the ways that these rationalizations perpetuate and entrench themselves. And it’s about some of the ways we may be able to derail them. The book is fascinating and readable; it’s clear, well-written, well-researched, loaded with examples, and often very funny.

Im_with_stupid
The basic idea: When we believe something that turns out to be untrue, it conflicts with our concept of ourselves as intelligent. When we make a decision that turns out badly, it conflicts with our concept of ourselves as competent. And when we do something that hurts someone, it conflicts with our concept of ourselves as good. That’s the dissonance.

And what we do, much if not most of the time, is rationalize. We come up with reasons why our mistake wasn’t really a mistake; why our bad deed wasn’t really so bad.

“I couldn’t help it.” “Everyone else does it.” “It’s not that big a deal.” “I was tired/sick.” “They made me do it.” “I’m sure it’ll work out in the long run.” “I work hard, I deserve this.” “History will prove me right.” “I can accept money and gifts and still be impartial.” “Actually, spending fifty thousand dollars on a car makes a lot of sense.” “When the Leader said the world was going to end on August 22, 1997, he was just speaking metaphorically.”

Propagandanazijapanesemonster
In fact, we have entire social structures based on supporting and perpetuating each other’s rationalizations — from patriotic fervor in wartime to religion and religious apologetics.

More on that in a bit.

I could summarize the book ad nauseum, and this could easily turn into a 5,000 word book review. But I do have my own actual points to make. So here are, IMO, the most important pieces of info to take from this book

Brainlobessvg
1) This process is unconscious. It’s incredibly easy to see when someone else is rationalizing a bad decision. It’s incredibly difficult to see when we’re doing it ourselves. The whole way that this process works hinges on it being unconscious — if we were conscious of it, it wouldn’t work.

Crowd
2) This process is universal. All human beings do it. In fact, all human beings do it pretty much every day. Every time we take a pen from work and think, “Oh everyone does it, and the company can afford it”; every time we light a cigarette after deciding to quit and think, “Well, I only smoke half a pack a day, that’s not going to kill me”; every time we eat a pint of Ben and Jerry’s for dinner and think, “It’s been a long week, I deserve this”; every time we buy consumer products made in China (i.e., by slave labor) and think, “I really need new sneakers, and I just can’t afford to buy union-made”… that’s rationalization in action. It is a basic part of human mental functioning. If you think you’re immune… I’m sorry to break this to you, but you’re mistaken. (See #1 above, re: this process being unconscious, and very hard to detect when we’re in the middle of it.)

Circle_of_two_arrows_2
3) This process is self-perpetuating. The deeper we get into a rationalization, the more likely we are to repeat the bad decision, hang on to the mistaken belief, continue to do harm to others.

This is probably the scariest part of the book. When we hurt someone and convince ourselves that they deserved it, we’re more likely to hurt them — or other people like them — again. Partly because we’ve already convinced ourselves that they’re bad, so why not… but also, in large part, to bolster our belief that our original decision was right.

Prison
The most chilling examples of this are in the justice system and international relations. In the justice system, cops and prosecutors are powerfully resistant to the idea that they might have made a mistake and put the wrong person in prison. As a result, they actively resist revisiting cases, even when new evidence turns up. And the justice system is, in far too many ways, structured to support this pattern.

As for this process playing out in international relations, I have just three words: “The Middle East.” Any time you have a decades- or centuries-old “they started it” vendetta, you probably have one of these self-perpetuating rationalization processes on your hands. On all sides.

Mean_girls
But this happens on a small scale as well, with individuals. I know that I’ve said snarky, mean things behind people’s backs, for no good reason other than that friends of mine didn’t like them and were being mean and snarky about them… and I’ve then convinced myself that I really couldn’t stand that person, and gone on to say even more mean things about them. And I’ve more than once tried to convince my friends to dislike the people that I disliked… because if my friends liked them, it was harder to convince myself that my dislike was objectively right and true. All unconsciously, of course. It’s taken time and perspective to see that that’s what I was doing.

Commitment
4) The more we have at stake in a decision, the harder we hang on to our rationalization for it.

This is a freaky paradox, but it makes a terrible kind of sense when you think about it. The further along we’ve gone with a bad decision, and the more we’ve committed to it, the more likely we are to justify it — and to stick with it, and to invest in it even more heavily.

History_of_the_end_of_the_world
A perfect example of this is end-of-the-world cults. When people quit their jobs and sell their houses to follow some millennial leader, they’re more likely to hang on to their beliefs, even though the world conspicuously did not end on August 22, 1997 like they thought it would. If someone doesn’t sell their house to prepare for the end of the world — if, say, they just take a week off work — they’ll find it easier to admit that they made a mistake.

Helter_skelter
And this is true, not just for bad decisions and mistaken beliefs, but immoral acts as well. Paradoxically, the worse the thing is that you’ve done, the more likely you are to rationalize it, and to stick to your rationalization like glue. As I wrote before when I mentioned this book: It’s relatively easy to reconcile your belief that you’re a good person with the fact that you sometimes make needlessly catty remarks and forget your friends’ birthdays. It’s a lot harder to reconcile your belief that you’re a good person with the fact that you carved up a pregnant woman and smeared her blood on the front door. The more appalling your immoral act was, the more likely you are to have a rock-solid justification for it… or a justification that you think is rock-solid, even if everyone around you thinks it’s transparently self-serving or batshit loony.

Icepick2
5) This process is necessary.

This may be the hardest part of all this to grasp. As soon as you start learning about the unconscious rationalization of cognitive dissonance, you start wanting to take an icepick and dig out the part of your brain that’s responsible for it.

Long_dark_teatime_of_the_soul
But in fact, rationalization exists for a reason. It enables us to make decisions without being paralyzed about every possible consequence. It enables us to have confidence and self-esteem, even though we’ve made mistakes in the past. And it enables us to live with ourselves. Without it, we’d be paralyzed with guilt and shame and self-doubt. Perpetually. We’d never sleep. We’d be second-guessing everything we do. We’d be having dark nights of the soul every night of our lives.

Mistakes_were_made_2
So that’s the gist of the book. Cognitive dissonance, and the unconscious rationalizations and justifications we come up with to deal with it, are a basic part of human consciousness. It’s a necessary process… but it also does harm, sometimes great harm. So we need to come up with ways, both individually and institutionally, to minimize the harm that it does. And since the process is harder to stop the farther along it’s gone, we need to find ways to catch it early.

That’s the concept. And I think it’s important.

It’s important because, in a very practical and down-to-earth way, this concept gives us a partial handle on why dumb mistakes, absurd beliefs, and harmful acts get perpetuated. And it gives us — again, in a very practical, down-to-earth way — a handle on what we can do about it.

Wicked_witch
We have a tendency to think that bad people know they’re bad. Our popular culture is full of villains cackling over their beautiful wickedness, or trying to lure their children to The Dark Side. It’s a very convenient way of positioning evil outside ourselves, as something we could never do ourselves. Evil is Out There, something done by The Other. (In fact, I’d argue that this whole cultural trope is itself a very effective support for rationalization. “Sure, I set the stove on fire/ shagged the babysitter/ gave my money to a con artist… but it’s not like I’m Darth Vader.”)

Osama_bin_laden
But reality isn’t like that. Genuine sociopaths are rare. Most people who do bad things — even terrible, appalling, flatly evil things — don’t think of themselves as bad people. They think of themselves as good people, and they think of their evil acts as understandable, acceptable, justifiable by the circumstances. In some cases, they even think of their evil acts as positive goods.

Eye
If we want to mitigate the effects of foolish beliefs, bad decisions, and hurtful acts, we need to look at the reality of how these things happen. We need to be vigilant about our own tendency to rationalize our mistakes. We need to be knowledgeable about how to effectively deal with other people’s rationalizations. We need to create institutional structures designed to catch both our mistakes and our rationalizations, and to support us in acknowledging them. (The scientific method is a pretty good model of this.) And especially in America, we need to create a culture that doesn’t see mistakes as proof of incompetence, misconceptions as proof of stupidity, and hurtful acts as proof of evil.

And this book offers us ways to do all of that.

Optimism
The book isn’t perfect. There are, for instance, some very important questions that it neglects to answer. Specifically, I kept finding myself wondering: What’s the difference between rationalization and simple optimism, or positive thinking? What’s the difference between rationalizing a bad decision, and just having a silver-lining, “seeing the bright side” attitude? And if there is a difference, how can you tell which one you’re doing?

Journey_out
And, as a commenter here in the blog asked when I mentioned this book earlier: What’s the difference between justifying why your bad behavior wasn’t really bad — and genuinely changing your mind about what is and isn’t bad? Think of all the people who believed that homosexual sex was wrong and they were bad people for even thinking about it  until they actually did it, and spent time with other people who did it, and realized that there wasn’t actually anything wrong with it. How do you tell the difference between a rationalization and a genuine change of heart?

Thinker
Somewhat more seriously, the section on “What can we actually do about this?” is rather shorter than I would have liked. The authors do have some excellent practical advice on dealing with cognitive dissonance and rationalization. But while their advice on dealing with other people’s rationalizations is helpful, and their ideas on creating institutional structures to nip the process in the bud are inspired, their advice for dealing with one’s own dissonance/ rationalization pretty much comes down to, “Just try to be aware of it.” Problematic — since as they themselves point out, rationalization and justification are singularly resistant to introspection.

But it’s a grand and inspiring start, an excellent foundation on an important topic. It’s been a life-changer, and I recommend it passionately to everyone.

So what does it have to do with religion?

(To be continued tomorrow.)

{advertisement}
Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts — A Review
{advertisement}

12 thoughts on “Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts — A Review

  1. 1

    There’s an important counterpoint to this—if you genuinely want to change someone’s mind, it’s important to let them rationalize. It may be just a polite fiction from your point of view, but avoid asking someone to admit that they were wrong.
    Misinformed, hadn’t thought it through properly, didn’t understand the question, or whatever, but help them find away to dodge around the blunt “wrong” that wakes up all the ego defenses.

  2. 3

    I can already feel myself meta-rationalizing. “Sure, I rationalize things, but everyone else does it too…it’s just part of what makes me human.”
    And I haven’t even read the book yet.

  3. 4

    I immediately said to myself “but I really really want a candy bar, after all it isn’t the worst vice I could have” 🙂
    It does look like a very interesting book.

  4. 6

    One of the most effective tools I’ve discovered for minimizing cognitive dissonance has been meditation. There’s some interesting research on the effects it has on self-regulation of emotion and self-reflection, both of which tend to balance out cognitive dissonance. Check out the book “The Mindful Brain” for an overview of some of the research.
    That also makes me think about your stance on woo. 30 years ago, meditation was seen as useless woo, other than by practitioners. Now we have evidence that it helps the immune system and enhances mental health. So I tend to think that one of the values of woo is that it questions the edges of our beliefs. Yes, a lot of it is crap, but 90% of everything is crap (according to Sturgeon’s “Law”, which I tend to agree with.) But some valuable stuff has come out of it, which I think is important to remember.

  5. 9

    “Just try to be aware of it.”
    I have created a sign that I’ve now posted onto my bulletin board (I work from home) that says:
    “WHAT HAVE YOU RATIONALIZED TODAY?”
    It’s a start. Excellant, btw.
    DeSwiss~
    (Long-time luker and fan)

  6. 10

    It reminds me of progressive Christianity. They know that the Bible says premarital and homosexual sex (not to mention judgemental rudeness and homophobia) are un-Christian, but they mentally masturbate and say things like “we’re all sinners, we all fall short of the glory.

  7. 11

    As much as possible REMIND someone that they are wrong coupled with setting a right example – morality springs from there. Anyway, good compilation you have there, I will be reading a couple of them and let us see if I justify some “foolish” beliefs I have. Thank you for this.

  8. 12

    Does the book address the question, “What about rationalizations that are justified?” I mean, just because we rationalize things to make ourselves feel better, it may be the case as often as not that what we did really WASN’T so bad, and that we are justified in rationalizing, that the rationalizations might in fact be true. You mentioned that the book addresses the fact that rationalizations are necessary to keep us from being “paralyzed with guilt and shame and self-doubt”, but this relegates them to the status of heuristics or useful-but-false beliefs. I think that it might be the case that our rationalizations could in fact be true as often as not, that our guilt and shame and self-doubt are as much mental flaws as our tendency to rationalize is.
    Or I could just be trying to make myself feel better.

Leave a Reply

Your email address will not be published. Required fields are marked *