Not so! Over the past several months, I've paid much closer attention to the phenomenology of disagreement and what it's like from the inside to change my mind. I've found that even after having been raised by scientists, earning a degree in philosophy, and putting Julia's tricks to frequent use, it really is incredibly difficult.
Along the way, I adopted the following mantra. Just because I'm more rational than those around me does not mean I am in fact rational.
It's a little tricky to discover instances of my own irrationality for a stubbornly obvious reason: If I were fully aware of being wrong, I'd already be right! But it's not impossible once you get used to trying. It's just a matter of recognizing what self-deception feels like. At first, though, it's easiest to catch irrationality in retrospect, so here's a little exercise that taught me to be on the lookout for resistance to learning the truth.
Exercise One
Next time you change your mind about something, make a study
of what led you to do it.
- Get a piece of paper and fold it in half. Great big in the top left, write down an estimation of how certain you were about your belief before you started the process of changing your mind. Beneath that, write out, in as much detail as possible, why you held the false belief in the first place. If there were several pieces of evidence, make a list.
- On the other side, write down all the evidence you collected or considered that ultimately led you to abandon your former hypothesis. Circle the one that finally did the trick.
- Then, distance yourself from the situation. Pretend that it’s is a story about someone else entirely. Consider each piece of weakening evidence individually, and estimate how much less certain it would make a fully rational Bayesian reasoner on its own and in conjunction with the other pieces of evidence you already had when you started considering this new one. If you want to be really fancy about it, plug it into Bayes theorem and run the numbers. Write those estimations in a column.
- Finally, in another column, estimate how much each piece of evidence really did decrease your certainty of your false belief, and compare those numbers to those in the first column.
Now, perhaps you’re a whole lot more rational than I
am. But here’s what I find almost every
time. What actually happens is that my
certainty barely changes at all until the final piece of evidence, even though
the Bayesian reasoner’s certainty about the false hypothesis falls way below
50% long before that.
This is what it means to cling to a belief, and it's all the more difficult to overcome in the course of a debate. Even the most rational among us have human brains full of cognitive biases; defending yourself against them takes serious effort, no matter how far you've come as a rationalist.
But you don't have to take my word for it! Go do science to it. I'll see you next time. ;-)
No comments:
Post a Comment