One of my favourite quotes comes from the great physicist Richard Feynman, who cautioned his colleagues:
The first principle is that you must not fool yourself and you are the easiest person to fool.
Feynman was talking to other scientists about the practice of science, but his caution is just as relevant for communication practitioners. To understand why, consider the case of Donald Trump.
If you’re middle class and educated, you might still be struggling to understand how Trump became President. Or, like many of my own friends, you might still be enthralled by the daily updates of the fear and loathing coming out of the White House, eagerly anticipating the train wreck that is surely imminent.
But the more I think about this, the more I think about what Feynman said. And the surer I am that there is a larger lesson to be learned here.
Let’s start with the grim facts: According to the Pew Research Center, most of those people who voted for Trump in 2016 would likely vote for him again. As a result, Pew is convinced that, if the election was held again tomorrow, Trump would win again.
Before I go on, pause for a moment and consider your reaction to that.
Because here’s the important part of what I’m going to say: that reaction provides a powerful insight into how we are all wired to fool ourselves.
The good news is that, at some level, this is not really our fault. Evolution has wired all of us to be ‘cognitive misers’. That is, we all think about as little as we can get away with. This makes more sense when you realise that while our brains take up about 2% of our body weight, they burn at least 20% of the fuel that goes into our bodies. As a result, our brains take shortcuts wherever and whenever they can. In a very real way, our brains prefer processing fluency over accuracy.
This means that what we believe is ‘thinking’ is really the application of the same old well-worn ‘heuristics and biases’. It feels like we’re thinking but we’re mostly going through the motions. This is because those biases and heuristics are things we think with but only rarely about.
I think we need to spend more time in 2018 thinking about them. Now, there are too many biases to discuss here (you can read a fuller list at: http://researchfirst.co.nz/wordpress/wp-content/uploads/2017/06/RF_2017_Annual_v15P2-WEB.pdf) but I want to briefly outline three of the big ones. In the process, I want to show you why none of us is really as smart as we think we are.
Now you might think all of this explains what’s wrong with Trump and the confederacy of dunces he’s gathered around him but I’m actually talking about you. And me. And everyone else.
The first of the biases we need to confront, and one of those most powerful shortcuts our brains uses to make sense of the world, is called confirmation bias. ‘Confirmation bias’ is, in David McRaney’s words:
A filter through which you see a reality that matches your expectations.
In other words, it’s the tendency to look for confirmation for our pre-existing ideas while ignoring any evidence that might disprove those ideas.
The evidence suggests we process confirming ideas about twice as fast as dis-confirming ones. As a result, one of the most common reasons we all make mistakes is not because the right answers are too hard but because the wrong answers are too easy.
There isn’t room here to demonstrate this bias to you, but try this simple question: when did you last change your mind about anything really important?
One of the reasons it’s hard to spot ourselves using confirmation bias is because our brains tidy away our mistakes. The second of the biases I want to talk about here, ‘hindsight bias’, is a common way we do that.
Hindsight bias, also known as ‘I knew-it-all-along effect’, is the inclination, after an event has occurred, to see the event as having been predictable. We do this all the time, despite failing to predict those events beforehand. Think of this bias as a way of editing your memory to fit the current situation. This is a particularly pernicious bias because it leads us to genuinely believe we knew the outcome all along, even though we didn’t.
Where hindsight bias is not enough to expunge error, your brain fails to record your own mistakes thanks to the third bias known as ‘the fundamental attribution error’. This leads us to see our own failings as being caused by external factors beyond our control while seeing the failings of others as a reflection of their character. In other words, when we make mistakes, we always have a good reason (we were tired, rushed, not really thinking) but when other people do the same thing, it’s because there is something wrong with them. Tell me again about how you feel about the proposition Trump voters would re-elect him tomorrow?
I said I’d talk about three biases but there’s a fourth that goes hand in hand with these three. If you’re like most people, then you probably don’t think you’re like most people. As a result, you might be able to appreciate these biases intellectually, and to spot them in others. But it’s unlikely that you think they apply to you.
Well guess what? That itself is a bias. It’s known as the Bias Blind Spot Bias. This occurs when we can recognise the impact of biases on the judgement of others but fail to see the impact of those biases on our judgement. The fact that we are often cognitively blind is no surprise, but the extent to which we can be blind to our own blindness should be.
This is why Feynman’s caution is so powerful and so important. For all of us, our ability to make sense of the world rests on what Daniel Kahneman called “an almost unlimited ability to ignore our ignorance”. Your challenge for 2018 is to work hard to notice and arrest that impulse. As that old adage put it: Fool me once, shame on you. Fool me twice, shame on me.
– Carl Davidson, Research First