Paved With Good Intentions


We’ve all heard the aphorism that ‘the road to hell is paved with good intentions’. It warns us that, in trying to make something better, we often end up making it worse. In many ways this is another warning about the hubris that comes with believing we understand how the world around us truly works. In reality, the rules of science are clear that our understanding is always incomplete and open to revision. This is why George Box famously observed that while some of our models are ‘useful’, they are all ultimately ‘wrong’.

One beautiful illustration of this is provided by “Braess’s Paradox”. This began life as a mathematical model about traffic congestion. It shows that adding extra capacity to a network when the users of can reduce overall performance of the network. In other words, attempting to improve congestion by adding more roads can actually make the congestion worse. Interestingly, this also suggests that you could reduce congestion by removing roads (it’s not called a ‘paradox’ for nothing). Subsequent experience in a number of large cities across the world demonstrate that this is what happens. There’s a nice summary in the New York Times here:

But we said that Braess’s Paradox ‘began life’ as a mathematical model because it has subsequently become a description of all those cases where attempts to improve a situation result in making it works. In particular, those cases where individuals’ choices end up leaving the group worse off. In this regard, it’s a great scientific proof of the good old fashioned ‘tragedy of the commons’ problem.

Braess’s Paradox (and the Tragedy of the Commons) reminds us that we rarely know as much as we think we do. Research published this month in the Harvard Business Review notes that this is particularly a problem for beginners. The authors of that study, Carmen Sanchez and David Dunning, call this “the beginner’s bubble”, reflecting how confidence builds much faster than competence when learning a new task (and just in case you are wondering, yes that is the same David Dunning who gave his name to the Dunning-Kruger Effect).

As with most of the ways our brains play tricks on us, these biases and effects are much easier to see in others than in ourselves. You’re unlikely to spot them on your own, or if you’ve been trained to see criticism as a barometer of failure. Yet we’d all be better off if we embraced Voltaire’s truism that “doubt may be an uncomfortable position but certainty is a ridiculous one”.


Paved With Good Intentions

Time to Stop Fooling Yourself


One of my favourite quotes comes from the great physicist Richard Feynman, who cautioned his colleagues:

The first principle is that you must not fool yourself and you are the easiest person to fool.

Feynman was talking to other scientists about the practice of science, but his caution is just as relevant for communication practitioners. To understand why, consider the case of Donald Trump.

If you’re middle class and educated, you might still be struggling to understand how Trump became President. Or, like many of my own friends, you might still be enthralled by the daily updates of the fear and loathing coming out of the White House, eagerly anticipating the train wreck that is surely imminent.

But the more I think about this, the more I think about what Feynman said. And the surer I am that there is a larger lesson to be learned here.

Let’s start with the grim facts: According to the Pew Research Center, most of those people who voted for Trump in 2016 would likely vote for him again. As a result, Pew is convinced that, if the election was held again tomorrow, Trump would win again.

Before I go on, pause for a moment and consider your reaction to that.

Because here’s the important part of what I’m going to say: that reaction provides a powerful insight into how we are all wired to fool ourselves.

The good news is that, at some level, this is not really our fault. Evolution has wired all of us to be ‘cognitive misers’. That is, we all think about as little as we can get away with. This makes more sense when you realise that while our brains take up about 2% of our body weight, they burn at least 20% of the fuel that goes into our bodies. As a result, our brains take shortcuts wherever and whenever they can. In a very real way, our brains prefer processing fluency over accuracy.

This means that what we believe is ‘thinking’ is really the application of the same old well-worn ‘heuristics and biases’.  It feels like we’re thinking but we’re mostly going through the motions. This is because those biases and heuristics are things we think with but only rarely about.

I think we need to spend more time in 2018 thinking about them. Now, there are too many biases to discuss here (you can read a fuller list at: but I want to briefly outline three of the big ones. In the process, I want to show you why none of us is really as smart as we think we are.

Now you might think all of this explains what’s wrong with Trump and the confederacy of dunces he’s gathered around him but I’m actually talking about you. And me. And everyone else.

The first of the biases we need to confront, and one of those most powerful shortcuts our brains uses to make sense of the world, is called confirmation bias. ‘Confirmation bias’ is, in David McRaney’s words:

A filter through which you see a reality that matches your expectations.

In other words, it’s the tendency to look for confirmation for our pre-existing ideas while ignoring any evidence that might disprove those ideas.

The evidence suggests we process confirming ideas about twice as fast as dis-confirming ones. As a result, one of the most common reasons we all make mistakes is not because the right answers are too hard but because the wrong answers are too easy.

There isn’t room here to demonstrate this bias to you, but try this simple question: when did you last change your mind about anything really important?

One of the reasons it’s hard to spot ourselves using confirmation bias is because our brains tidy away our mistakes. The second of the biases I want to talk about here, ‘hindsight bias’, is a common way we do that.

Hindsight bias, also known as ‘I knew-it-all-along effect’, is the inclination, after an event has occurred, to see the event as having been predictable. We do this all the time, despite failing to predict those events beforehand. Think of this bias as a way of editing your memory to fit the current situation. This is a particularly pernicious bias because it leads us to genuinely believe we knew the outcome all along, even though we didn’t.

Where hindsight bias is not enough to expunge error, your brain fails to record your own mistakes thanks to the third bias known as ‘the fundamental attribution error’. This leads us to see our own failings as being caused by external factors beyond our control while seeing the failings of others as a reflection of their character. In other words, when we make mistakes, we always have a good reason (we were tired, rushed, not really thinking) but when other people do the same thing, it’s because there is something wrong with them. Tell me again about how you feel about the proposition Trump voters would re-elect him tomorrow?

I said I’d talk about three biases but there’s a fourth that goes hand in hand with these three. If you’re like most people, then you probably don’t think you’re like most people. As a result, you might be able to appreciate these biases intellectually, and to spot them in others. But it’s unlikely that you think they apply to you.

Well guess what? That itself is a bias. It’s known as the Bias Blind Spot Bias. This occurs when we can recognise the impact of biases on the judgement of others but fail to see the impact of those biases on our judgement. The fact that we are often cognitively blind is no surprise, but the extent to which we can be blind to our own blindness should be.

This is why Feynman’s caution is so powerful and so important. For all of us, our ability to make sense of the world rests on what Daniel Kahneman called “an almost unlimited ability to ignore our ignorance”.  Your challenge for 2018 is to work hard to notice and arrest that impulse. As that old adage put it: Fool me once, shame on you. Fool me twice, shame on me.

– Carl Davidson, Research First

Time to Stop Fooling Yourself

Free to Choose?


If you’re like most people, you probably think you’re good at making decisions and pretty much always know what you want (and why). The evidence from psychology, on the other hand, points the other way.

For instance, Barry Scwartz’s The Paradox of Choice shows that the more choices we are faced with when making a decision, the slower we are to make that decision and the more unhappy we are with our choice (if you haven’t read the book you can watch the TED talk here).

Because Schwartz’s work flies in the face of common sense and classical economics (where more choice is always a good thing), his research has attracted its fair share of critics. In addition, attempts to replicate the jam experiment at the heart of Schwartz’s argument have not been an unqualified success. However, the notion that more choices slow down decision making has been demonstrated many times. It forms the basis of Hick’s Law, which states there is a logarithmic relationship between the number of options presented to someone and their reaction time.

Hick’s Law is often used when designing control systems (‘user interfaces’) and, more recently, websites. Just like the heart of Schwartz’s argument, Hick’s Law tells us that the key when presenting options is not to remove choice but to reduce it.

But don’t think having fewer choices automatically means greater agency in our decision-making: one of the most useful insights from behavioural economics is that people don’t respond to choices so much as how those choices are framed. Clever marketers know this and so frame choices in a way that silently influence your decision making.

The most famous of these is the so-called Decoy Price. This is the use of high-priced alternatives to reset your expectation of what ‘reasonable’ prices are. Restaurants don’t really expect to sell those $400 bottles of wine but they use them to influence you to buy the $40 bottles (as an aside, always avoid the second cheapest bottle on a wine list because this is the one the owner knows you are most likely to buy and is often lower quality than you think the price signals).

Menus are a masterclass in the use of options to shape the choices you make. There are even menu engineers to help restaurants  (and we are not making this up) “build value and increase profits through menus.” The lesson here is that the menu is trying to manipulate you and every little detail helps.

And the greater insight here is that ‘freedom to choose’ doesn’t always mean you’re choosing freely.

Free to Choose?

Warming Our Hands on a Dumpster Fire


The Sunk Cost Trap is another of those cognitive heuristics that can catch us all out. It describes how the more we have invested in something, the less willing we are to let that investment go. So instead of ‘cutting our losses’, we often find ourselves ‘doubling down’.

We literally do this with financial investments but it is also present in other positions we hold. In other words, when we are faced with negative outcomes from a decision we have made, we are likely to escalate our commitment to that decision. This is totally irrational, but then so is much of what really happens in our heads.

If you have any interest in global politics then you’re probably already thinking that the notions of sunk costs and escalation of commitments are useful in understanding why support for Trump remains strong among his admirers. How strong, you ask? According to a poll published recently by Reuters, 85 percent of those who voted for Trump in 2016 said they would do so again.

These cognitive traps also explain why fact checking Trump’s most fanciful claims makes no difference to  those supporters. Or, as Vox put it, “Trump supporters know Trump lies – they just don’t care”.

In this regard Trump’s presidency is a remarkable gift to social science, using the largest stage in the world to demonstrate off one cognitive bias after another.

But the uncomfortable truth is that we are all Donald Trump to some extent. Like him and his supporters, facts that contradict our own treasured assumptions are unlikely to change our minds too. Just like him, we’re all confident idiots at heart.

It’s just that some of us are more confident (and more idiotic) than others.

Warming Our Hands on a Dumpster Fire

Wouldn’t You Rather Be Fishing?


How many hours do you spend at work each week?

Data from the OECD suggests that New Zealanders are a hard working bunch, with an ‘average’ full time work week of 43.3 hours. But at the same time, StatsNZ data show that the proportion of people working 50 hours or more a week has reduced over the last 15 years. Indeed, the data show that New Zealanders tend to be working less than they were in 2001.

Of course, what we do and how we feel about it are often very different. Social scientists have spent quite a bit of time working out why so many us feel like we are working harder when we don’t seem to be. We’ve talked about that research elsewhere but the short version is that the number of choices we have about how we use our time influences how we feel about that time.

The international research shows that reducing working hours will probably increase your productivity

But here’s where tracking working hours gets interesting: the international research shows that reducing working hours will probably increase your productivity. The economies that are held up as powerhouses of productivity, such as Germany’s, demonstrably work fewer hours than their competitors. Similarly, many of history’s most famously productive people did so working  very few hours.

The argument for working four hours a day makes sense from a psychological point of view. As does one for having three day weekends.

The argument for working four hours day makes sense from a psychological point of view. As does one for having three day weekends. This is because you only have so much ‘cognitive bandwidth’ available to work with, and  when you focus on one thing you have much less left over to focus on something else.

Eldar Shafir and Sendhil Mullainathan call this problem ‘tunneling’, and highlight how we tend to do more of it when we are stressed or time-poor. The irony here is that we try to work our way through periods of stress by working longer hours. But, as with gambling, that means chasing losses we’re never going to win back.

In contrast, working smarter seems to be both about working fewer hours (so we can both spend more time recovering and leave more room for serendipity) and to get smarter about how we structure those hours. The research here is also clear: multitasking is a myth (and one that is exploiting you), and we have found ourselves in a world where office designs are perfectly suited to enable the kind of interruptions that ravage our productivity.

We’d love to talk more about it but we’re off to go fishing …



Wouldn’t You Rather Be Fishing?

Caveat Lector!


You probably know that advertisements are carefully crafted to influence how you think, feel, or act. But the subtlety of the deception used in ads is often easy to overlook.

For instance, ads for therapeutic products that claim their product ‘may’ prevent, reduce, or slow the signs of a condition are really signalling that they can’t prove any effect (because products that claim healing properties are obliged to
demonstrate those effects scientifically).

But you also need to be on your guard whenever you hear an ad that makes reference to a product being ‘recommended’ by professionals. The problem for consumers here is that ‘recommend’ and ‘prefer’ sound like synonyms, while for advertisers they remain distinct (and, presumably, their assumption is that most people will not notice the difference).

Colgate ran into trouble in the UK for using precisely this technique. They ran ads claiming that “80 per cent of dentists recommend” their toothpaste, but failed to mention that dentists recommended more than one brand of toothpaste. So while it was true that ’80 percent of dentists recommend Colgate’ it was certainly not the case that they prefer Colgate. In fact, it turned out that those dentists were just as likely to recommend a competitor’s product.

In cases like this, it is as important to think about what the ad is not saying as what it does. Jeffrey Schrank’s The Language of Advertising Claims remains one of the best things written about this topic, and is particularly good at identifying the ‘weasel words’ advertisers often use. These include “helps,” “virtually,” “acts,” “can be,” “up to,” “refreshes,” “comforts,” “fights,” “the feel of,” “the look of,” “fortified,” “enriched,” and “strengthened.” The whole point of these is to modify the claim that follows (and, indeed, to empty that claim of any real meaning) but to be subtle enough that most consumers won’t notice them.

Interestingly, we are all familiar with the caution ‘caveat emptor’ (‘let the ‘buyer beware’) but the work of Schrank and others reminds us that we also need ‘caveat lector’ (‘let the reader beware’). Because as H G Wells warned us long before colour television, advertising is a form of legalised lying.

Caveat Lector!

The Secret to Changing Behaviour


A great deal of our work at Research First is about changing behaviour. But as anyone who works in this field will tell you, getting people to change their behaviour is hard.

For some sense of just how hard it is, think about the last time you tried to change one of your own habits. How are those New Year’s resolutions working out for you? According to one measure, for two-thirds of us those resolutions don’t even make it to February intact. If we struggle to change our own behaviour, how do we hope to encourage others to change theirs?

The answer seems to be ‘by planning what we’ll do when the initial motivation runs out’. Another way to put this is that the secret to effecting lasting change is with what are called implementation intention plans.

As the name suggests, these outline how you plan to implement your intended change. As this article points out, they are if-then plans that spell out in advance how you will strive for a set goal.

These plans are particularly useful for changing health behaviours, and where local councils want to get more people using public transport. The Travel Smart behaviour change approach is built on implementation intention planning. It involves working with households to help them plan their weekly transport use to reduce reliance on private cars. The weekly plans outline clear implementation intentions, as well as what the participants will do if those plans cannot be realised.

Implicit in the success of implementation intention plans is the notion that ‘motivation is over-rated‘. Or, as Chuck Close put it, “inspiration is for amateurs”.

“Inspiration is for amateurs. The rest of us just show up and get to work” – Chuck Close

Along with planning what to  do once the motivation has run out, implementation intention plans help track progress towards the desired goal. By breaking down larger goals into smaller achievements, and enabling you to measure each one, the plans provide a clear roadmap towards your goal.

In this regard, the plans provide the process. As Deming said in a different context:

If you can’t describe what you are doing as a process, you don’t know what you’re doing – W. Edwards Deming

And the best thing about having a process? It means that even when you don’t know what you’re doing, you still know what to do next.

The Secret to Changing Behaviour