A Short List of Essential Reading for a World Full of Fake News

t-shirt great again

Charlie Jones once said that “you are the same today as you will be in five years, except for two things: the people you meet and the books you read. Choose both carefully”. At Research First we have always believed that critical thinking skills are essential to both thrive in business and fully participate in civic life.

If anything, those skills have become even more important given the batshit crazy world we find ourselves in today. Now, seemingly more than ever, the ability to think critically about how arguments are constructed, supported, and presented is an essential antidote to a world awash with bullshit (in Harry Frankfurt’s sense of the word).

Three of our favourite books for helping develop that antidote are Darrell Huff’s How to Lie with Statistics; John Allen Paulos’s A Mathematician Reads the Newspaper; and Cynthia Crossen’s Tainted Truth.

The first of these – How to Lie with Statistics – is now called ‘a classic’ but don’t let that put you off. As the entry on Wikipedia puts it, Huff’s book is “a brief, breezy, illustrated volume outlining common errors, both intentional and unintentional, associated with the interpretation of statistics”. One reason why Huff’s book is so easy to read and follow is that he was a journalist rather than a statistician. And, obviously, this book isn’t really about how to lie with statistics but about how to know when you’re being lied to with statistics. If you work with numbers and statistics (and who doesn’t?) you’ll love Huff’s book.

We’d be tempted to say ‘keep a copy handy whenever you read the newspaper if John Allen Paulos hadn’t beaten us to it with his book. A Mathematician Reads the Newspapers is just as ‘breezy’ as Huff’s and just as insightful. It shows how maths and numbers are central to many of the articles we read every day (Paulos takes stories that don’t seem to involve maths and – as Amazon puts it – ‘demonstrates how a lack of mathematical knowledge can hinder our understanding of them’). In the process, he demonstrates how maths and statistics are often abused in the support of bullshit and bluff.

Tainted Truth rounds out this collection by showing how sponsored studies have become a powerful tool of persuasion. These studies look like real science to the casual observer but they manipulate truth to reflect the intentions of their sponsors. Tainted Truth also shows that how an argument is presented and communicated can have a significant effect on how persuasive we find it (regardless of its real merits). One of the quotes we love in it says “everybody gets so much information all day long that they lose their common sense”. All three books here help us retain (or reclaim?) that common sense.

And, God knows, now more than ever, that’s something we all need. Check out these three books and let’s make critical thinking great again!

 

 

 

Advertisements
A Short List of Essential Reading for a World Full of Fake News

Cambridge Analytica and the Limits of Big Data

Big data

In case you missed it, the Cambridge Analytica scandal revolved around an innocuous-looking survey posted on Facebook that deceitfully collected users’ details. The scandal caught the attention of the world because Facebook’s “privacy” settings enabled Cambridge Analytica to also harvest the details of the friends of anyone who completed the survey. This meant Cambridge Analytica collected data from over 50 million users (and maybe as many as 87 million) to help in the company’s stated aim of ‘changing audience behaviour’.

Given that most of these millions of users had not given permission to access their data, most of the coverage of this scandal has focused on the ethical and legal transgressions involved. But other commentators have noted that the scandal provides a powerful lens into what we have given up in order to have access to the wonders of social media. Think of it as a contemporary Faustian bargain, just one where Mephistopheles is disguised in full hipster mode and Faust forgot to read the fine print. No-one has run this argument more persuasively or entertainingly than Bob Hoffman (of Ad Contrarian fame). To give you a taste, in a recent blog Bob wrote:

We used to be able to dismiss Zuckerberg and his gang as greedy, silly brats with no perspective and no ethical compass. But he is far more dangerous than that.  

As important as these debates are, there is another side to the Cambridge Analytica story that needs to better known. Long before it started its Facebook deceit, Cambridge Analytica boasted that “data drives all we do”. The company promised to be able to form ‘psychographic’ profiles from these data, and to use these psychological insights to better influence opinions, preferences, and behaviour. In this regard, Cambridge Analytica was hitching a ride on the wave of hype created around ‘big data’ and ‘data analytics’.

Because of the association with Steve Bannon and the Trump campaign, it’s easy to think that Cambridge Analytica delivered on that promise. But the evidence clearly points the other way. The company worked on campaigns for both Ted Cruz and Ben Carson (neither of which ended well), and was fired from Trump’s campaign too. According to CBS News, Trump’s campaign ended its relationship with Cambridge Analytica because the data it was a supplying were of “suspect quality and value”. In other words, even with all those stolen data, Cambridge Analytica wasn’t able to change much audience behaviour.

Worse still, investigations done by The Atlantic, and elsewhere, suggest that not even Cambridge Analytica believed the data could do what they were telling their clients it could. An undercover investigation by Channel 4 News showed staff from Cambridge Analytica promoting the virtues of blackmail and bribery over bits and bytes to change behaviour. As The Atlantic noted:

If the consulting firm’s “psychographic” modeling was really the key to winning campaigns, why would it even flirt with sketchier skullduggery?

As The Atlantic also notes, Cambridge Analytica found so many willing buyers for its psychographic claptrap because we’ve all become suckers for what Bob Hoffman calls “buffoon[s] with a Powerpoint and a bag full of clichés”. After all, it wasn’t so long ago the world was agog at how Obama’s campaign used data to drive microtargeting to influence voters.

All of which means the other important lesson in the Cambridge Analytica story is this: we should always remain skeptical about revolutionary techniques claiming to wring unique insights from data.

At Research First we love data, and we like our data big, but what matters is the quality of the science not the size of the data file. While we follow Celia Green’s counsel that “the way to do research is to attack the facts at the point of greatest astonishment”, we are also acutely conscious of Heisenberg’s warning that:

What we observe is not nature herself, but nature exposed to our method of questioning.

What the Cambridge Analytica scandal reminds us is that with big data, as with so much in research, we can be our own worst enemies. The real problem is not that the right answers are too hard but that the wrong answers are often too easy.

 

 

 

 

Cambridge Analytica and the Limits of Big Data

Paved With Good Intentions

2018-trip-003.jpeg

We’ve all heard the aphorism that ‘the road to hell is paved with good intentions’. It warns us that, in trying to make something better, we often end up making it worse. In many ways this is another warning about the hubris that comes with believing we understand how the world around us truly works. In reality, the rules of science are clear that our understanding is always incomplete and open to revision. This is why George Box famously observed that while some of our models are ‘useful’, they are all ultimately ‘wrong’.

One beautiful illustration of this is provided by “Braess’s Paradox”. This began life as a mathematical model about traffic congestion. It shows that adding extra capacity to a network when the users of can reduce overall performance of the network. In other words, attempting to improve congestion by adding more roads can actually make the congestion worse. Interestingly, this also suggests that you could reduce congestion by removing roads (it’s not called a ‘paradox’ for nothing). Subsequent experience in a number of large cities across the world demonstrate that this is what happens. There’s a nice summary in the New York Times here:

But we said that Braess’s Paradox ‘began life’ as a mathematical model because it has subsequently become a description of all those cases where attempts to improve a situation result in making it works. In particular, those cases where individuals’ choices end up leaving the group worse off. In this regard, it’s a great scientific proof of the good old fashioned ‘tragedy of the commons’ problem.

Braess’s Paradox (and the Tragedy of the Commons) reminds us that we rarely know as much as we think we do. Research published this month in the Harvard Business Review notes that this is particularly a problem for beginners. The authors of that study, Carmen Sanchez and David Dunning, call this “the beginner’s bubble”, reflecting how confidence builds much faster than competence when learning a new task (and just in case you are wondering, yes that is the same David Dunning who gave his name to the Dunning-Kruger Effect).

As with most of the ways our brains play tricks on us, these biases and effects are much easier to see in others than in ourselves. You’re unlikely to spot them on your own, or if you’ve been trained to see criticism as a barometer of failure. Yet we’d all be better off if we embraced Voltaire’s truism that “doubt may be an uncomfortable position but certainty is a ridiculous one”.

 

Paved With Good Intentions

Time to Stop Fooling Yourself

Feynman

One of my favourite quotes comes from the great physicist Richard Feynman, who cautioned his colleagues:

The first principle is that you must not fool yourself and you are the easiest person to fool.

Feynman was talking to other scientists about the practice of science, but his caution is just as relevant for communication practitioners. To understand why, consider the case of Donald Trump.

If you’re middle class and educated, you might still be struggling to understand how Trump became President. Or, like many of my own friends, you might still be enthralled by the daily updates of the fear and loathing coming out of the White House, eagerly anticipating the train wreck that is surely imminent.

But the more I think about this, the more I think about what Feynman said. And the surer I am that there is a larger lesson to be learned here.

Let’s start with the grim facts: According to the Pew Research Center, most of those people who voted for Trump in 2016 would likely vote for him again. As a result, Pew is convinced that, if the election was held again tomorrow, Trump would win again.

Before I go on, pause for a moment and consider your reaction to that.

Because here’s the important part of what I’m going to say: that reaction provides a powerful insight into how we are all wired to fool ourselves.

The good news is that, at some level, this is not really our fault. Evolution has wired all of us to be ‘cognitive misers’. That is, we all think about as little as we can get away with. This makes more sense when you realise that while our brains take up about 2% of our body weight, they burn at least 20% of the fuel that goes into our bodies. As a result, our brains take shortcuts wherever and whenever they can. In a very real way, our brains prefer processing fluency over accuracy.

This means that what we believe is ‘thinking’ is really the application of the same old well-worn ‘heuristics and biases’.  It feels like we’re thinking but we’re mostly going through the motions. This is because those biases and heuristics are things we think with but only rarely about.

I think we need to spend more time in 2018 thinking about them. Now, there are too many biases to discuss here (you can read a fuller list at: http://researchfirst.co.nz/wordpress/wp-content/uploads/2017/06/RF_2017_Annual_v15P2-WEB.pdf) but I want to briefly outline three of the big ones. In the process, I want to show you why none of us is really as smart as we think we are.

Now you might think all of this explains what’s wrong with Trump and the confederacy of dunces he’s gathered around him but I’m actually talking about you. And me. And everyone else.

The first of the biases we need to confront, and one of those most powerful shortcuts our brains uses to make sense of the world, is called confirmation bias. ‘Confirmation bias’ is, in David McRaney’s words:

A filter through which you see a reality that matches your expectations.

In other words, it’s the tendency to look for confirmation for our pre-existing ideas while ignoring any evidence that might disprove those ideas.

The evidence suggests we process confirming ideas about twice as fast as dis-confirming ones. As a result, one of the most common reasons we all make mistakes is not because the right answers are too hard but because the wrong answers are too easy.

There isn’t room here to demonstrate this bias to you, but try this simple question: when did you last change your mind about anything really important?

One of the reasons it’s hard to spot ourselves using confirmation bias is because our brains tidy away our mistakes. The second of the biases I want to talk about here, ‘hindsight bias’, is a common way we do that.

Hindsight bias, also known as ‘I knew-it-all-along effect’, is the inclination, after an event has occurred, to see the event as having been predictable. We do this all the time, despite failing to predict those events beforehand. Think of this bias as a way of editing your memory to fit the current situation. This is a particularly pernicious bias because it leads us to genuinely believe we knew the outcome all along, even though we didn’t.

Where hindsight bias is not enough to expunge error, your brain fails to record your own mistakes thanks to the third bias known as ‘the fundamental attribution error’. This leads us to see our own failings as being caused by external factors beyond our control while seeing the failings of others as a reflection of their character. In other words, when we make mistakes, we always have a good reason (we were tired, rushed, not really thinking) but when other people do the same thing, it’s because there is something wrong with them. Tell me again about how you feel about the proposition Trump voters would re-elect him tomorrow?

I said I’d talk about three biases but there’s a fourth that goes hand in hand with these three. If you’re like most people, then you probably don’t think you’re like most people. As a result, you might be able to appreciate these biases intellectually, and to spot them in others. But it’s unlikely that you think they apply to you.

Well guess what? That itself is a bias. It’s known as the Bias Blind Spot Bias. This occurs when we can recognise the impact of biases on the judgement of others but fail to see the impact of those biases on our judgement. The fact that we are often cognitively blind is no surprise, but the extent to which we can be blind to our own blindness should be.

This is why Feynman’s caution is so powerful and so important. For all of us, our ability to make sense of the world rests on what Daniel Kahneman called “an almost unlimited ability to ignore our ignorance”.  Your challenge for 2018 is to work hard to notice and arrest that impulse. As that old adage put it: Fool me once, shame on you. Fool me twice, shame on me.

– Carl Davidson, Research First

Time to Stop Fooling Yourself

Free to Choose?

escolher_empresa_multinivel

If you’re like most people, you probably think you’re good at making decisions and pretty much always know what you want (and why). The evidence from psychology, on the other hand, points the other way.

For instance, Barry Scwartz’s The Paradox of Choice shows that the more choices we are faced with when making a decision, the slower we are to make that decision and the more unhappy we are with our choice (if you haven’t read the book you can watch the TED talk here).

Because Schwartz’s work flies in the face of common sense and classical economics (where more choice is always a good thing), his research has attracted its fair share of critics. In addition, attempts to replicate the jam experiment at the heart of Schwartz’s argument have not been an unqualified success. However, the notion that more choices slow down decision making has been demonstrated many times. It forms the basis of Hick’s Law, which states there is a logarithmic relationship between the number of options presented to someone and their reaction time.

Hick’s Law is often used when designing control systems (‘user interfaces’) and, more recently, websites. Just like the heart of Schwartz’s argument, Hick’s Law tells us that the key when presenting options is not to remove choice but to reduce it.

But don’t think having fewer choices automatically means greater agency in our decision-making: one of the most useful insights from behavioural economics is that people don’t respond to choices so much as how those choices are framed. Clever marketers know this and so frame choices in a way that silently influence your decision making.

The most famous of these is the so-called Decoy Price. This is the use of high-priced alternatives to reset your expectation of what ‘reasonable’ prices are. Restaurants don’t really expect to sell those $400 bottles of wine but they use them to influence you to buy the $40 bottles (as an aside, always avoid the second cheapest bottle on a wine list because this is the one the owner knows you are most likely to buy and is often lower quality than you think the price signals).

Menus are a masterclass in the use of options to shape the choices you make. There are even menu engineers to help restaurants  (and we are not making this up) “build value and increase profits through menus.” The lesson here is that the menu is trying to manipulate you and every little detail helps.

And the greater insight here is that ‘freedom to choose’ doesn’t always mean you’re choosing freely.

Free to Choose?

Warming Our Hands on a Dumpster Fire

AMER#850

The Sunk Cost Trap is another of those cognitive heuristics that can catch us all out. It describes how the more we have invested in something, the less willing we are to let that investment go. So instead of ‘cutting our losses’, we often find ourselves ‘doubling down’.

We literally do this with financial investments but it is also present in other positions we hold. In other words, when we are faced with negative outcomes from a decision we have made, we are likely to escalate our commitment to that decision. This is totally irrational, but then so is much of what really happens in our heads.

If you have any interest in global politics then you’re probably already thinking that the notions of sunk costs and escalation of commitments are useful in understanding why support for Trump remains strong among his admirers. How strong, you ask? According to a poll published recently by Reuters, 85 percent of those who voted for Trump in 2016 said they would do so again.

These cognitive traps also explain why fact checking Trump’s most fanciful claims makes no difference to  those supporters. Or, as Vox put it, “Trump supporters know Trump lies – they just don’t care”.

In this regard Trump’s presidency is a remarkable gift to social science, using the largest stage in the world to demonstrate off one cognitive bias after another.

But the uncomfortable truth is that we are all Donald Trump to some extent. Like him and his supporters, facts that contradict our own treasured assumptions are unlikely to change our minds too. Just like him, we’re all confident idiots at heart.

It’s just that some of us are more confident (and more idiotic) than others.

Warming Our Hands on a Dumpster Fire

Wouldn’t You Rather Be Fishing?

4395c411-6423-48fd-bafc-b26637f3832a-original

How many hours do you spend at work each week?

Data from the OECD suggests that New Zealanders are a hard working bunch, with an ‘average’ full time work week of 43.3 hours. But at the same time, StatsNZ data show that the proportion of people working 50 hours or more a week has reduced over the last 15 years. Indeed, the data show that New Zealanders tend to be working less than they were in 2001.

Of course, what we do and how we feel about it are often very different. Social scientists have spent quite a bit of time working out why so many us feel like we are working harder when we don’t seem to be. We’ve talked about that research elsewhere but the short version is that the number of choices we have about how we use our time influences how we feel about that time.

The international research shows that reducing working hours will probably increase your productivity

But here’s where tracking working hours gets interesting: the international research shows that reducing working hours will probably increase your productivity. The economies that are held up as powerhouses of productivity, such as Germany’s, demonstrably work fewer hours than their competitors. Similarly, many of history’s most famously productive people did so working  very few hours.

The argument for working four hours a day makes sense from a psychological point of view. As does one for having three day weekends.

The argument for working four hours day makes sense from a psychological point of view. As does one for having three day weekends. This is because you only have so much ‘cognitive bandwidth’ available to work with, and  when you focus on one thing you have much less left over to focus on something else.

Eldar Shafir and Sendhil Mullainathan call this problem ‘tunneling’, and highlight how we tend to do more of it when we are stressed or time-poor. The irony here is that we try to work our way through periods of stress by working longer hours. But, as with gambling, that means chasing losses we’re never going to win back.

In contrast, working smarter seems to be both about working fewer hours (so we can both spend more time recovering and leave more room for serendipity) and to get smarter about how we structure those hours. The research here is also clear: multitasking is a myth (and one that is exploiting you), and we have found ourselves in a world where office designs are perfectly suited to enable the kind of interruptions that ravage our productivity.

We’d love to talk more about it but we’re off to go fishing …

 

 

Wouldn’t You Rather Be Fishing?