Framing Effect Bias – Part 3

In my last two articles, we have looked at framing biases. Specifically, how the language that is used, and its level-of-aggregation, can significantly affect the way we process information.

Both articles were written in the last week, and so have been at the forefront of my mind when punting. I’ve been betting on US Politics – specifically Donald Trump’s exit date – and have been careful not to over-interpret the highly-political language used around the Russia investigation, and to rely on longer-term and aggregated ratings of his job approval, rather than daily polls that are potentially politically biased and oversensitive to whatever Twitter-feud is dominating the headlines in any hour.

However, I’ve been much less aware of my tendency towards overconfidence when it comes to betting on politics (I’ve got a degree in it, why shouldn’t I be confident?). As well as my tendency to over-weigh certain sources of authority (I think talk-show host Seth Meyers is brilliant, so I tend to agree with whatever he says).

But then it’s been nearly a year since I wrote about the overconfidence effect and our tendency to obey authority, so why would I be thinking about them?

Like all humans, my brain tends towards the recent, and this recency bias can lead to poor judgement.

University of Cambridge economist Ha-Joon Chang neatly explores this when he asks students to choose the more important invention, the internet, or the washing machine. Most, as you’d expect, choose the internet. This is because, Chang argues, humans tend to overvalue the significance of more recent events: it feels as if the internet has revolutionised our lives, but when compared to household appliances, which allowed a whole gender the time to find paid employment outside of the home, the small efficiencies the web allows us seem insignificant.

This tendency to weigh more heavily recent information has been repeatedly demonstrated by psychologists (they refer to it as the “availability heuristic”) and it’s important to recognise this in-built bias if we are to be better punters.

Markets are often skewed because of it – in betting and finance. In 2010, after the BP oil spill in the Gulf of Mexico, the company’s share price dropped to less-than-half of its pre-spill value, as traders overemphasised news of the spill when compared to the context of the company as a whole.

Now, it’s easy to see this with the benefit of hindsight – at the time, there were suggestions that regulatory fines would be so punitive that the company may face bankruptcy – but it serves as a reminder that data that is new is no more important or valuable as that which already existed.

So, when a rugby team unexpectedly lose, or a horse produces a career-best performance, or a tennis player bombs out in the first round, or Donald Trump takes to Twitter to attack a Republican senator, we shouldn’t overreact. The information is important – of course it is – but it needs to be assessed in a measured way as part of everything you know, rather than being given special prominence.

The medical profession is well-aware of the dangers of how the order in which things are framed can make us biased towards certain pieces of information.

Let's Look At Another Example

A son brings his aging father to the hospital and says his dad is having a heart-attack. The patient is triaged and prioritised as having a suspected heart-attack. A junior doctor orders tests to confirm the diagnosis. They then present the results to a consultant, who is unsure, as several of the tests are inconclusive. A final test is more convincing, though, and the consultant treats the patient as if they are having a heart attack.

In the scenario, four errors-in-thinking occur.

First, the consultant is guilty of the same recency bias that Ha-Joon Chang outlines with his washing machine versus the internet dilemma. The last medical test should be objectively assessed in conjunction with all the other tests, but, in this case, it is given greater prominence.

Second, the medical professionals are swayed by the initial assumption that the man is having a heart attack, and this directs much of the action they take. This is known as the primacy effect, or anchoring, where the first piece of information we receive is given greater prominence in our thinking. Although subsequent information may cause us to adjust our thinking, we tend to only adjust it in relation to the initial anchor, but find it very hard to rid ourselves of the anchor altogether.

The cognitive bias of anchoring has been repeatedly demonstrated in different environments, from how much we are prepared to pay for goods in negotiation to how we judge other people. Most stark, though, is the work of University of Virginia researchers, Wilson et al, who found that, even when participants were specifically told about anchoring and the effect it would have on their judgement, their estimates of how many doctors lived locally were still dramatically affected by how large or small the initial anchor was that they were given.

Third, the junior doctor exhibits a confirmation bias, a desire in all of us to corroborate what we already believe to be true, which we have explored in more detail elsewhere on this site.

And fourth, the consultant undervalues the tests that were done in the middle of the process of diagnosis. Work done on memory, specifically on what is known as the serial position effect, suggests this may be because we struggle to recall information in the middle of a list. In the case of the consultant, they have clear recall of both the anchor and the most recent test, but the results and significance of those interim tests, which have now been supplanted in their working memory, cease to be given the same weight as when they were causing the consultant to doubt the initial diagnosis.

Back To Betting

These information order effects are crucial for punters to be aware of. If a cricket match is billed as a grudge-match between one team’s star batsman and the other team’s star bowler, and this discussion dominates much of the pre-match media coverage, it is easy for that information to become the anchor by which we assess the likely outcome of the match. If we go on to look at a wider range of information, but this then becomes supplanted by news that the pitch is deteriorating and that this will favour the bowler, we can easily make judgements which are wildly distorted from reality.

As with most cognitive biases, implementing unmodifiable processes by which we bet is our best way of combating these kinds of framing errors. Medical professionals are encouraged, when presenting any case to a colleague, to vary the order in which they provide the information they have, and would-be successful punters can borrow the technique.

One way to do this is to write out the information upon which you are basing any bet in a circle, in the same way that you might write down the letters of an anagram when trying to solve it. This information should be both positive (data encouraging you to make the bet) and negative (data discouraging you from making the bet). Once done, run through the reasoning for your bet several times, starting at different places in the circle and heading in different directions each time. If you consistently find the bet to be good value, no matter your starting-point or direction-of-travel, you can be more confident that your thinking is clear and sound.

Another benefit of this technique, especially for punters who are new to trying to make the game pay, is that you will find it illuminates the way your mind works, especially when it can be trusted and when it can’t. This is likely to make you more skeptical of your own ability, and more cautious as a result, which, as we will see in an upcoming article, is no bad thing.


As a passionate sports’ fan and punter, Jack has written about sports and betting for over a decade, winning the Martin Wills Award for racing journalism in 2002 and writing Winning on Betfair for Dummies, first published in 2006 and now in its second edition, having sold over 35,000 copies in two languages.

Related Articles

Framing Effect Bias – Part 1

Humans, no matter their intelligence, are cognitively lazy. We have to be.

Framing Effect Bias – Part 2

Last time we looked at the Framing Effect Bias, and how the language used in data presented to us ...

Red Heads and Blue Heads

As would-be successful punters, many spend hours poring over the methods of professionals.  What techniques do they use, we ...