If you happened to catch Chernobyl on television recently, then like us you may well have been struck by the Alice in Wonderland-like logic of some of the most senior characters involved in the disaster – that this simply could not be happening as it was inconceivable it ever should happen.
Or as one character puts it: “The official position of the state is that global nuclear catastrophe is not possible in the Soviet Union.”
This is grimly illustrated early on in the miniseries, just after the core of Chernobyl’s number-four reactor 'apparently' blows up, setting in motion a chain of events that would give off twice the radiation of the Hiroshima bomb – every single hour it went on.
The plant supervisor sends a junior colleague off to go and see what has happened and he reports back that the core must have exploded as he can see graphite everywhere.
New facts, same opinion
The supervisor struggles to process the new information. 'Reactors don’t explode’ and therefore his colleague had been mistaken about seeing any graphite. The facts failed to change his opinion on what he viewed as an impossibility.
To put it another way, the supervisor’s approach was the polar opposite of the famous quote attributed to economist John Maynard Keynes – “When the facts change, I change my mind. What do you do?”
It also embodies not one but two different behavioural finance sins that investors are all too frequently guilty of committing.
The first relates to ‘confirmation bias’, which can see investors specifically seeking out information that supports their current view on companies and ignoring everything else.
An undiversified portfolio is one example of how an investor could be subconsciously harbouring an element of confirmation bias, as we pointed out in Instinctive biases.
The other behavioural finance sin the Chernobyl supervisor’s actions – or inaction – brought to mind is ‘narrative fallacy’, which we have discussed in articles such as Once upon a time.
This boils down to the idea that human beings do love a story – and indeed they can become so attached to a particular narrative that, when presented with information that runs counter to that idea, they are strongly inclined to ignore it.
As we noted in Tale of woe, this idea prompted a paper from a team of US academics led by Yale law school Professor Dan Kahan, which considered how people’s numerical skills can be affected when they are asked questions that conflict with firmly held beliefs.
These “disputed empirical issues”, as Kahan calls them, occupy a conspicuous place in US political debate, such as climate change, nuclear energy and gun control.
So, for example, where the numbers contradicted their view on gun control, Kahan and his team found the most numerate students were every bit as bad at answering questions correctly as their less numerate fellow test subjects.
On the other hand, where the numbers served to endorse their view on gun control – for example, the liberal students believing gun control reduced crime – the answers were almost 100% correct.
The paper highlights what Kahan dubs “identity protective cognition thesis”, which is “a self-sabotage of cognitive ability where it conflicts with a deeply-held belief”.
While such a description hardly rolls off the tongue, it should not prevent investors from recognising the very real risk clinging to a particular narrative and thus failing to analyse fresh data objectively can pose to the health of their portfolios.