Three excuses experts make to explain away wrong predictions

The future may be impossible to forecast although, ironically, the explanations experts tend to offer when their own pronouncements turn out to be inaccurate are all too predictable


Ben Arnold

Ben Arnold

Investment Specialist, Equity Value

Teaching is increasingly described as an impossible job because of the ever-greater demands placed upon the men and women who do it.

Here on The Value Perspective, beyond our own experiences with schools, we are not exactly qualified to offer an opinion – except perhaps on one specific area.

When teachers are asked to predict exam grades for students applying to university, they really are being tasked with the impossible.

That, of course – as we have argued in the context of fields ranging from finance to sport, weather to politics and beyond – is because the future is impossible to predict.

Predicting the future is impossible 

As such, we were unsurprised to find this BBC article, which highlights a number of studies that suggest most of the A-level grades predicted for students by their teachers as an integral part of the UK’s university-application process turn out to be incorrect.

University admissions system operator Ucas, for example, says its most recent figures show predicted grades are usually higher than the actual results, with almost three-quarters (73%) of applicants performing less well than their teachers have forecast.

For its part, the University and College Union (UCU) points to research from 2016 that indicated just 16% of predictions for three A-levels or equivalent had proved accurate.

The UCU’s own analysis of university admissions systems across 30 developed countries found no others using this predicted-grade approach, which was described as “no longer fit for purpose” and in need of an “urgent overhaul”.

We shall see what happens – nothing came of a review that recommended change back in 2004 – but for now the UK’s beleaguered teachers are left with a system that literally asks the impossible.

Still, any teacher looking for a few crumbs of comfort in this matter might find them in the pages of Nassim Taleb’s classic The Black Swan: The Impact of the Highly Improbable (2008) – and specifically the chapter entitled ‘The Scandal of Prediction’.

Here, among other things, he considers the big explanations – excuses might be a better word – experts tend to offer when their predictions prove wrong.

Referencing the research of US psychologist Philip E Tetlock, which we have also discussed in articles such as Polls apart, Taleb highlights three common ways experts seek to explain away the awkward revelation that they might not be quite as expert as they thought they were.

Or “how they spun their own stories”, as he puts it – “mostly in the form of belief defence or the protection of self-esteem”.

#1: 'Playing a different game' 

The first big excuse is to “tell yourself you were playing a different game” –  if you were a political pundit who failed to call an election result, for example, you might claim the result really came down to economic issues and, well, you are not an economist.

#2: 'The outlier'

Or you might “invoke the outlier” – in other words, you argue something happened that was outside the scope of your expertise. You know … something unpredictable.

#3: 'Close, but no cigar'

And if people are still looking at you with a raised eyebrow, you can always fall back on “the ‘almost right’ defence”.

As Taleb explains: “Retrospectively, with the benefit of a revision of values and an informational framework, it is easy to feel that it was a close call.”

Close but no cigar then, but if it had not been for that one small turn of events …

None of which helps the universities, the applicants or the teachers, of course, but they do say that misery loves company.

And as Taleb concludes: “These ‘experts’ were lopsided: on the occasions when they were right, they attributed it to their own depth of understanding and expertise; when wrong, it was either the situation that was to blame, since it was unusual, or, worse, they did not recognise that they were wrong and spun stories around it.

"They found it difficult to accept that their grasp was a little short. But this attribute is universal to all our activities: there is something in us designed to protect our self-esteem.”


Ben Arnold

Ben Arnold

Investment Specialist, Equity Value

Ben joined Schroders in 2016 after spending 3 years as an analyst at the Royal Bank of Scotland. He moved in to the Value team in January 2018 as an investment specialist after working for two years in Schroders' Distribution division. He is a CFA Charterholder and holds an MSc in Corporate Strategy from The University of Nottingham.

Important Information:

The views and opinions displayed are those of Nick Kirrage, Andrew Lyddon, Kevin Murphy, Andrew Williams, Andrew Evans, Simon Adler, Juan Torres Rodriguez, Liam Nunn, Vera German and Roberta Barr, members of the Schroder Global Value Equity Team (the Value Perspective Team), and other independent commentators where stated.

They do not necessarily represent views expressed or reflected in other Schroders' communications, strategies or funds. The Team has expressed its own views and opinions on this website and these may change.

This article is intended to be for information purposes only and it is not intended as promotional material in any respect. Reliance should not be placed on the views and information on the website when taking individual investment and/or strategic decisions. Nothing in this article should be construed as advice. The sectors/securities shown above are for illustrative purposes only and are not to be considered a recommendation to buy/sell.

Past performance is not a guide to future performance and may not be repeated. The value of investments and the income from them may go down as well as up and investors may not get back the amounts originally invested.