Outcomes and timeframes – with Annie Duke
In the last of three pieces based on our podcast conversation with US poker player and author Annie Duke, we focus on outcomes and timeframes in the context of decision-making
This time last year, here on The Value Perspective, we enjoyed an extended chat with Annie Duke, the professional poker player turned business consultant, whose 2018 book Thinking in Bets we have referenced in articles such as How to have no regrets as an investor and Lessons from the controversial 2015 Super Bowl. Her next book, How to Decide: Simple Tools for Making Better Choices, is due out this September.
More recently, we have been running a series of podcasts looking at how people from walks of life beyond investment approach decision-making in uncertain and complex environments. While our conversation with Duke was not originally intended to feature among these podcasts, we hope you will agree its discussion of probabilistic thinking, base rates and a focus on outcomes makes it an excellent addition to the series.
Having talked through the benefits of thinking in terms of probabilities and the importance of base rates and how to counter being ‘on tilt’, we turn to the subject of outcomes. As humans, we tend to judge any decision by whether its outcome is good or bad – something poker players call ‘resulting’ – so how then, as investors, can we look to fight that instinct and avoid learning the wrong lessons from the way a decision turns out?
For Duke, the solution is “process, process, process”. “If we get our focus on process then, over the long run, the outcomes are going to come,” she says. “What’s interesting, however, is that if you focus on outcomes, it can be counterproductive. A big part of my next book is the ‘paradox of experience’, which is that we need experience to learn and yet any experience we have can interfere with learning as we tend to overrate it.
Change of focus
“So if we want to do well, we need to get our focus off outcomes – certainly short-term and individual outcomes – and start to focus on process and decision quality.” Unfortunately, this is not something human beings are very good at so Duke’s advice to investors is to work to change the outcome they are focusing or “pegging” on – though she concedes that is easier said than done
Often, Duke says, she will meet managers of financial businesses who are at a loss to understand why, even though they put an emphasis on process over outcomes, those at the coalface – the traders, say – tend to be hugely outcome-oriented. “Process language matters insofar as it gets people thinking in the right space,” she adds, “but it means nothing if you are not also behaving as if the process matters to you.”
If a business calls a ‘post-mortem’ or an ‘all hands on deck’ meeting, Duke explains, the chances are it is because something negative has happened. “An unexpected loss, say, will have triggered a meeting to explore why the trade was made,” she says. “Was the trade good? Does our investment strategy need a tweak? And so on. What is interesting, though, is that when a trade overperforms, the same meeting doesn’t occur.
“But this is a symmetrical issue. In either case, to be so far off your forecast, there may be risks you didn’t see and you would want to explore that equally on both sides. Yet if what’s getting us ‘in the room’ is only a bad outcome, what you are telling people making decisions around you is ‘Be really afraid of bad outcomes as that’s all we care about around here’. You have inadvertently put a bunch of emphasis on downside outcome.”
Attitude to risk
Duke identifies a couple of issues that can flow from this, the first being that people’s attitude to risk can be disturbed to the extent that, when faced with a choice between high risk and low risk, they will always pick the latter. Another issue is that people follow what is seen to be a consensus or a ‘company line’ so they have an excuse to fall back on if they do end up ‘in the room’.
A fixation with outcomes also means any post-mortem following a trading loss is unlikely to feature the very valid – if, on the face of it, odd – question: “Should we have lost more money?” People are only asking how they could have lost less,” says Duke. “But sometimes we should have lost more – sometimes you explore and you realise your position was actually too small and you should have lost more.
“Likewise, when you win, then – assuming anyone is even asking questions – it will be: ‘How could we have done better?’ Nobody’s saying, ‘Maybe we should have won less’ – and yet, where you mis-assessed the risk, often you may have had too big a position and you should actually have won less. Or sometimes you win for a reason you weren't expecting to and maybe you shouldn’t have had the position on at all.”
Striving to introduce this sort of symmetry into the analysis of decision-making, argues Duke, helps people understand it is not outcome that matters but process. “People will peg on outcome,” she explains. “They’re human beings. But they need to know you don’t want them worrying about win or loss so much – that’s going to come out in the long run. What you care about is how close the result is to what was expected.”
Influence of timeframes
Mention of the long run brings us to our final question. Following a value strategy means that, here on The Value Perspective, we have to be very patient and will only know how our decisions have turned out some years down the line. This, of course, is in stark contrast to the poker table where you will discover how a hand turns out in a matter of minutes so what influence do different timeframes have on thinking probabilistically?
For Duke, this can be seen as a question of data-gathering – in other words, figuring out how to get enough data to make a properly informed decision over an expanded timeframe. “When you’re getting a lot of feedback, if you can sit back and take time to aggregate, there is a lot more information to be had,” she says. “That's clearly helpful so the question then becomes, how do I deal with this data problem?”
One approach Duke suggests is to work out “interim predictions” to peg against along the way, while another is to try and build up a better picture of the paths we did not take. “We tend to ignore all the positions we did not put on,” she explains. “But if you do not put many positions on, they are taking a long time to realise and you feel you don’t have a lot of data, then how do you start to really refine your model?”
Duke’s answer is to group investments into three categories. In addition to the ‘hits’ – the investments you actually made and will obviously be tracking – there are the ‘near-misses’ and the ‘clear-misses’. The former are the investments you came close to making but ultimately decided against and, to Duke’s mind, it is important to keep what she calls a “shadow book” of these tipping-point ‘yes’ versus ‘no’ decisions.
Track your misses
By the same token, Duke goes on to argue, investors should also look to keep track of their ‘clear-misses’ – that is, everything your process is telling you not to touch with a bargepole. Clearly there is a resource issue here, she acknowledges – especially if you are also tracking all your near-misses – but even a sample of your clear-misses should offer valuable insights.
“It’s incredibly helpful for understanding the world and your data because these are things your model very strongly predicted were not going to win,” Duke explains. “So, first of all, just confirming that is really helpful – but, then, a clear-miss will do really well. That could be because it’s a tail event or an outlier or whatever but sometimes it will tell you something really important that is going to help your model along.”
As well as generating more data on your process, says Duke, tracking these hits, near-misses and clear-misses can help change how people think. “In situations where, say, you are on tilt and don’t want to take on more risk or those tipping-point decisions or where you’re trying to stay out of ‘that room’ or where you know a firm is tracking the ‘no’s – the omissions,” she adds, “this extra data can help ‘equalise’ decision-making.
“And that starts to get people to focus on the right thing. Instead of quality of outcomes – did I win or lose? – they start to think more about their decision-making quality and accuracy. So trying to solve this data problem has this really good side-effect, where you get people to understand that an omission is the same as a commission.”