One of the great aims of value investing is to take out of the equation the emotions – principally fear and greed – that can negatively influence the way human beings form judgements and make decisions. That is why The Value Perspective likes to highlight any new work in this area that bears the name of Dan Kahan, a professor of both law and psychology at Yale University.
In Open, not shut, case, for example, we looked at a paper of Kahan’s on the degree to which we see ourselves and what we believe can affect what we think we objectively know. Before that, in Tale of woe, we touched on some work of his into how people’s numerical skills can be affected when they are asked questions that conflict with firmly held beliefs.
Now Kahan has turned his attention to the way the US legal profession and, in particular, its judges reach decisions. Certainly the US public would appear to have some strong views here, with opinion polls suggesting some three-quarters of Americans believe judges – whether US Supreme Court justices or those at lower levels – base their decisions on their personal political views.
Kahan and his team do not share this view. This is partly because many Americans do not have a great understanding of the basics of their legal system – another survey, for example, found fewer than a third know US Supreme Court rulings cannot be appealed – but mainly because Kahan and the others do not believe “existing empirical evidence” offers “secure grounds for crediting it”.
Good social scientists that they are, they set about obtaining some – creating, as they put it, “a study to examine the impact of ideology on judicial reasoning”. Of the 1,554 participants in this experiment outlined in ‘Ideology’ or ‘Situation sense’?, 253 were judges, 225 were lawyers, 250 were law students and the rest were non-lawyers drawn from the general public.
The participants were assessed on their “cultural outlooks” – that is, “their worldviews or preferences for how society or other collective enterprises should be organised” – and then asked to judge certain cases. These offered some leeway for being decided in different ways on legal grounds as well as, importantly, enough scope for those cultural outlooks perhaps to play a part in the judgements reached.
In one case, for example, participants were asked to decide whether 400 10-galllon reusable plastic water containers left unattended in a wildlife preserve on the US-Mexico border constituted the offence of littering under US law. The defence of the group of defendants – and thus the legal ambiguity – was that these containers were intended for use and were periodically refilled.
The cultural ambiguity meanwhile came down to the defendants’ identity, with half the study’s subjects told they were activists, who hoped the water would be drunk by people crossing the desert to enter the US illegally. For the other half, the defendants were said to be construction workers, who would use the containers themselves while building a border fence designed to prevent illegal entry into the US.
Although no particular judgement was implied as to whether one group of defendants might be better or worse than the other, a significant number of participants apparently felt otherwise. According to Kahan and his team, this would depend on where they stood on the two axes shown in the chart below, which run from ‘Individualism’ to ‘Communitarianism’ and from ‘Hierarchy’ to ‘Egalitarianism’.
Source: The Cultural Cognition Project, “Ideology” or “Situation Sense”
Dan M. Kahan,α David Hoffman,β Danieli Evans,χ Neal Devins,δ Eugene Lucci,π and Katherine Chengσ
Thus, for example, people who were disposed to be hierarchical and individualistic found a 75% chance of a violation by the activists but less than a 50% chance of a violation by the construction workers. For those with a more egalitarian and communitarian worldview, however, the reverse held and the construction workers were more likely to be seen as guilty of littering.
Clearly the facts of the matter did not change – only the person making the judgement. Now, those figures were for the study overall but what about the judges as a sub-group? Did they live down to the low opinion most Americans appear to have of them? Not at all. As a group, the judges held the view there was only a 25% chance of the containers counting as litter – regardless of the identity of the defendants.
In other words, the judges were capable of suppressing any inherent biases they might have and focusing only on the facts of the matter. Great news for truth, justice and the American way, you might think but not so fast – how do we square these findings with an academic study from 2006, Playing dice with criminal sentences, which does not seem to paint judges in such a positive light?
This time, judges were asked what sentence they would impose for a particular offence – but only after they had rolled what was actually a loaded dice. According to the paper, the median sentence imposed by judges who rolled a high number was eight months while, for those who rolled a low number, it was five months. The judges had, it would appear, ‘anchored’ their decision on the number they saw.
One way to rationalise the two studies is to recognise there are types of bias that can be overcome if you know you are venturing into an area where they might have an influence. On the other hand, if you are unaware the biases are out there, then there is a greater chance of your being subconsciously influenced.
This is ‘System One’ and ‘System Two’ territory – what we might effectively think of as, respectively, the subconscious and conscious parts of the human mind – which Nobel Prize-winning behavioural theorist Daniel Kahneman addressed in his 2011 book, Thinking fast and slow, and which we have discussed on The Value Perspective in pieces such as Inherent risk.
Indeed, in his latest paper, Kahan mentions “rapid, unconscious, affective reactions” – which might cover the dice experiment – before going on: “Labelled ‘System 1’, this form of reasoning is an alternative to ‘System 2’ information processing, which is conscious, effortful and analytic, and which is understood to counteract the biases that the behavioural economics inventory comprises.”
In other words, ‘System 2’ decision-making – understanding, evaluating and rationalising the available evidence before acting – can help people suppress biases that may impair their judgement. Translated into an investment context, decisions should be less arbitrary if some quantitative systems are first put in place – something value investors already do by building up in advance objective benchmarks of what would be a ‘safe’ balance sheet, say, or indeed what constitutes ‘good’ or ‘poor’ value.