It has long been known that people generally prefer a bet with known odds, to one where there is uncertainty in the odds. The classic example of this is a choice of two bets.
Imagine two urns. In the first urn there are 100 balls, 50 white and 50 red. If you reach in to the urn and pull out a red ball, I’ll give you £100. In the second urn, there are also 100 balls, but with an unknown proportion of red and white balls. If you reach in and pull out a red ball, I’ll still give you £100.
Assuming I’m not trying to cheat you (I’m not, honest), which bet would you rather take? When surveyed, people tend to take (or pay more to take) the first bet, even though they are essentially the same. This preference is called ambiguity aversion, and apparently Keynes knew about it. It is the foundation of the Ellsberg paradox, which looks at slightly more complex bets to find a paradox, rather than just a preference.
Why am I looking at this stuff? Well, I’ve an idea that this kind of thing might have an influence on the way that we look at abrupt climate change: particularly “low probability – high impact” events. Looking at that literature, there are very few studies that try and pin down the probability of such an event occuring. When they do try, they often end up with very wide bounds on probability estimates, as the dynamics of the systems are often little known. In our recent review paper, we argued that it might be better to classify these as “high uncertainty – high impact” events. Nevertheless, these bounds (and best guesses) often come in with probabilities that it would be difficult to describe as “low”. I mean, would you get on a plane if there was a 1 in 10 (or 100, or 1000) chance of it crashing?
There might be lots wrong with these probability estimates, but assuming that they are in the right ballpark, why is the literature on cost benefit analysis, impacts studies, adaptation etc. to abrupt climate change so thin? Because we are averse to the ambiguous? Because we (as scientists) don’t like to comment and speculate on risks that we feel we don’t know about? Because we don’t like to be seen to be crying wolf?
In a lovely paper, Fox and Tsversky (1995) look at ambiguity aversion from a different perspective. Using a series of simple studies, they find that the effect tends to disappear if you don’t immediately contrast an ambiguous bet with an unambiguous one. If you ask one bunch of people to pay to bet on the first urn, and another bunch of people to bet on the second urn, they seem to pay about the same. Does this mean that if we want people to seriously consider abrupt climate change, that they clear their minds of other, less uncertain issues that they might be focussed on?