The standard, Bayesian account of rational belief and decision in terms of probability is often argued to be unable to cope properly with severe uncertainty, of the sort ubiquitous in some areas of policy making. Confidence in Beliefs and Rational Decision Making (to appear in the JUly 2019 issue of the journal Economics and Philosophy; pre-print available here) asks what should replace it as a guide for rational decision making. It provides the most comprehensive defense to date of an account of rational belief and decision that reserves a role for the decision maker’s confidence in beliefs (previously endorsed here and here). Beyond being able to cope with severe uncertainty, the account has strong normative credentials on the main fronts typically evoked as relevant for rational belief and decision. It fares particularly well in comparison to other prominent non-Bayesian models.
Reports by the IPCC employ an evolving framework of calibrated language for assessing and communicating degrees of certainty in scientific findings. One challenge for this framework has been ambiguity in the relationship between multiple degree-of-certainty metrics. A new paper in the journal Climatic Change aims to better systematize the interrelation between the IPCC’s probability language and their confidence language, with benefits for consistency among findings and for usability in downstream modeling and decision analysis. [link to the article]
All decision making involves figuring out what decision needs to be made, what your options are, and what information you need in order to evaluate those options. These activities are a part of what’s called the framing, or structuring, of decisions. A new paper in the journal Topoi argues that this aspect of decision making is even more important where uncertainty is severe, and that improving decision making and policy analysis under severe uncertainty requires better integration of knowledge about good framing practices and knowledge about other aspects of decision making, such as weighing the options. [link to the article]
Environmental modelling researchers use a range of technical approaches to manage and characterise uncertainties in their models and in the conclusions they draw from them. In addition, researchers also communicate informally about uncertainty in the text of the journal articles in which they report their findings. This informal communication is a part the ‘craft’ of research, and shapes the way that readers — including referees, editors, researchers and decision makers — will perceive the work and judge its usefulness.
An interdisciplinary team of environmental modellers and philosophers has been studying this informal communication, and has developed a catalogue of different ways in which authors present, or ‘frame’, uncertainties in their work. The study, which also reports the frequencies of different uncertainty frames from a corpus of scientific abstracts, is forthcoming in the journal Water Resources Research: “Towards best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts” Joseph Guillaume, Casey Helgeson, Sondoss Elsawah, Anthony Jakeman, and Matti Kummu.
Cities and regions around the world are facing the impacts of climate change and planning for even greater impacts in the near future. These efforts to manage water supplies, respond to sea level rise, or protect infrastructure can be informed by regional climate change projections on decadal and multi-decadal timescales. While informative, these climate change projections are also highly uncertain, and national meteorological offices and other climate services providers are searching for the best approaches to quantifying and communicating this uncertainty.
The formal elicitation of experts’ judgements is a method of uncertainty quantification that has been used widely in other areas of application, but is so far under-utilised in climate change adaptation. Expert judgements are subjective, but they are also evidence-based, and eliciting these judgements in a careful, controlled, scientific manner is often the best—or even the only— way to integrate evidence from a variety of sources. Climate scientist Erica Thompson and philosophers Roman Frigg, and Casey Helgeson argue that structured expert elicitation should now be put to use for characterising uncertainties in regional climate change projections intended to inform adaptation decisions: Expert Judgment for Climate Change Adaptation, Philosophy of Science 83 (5):1110-1121 (2016). [link to journal article] [pdf from Philpapers archive]
How can the Intergovernmental Panel for Climate Change’s (IPCC) framework for assessing and communicating uncertainty be related to decision making? One way to attack this question is by confronting the framework with recent decision models proposed mainly by economists working on the theory of decision under uncertainty. The confidence-based decision model developed in the context of this project (see here or here) emerges as best equipped, among major existing approaches, to fully utilise the information provided by the IPCC. Moroever, the connection of IPCC conclusions with decision making via such a model brings out some an apparently novel recommendations for future uncertainty reporting.
For more details, see Climate Change Assessments: Confidence Probability and Decision by R. Bradley, C. Helgeson & B. Hill (forthcoming, Philosophy of Science).