Uncertain About Uncertainty: How Qualitative Expressions of Forecaster Confidence Impact Decision-Making With Uncertainty Visualizations

Frontiers in Psychology 11:579267 (2021)
  Copy   BIBTEX

Abstract

When forecasting events, multiple types of uncertainty are often inherently present in the modeling process. Various uncertainty typologies exist, and each type of uncertainty has different implications a scientist might want to convey. In this work, we focus on one type of distinction betweendirect quantitative uncertaintyandindirect qualitative uncertainty. Direct quantitative uncertainty describes uncertainty about facts, numbers, and hypotheses that can be communicated in absolute quantitative forms such as probability distributions or confidence intervals. Indirect qualitative uncertainty describes the quality of knowledge concerning how effectively facts, numbers, or hypotheses represent reality, such as evidence confidence scales proposed by the Intergovernmental Panel on Climate Change. A large body of research demonstrates that both experts and novices have difficulty reasoning with quantitative uncertainty, and visualizations of uncertainty can help with such traditionally challenging concepts. However, the question of if, and how, people may reason with multiple types of uncertainty associated with a forecast remains largely unexplored. In this series of studies, we seek to understand if individuals can integrate indirect uncertainty about how “good” a model is (operationalized as a qualitative expression of forecaster confidence) with quantified uncertainty in a prediction (operationalized as a quantile dotplot visualization of a predicted distribution). Our first study results suggest that participants utilize both direct quantitative uncertainty and indirect qualitative uncertainty when conveyed as quantile dotplots and forecaster confidence. In manipulations where forecasters were less sure about their prediction, participants made more conservative judgments. In our second study, we varied the amount of quantified uncertainty (in the form of the SD of the visualized distributions) to examine how participants’ decisions changed under different combinations of quantified uncertainty (variance) and qualitative uncertainty (low, medium, and high forecaster confidence). The second study results suggest that participants updated their judgments in the direction predicted by both qualitative confidence information (e.g., becoming more conservative when the forecaster confidence is low) and quantitative uncertainty (e.g., becoming more conservative when the variance is increased). Based on the findings from both experiments, we recommend that forecasters present qualitative expressions of model confidence whenever possible alongside quantified uncertainty.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,349

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

How Uncertain Do We Need to Be?Jon Williamson - 2014 - Erkenntnis 79 (6):1249-1271.
Decision making under great uncertainty.Sven Ove Hansson - 1996 - Philosophy of the Social Sciences 26 (3):369-386.
Decision Making Under Great Uncertainty.Sven Ove Hansson - 1996 - Philosophy of the Social Sciences 26 (3):369-386.
Uncertain science...: uncertain world.Henry N. Pollack - 2003 - New York: Cambridge University Press.
Moral uncertainty.Krister Bykvist - 2017 - Philosophy Compass 12 (3):e12408.
Scientific Uncertainty: A User's Guide.Seamus Bradley - 2012 - Grantham Institute on Climate Change Discussion Paper.
Types of Uncertainty.Richard Bradley & Mareile Drechsler - 2014 - Erkenntnis 79 (6):1225-1248.
Varieties of uncertainty monitoring.John H. Flavell - 2003 - Behavioral and Brain Sciences 26 (3):344-344.

Analytics

Added to PP
2021-01-25

Downloads
15 (#919,495)

6 months
7 (#425,192)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

Risk, Uncertainty and Profit.Frank H. Knight - 1921 - University of Chicago Press.
Choices, Values, and Frames.Daniel Kahneman & Amos Tversky (eds.) - 2000 - Cambridge University Press.
Judgement under Uncertainty: Heuristics and Biases.Daniel Kahneman, Paul Slovic & Amos Tversky - 1985 - British Journal for the Philosophy of Science 36 (3):331-340.

View all 10 references / Add more references