Introduction -- Sanctioning models : theories and their scope -- Methodology for a virtual world -- A tale of two methods -- When theories shake hands -- Models of climate : values and uncertainties -- Reliability without truth -- Conclusion.
There continues to be a vigorous public debate in our society about the status of climate science. Much of the skepticism voiced in this debate suffers from a lack of understanding of how the science works - in particular the complex interdisciplinary scientific modeling activities such as those which are at the heart of climate science. In this book Eric Winsberg shows clearly and accessibly how philosophy of science can contribute to our understanding of climate science, and how it can (...) also shape climate policy debates and provide a starting point for research. Covering a wide range of topics including the nature of scientific data, modeling, and simulation, his book provides a detailed guide for those willing to look beyond ideological proclamations, and enriches our understanding of how climate science relates to important concepts such as chaos, unpredictability, and the extent of what we know. (shrink)
Simulations (both digital and analog) and experiments share many features. But what essential features distinguish them? I discuss two proposals in the literature. On one proposal, experiments investigate nature directly, while simulations merely investigate models. On another proposal, simulations differ from experiments in that simulationists manipulate objects that bear only a formal (rather than material) similarity to the targets of their investigations. Both of these proposals are rejected. I argue that simulations fundamentally differ from experiments with regard to the background (...) knowledge that is invoked to argue for the “external validity” of the investigation. (shrink)
In this article we argue for the existence of ‘analogue simulation’ as a novel form of scientific inference with the potential to be confirmatory. This notion is distinct from the modes of analogical reasoning detailed in the literature, and draws inspiration from fluid dynamical ‘dumb hole’ analogues to gravitational black holes. For that case, which is considered in detail, we defend the claim that the phenomena of gravitational Hawking radiation could be confirmed in the case that its counterpart is detected (...) within experiments conducted on diverse realizations of the analogue model. A prospectus is given for further potential cases of analogue simulation in contemporary science. 1 Introduction2 Physical Background2.1 Hawking radiation in semi-classical gravity2.2 Modelling sound in fluids2.3 The acoustic analogue model of Hawking radiation3 Simulation and Analogy in Physical Theory3.1 Analogical reasoning and analogue simulation3.2 Confirmation via analogue simulation3.3 Recapitulation4 The Sound of Silence: Analogical Insights into Gravity4.1 Experimental realization of analogue models4.2 Universality and the Hawking effect4.3 Confirmation of gravitational Hawking radiation5 Prospectus. (shrink)
We present a Bayesian analysis of the epistemology of analogue experiments with particular reference to Hawking radiation. Provided such experiments can be externally validated via universality arguments, we prove that they are confirmatory in Bayesian terms. We then provide a formal model for the scaling behaviour of the confirmation measure for multiple distinct realisations of the analogue system and isolate a generic saturation feature. Finally, we demonstrate that different potential analogue realisations could provide different levels of confirmation. Our results thus (...) provide a basis both to formalise the epistemic value of analogue experiments that have been conducted and to advise scientists as to the respective epistemic value of future analogue experiments. (shrink)
In computer simulations of physical systems, the construction of models is guided, but not determined, by theory. At the same time simulations models are often constructed precisely because data are sparse. They are meant to replace experiments and observations as sources of data about the world; hence they cannot be evaluated simply by being compared to the world. So what can be the source of credibility for simulation models? I argue that the credibility of a simulation model comes not only (...) from the credentials supplied to it by the governing theory, but also from the antecedently established credentials of the model building techniques employed by the simulationists. In other words, there are certain sorts of model building techniques which are taken, in and of themselves, to be reliable. Some of these model building techniques, moreover, incorporate what are sometimes called “falsifications.” These are contrary-to-fact principles that are included in a simulation model and whose inclusion is taken to increase the reliability of the results. The example of a falsification that I consider, called artificial viscosity, is in widespread use in computational fluid dynamics. Artificial viscosity, I argue, is a principle that is successfully and reliably used across a wide domain of fluid dynamical applications, but it does not offer even an approximately “realistic” or true account of fluids. Artificial viscosity, therefore, is a counter-example to the principle that success implies truth – a principle at the foundation of scientific realism. It is an example of reliability without truth. (shrink)
Using an example of a computer simulation of the convective structure of a red giant star, this paper argues that simulation is a rich inferential process, and not simply a "number crunching" technique. The scientific practice of simulation, moreover, poses some interesting and challenging epistemological and methodological issues for the philosophy of science. I will also argue that these challenges would be best addressed by a philosophy of science that places less emphasis on the representational capacity of theories (and ascribes (...) that capacity instead to models) and more emphasis on the role of theory in guiding (rather than determining) the construction of models. (shrink)
Sovereign is he who provides the exception.…The exception is more interesting than the rule. The rule proves nothing; the exception proves everything. In the exception the power of real life breaks through the crust of a mechanism that has become torpid by repetition.In spring 2020, in response to the COVID-19 crisis, world leaders imposed severe restrictions on citizens’ civil, political, and economic liberties. These restrictions went beyond less controversial and less demanding social distancing measures seen in past epidemics. Many states (...) and countries imposed universal lockdowns. Lockdowns, as we define them here, require people to stay home; in some countries and places, citizens must have ad hoc licenses to... (shrink)
This paper discusses a crisis of accountability that arises when scientific collaborations are massively epistemically distributed. We argue that social models of epistemic collaboration, which are social analogs to what Patrick Suppes called a “model of the experiment,” must play a role in creating accountability in these contexts. We also argue that these social models must accommodate the fact that the various agents in a collaborative project often have ineliminable, messy, and conflicting interests and values; any story about accountability in (...) a massively distributed collaboration must therefore involve models of such interests and values and their methodological and epistemic effects. (shrink)
There are a variety of topics in the philosophy of science that need to be rethought, in varying degrees, after one pays careful attention to the ways in which computer simulations are used in the sciences. There are a number of conceptual issues internal to the practice of computer simulation that can benefit from the attention of philosophers. This essay surveys some of the recent literature on simulation from the perspective of the philosophy of science and argues that philosophers have (...) a lot to learn by paying closer attention to the practice of simulation. (shrink)
We call attention to an underappreciated way in which non-epistemic values influence evidence evaluation in science. Our argument draws upon some well-known features of scientific modeling. We show that, when scientific models stand in for background knowledge in Bayesian and other probabilistic methods for evidence evaluation, conclusions can be influenced by the non-epistemic values that shaped the setting of priorities in model development. Moreover, it is often infeasible to correct for this influence. We further suggest that, while this value influence (...) is not particularly prone to the problem of wishful thinking, it could have problematic non-epistemic consequences in some cases. (shrink)
1. Introduction; Elisabeth A. Lloyd and Eric Winsberg.- Section 1: Confirmation and Evidence.- 2. The Scientific Consensus on Climate Change: How Do We Know We’re Not Wrong?; Naomi Oreskes.- 3. Satellite Data and Climate Models Redux.- 3a. Introduction to Chapter 3: Satellite Data and Climate Models; Elisabeth A. Lloyd.- Ch. 3b Fact Sheet to "Consistency of Modelled and Observed Temperature Trends in the Tropical Troposphere"; Benjamin D. Santer et al..- Ch. 3c Reprint of "Consistency of Modelled and Observed Temperature Trends (...) in the Tropical Troposphere"; Benjamin D. Santer et al..- 4. The Role of ’Complex’ Empiricism in the Debates about Satellite Data and Climate Models; Elisabeth A. Lloyd.- 5. Reconciling Climate Model/Data Discrepancies: The Case of the Trees That Didn’t Bark; Michael Mann.- 6. Downscaling of Climate Information; Linda O. Mearns et al..- Section 2: Uncertainties and Robustness.- 7. The Significance of Robust Model Projections; Wendy S. Parker.- 8. Building Trust, Removing Doubt? Robustness Analysis and Climate Modeling; Jay Odenbaugh.- Section 3: Climate Models as Guides to Policy.- 9. Climate Model Confirmation: From Philosophy to Predicting Climate in the Real World; Reto Knutti.- 10. Uncertainty in Climate Science and Climate Policy; Jonathan Rougier and Michel Crucifix.- 11. Communicating Uncertainty to Policy Makers: The Ineliminable Role of Values; Eric Winsberg.- 12. Modeling Climate Policies: A Critical Look at Integrated Assessment Models; Mathias Frisch.- 13. Modelling Mitigation and Adaptation Policies to Predict their Effectiveness: The Limits of Randomized Controlled Trials; Alexandre Marcellesi and Nancy D. Cartwright. (shrink)
In his recent book, Time and Chance, David Albert claims that by positing that there is a uniform probability distribution defined, on the standard measure, over the space of microscopic states that are compatible with both the current macrocondition of the world, and with what he calls the “past hypothesis”, we can explain the time asymmetry of all of the thermodynamic behavior in the world. The principal purpose of this paper is to dispute this claim. I argue that Albert's proposal (...) fails in his stated goal—to show how to use the time‐reversible dynamics of Newtonian physics to “underwrite the actual content of our thermodynamic experience” (Albert 2000, 159). Albert's proposal can satisfactorily explain why the overall entropy of the universe as a whole is increasing, but it does not and cannot explain the increasing entropy of relatively small, relatively short‐lived systems in energetic isolation without making use of a principle that leads to reversibility objections. (shrink)
Using an example of a computer simulation of the convective structure of a red giant star, this paper argues that simulation is a rich inferential process, and not simply a “number crunching” technique. The scientific practice of simulation, moreover, poses some interesting and challenging epistemological and methodological issues for the philosophy of science. I will also argue that these challenges would be best addressed by a philosophy of science that places less emphasis on the representational capacity of theories and more (...) emphasis on the role of theory in guiding the construction of models. (shrink)
This paper deals with the question of whether uncertainty regarding model structure, especially in climate modeling, exhibits a kind of “chaos.” Do small changes in model structure, in other words, lead to large variations in ensemble predictions? More specifically, does model error destroy forecast skill faster than the ordinary or “classical” chaos inherent in the real-world attractor? In some cases, the answer to this question seems to be “yes.” But how common is this state of affairs? Are there precise mathematical (...) results that can help us answer this question? And is dependence on model structure “sensitive” in that arbitrarily small errors can destroy forecast skill? We examine some efforts in the literature to answer this last question in the affirmative and find them to be unconvincing. (shrink)
Statistical mechanics involves probabilities. At the same time, most approaches to the foundations of statistical mechanics--programs whose goal is to understand the macroscopic laws of thermal physics from the point of view of microphysics--are classical; they begin with the assumption that the underlying dynamical laws that govern the microscopic furniture of the world are deterministic. This raises some potential puzzles about the proper interpretation of these probabilities.
Statistical Mechanics (SM) involves probabilities. At the same time, most approaches to the foundations of SM—programs whose goal is to understand the macroscopic laws of thermal physics from the point of view of microphysics—are classical; they begin with the assumption that the underlying dynamical laws that govern the microscopic furniture of the world are (or can without loss of generality be treated as) deterministic. This raises some potential puzzles about the proper interpretation of these probabilities. It also raises, more generally, (...) the question of what kinds, if any, of objective probabilities can exist in a deterministic world. (shrink)
Violations of the Bell inequalities in EPR-Bohm type experiments have set the literature on the metaphysics of microscopic systems to flirting with some sort of metaphysical holism regarding spatially separated, entangled systems. The rationale for this behavior comes in two parts. The first part relies on the proof, due to Jon Jarrett  that the experimentally observed violations of the Bell inequalities entail violations of the conjunction of two probabilistic constraints. Jarrett called these two constraints locality and completeness. We prefer (...) the terminology of locality and factorizability. The first part of the rationale for metaphysical holism urges that only Jarrett’s locality allows for “peaceful coexistence” between any model of EPR-Bohm type experiments and special relativity. Factorizability, it is suggested, must be jettisoned. (shrink)
Should philosophers of science be paying attention to developments in "nanoscience"? Undoubtedly, it is too early to tell for sure. The goal of this paper is to take a preliminary look. In particular, I look at the use of computational models in the study of nano-sized solid-state materials. What I find is that there are features of these models that appear on their face to be at odds with some basic philosophical intuitions about the relationships between different theories and between (...) theories and their models. My conclusion is that developments in nanoscience are not an unlikely place for novel insights in the philosophy of science to emerge. (shrink)
In the philosophy of climate science, debate surrounding the issue of variety of evidence has mostly taken the form of attempting to connect these issues in climate science and climate modeling with philosophical accounts of what has come to be known as “robustness analysis.” I argue that an “explanatory” conception of robustness is the best candidate for understanding variety of evidence in climate science. I apply the analysis to both examples of model agreement, as well at to the convergence of (...) evidence from both model and non-model sources. (shrink)
This paper explores some connections between competing conceptions of scientific laws on the one hand, and a problem in the foundations of statistical mechanics on the other. I examine two proposals for understanding the time asymmetry of thermodynamic phenomenal: David Albert's recent proposal and a proposal that I outline based on Hans Reichenbach's “branch systems”. I sketch an argument against the former, and mount a defense of the latter by showing how to accommodate statistical mechanics to recent developments in the (...) philosophy of scientific laws. (shrink)
Roman Frigg and others have developed a general epistemological argument designed to cast doubt on the capacity of a broad range of mathematical models to generate “decision relevant predictions.” In this article, we lay out the structure of their argument—an argument by analogy—with an eye to identifying points at which certain epistemically significant distinctions might limit the force of the analogy. Finally, some of these epistemically significant distinctions are introduced and defended as relevant to a great many of the predictive (...) mathematical modeling projects employed in contemporary climate science. (shrink)
A collection of newly commissioned papers on themes from David Albert's Time and Chance (HUP, 2000), with replies by Albert. Confirmed contributors: Sean Carroll, Sidney Felder, Alison Fernandes, Mathias Frisch, Nick Huggett, Jenann Ismael, Doug Kutach, Barry Loewer, Tim Maudlin, Chris Meacham, David Wallace, and Eric Winsberg.
Climate science evaluates hypotheses about the climate using computer simulations and complex models. The models that drive these simulations, moreover, represent the efforts of many different agents, and they arise from a compounding set of methodological choices whose effects are epistemically inscrutable. These facts, I argue in this chapter, make it extremely difficult for climate scientists to estimate the degrees of uncertainty associated with these hypotheses that are free from the influences of past preferences—preferences both with regard to importance of (...) one prediction over another and with regard to avoidance of false positive over false negatives and vice versa. This leaves an imprint of non-epistemic values in the nooks and crannies of climate science. (shrink)
Scientific modelling is a value-laden process: the decisions involved can seldom be made using 'scientific' criteria alone, but rather draw on social and ethical values. In this paper, we draw on a body of philosophical literature to analyze a COVID-19 vaccination model, presenting a case study of social and ethical value judgments in health-oriented modelling. This case study urges us to make value judgments in health-oriented models explicit and interpretable by non-experts and to invite public involvement in making them.
Scientific modelling is a value-laden process: the decisions involved can seldom be made using ‘scientific’ criteria alone, but rather draw on social and ethical values. In this paper, we draw on a body of philosophical literature to analyze a COVID-19 vaccination model, presenting a case study of social and ethical value judgments in health-oriented modelling. This case study urges us to make value judgments in health-oriented models explicit and interpretable by non-experts and to invite public involvement in making them.
In a large and impressive body of published work, Quayshawn Spencer has meticulously articulated and defended a metaphysical project aimed at resuscitating a biological conception of race—one free from many of the pitfalls of biological essentialism. If successful, such a project would be highly rewarding, since it would provide a compelling response to philosophers who have denied the genuine existence of race while avoiding the very dangers that they sought to avoid. I argue that if a “new biologism” about race (...) is a live and attractive possibility, it will have to employ many of the moves that Spencer employs. The aim of this paper is to subject those moves to careful scrutiny and thereby appraise the prospects for a new biologism about race. (shrink)
One of the most interesting things about science and engineering at the nanoscale, from the point of view of the philosophy of science, is the frequent use they make of models constructed out of theories belonging to different levels of description. We usually take it for granted that every level of description falls under the domain of its own theory. For example, we generally presume there is some fundamental level of description. And with that presumption comes the hope that we (...) will be able to find a general theory of how things work at that level. But we also often take it for granted that at every other level of description that interests us—whether it be at the level of subatomic particles, atoms, fundamental theory molecules, fluids or mechanical solids—there will be some non-fundamental theory available to us that will be practically serviceable for explaining, predicting, and controlling the various phenomena that live at that level of description. (shrink)