Much has been written on the role of causal notions and causal reasoning in the so-called 'special sciences' and in common sense. But does causal reasoning also play a role in physics? Mathias Frisch argues that, contrary to what influential philosophical arguments purport to show, the answer is yes. Time-asymmetric causal structures are as integral a part of the representational toolkit of physics as a theory's dynamical equations. Frisch develops his argument partly through a critique of anti-causal arguments and partly (...) through a detailed examination of actual examples of causal notions in physics, including causal principles invoked in linear response theory and in representations of radiation phenomena. Offering a new perspective on the nature of scientific theories and causal reasoning, this book will be of interest to professional philosophers, graduate students, and anyone interested in the role of causal thinking in science. (shrink)
Mathias Frisch provides the first sustained philosophical discussion of conceptual problems in classical particle-field theories. Part of the book focuses on the problem of a satisfactory equation of motion for charged particles interacting with electromagnetic fields. As Frisch shows, the standard equation of motion results in a mathematically inconsistent theory, yet there is no fully consistent and conceptually unproblematic alternative theory. Frisch describes in detail how the search for a fundamental equation of motion is partly driven by pragmatic considerations (like (...) simplicity and mathematical tractability) that can override the aim for full consistency. The book also offers a comprehensive review and criticism of both the physical and philosophical literature on the temporal asymmetry exhibited by electromagnetic radiation fields, including Einstein's discussion of the asymmetry and Wheeler and Feynman's influential absorber theory of radiation. Frisch argues that attempts to derive the asymmetry from thermodynamic or cosmological considerations fail and proposes that we should understand the asymmetry as due to a fundamental causal constraint. The book's overarching philosophical thesis is that standard philosophical accounts that strictly identify scientific theories with a mathematical formalism and a mapping function specifying the theory's ontology are inadequate, since they permit neither inconsistent yet genuinely successful theories nor thick causal notions to be part of fundamental physics. (shrink)
Models not only represent but may also influence their targets in important ways. While models’ abilities to influence outcomes has been studied in the context of economic models, often under the label ‘performativity’, we argue that this phenomenon also pertains to epidemiological models, such as those used for forecasting the trajectory of the Covid-19 pandemic. After identifying three ways in which a model by the Covid-19 Response Team at Imperial College London may have influenced scientific advice, policy, and individual responses, (...) we consider the implications of epidemiological models’ performative capacities. We argue, first, that performativity may impair models’ ability to successfully predict the course of an epidemic; but second, that it may provide an additional sense in which these models can be successful, namely by changing the course of an epidemic. (shrink)
According to a widespread view, which can be traced back to Russell’s famous attack on the notion of cause, causal notions have no legitimate role to play in how mature physical theories represent the world. In this paper I first critically examine a number of arguments for this view that center on the asymmetry of the causal relation and argue that none of them succeed. I then argue that embedding the dynamical models of a theory into richer causal structures can (...) allow us to decide between models in cases where our observational data severely underdetermine our choice of dynamical models. (shrink)
Many climate scientists have made claims that may suggest that evidence used in tuning or calibrating a climate model cannot be used to evaluate the model. By contrast, the philosophers Katie Steele and Charlotte Werndl have argued that, at least within the context of Bayesian confirmation theory, tuning is simply an instance of hypothesis testing. In this paper I argue for a weak predictivism and in support of a nuanced reading of climate scientists’ concerns about tuning: there are cases, model-tuning (...) among them, in which predictive successes are more highly confirmatory of a model than accommodation of evidence. (shrink)
In recent work on the foundations of statistical mechanics and the arrow of time, Barry Loewer and David Albert have developed a view that defends both a best system account of laws and a physicalist fundamentalism. I argue that there is a tension between their account of laws, which emphasizes the pragmatic element in assessing the relative strength of different deductive systems, and their reductivism or funda- mentalism. If we take the pragmatic dimension in their account seriously, then the laws (...) of the special sciences should be part of our best explanatory system of the world, as well. (shrink)
I examine Harvey Brown’s account of relativity as dynamic and constructive theory and Michel Janssen recent criticism of it. By contrasting Einstein’s principle-constructive distinction with a related distinction by Lorentz, I argue that Einstein's distinction presents a false dichotomy. Appealing to Lorentz’s distinction, I argue that there is less of a disagreement between Brown and Janssen than appears initially and, hence, that Brown’s view presents less of a departure from orthodoxy than it may seem. Neither the kinematics-dynamics distinction nor Einstein’s (...) principle- and constructive theory distinction ultimately capture their disagreement, which may instead be a disagreement about the role of modality in science and the explanatory force of putatively nomic constraints. (shrink)
According to a view widely held among philosophers of science, the notion of cause has no legitimate role to play in mature theories of physics. In this paper I investigate the role of what physicists themselves identify as causal principles in the derivation of dispersion relations. I argue that this case study constitutes a counterexample to the popular view and that causal principles can function as genuine factual constraints. Introduction Causality and Dispersion Relations Norton's Skepticism Conclusion.
I show that Albert Einstein’s distinction between principle and constructive theories was predated by Hendrik A. Lorentz’s equivalent distinction between mechanism- and principle-theories. I further argue that Lorentz’s views toward realism similarly prefigure what Arthur Fine identified as Einstein’s ‘‘motivational realism.’’ r 2005 Published by Elsevier Ltd.
David Albert and Barry Loewer have argued that the temporal asymmetry of our concept of causal influence or control is grounded in the statistical mechanical assumption of a low-entropy past. In this paper I critically examine Albert's and Loewer 's accounts.
I criticize two accounts of the temporal asymmetry of electromagnetic radiation - that of Huw Price, whose account centrally involves a reinterpretation of Wheeler and Feynman's infinite absorber theory, and that of Dieter Zeh. I then offer some reasons for thinking that the purported puzzle of the arrow of radiation does not present a genuine puzzle in need of a solution.
Climate change presents us with a problem of intergenerational justice. While any costs associated with climate change mitigation measures will have to be borne by the world’s present generation, the main beneficiaries of mitigation measures will be future generations. This raises the question to what extent present generations have a responsibility to shoulder these costs. One influential approach for addressing this question is to appeal to neo-classical economic cost–benefit analyses and so-called economy-climate “integrated assessment models” to determine what course of (...) action a principle of intergenerational welfare maximization would require of us. I critically examine a range of problems for this approach. First, integrated assessment models face a problem of underdetermination and induction: They are very sensitive to a number of highly conjectural assumptions about economic responses to a temperature and climate regime, for which we have no empirical evidence. Second, they involve several simplifying assumptions which cannot be justified empirically. And third, some of the assumptions underlying the construction of economic models are intrinsically normative assumptions that reflect value judgments of the modeler. I conclude that, while integrated assessment models may play a useful role as “toy models,” their use as tools for policy optimization is highly problematic. (shrink)
I show that the standard approach to modeling phenomena involving microscopic classical electrodynamics is mathematically inconsistent. I argue that there is no conceptually unproblematic and consistent theory covering the same phenomena to which this inconsistent theory can be thought of as an approximation; and I propose a set of conditions for the acceptability of inconsistent theories.
A Tale of Two Arrows.Mathias Frisch - 2006 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 37 (3):542-558.details
In this paper I propose a reasonably sharp formulation of the temporal asymmetry of radiation. I criticize accounts that propose to derive the asymmetry from a low-entropy assumption characterizing the state of the early universe and argue that these accounts fail, since they presuppose the very asymmetry they are intended to derive. r 2006 Elsevier Ltd. All rights reserved.
Classical dispersion relations are derived from a time-asymmetric constraint. I argue that the standard causal interpretation of this constraint plays a scientifically legitimate role in dispersion theory, and hence provides a counterexample to the causal skepticism advanced by John Norton and others. Norton ([2009]) argues that the causal interpretation of the time-asymmetric constraint is an empty honorific and that the constraint can be motivated by purely non-causal considerations. In this paper I respond to Norton's criticisms and argue that Norton's skepticism (...) derives its force partly by holding causal principles to a standard too high to be met by other scientifically legitimate constraints. (shrink)
How could the initial, drastic decisions to implement “lockdowns” to control the spread of COVID-19 infections be justifiable, when they were made on the basis of such uncertain evidence? We defend the imposition of lockdowns in some countries by first, and focusing on the UK, looking at the evidence that undergirded the decision, second, arguing that this provided us with sufficient grounds to restrict liberty given the circumstances, and third, defending the use of poorly-empirically-constrained epidemiological models as tools that can (...) legitimately guide public policy. (shrink)
In Frisch 2004 and 2005 I showed that the standard ways of modeling particle-field interactions in classical electrodynamics, which exclude the interactions of a particle with its own field, results in a formal inconsistency, and I argued that attempts to include the self-field lead to numerous conceptual problems. In this paper I respond to criticism of my account in Belot 2007 and Muller 2007. I concede that this inconsistency in itself is less telling than I suggested earlier but argue that (...) existing solutions to the theory's foundational problems do not support the kind of traditional philosophical conception of scientific theorizing defended by Muller and Belot. *Received January 2007; revised October 2007. †To contact the author, please write to: Department of Philosophy, University of Maryland, Skinner Building, College Park, MD 20742; e-mail: [email protected] (shrink)
As the record-breaking heat of 2016 continues into 2017, making it likely that 2017 will be the second hottest year on record just behind the El Niño year 2016, and as Arctic heat waves pushing the sea ice extent to record lows are mirrored by large scale sheets of meltwater and even rain in Antarctica—the Trump administration is taking dramatic steps to undo the Obama administration’s climate legacy.In its final years, the Obama administration pursued two principal strategies toward climate policy. (...) First, by signing the Paris Accord it committed the U.S. to contribute to global efforts to hold “the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to... (shrink)
When do probability distribution functions (PDFs) about future climate misrepresent uncertainty? How can we recognise when such misrepresentation occurs and thus avoid it in reasoning about or communicating our uncertainty? And when we should not use a PDF, what should we do instead? In this paper we address these three questions. We start by providing a classification of types of uncertainty and using this classification to illustrate when PDFs misrepresent our uncertainty in a way that may adversely affect decisions. We (...) then discuss when it is reasonable and appropriate to use a PDF to reason about or communicate uncertainty about climate. We consider two perspectives on this issue. On one, which we argue is preferable, available theory and evidence in climate science basically excludes using PDFs to represent our uncertainty. On the other, PDFs can legitimately be provided when resting on appropriate expert judgement and recognition of associated risks. Once we have specified the border between appropriate and inappropriate uses of PDFs, we explore alternatives to their use. We briefly describe two formal alternatives, namely imprecise probabilities and possibilistic distribution functions, as well as informal possibilistic alternatives. We suggest that the possibilistic alternatives are preferable. -/- . (shrink)
This article defends a pragmatic and structuralist account of scientific representation of the kind recently proposed by Bas van Fraassen against criticisms of both the structuralist and the pragmatist plank of the account. I argue that the account appears to have the unacceptable consequence that the domain of a theory is restricted to phenomena for which we actually have constructed a model—a worry arising from the account’s pragmatism, which is exacerbated by its structuralism. Yet, the account has the resources, at (...) least partially, to address the worry. What remains as implication is a strong anti-foundationalism. 1 Introduction2 ‘No Representation without Representer’3 Representational Structuralism3.1 Do structural models need to be concretely fitted out?3.2 The triviality objection4 Anti-foundationalism5 Conclusion. (shrink)
I discuss two case studies from classical electrodynamics challenging the distinction between laws that delineate physically possible words and initial conditions. First, for many reasonable initial conditions there exist no global solutions to the Maxwell‐Lorentz equations for continuous charge distributions. Second, in deriving an equation of motion for a charged point particle one needs to invoke an asymptotic condition that seems to express a physically contingent fact even though it is mathematically necessary for the derivation.
This paper provides a survey of several philosophical issues arising in classical electrodynamics arguing that there is a philosophically rich set of problems in theories of classical physics that have not yet received the attention by philosophers that they deserve. One issue, which is connected to the philosophy of causation, concerns the temporal asymmetry exhibited by radiation fields in the presence of wave sources. Physicists and philosophers disagree on whether this asymmetry reflects a fundamental causal asymmetry or is due to (...) statistical or thermodynamic considerations. I suggest that an explanation appealing to the asymmetry of causation is more promising. Another issue concerns the conceptual structure of the theory. Despite its empirical success, classical electrodynamics faces serious foundational problems. Models of charged particles involve what by the theory's own lights are idealizations, I maintain, and this is a feature that is not readily captured by traditional philosophical accounts of scientific theories. Other issues I discuss concern (i) the relation between Lorentz's theory of the electron and Einstein's Theory of Special Relativity; (ii) the notion of the domain of a theory, the question of theory reduction, and the relation between classical and more fundamental quantum theories; and (iii) the role of locality constraints, their relation to the concept of causation; and the status of locality conditions in the semi-classical theory of the Aharanov-Bohm effect. (shrink)
Albert provides a sketch of an entropy account of the causal and counterfactual asymmetries. This paper critically examines a proposal that may be thought to fill in some of the lacunae in Albert’s account.
In an illuminating article, Claus Beisbart argues that the recently-popular thesis that the probabilities of statistical mechanics (SM) are Best System chances runs into a serious obstacle: there is no one axiomatization of SM that is robustly best, as judged by the theoretical virtues of simplicity, strength, and fit. Beisbart takes this 'no clear winner' result to imply that the probabilities yielded by the competing axiomatizations simply fail to count as Best System chances. In this reply, we express sympathy for (...) the 'no clear winner' thesis. However, we argue that an importantly different moral should be drawn from this. We contend that the implication for Humean chances is not that there are no SM chances, but rather that SM chances fail to be sharp. (shrink)
This chapter examines two approaches to climate policy: expected utility calculations and a precautionary approach. The former provides the framework for attempts to calculate the social cost of carbon. The latter approach has provided the guiding principle for the United Nations Conference of Parties from the 1992 Rio Declaration to the Paris Agreement. The chapter argues that the deep uncertainties concerning the climate system and climate damages make the exercise of trying to calculate a well-supported value for the SCC impossible. (...) Moreover, cost-benefit analyses are blind to important moral dimensions of the climate problem. Yet it is an open question to what extent an alternative, precautionary approach can result in specific policy recommendations such as the temperature targets of the Paris agreement. (shrink)
Classical dispersion relations are derived from a time-asymmetric constraint. I argue that the standard causal interpretation of this constraint plays a scientifically legitimate role in dispersion theory, and hence provides a counterexample to the causal skepticism advanced by John Norton and others. Norton argues that the causal interpretation of the time-asymmetric constraint is an empty honorific and that the constraint can be motivated by purely non-causal considerations. In this paper I respond to Norton's criticisms and argue that Norton's skepticism derives (...) its force partly by holding causal principles to a standard too high to be met by other scientifically legitimate constraints. Introduction Non-causal Foundations? Other Grounds for Skepticism The Principle of Energy Conservation. (shrink)
In order to motivate the thesis that there is no single concept of causation that can do justice to all of our core intuitions concerning that concept, Ned Hall has argued that there is a conflict between a counterfactual criterion of causation and the condition of causal locality. In this paper I critically examine Hall's argument within the context of a more general discussion of the role of locality constraints in a causal conception of the world. I present two strategies (...) that defenders of counterfactual accounts of causation can pursue to respond to Hall's challenge?including the adoption of a counterfactual condition that is sufficient for causal action-at-a-distance in place of Hall's ?process? condition?and conclude that Hall's argument against counterfactual accounts of causation is unsuccessful. (shrink)
I have argued that the standard ways of modeling classical particle-field interactions rely on a set of inconsistent assumptions. This claim has been criticized in (Muller forthcoming). In this paper I respond to some of Muller's criticism.
in Dirac's classical theory of the electron—is causally non-local. I distinguish two distinct causal locality principles and argue, using Dirac's theory as my main case study, that neither can be reduced to a non-causal principle of local determinism.
This chapter examines the role of parameterParametercalibrationCalibration in the confirmation and validation of complex computer simulation models. I examine the question to what extent calibration data can confirm or validate the calibrated model, focusing in particular on Bayesian approaches to confirmation. I distinguish several different Bayesian approaches to confirmation and argue that complex simulation models exhibit a predictivist effect: Complex computer simulation models constitute a case in which predictive success, as opposed to the mere accommodation of evidence, provides a more (...) stringent test of the model. DataData used in tuning do not validate or confirm a model to the same extent as data successfully predicted by the model do. (shrink)
Bas van Fraassen has recently argued for a "dissolution" of Hilary Putnam's well-known model-theoretic argument. In this paper I argue that, as it stands, van Fraassen's reply to Putnam is unsuccessful. Nonetheless, it suggests the form a successful response might take.
The study of similarity is fundamental to biological inquiry. Many homology concepts have been formulated that function successfully to explain similarity in their native domains, but fail to provide an overarching account applicable to variably interconnected and independent areas of biological research despite the monistic standpoint from which they originate. The use of multiple, explicitly articulated homology concepts, applicable at different levels of the biological hierarchy, allows a more thorough investigation of the nature of biological similarity. Responsible epistemological pluralism as (...) advocated herein is generative of fruitful and innovative biological research, and is appropriate given the metaphysical pluralism that underpins all of biology. (shrink)
Many contemporary philosophers of physics (and philosophers of science more generally) follow Bertrand Russell in arguing that there is no room for causal notions in physics. Causation, as James Woodward has put it, has a ‘human face’, which makes causal notions sit ill with fundamental theories of physics. In this paper I examine a range of anti-causal arguments and show that the human face of causation is the face of scientific representations much more generally. Physics, like other sciences, is deeply (...) permeated with causal reasoning. (shrink)
What are laws of nature? During much of the eighteenth and nineteenth centuries Newton’s laws of motion were taken to be the paradigm of scientific laws thought to constitute universal and necessary eternal truths. But since the turn of the twentieth century we know that Newton’s laws are not universally valid. Does this mean that their status as laws of physics has changed? Have we discovered that the principles, which were once thought to be laws of nature, are not in (...) fact laws? (shrink)
I argue that if we make explicit the role of the user of scientific representations not only in the application but also in the construction of a model or representation, then inconsistent modeling assumptions do not pose an insurmountable obstacle to our representational practices.
Using climate policy debates as a case study, I argue that a certain response to the argument from inductive risk, the hedging defense, runs afoul of a reasonable ethical principle: the no-passing-...