Various forms of underdetermination that might threaten the realist stance are examined. That which holds between different 'formulations' of a theory (such as the Hamiltonian and Lagrangian formulations of classical mechanics) is considered in some detail, as is the 'metaphysical' underdetermination invoked to support 'ontic structural realism'. The problematic roles of heuristic fruitfulness and surplus structure in attempts to break these forms of underdetermination are discussed and an approach emphasizing the relevant structural commonalities is defended.
String theory promises to be able to provide us with a working theory of quantum gravity and a unified description of all fundamental forces. In string theory there are so called ‘dualities’; i.e. different theoretical formulations that are physically equivalent. In this article these dualities are investigated from a philosophical point of view. Semantic and epistemic questions relating to the problem of underdetermination of theories by data and the debate on realism concerning scientific theories are discussed. Depending on ones (...) views on semantic issues and realism different interpretations are possible of the dualities. (shrink)
The underdetermination of scientific theory choice by evidence is a familiar but multifaceted concept in the philosophy of science. I answer two pressing questions about underdetermination: “What is underdetermination?” and “Why should we care about underdetermination?” To answer the first question, I provide a general definition of underdetermination, identify four forms of underdetermination, and discuss major criticisms of each form. To answer the second question, I then survey two common uses of underdetermination in (...) broader arguments against scientific realism and in support of the use of values in scientific theory choice. I conclude that philosophers should also care about underdetermination because it impacts scientists in their practice. (shrink)
This article investigates the implications of string theory for the conception of scientific theory confirmation. The classical understanding of theory confirmation is based on the assumption that scientific theory building is underdetermined by the available empirical data. Several arguments are presented, which suggest a devaluation of this ‘principle of scientific underdetermination’ in the context of string theory. An altered conception of scientific progress emerges that is not based on the notion of theory succession.
Are theories ‘underdetermined by the evidence’ in any way that should worry the scientiﬁc realist? I argue that no convincing reason has been given for thinking so. A crucial distinction is drawn between data equivalence and empirical equivalence. Duhem showed that it is always possible to produce a data equivalent rival to any accepted scientiﬁc theory. But there is no reason to regard such a rival as equally well empirically supported and hence no threat to realism. Two theories are empirically (...) equivalent if they share all consequences expressed in purely observational vocabulary. This is a much stronger requirement than has hitherto been recognised— two such ‘rival’ theories must in fact agree on many claims that are clearly theoretical in nature. Given this, it is unclear how much of an impact on realism a demonstration that there is always an empirically equivalent ‘rival’ to any accepted theory would have—even if such a demonstration could be produced. Certainly in the case of the version of realism that I defend—structural realism—such a demonstration would have precisely no impact: two empirically equivalent theories are, according to structural realism, cognitively indistinguishable. (shrink)
Several feminist philosophers of science have tried to open up the possibility that feminist ethical or political commitments could play a positive role in good science by appealing to the Duhem-Quine thesis and underdetermination of theories by observation. I examine several different interpretations of the claim that feminist values could play a legitimate role in theory justification and show that none of them follow from a logical gap between theory and observation. Finally, I sketch an alternative approach for defending (...) the possibility that feminist political commitments could play a legitimate role in science. (shrink)
Permissivism is the thesis that, for some body of evidence and a proposition p, there is more than one rational doxastic attitude any agent with that evidence can take toward p. Proponents of uniqueness deny permissivism, maintaining that every body of evidence always determines a single rational doxastic attitude. In this paper, we explore the debate between permissivism and uniqueness about evidence, outlining some of the major arguments on each side. We then consider how permissivism can be understood as an (...)underdetermination thesis, and show how this moves the debate forward in fruitful ways: in distinguishing between different types of permissivism, in dispelling classic objections to permissivism, and in shedding light on the relationship between permissivism and evidentialism. (shrink)
Duhem—Quine underdetermination plays a constructive role in epistemology by pinpointing the impact of non-empirical virtues or cognitive values on theory choice. Underdetermination thus contributes to illuminating the nature of scientific rationality. Scientists prefer and accept one account among empirical equivalent alternatives. The non-empirical virtues operating in science are laid open in such theory choice decisions. The latter act as an epistemological test tube in making explicit commitments to how scientific knowledge should be like.
In this paper, I argue (i) that there are certain methodological practices that are epistemically significant, and (ii) that we can test for the success of these practices empirically by examining case-studies in the history of science. Analysing a particular episode from the history of medicine, I explain how this can help us resolve specific cases of underdetermination. I conclude that, while the anti-realist is (more or less legitimately) able to construct underdetermination scenarios on a case-by-case basis, he (...) will have to abandon the strategy of using algorithms to do so, thus losing the much needed guarantee that there will always be rival cases of the required kind. (shrink)
According to Duncan Pritchard, there are two kinds of radical sceptical problem; the closure-based problem, and the underdetermination-based problem. He argues that distinguishing these two problems leads to a set of desiderata for an anti-sceptical response, and that the way to meet all of these desiderata is by supplementing a form of Wittgensteinian contextualism with disjunctivist views about factivity. I agree that an adequate response should meet most of the initial desiderata Pritchard puts forward, and that some version of (...) Wittgensteinian contextualism shows the most promise as a starting point for this, but I argue, contra Pritchard, that the addition of disjunctivism is unnecessary and potentially counter-productive. If we draw on lessons from Michael Williams's inferential contextualism then it is both possible, and preferable, to meet the most important of Pritchard's desiderata, undercutting both closure-based and underdetermination-based sceptical problems in a unified way, without the need to resort to disjunctivism. (shrink)
The problem of scientific disregard is the problem of accounting for why some putative theories that appear to be well-supported by empirical evidence nevertheless play no role in the scientific enterprise. Laudan and Leplin suggest (and Hoefer and Rosenberg concur) that at least some of these putative theories fail to be genuine theoretical rivals because they lack some non-empirical property of theoreticity. This solution also supports their repudiation of the thesis of underdetermination. I argue that the attempt to provide (...) criteria of theoreticity fails, that there is a Bayesian solution to the problem of scientific disregard that fares better, and that this successful solution supports a distinctively Bayesian version of the underdetermination thesis. (shrink)
Widespread causal overdetermination is often levied as an objection to nonreductive theories of minds and objects. In response, nonreductive metaphysicians have argued that the type of overdetermination generated by their theories is different from the sorts of coincidental cases involving multiple rock-throwers, and thus not problematic. This paper pushes back. I argue that attention to differences between types of overdetermination discharges very few explanatory burdens, and that overdetermination is a bigger problem for the nonreductive metaphysician than previously thought.
David Lewis defends the thesis of the asymmetry of overdetermination: later affairs are seldom overdetermined by earlier affairs, but earlier affairs are usually overdetermined by later affairs. Recently, Carol Cleland has argued that since the distinctive methodologies of historical science and experimental science exploit different aspects of this asymmetry, the methodology of historical science is just as good, epistemically speaking, as that of experimental science. This paper shows, first, that Cleland's epistemological conclusion does not follow from the thesis of the (...) asymmetry of overdetermination, because overdetermination is compatible with epistemic underdetermination. The paper also shows, contra Cleland, that there is at least one interesting sense in which historical science is epistemically inferior to experimental science, after all, because local underdetermination problems are more widespread in historical than in experimental science. (shrink)
This paper examines the underdetermination between the Ptolemaic, Copernican, and the Tychonic theories of planetary motions and its attempted resolution by Kepler. I argue that past philosophical analyses of the problem of the planetary motions have not adequately grasped a method through which the underdetermination might have been resolved. This method involves a procedure of what I characterize as decomposition and identification. I show that this procedure is used by Kepler in the first half of the Astronomia Nova, (...) where he ultimately claims to have refuted the Ptolemaic theory, thus partially overcoming the underdetermination. Finally, I compare this method with other views of scientific inference such as bootstrapping. (shrink)
Proponents of the value ladenness of science rely primarily on arguments from underdetermination or inductive risk, which share the premise that we should only consider values where the evidence runs out or leaves uncertainty; they adopt a criterion of lexical priority of evidence over values. The motivation behind lexical priority is to avoid reaching conclusions on the basis of wishful thinking rather than good evidence. This is a real concern, however, that giving lexical priority to evidential considerations over values (...) is a mistake and unnecessary for avoiding the wishful thinking. Values have a deeper role to play in science. (shrink)
Classical and quantum field theory provide not only realistic examples of extant notions of empirical equivalence, but also new notions of empirical equivalence, both modal and occurrent. A simple but modern gravitational case goes back to the 1890s, but there has been apparently total neglect of the simplest relativistic analog, with the result that an erroneous claim has taken root that Special Relativity could not have accommodated gravity even if there were no bending of light. The fairly recent acceptance of (...) nonzero neutrino masses shows that widely neglected possibilities for nonzero particle masses have sometimes been vindicated. In the electromagnetic case, there is permanent underdetermination at the classical and quantum levels between Maxwell's theory and the one-parameter family of Proca's electromagnetisms with massive photons, which approximate Maxwell's theory in the limit of zero photon mass. While Yang–Mills theories display similar approximate equivalence classically, quantization typically breaks this equivalence. A possible exception, including unified electroweak theory, might permit a mass term for the photons but not the Yang–Mills vector bosons. Underdetermination between massive and massless (Einstein) gravity even at the classical level is subject to contemporary controversy. (shrink)
The underdetermination of theory by evidence must be distinguished from holism. The latter is a doctrine about the testing of scientific hypotheses; the former is a thesis about empirically adequate logically incompatible global theories or "systems of the world". The distinction is crucial for an adequate assessment of the underdetermination thesis. The paper shows how some treatments of underdetermination are vitiated by failure to observe this distinction, and identifies some necessary conditions for the existence of multiple empirically (...) equivalent global theories. We consider how empiricists should respond to the possibility of such systems of the world. (shrink)
As climate policy decisions are decisions under uncertainty, being based on a range of future climate change scenarios, it becomes a crucial question how to set up this scenario range. Failing to comply with the precautionary principle, the scenario methodology widely used in the Third Assessment Report of the International Panel on Climate Change (IPCC) seems to violate international environmental law, in particular a provision of the United Nations Framework Convention on Climate Change. To place climate policy advice on a (...) sound methodological basis would imply that climate simulations which are based on complex climate models had, in stark contrast to their current hegemony, hardly an epistemic role to play in climate scenario analysis at all. Their main function might actually consist in ‘foreseeing future ozone-holes’. In order to argue for these theses, I explain first of all the plurality of climate models used in climate science by the failure to avoid the problem of underdetermination. As a consequence, climate simulation results have to be interpreted as modal sentences, stating what is possibly true of our climate system. This indicates that climate policy decisions are decisions under uncertainty. Two general methodological principles which may guide the construction of the scenario range are formulated and contrasted with each other: modal inductivism and modal falsificationism. I argue that modal inductivism, being the methodology implicitly underlying the third IPCC report, is severely flawed. Modal falsificationism, representing the sound alternative, would in turn require an overhaul of the IPCC practice. (shrink)
The view that quantum particles cannot be regarded as individuals was articulated in the early days of the 'quantum revolution' and became so well-entrenched that French and Krause called it 'the Received View'. However it was subsequently shown that quantum statistics is in fact compatible with a metaphysics of particle individuality, subject to certain caveats. As a consequent it has been claim that there exists a kind of underdetermination of the metaphysics by the physics which in turn has been (...) used to motivate a form of 'notice' structural realism. In this essay I will review this purported underdetermination and the motivation for structural realism that it purportedly provides in the context of recent developments in both the philosophy of physics and metaphysics. I aim to conclude that such developments reinforce the underdetermination and allow one to respond to certain critical concerns regarding its motivational power. (shrink)
I examine the argument that scientific theories are typically 'underdetermined' by the data, an argument which has often been used to combat scientific realism. I deal with two objections to the underdetermination argument: (i) that the argument conflicts with the holistic nature of confirmation, and (ii) that the argument rests on an untenable theory/data dualism. I discuss possible responses to both objections, and argue that in both cases the proponent of underdetermination can respond in ways which are individually (...) plausible, but that the best response to the first objection conflicts with the best response to the second. Consequently underdetermination poses less of a problem for scientific realism than has often been thought. (shrink)
If two theory formulations are merely different expressions of the same theory, then any problem of choosing between them cannot be due to the underdetermination of theories by data. So one might suspect that we need to be able to tell distinct theories from mere alternate formulations before we can say anything substantive about underdetermination, that we need to solve the problem of identical rivals before addressing the problem of underdetermination. Here I consider two possible solutions: Quine (...) proposes that we call two theories identical if they are equivalent under a reconstrual of predicates, but this would mishandle important cases. Another proposal is to defer to the particular judgements of actual scientists. Consideration of an historical episodethe alleged equivalence of wave and matrix mechanicsshows that this second proposal also fails. Nevertheless, I suggest, the original suspicion is wrong; there are ways to enquire into underdetermination without having solved the problem of identical rivals. (shrink)
The goal of this article is to show that the structuralist approachprovides a powerful framework for the analysis of certain holistic phenomena in empirical theories.We focus on two aspects of holism. The first refers to the involvement of comprehensive complexes of hypothesesin the theoretical treatment of systems regarded in isolation. By contrast, the second refers to thecorrelation between the theoretical descriptions of different systems. It is demonstrated how these two aspectscan be analysed by making use of the structuralist notion of (...) theory-nets, and how they are reflected by a refinedversion of the Ramsey sentence. Furthermore, it is argued that there exists a tight correlation between theoccurrence of these two holistic phenomena, a specific form of underdetermination of terms which occur in thefundamental principles of an empirical theory, and the shaping of the theory's protective belt. After having dealtwith these questions in abstracto, the relevance of these considerations for a better understanding of the dynamicsof empirical theories is demonstrated in a concrete case study. It refers to the role holistic phenomenaplayed in the investigation of the anomalous advance of Mercury's perihelion and in the various attempts to eliminate this anomaly. (shrink)
I identify a controversial hypothesis in evolutionary biology called the plasticity-first hypothesis. I argue that the plasticity-first hypothesis is underdetermined and that the most popular means of studying the plasticity-first hypothesis are insufficient to confirm or disconfirm it. I offer a strategy for overcoming this problem. Researchers need to develop a richer middle range theory of plasticity-first evolution that allows them to identify distinctive empirical traces of the hypothesis. They can then use those traces to discriminate between rival explanations of (...) evolutionary episodes. The best tools for developing such a middle range theory are experimental evolution and formal modelling. (shrink)
I discuss how modern cosmology illustrates underdetermination of theoretical hypotheses by data, in ways that are different from most philosophical discussions. I confine the discussion to the history of the observable universe from about one second after the Big Bang, as described by the mainstream cosmological model: in effect, what cosmologists in the early 1970s dubbed the ‘standard model’, as elaborated since then. Or rather, the discussion is confined to a few aspects of that history. I emphasize that despite (...) the underdetermination, a scientific realist can, and should, endorse this description. (shrink)
The underdetermination of theory by evidence is supposed to be a reason to rethink science. It is not. Many authors claim that underdetermination has momentous consequences for the status of scientific claims, but such claims are hidden in an umbra of obscurity and a penumbra of equivocation. So many various phenomena pass for `underdetermination' that it's tempting to think that it is no unified phenomenon at all, so I begin by providing a framework within which all these (...) worries can be seen as species of one genus: A claim of underdetermination involves (at least implicitly) a set of rival theories, a standard of responsible judgment, and a scope of circumstances in which responsible choice between the rivals is impossible. Within this framework, I show that one variety of underdetermination motivated modern scepticism and thus is a familiar problem at the heart of epistemology. I survey arguments that infer from underdetermination to some reëvaluation of science: top-down arguments infer a priori from the ubiquity of underdetermination to some conclusion about science; bottom-up arguments infer from specific instances of underdetermination, to the claim that underdetermination is widespread, and then to some conclusion about science. The top-down arguments either fail to deliver underdetermination of any great significance or (as with modern scepticism) deliver some well-worn epistemic concern. The bottom-up arguments must rely on cases. I consider several promising cases and find them to either be so specialized that they cannot underwrite conclusions about science in general or not be underdetermined at all. Neither top-down nor bottom-up arguments can motivate any deep reconsideration of science. (shrink)
The underdetermination of theory by data argument (UD) is traditionally construed as an argument that tells us that we ought to favour an anti-realist position over a realist position. I argue that when UD is constructed as an argument saying that theory choice is to proceed between theories that are empirically equivalent and adequate to the phenomena up until now, the argument will not favour constructive empiricism over realism. A constructive empiricist cannot account for why scientists are reasonable in (...) expecting one theory to be empirically adequate rather than another, given the criteria he suggests for theory choice. (shrink)
Consciousness scientists have not reached consensus on two of the most central questions in their field: first, on whether consciousness overflows reportability; second, on the physical basis of consciousness. I review the scientific literature of the 19th century to provide evidence that disagreement on these questions has been a feature of the scientific study of consciousness for a long time. Based on this historical review, I hypothesize that a unifying explanation of disagreement on these questions, up to this day, is (...) that scientific theories of consciousness are underdetermined by the evidence, namely, that they can be preserved “come what may” in front of (seemingly) disconfirming evidence. Consciousness scientists may have to find a way of solving the persistent underdetermination of theories of consciousness to make further progress. (shrink)
This paper examines the epistemological significance of the present situation of underdetermination in quantum mechanics. After analyzing this underdetermination at three levels---formal, ontological, and methodological---the paper considers implications for a number of variants of the thesis of scientific realism in fundamental physics and reassesses Lakatos‘ characterization of progress in physical theory in light of the present situation. Next, this paper considers the implications of underdetermination for Weinberg’s ‘‘dream of a final theory.’’ Finally, the paper concludes by suggesting (...) how one might still think of realism and progress in fundamental physics despite the possibility of persistent underdetermination in quantum mechanics. (shrink)
Metaphysical underdetermination arises when we are not able to decide, through purely theoretical criteria, between competing interpretations of scientific theories with different metaphysical commitments. This is the case in which non-relativistic quantum mechanics (QM) finds itself in. Among several available interpretations, there is the one that states that the interaction with the conscious mind of a human observer causes a change in the dynamics of quantum objects undergoing from indefinite to definite states. In this paper, we argue that there (...) seems to be also a metaphysical underdetermination concerning London and Bauer’s theory of measurement between two methods of phenomenological reduction: the eidetic and the transcendental approaches. Recently, Steven French argued that both methods can be combined in order to interpret London and Bauer’s formalism. However, in this paper we argue that the eidetic one is the only viable phenomenological way to interpret this particular theory of measurement in QM based on the formalism presented by London and Bauer, hence breaking this phenomenological underdetermination. (shrink)
This paper criticizes the attempt to found the epistemological doctrine that all theories are evidentially underdetermined on the thesis that all theories have empirically equivalent rivals. The criticisms focus on the role of auxiliary hypotheses in prediction. It is argued, in particular, that if auxiliaries are underdetermined, then the thesis of empirical equivalence is undecidable. The inference from empirical equivalence to the underdetermination of total theories would seem to survive the criticisms, because total theories do not require auxiliaries to (...) yield observational consequences. It is shown that, nevertheless, underdetermination cannot be established for total theories. (shrink)
Linguistic meaning underdetermines what is said. This has consequences for philosophical accounts of meaning, communication, and propositional attitude reports. I argue that the consequence we should endorse is that utterances typically express many propositions, that these are what speakers mean, and that the correct semantics for attitude reports will handle this fact while being relational and propositional.
What attitude should we take toward a scientific theory when it competes with other scientific theories? This question elicited different answers from instrumentalists, logical positivists, constructive empiricists, scientific realists, holists, theory-ladenists, antidivisionists, falsificationists, and anarchists in the philosophy of science literature. I will summarize the diverse philosophical responses to the problem of underdetermination, and argue that there are different kinds of underdetermination, and that they should be kept apart from each other because they call for different responses.
: Where there are cases of underdetermination in scientific controversies, such as the case of the molecular clock, scientists may direct the course and terms of dispute by playing off the multidimensional framework of theory evaluation. This is because assessment strategies themselves are underdetermined. Within the framework of assessment, there are a variety of trade-offs between different strategies as well as shifting emphases as specific strategies are given more or less weight in assessment situations. When a strategy is underdetermined, (...) scientists can change the dynamics of a controversy by making assessments using different combinations of evaluation strategies and/or weighting whatever strategies are in play in different ways. Following an underdetermination strategy does not end or resolve a scientific dispute. Consequently, manipulating underdetermination is a feature of controversy dynamics and not controversy closure. (shrink)
Advocates of the "strong programme" in the sociology of knowledge have argued that, because scientific theories are "underdetermined" by data, sociological factors must be invoked to explain why scientists believe the theories they do. I examine this argument, and the responses to it by J.R. Brown (1989) and L. Laudan (1996). I distinguish between a number of different versions of the underdetermination thesis, some trivial, some substantive. I show that Brown's and Laudan's attempts to refute the sociologists' argument fail. (...) Nonetheless, the sociologists' argument falls to a different criticism, for the version of the underdetermination thesis that the argument requires, has not been shown to be true. (shrink)
The old antagonism between the Quinean and the Duhemian view on underdetermination is reexamined. In this respect, two theses will be defended. First, it is argued that the main differences between Quine's and Duhem's versions of underdetermination derive from a different attitude towards the history of science. While Quine considered underdetermination from an ahistorical, a logical point of view, Duhem approached it as a distinguished historian of physics. On this basis, a logical and a historical version of (...) the underdetermination thesis can be distinguished. The second thesis of the article is that the main objections against underdetermination are fatal only to the logical rendering. Taken together, the two theses constitute a defence of underdetermination. (shrink)
A number of philosophers have attempted to solve the problem of null-probability possible events in Bayesian epistemology by proposing that there are infinitesimal probabilities. Hájek and Easwaran have argued that because there is no way to specify a particular hyperreal extension of the real numbers, solutions to the regularity problem involving infinitesimals, or at least hyperreal infinitesimals, involve an unsatisfactory ineffability or arbitrariness. The arguments depend on the alleged impossibility of picking out a particular hyperreal extension of the real numbers (...) and/or of a particular value within such an extension due to the use of the Axiom of Choice. However, it is false that the Axiom of Choice precludes a specification of a hyperreal extension—such an extension can indeed be specified. Moreover, for all we know, it is possible to explicitly specify particular infinitesimals within such an extension. Nonetheless, I prove that because any regular probability measure that has infinitesimal values can be replaced by one that has all the same intuitive features but other infinitesimal values, the heart of the arbitrariness objection remains. (shrink)
It is often possible to know what a speaker intends to communicate without knowing what they intend to say. In such cases, speakers need not intend to say anything at all. Stanley and Szabó's influential survey of possible analysis of quantifier domain restriction is, therefore, incomplete and the arguments made by Clapp and Buchanan against Truth Conditional Compositionality and propositional speaker-meaning are flawed. Two theories should not always be viewed as incompatible when they associate the same utterance with different propositions, (...) as there may be many ways to interpret speakers that are compatible with their intentions. (shrink)
Kyle Stanford’s arguments against scientific realism are assessed, with a focus on the underdetermination of theory by evidence. I argue that discussions of underdetermination have neglected a possible symmetry which may ameliorate the situation.
Fallibilism about knowledge and justification is a widely held view in epistemology. In this paper, I will try to arrive at a proper formulation of fallibilism. Fallibilists often hold that Cartesian skepticism is a view that deserves to be taken seriously and dealt with somehow. I argue that it turns out that a canonical form of skeptical argument depends upon the denial of fallibilism. I conclude by considering a response on behalf of the skeptic.
According to the thesis of semantic underdetermination, most sentences of a natural language lack a definite semantic interpretation. This thesis supports an argument against the use of natural language as an instrument of thought, based on the premise that cognition requires a semantically precise and compositional instrument. In this paper we examine several ways to construe this argument, as well as possible ways out for the cognitive view of natural language in the introspectivist version defended by Carruthers. Finally, we (...) sketch a view of the role of language in thought as a specialized tool, showing how it avoids the consequences of semantic underdetermination. (shrink)
ABSTRACT The truth-conditions of utterances are often underdetermined by the meaning of the sentence uttered, as suggested by the observation that the same sentence has different intuitive truth-values in different contexts. The intuitive difference is usually explained by assigning different truth-conditions to different utterances. This paper poses a problem for explanations of this kind: These truth-conditions, if they exist, are epistemically inaccessible. I suggest instead that truth-conditional underdetermination is ineliminable and these utterances have no truth-conditions. Intuitive truth-values are explained (...) by the effect that all the most reasonable interpretations have on the common ground: An utterance is intuitively true when it is true on all interpretations that answer the question under discussion. (shrink)