Probabilistic models have much to offer to philosophy. We continually receive information from a variety of sources: from our senses, from witnesses, from scientific instruments. When considering whether we should believe this information, we assess whether the sources are independent, how reliable they are, and how plausible and coherent the information is. Bovens and Hartmann provide a systematic Bayesian account of these features of reasoning. Simple Bayesian Networks allow us to model alternative assumptions about the nature of the information sources. (...) Measurement of the coherence of information is a controversial matter: arguably, the more coherent a set of information is, the more confident we may be that its content is true, other things being equal. The authors offer a new treatment of coherence which respects this claim and shows its relevance to scientific theory choice. Bovens and Hartmann apply this methodology to a wide range of much discussed issues regarding evidence, testimony, scientific theories, and voting. Bayesian Epistemology is an essential tool for anyone working on probabilistic methods in philosophy, and has broad implications for many other disciplines. (shrink)
Jan Sprenger and Stephan Hartmann offer a fresh approach to central topics in philosophy of science, including causation, explanation, evidence, and scientific models. Their Bayesian approach uses the concept of degrees of belief to explain and to elucidate manifold aspects of scientific reasoning.
Models are of central importance in many scientific contexts. The centrality of models such as the billiard ball model of a gas, the Bohr model of the atom, the MIT bag model of the nucleon, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka-Volterra model of predator-prey interaction, the double helix model of DNA, agent-based and evolutionary models in the social sciences, or general equilibrium models of markets in their respective domains are cases in point. (...) Scientists spend a great deal of time building, testing, comparing and revising models, and much journal space is dedicated to introducing, applying and interpreting these valuable tools. In short, models are one of the principal instruments of modern science. (shrink)
Some naturalistic philosophers of mind subscribing to the predictive processing theory of mind have adopted a realist attitude towards the results of Bayesian cognitive science. In this paper, we argue that this realist attitude is unwarranted. The Bayesian research program in cognitive science does not possess special epistemic virtues over alternative approaches for explaining mental phenomena involving uncertainty. In particular, the Bayesian approach is not simpler, more unifying, or more rational than alternatives. It is also contentious that the Bayesian approach (...) is overall better supported by the empirical evidence. So, to develop philosophical theories of mind on the basis of a realist interpretation of results from Bayesian cognitive science is unwarranted. Naturalistic philosophers of mind should instead adopt an anti-realist attitude towards these results and remain agnostic as to whether Bayesian models are true. For continuing on with an exclusive focus and praise of Bayes within debates about the predictive processing theory will impede progress in philosophical understanding of scientific practice in computational cognitive science as well as of the architecture of the mind. (shrink)
We reconsider the Nagelian theory of reduction and argue that, contrary to a widely held view, it is the right analysis of intertheoretic reduction. The alleged difficulties of the theory either vanish upon closer inspection or turn out to be substantive philosophical questions rather than knock-down arguments.
It is often claimed that the greatest value of the Bayesian framework in cognitive science consists in its unifying power. Several Bayesian cognitive scientists assume that unification is obviously linked to explanatory power. But this link is not obvious, as unification in science is a heterogeneous notion, which may have little to do with explanation. While a crucial feature of most adequate explanations in cognitive science is that they reveal aspects of the causal mechanism that produces the phenomenon to be (...) explained, the kind of unification afforded by the Bayesian framework to cognitive science does not necessarily reveal aspects of a mechanism. Bayesian unification, nonetheless, can place fruitful constraints on causal–mechanical explanation. 1 Introduction2 What a Great Many Phenomena Bayesian Decision Theory Can Model3 The Case of Information Integration4 How Do Bayesian Models Unify?5 Bayesian Unification: What Constraints Are There on Mechanistic Explanation?5.1 Unification constrains mechanism discovery5.2 Unification constrains the identification of relevant mechanistic factors5.3 Unification constrains confirmation of competitive mechanistic models6 ConclusionAppendix. (shrink)
Toy models are highly idealized and extremely simple models. Although they are omnipresent across scientific disciplines, toy models are a surprisingly under-appreciated subject in the philosophy of science. The main philosophical puzzle regarding toy models concerns what the epistemic goal of toy modelling is. One promising proposal for answering this question is the claim that the epistemic goal of toy models is to provide individual scientists with understanding. The aim of this article is to precisely articulate and to defend this (...) claim. In particular, we will distinguish between autonomous and embedded toy models, and then argue that important examples of autonomous toy models are sometimes best interpreted to provide how-possibly understanding, while embedded toy models yield how-actually understanding, if certain conditions are satisfied. _1_ Introduction _2_ Embedded and Autonomous Toy Models _2.1_ Embedded toy models _2.2_ Autonomous toy models _2.3_ Qualification _3_ A Theory of Understanding for Toy Models _3.1_ Preliminaries and requirements _3.2_ The refined simple view _4_ Two Kinds of Understanding with Toy Models _4.1_ Embedded toy models and how-actually understanding _4.2_ Against a how-actually interpretation of all autonomous toy models _4.3_ The how-possibly interpretation of some autonomous toy models _5_ Conclusion. (shrink)
Scientific theories are hard to find, and once scientists have found a theory, H, they often believe that there are not many distinct alternatives to H. But is this belief justified? What should scientists believe about the number of alternatives to H, and how should they change these beliefs in the light of new evidence? These are some of the questions that we will address in this article. We also ask under which conditions failure to find an alternative to H (...) confirms the theory in question. This kind of reasoning is frequently used in science and therefore deserves a careful philosophical analysis. 1 Introduction2 The Conceptual Framework3 The No Alternatives Argument4 Discussion I: A Quantitative Analysis of the No Alternatives Argument5 Discussion II: The Number of Alternatives and the Problem of Underdetermination6 ConclusionsAppendix AAppendix B. (shrink)
Toy models are highly idealized and extremely simple models. Although they are omnipresent across scientific disciplines, toy models are a surprisingly under-appreciated subject in the philosophy of science. The main philosophical puzzle regarding toy models is that it is an unsettled question what the epistemic goal of toy modeling is. One promising proposal for answering this question is the claim that the epistemic goal of toy models is to provide individual scientists with understanding. The aim of this paper is to (...) precisely articulate and to defend this claim. In particular, we will distinguish between autonomous and embedded toy models, and, then, argue that important examples of autonomous toy models are sometimes best interpreted to provide how-possibly understanding, while embedded toy models yield how-actually understanding, if certain conditions are satisfied. (shrink)
Simulation techniques, especially those implemented on a computer, are frequently employed in natural as well as in social sciences with considerable success. There is mounting evidence that the "model-building era" (J. Niehans) that dominated the theoretical activities of the sciences for a long time is about to be succeeded or at least lastingly supplemented by the "simulation era". But what exactly are models? What is a simulation and what is the difference and the relation between a model and a simulation? (...) These are some of the questions addressed in this article. I maintain that the most significant feature of a simulation is that it allows scientists to imitate one process by another process. "Process" here refers solely to a temporal sequence of states of a system. Given the observation that processes are dealt with by all sorts of scientists, it is apparent that simulations prove to be a powerful interdisciplinarily acknowledged tool. Accordingly, simulations are best suited to investigate the various research strategies in different sciences more carefully. To this end, I focus on the function of simulations in the research process. Finally, a somewhat detailed case-study from nuclear physics is presented which, in my view, illustrates elements of a typical simulation in physics. (shrink)
According to the Bayesian paradigm in the psychology of reasoning, the norms by which everyday human cognition is best evaluated are probabilistic rather than logical in character. Recently, the Bayesian paradigm has been applied to the domain of argumentation, where the fundamental norms are traditionally assumed to be logical. Here, we present a major generalisation of extant Bayesian approaches to argumentation that utilizes a new class of Bayesian learning methods that are better suited to modelling dynamic and conditional inferences than (...) standard Bayesian conditionalization, is able to characterise the special value of logically valid argument schemes in uncertain reasoning contexts, greatly extends the range of inferences and argumentative phenomena that can be adequately described in a Bayesian framework, and undermines some influential theoretical motivations for dual function models of human cognition. We conclude that the probabilistic norms given by the Bayesian approach to rationality are not necessarily at odds with the norms given by classical logic. Rather, the Bayesian theory of argumentation can be seen as justifying and enriching the argumentative norms of classical logic. (shrink)
Bayesianism is our leading theory of uncertainty. Epistemology is defined as the theory of knowledge. So “Bayesian Epistemology” may sound like an oxymoron. Bayesianism, after all, studies the properties and dynamics of degrees of belief, understood to be probabilities. Traditional epistemology, on the other hand, places the singularly non-probabilistic notion of knowledge at centre stage, and to the extent that it traffics in belief, that notion does not come in degrees. So how can there be a Bayesian epistemology?
Bayesian epistemology addresses epistemological problems with the help of the mathematical theory of probability. It turns out that the probability calculus is especially suited to represent degrees of belief (credences) and to deal with questions of belief change, confirmation, evidence, justification, and coherence. Compared to the informal discussions in traditional epistemology, Bayesian epis- temology allows for a more precise and fine-grained analysis which takes the gradual aspects of these central epistemological notions into account. Bayesian epistemology therefore complements traditional epistemology; it (...) does not re- place it or aim at replacing it. (shrink)
We present a Bayesian analysis of the epistemology of analogue experiments with particular reference to Hawking radiation. Provided such experiments can be externally validated via universality arguments, we prove that they are confirmatory in Bayesian terms. We then provide a formal model for the scaling behaviour of the confirmation measure for multiple distinct realisations of the analogue system and isolate a generic saturation feature. Finally, we demonstrate that different potential analogue realisations could provide different levels of confirmation. Our results thus (...) provide a basis both to formalise the epistemic value of analogue experiments that have been conducted and to advise scientists as to the respective epistemic value of future analogue experiments. (shrink)
A coherent story is a story that fits together well. This notion plays a central role in the coherence theory of justification and has been proposed as a criterion for scientific theory choice. Many attempts have been made to give a probabilistic account of this notion. A proper account of coherence must not start from some partial intuitions, but should pay attention to the role that this notion is supposed to play within a particular context. Coherence is a property of (...) an information set that boosts our confidence that its content is true ceteris paribus when we receive information from independent and partially reliable sources. We construct a measure cr that relies on hypothetical sources with certain idealized characteristics. A maximally coherent information set, i.e. a set with equivalent propositions, affords a maximal confidence boost. cr is the ratio of the actual confidence boost over the confidence boost that we would have received, had the information been presented in the form of maximally coherent information, ceteris paribus. This measure is functionally dependent on the degree of reliability r of the sources. We use cr to construct a coherence quasi-ordering over information sets S and S’: S is no less coherent than S’ just in case c_r is not smaller than c_r for any value of the reliability parameter. We show that, on our account, the coherence of the story about the world gives us a reason to believe that the story is true and that the coherence of a scientific theory, construed as a set of models, is a proper criterion for theory choice. (shrink)
Effective field theories have been a very popular tool in quantum physics for almost two decades. And there are good reasons for this. I will argue that effective field theories share many of the advantages of both fundamental theories and phenomenological models, while avoiding their respective shortcomings. They are, for example, flexible enough to cover a wide range of phenomena, and concrete enough to provide a detailed story of the specific mechanisms at work at a given energy scale. So will (...) all of physics eventually converge on effective field theories? This paper argues that good scientific research can be characterised by a fruitful interaction between fundamental theories, phenomenological models and effective field theories. All of them have their appropriate functions in the research process, and all of them are indispensable. They complement each other and hang together in a coherent way which I shall characterise in some detail. To illustrate all this I will present a case study from nuclear and particle physics. The resulting view about scientific theorising is inherently pluralistic, and has implications for the debates about reductionism and scientific explanation. (shrink)
Bayes nets are a powerful tool for researchers in statistics and artificial intelligence. This chapter demonstrates that they are also of much use for philosophers and psychologists interested in (Bayesian) rationality. To do so, we outline the general methodology of Bayes nets modeling in rationality research and illustrate it with several examples from the philosophy and psychology of reasoning and argumentation. Along the way, we discuss the normative foundations of Bayes nets modeling and address some of the methodological problems it (...) raises. (shrink)
We argue that social deliberation may increase an agent’s confidence and credence under certain circumstances. An agent considers a proposition H and assigns a probability to it. However, she is not fully confident that she herself is reliable in this assignment. She then endorses H during deliberation with another person, expecting him to raise serious objections. To her surprise, however, the other person does not raise any objections to H. How should her attitudes toward H change? It seems plausible that (...) she should increase the credence she assigns to H and, at the same time, increase the reliability she assigns to herself concerning H. A Bayesian model helps us to investigate under what conditions, if any, this is rational. (shrink)
Fundamental theories are hard to come by. But even if we had them, they would be too complicated to apply. Quantum chromodynamics is a case in point. This theory is supposed to govern all strong interactions, but it is extremely hard to apply and test at energies where protons, neutrons and ions are the effective degrees of freedom. Instead, scientists typically use highly idealized models such as the MIT Bag Model or the Nambu Jona-Lasinio Model to account for phenomena in (...) this domain, to explain them and to gain nderstanding. Based on these models, which typically isolate a single feature of QCD and disregard many others, scientists attempt to get a better understanding of the physics of strong interactions. But does this practice make sense? Is it justified to use these models for the purposes at hand? Interestingly, these models do not even provide an accurate description of the mass spectrum of protons, neutrons and pions and their lowest lying excitations well - despite several adjustable parameters. And yet, the models are heavily used. I'll argue that a qualitative story, which establishes an explanatory link between the fundamental theory and a model, plays an important role in model acceptance in these cases. (shrink)
A widely shared view in the cognitive sciences is that discovering and assessing explanations of cognitive phenomena whose production involves uncertainty should be done in a Bayesian framework. One assumption supporting this modelling choice is that Bayes provides the best approach for representing uncertainty. However, it is unclear that Bayes possesses special epistemic virtues over alternative modelling frameworks, since a systematic comparison has yet to be attempted. Currently, it is then premature to assert that cognitive phenomena involving uncertainty are best (...) explained within the Bayesian framework. As a forewarning, progress in cognitive science may be hindered if too many scientists continue to focus their efforts on Bayesian modelling, which risks to monopolize scientific resources that may be better allocated to alternative approaches. (shrink)
In this article, we address a major outstanding question of probabilistic Bayesian epistemology: how should a rational Bayesian agent update their beliefs upon learning an indicative conditional? A number of authors have recently contended that this question is fundamentally underdetermined by Bayesian norms, and hence that there is no single update procedure that rational agents are obliged to follow upon learning an indicative conditional. Here we resist this trend and argue that a core set of widely accepted Bayesian norms is (...) sufficient to identify a normatively privileged updating procedure for this kind of learning. Along the way, we justify a privileged formalization of the notion of ‘epistemic conservativity’, offer a new analysis of the Judy Benjamin problem, and emphasize the distinction between interpreting the content of new evidence and updating one’s beliefs on the basis of that content. (shrink)
Theoretical models are an important tool for many aspects of scientific activity. They are used, i.a., to structure data, to apply theories or even to construct new theories. But what exactly is a model? It turns out that there is no proper definition of the term "model" that covers all these aspects. Thus, I restrict myself here to evaluate the function of models in the research process while using "model" in the loose way physicists do. To this end, I distinguish (...) four kinds of models. These are (1) models as special theories, (2) models as a substitute for a theory, (3) toy models and (4) developmental models. I argue that models of the types (3) and (4) are considerably useful in the process of theory construction. This will be demonstrated in an extended case-study from High-Energy Physics. (shrink)
Various scientific theories stand in a reductive relation to each other. In a recent article, we have argued that a generalized version of the Nagel-Schaffner model (GNS) is the right account of this relation. In this article, we present a Bayesian analysis of how GNS impacts on confirmation. We formalize the relation between the reducing and the reduced theory before and after the reduction using Bayesian networks, and thereby show that, post-reduction, the two theories are confirmatory of each other. We (...) then ask when a purported reduction should be accepted on epistemic grounds. To do so, we compare the prior and posterior probabilities of the conjunction of both theories before and after the reduction and ask how well each is confirmed by the available evidence. (shrink)
In this discussion note, we explain how to relax some of the standard assumptions made in Garber-style solutions to the Problem of Old Evidence. The result is a more general and explanatory Bayesian approach.
Life-science phenomena are often explained by specifying the mechanisms that bring them about. The new mechanistic philosophers have done much to substantiate this claim and to provide us with a better understanding of what mechanisms are and how they explain. Although there is disagreement among current mechanists on various issues, they share a common core position and a seeming commitment to some form of scientific realism. But is such a commitment necessary? Is it the best way to go about mechanistic (...) explanation? In this article, we propose an alternative antirealist account that also fits explanatory practice in the life sciences. We pay special attention to mechanistic models, i.e. scientific models that involve a mechanism, and to the role of coherence considerations in building such models. To illustrate our points, we consider the mechanism for the action potential. 1 Introduction2 Some Core Features of Mechanistic Explanation3 Scientific Realism and Mechanistic Explanation4 Antirealist Mechanistic Explanation: The Case of the Action Potential5 Some Outstanding Issues for the Antirealist Mechanist6 Two Problems for the Realist Mechanist7 Conclusions. (shrink)
The aggregation of consistent individual judgments on logically interconnected propositions into a collective judgment on those propositions has recently drawn much attention. Seemingly reasonable aggregation procedures, such as propositionwise majority voting, cannot ensure an equally consistent collective conclusion. The literature on judgment aggregation refers to that problem as the discursive dilemma. In this paper, we motivate that many groups do not only want to reach a factually right conclusion, but also want to correctly evaluate the reasons for that conclusion. In (...) other words, we address the problem of tracking the true situation instead of merely selecting the right outcome. We set up a probabilistic model analogous to Bovens and Rabinowicz (2006) and compare several aggregation procedures by means of theoretical results, numerical simulations and practical considerations. Among them are the premise-based, the situation-based and the distance-based procedure. Our findings confirm the conjecture in Hartmann, Pigozzi and Sprenger (2008) that the premise-based procedure is a crude, but reliable and sometimes even optimal form of judgment aggregation. (shrink)
According to an argument by Colin Howson, the no-miracles argument is contingent on committing the base-rate fallacy and is therefore bound to fail. We demonstrate that Howson’s argument only applies to one of two versions of the NMA. The other version, which resembles the form in which the argument was initially presented by Putnam and Boyd, remains unaffected by his line of reasoning. We provide a formal reconstruction of that version of the NMA and show that it is valid. Finally, (...) we demonstrate that the use of subjective priors is consistent with the realist implication of the NMA and show that a core worry with respect to the suggested form of the NMA can be dispelled. (shrink)
We appeal to the theory of Bayesian Networks to model different strategies for obtaining confirmation for a hypothesis from experimental test results provided by less than fully reliable instruments. In particular, we consider (i) repeated measurements of a single test consequence of the hypothesis, (ii) measurements of multiple test consequences of the hypothesis, (iii) theoretical support for the reliability of the instrument, and (iv) calibration procedures. We evaluate these strategies on their relative merits under idealized conditions and show some surprising (...) repercussions on the variety-of-evidence thesis and the Duhem-Quine thesis. (shrink)
Conditionals and conditional reasoning have been a long-standing focus of research across a number of disciplines, ranging from psychology through linguistics to philosophy. But almost no work has concerned itself with the question of how hearing or reading a conditional changes our beliefs. Given that we acquire much—perhaps most—of what we believe through the testimony of others, the simple matter of acquiring conditionals via others’ assertion of a conditional seems integral to any full understanding of the conditional and conditional reasoning. (...) In this paper we detail a number of basic intuitions about how beliefs might change in response to a conditional being uttered, and show how these are backed by behavioral data. In the remainder of the paper, we then show how these deceptively simple phenomena pose a fundamental challenge to present theoretical accounts of the conditional and conditional reasoning – a challenge which no account presently fully meets. (shrink)
Bayesian Coherence Theory of Justification or, for short, Bayesian Coherentism, is characterized by two theses, viz. (i) that our degree of confidence in the content of a set of propositions is positively affected by the coherence of the set, and (ii) that coherence can be characterized in probabilistic terms. There has been a longstanding question of how to construct a measure of coherence. We will show that Bayesian Coherentism cannot rest on a single measure of coherence, but requires a vector (...) whose components exhaustively characterize the coherence properties of the set. Our degree of confidence in the content of the information set is a function of the reliability of the sources and the components of the coherence vector. The components of this coherence vector are weakly but not strongly separable, which blocks the construction of a single coherence measure. (shrink)
This paper focuses on the question of how to resolve disagreement and uses the Lehrer-Wagner model as a formal tool for investigating consensual decision-making. The main result consists in a general definition of when agents treat each other as epistemic peers (Kelly 2005; Elga 2007), and a theorem vindicating the “equal weight view” to resolve disagreement among epistemic peers. We apply our findings to an analysis of the impact of social network structures on group deliberation processes, and we demonstrate their (...) stability with the help of numerical simulations. (shrink)
We provide a Bayesian justification of the idea that, under certain conditions, the absence of an argument in favour of the truth of a hypothesis H constitutes a good argument against the truth of H.
Scientific theories are used for a variety of purposes. For example, physical theories such as classical mechanics and electrodynamics have important applications in engineering and technology, and we trust that this results in useful machines, stable bridges, and the like. Similarly, theories such as quantum mechanics and relativity theory have many applications as well. Beyond that, these theories provide us with an understanding of the world and address fundamental questions about space, time, and matter. Here we trust that the answers (...) scientific theories give are reliable and that we have good reason to believe that the features of the world are similar to what the theories say about them. But why do we trust scientific theories, and what counts as evidence in favor of them? (shrink)
The problem of old evidence, first described by Glymour [1980], is still widely regarded as one of the most pressing foundational challenges to the Bayesian account of scientific reasoning. Many so...
If we receive information from multiple independent and partially reliable information sources, then whether we are justified to believe these information items is affected by how reliable the sources are, by how well the information coheres with our background beliefs and by how internally coherent the information is. We consider the following question. Is coherence a separable determinant of our degree of belief, i.e. is it the case that the more coherent the new information is, the more justified we are (...) in believing the new information, ceteris paribus? We show that if we consider sets of information items of any size (Holism), and if we assume that there exists a coherence Ordering over such sets and that coherence is a function of the probability distribution over the propositions in such sets (Probabilism), then Separability fails to hold. (shrink)
A descriptive norm is a behavioral rule that individuals follow when their empirical expectations of others following the same rule are met. We aim to provide an account of the emergence of descriptive norms by first looking at a simple case, that of the standing ovation. We examine the structure of a standing ovation, and show it can be generalized to describe the emergence of a wide range of descriptive norms.
On Correspondence.Stephan Hartmann - 2002 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 33 (1):79-94.details
This paper is an essay review of Steven French and Harmke Kamminga (eds.), Correspondence, Invariance and Heuristics. Essays in Honour of Heinz Post (Dordrecht: Kluwer, 1993). I distinguish a varity of correspondence relations between scientific theories (exemplified by cases from the book under review) and examine how one can make sense of the the prevailing continuity in scientific theorizing.
Models are of central importance in many scientific contexts. The roles the MIT bag model of the nucleon, the billiard ball model of a gas, the Bohr model of the atom, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka- Volterra model of predator-prey interaction, agent-based and evolutionary models of social interaction, or general equilibrium models of markets play in their respective domains are cases in point.
This paper explores various functions of idealizations in quantum field theory. To this end it is important to first distinguish between different kinds of theories and models of or inspired by quantum field theory. Idealizations have pragmatic and cognitive functions. Analyzing a case-study from hadron physics, I demonstrate the virtues of studying highly idealized models for exploring the features of theories with an extremely rich structure such as quantum field theory and for gaining some understanding of the physical processes in (...) the system under consideration. (shrink)
Nancy Cartwright is one of the most distinguished and influential contemporary philosophers of science. Despite the profound impact of her work, there is neither a systematic exposition of Cartwright’s philosophy of science nor a collection of articles that contains in-depth discussions of the major themes of her philosophy. This book is devoted to a critical assessment of Cartwright’s philosophy of science and contains contributions from Cartwright's champions and critics. Broken into three parts, the book begins by addressing Cartwright's views on (...) the practice of model building in science and the question of how models represent the world before moving on to a detailed discussion of methodologically and metaphysically challenging problems. Finally, the book addresses Cartwright's original attempts to clarify profound questions concerning the metaphysics of science. With contributions from leading scholars, such as Ronald N. Giere and Paul Teller, this unique volume will be extremely useful to philosophers of science the world over. (shrink)
In a famous experiment by Tversky and Kahneman (Psychol Rev 90:293–315, 1983), featuring Linda the bank teller, the participants assign a higher probability to a conjunction of propositions than to one of the conjuncts, thereby seemingly committing a probabilistic fallacy. In this paper, we discuss a slightly different example featuring someone named Walter, who also happens to work at a bank, and argue that, in this example, it is rational to assign a higher probability to the conjunction of suitably chosen (...) propositions than to one of the conjuncts. By pointing out the similarities between Tversky and Kahneman’s experiment and our example, we argue that the participants in the experiment may assign probabilities to the propositions in question in such a way that it is also rational for them to give the conjunction a higher probability than one of the conjuncts. (shrink)
Corroborating Testimony, Probability and Surprise’, Erik J. Olsson ascribes to L. Jonathan Cohen the claims that if two witnesses provide us with the same information, then the less probable the information is, the more confident we may be that the information is true (C), and the stronger the information is corroborated (C*). We question whether Cohen intends anything like claims (C) and (C*). Furthermore, he discusses the concurrence of witness reports within a context of independent witnesses, whereas the witnesses in (...) Olsson's model are not independent in the standard sense. We argue that there is much more than, in Olsson's words, ‘a grain of truth’ to claim (C), both on his own characterization as well as on Cohen's characterization of the witnesses. We present an analysis for independent witnesses in the contexts of decision-making under risk and decision-making under uncertainty and generalize the model for n witnesses. As to claim (C*), Olsson's argument is contingent on the choice of a particular measure of corroboration and is not robust in the face of alternative measures. Finally, we delimit the set of cases to which Olsson's model is applicable. 1 Claim (C) examined for Olsson's characterization of the relationship between the witnesses 2 Claim (C) examined for two or more independent witnesses 3 Robustness and multiple measures of corroboration 4 Discussion. (shrink)
The aggregation of consistent individual judgments on logically interconnected propositions into a collective judgment on the same propositions has recently drawn much attention. Seemingly reasonable aggregation procedures, such as propositionwise majority voting, cannot ensure an equally consistent collective conclusion. The literature on judgment aggregation refers to such a problem as the \textit{discursive dilemma}. In this paper we assume that the decision which the group is trying to reach is factually right or wrong. Hence, we address the question of how good (...) the various approaches are at selecting the right conclusion. We focus on two approaches: distance-based procedures and a Bayesian analysis. They correspond to group-internal and group-external decision-making, respectively. We compare those methods in a probabilistic model, demonstrate the robustness of our results over various generalizations and discuss their applicability in different situations. The findings vindicate (i) that in judgment aggregation problems, reasons should carry higher weight than conclusions and (ii) that considering members of an advisory board to be highly competent is a better strategy than to underestimate their advice.". (shrink)
There are various ways to reach a group decision on a factual yes–no question. One way is to vote and decide what the majority votes for. This procedure receives some epistemological support from the Condorcet Jury Theorem. Alternatively, the group members may prefer to deliberate and will eventually reach a decision that everybody endorses—a consensus. While the latter procedure has the advantage that it makes everybody happy, it has the disadvantage that it is difficult to implement, especially for larger groups. (...) Besides, the resulting consensus may be far away from the truth. And so we ask: Is deliberation truth-conducive in the sense that majority voting is? To address this question, we construct a highly idealized model of a particular deliberation process, inspired by the movie Twelve Angry Men, and show that the answer is ‘yes’. Deliberation procedures can be truth-conducive just as the voting procedure is. We then explore, again on the basis of our model and using agent-based simulations, under which conditions it is better epistemically to deliberate than to vote. Our analysis shows that there are contexts in which deliberation is epistemically preferable and we will provide reasons for why this is so. (shrink)
This volume is a serious attempt to open up the subject of European philosophy of science to real thought, and provide the structural basis for the ...
According to orthodoxy, there are two basic moods of supposition: indicative and subjunctive. The most popular formalizations of the corresponding norms of suppositional judgement are given by Bayesian conditionalization and Lewisian imaging, respectively. It is well known that Bayesian conditionalization can be generalized to provide a model for the norms of partial indicative supposition. This raises the question of whether imaging can likewise be generalized to model the norms of ‘partial subjunctive supposition’. The present article casts doubt on whether the (...) most natural generalizations of imaging are able to provide a plausible account of the norms of partial subjunctive supposition. (shrink)
This book successfully achieves to serve two different purposes. On the one hand, it is a readable physics-based introduction into the philosophy of science, written in an informal and accessible style. The author, himself a professor of physics at the University of Notre Dame and active in the philosophy of science for almost twenty years, carefully develops his metatheoretical arguments on a solid basis provided by an extensive survey along the lines of the historical development of physics. On the other (...) hand, this book supplies one long argument for Cushing´s own attitude in the philosophy of science. While former studies of the author, from which this book draws in part, focused each on one special episode in the history of science, this book gathers case material from many different parts of physics and epochs. The main goal of this book is ”to impress upon the reader the essential and ineliminable role that philosophical considerations have played in the actual practice of science” (p. xv). (shrink)
Quantum mechanical entangled configurations of particles that do not satisfy Bell’s inequalities, or equivalently, do not have a joint probability distribution, are familiar in the foundational literature of quantum mechanics. Nonexistence of a joint probability measure for the correlations predicted by quantum mechanics is itself equivalent to the nonexistence of local hidden variables that account for the correlations (for a proof of this equivalence, see Suppes and Zanotti, 1981). From a philosophical standpoint it is natural to ask what sort of (...) concept can be used to provide a “joint” analysis of such quantum correlations. In other areas of application of probability, similar but different problems arise. A typical example is the introduction of upper and lower probabilities in the theory of belief. A person may feel uncomfortable assigning a precise probability to the occurrence of rain tomorrow, but feel comfortable saying the probability should be greater than ½ and less than ⅞. Rather extensive statistical developments have occurred for this framework. A thorough treatment can be found in Walley (1991) and an earlier measurement-oriented development in Suppes (1974). It is important to note that this focus on beliefs, or related Bayesian ideas, is not concerned, as we are here, with the nonexistence of joint probability distributions. Yet earlier work with no relation to quantum mechanics, but focused on conditions for existence has been published by many people. For some of our own work on this topic, see Suppes and Zanotti (1989). Still, this earlier work naturally suggested the question of whether or not upper and lower measures could be used in quantum mechanics, as a generalization of.. (shrink)
Many results of modern physics—those of quantum mechanics, for instance—come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first (...) to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fields. In particular, the Bayesian and Humean views of probabilities and the varieties of Boltzmann's typicality approach are examined. The contributions on quantum mechanics discuss the special character of quantum correlations, the justification of the famous Born Rule, and the role of probabilities in a quantum field theoretic framework. Finally, the connections between probabilities and foundational issues in physics are explored. The Reversibility Paradox, the notion of entropy, and the ontology of quantum mechanics are discussed. Other essays consider Humean supervenience and the question whether the physical world is deterministic. (shrink)