This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
This paper is about teaching probability to students of philosophy who don’t aim to do primarily formal work in their research. These students are unlikely to seek out classes about probability or formal epistemology for various reasons, for example because they don’t realize that this knowledge would be useful for them or because they are intimidated by the material. However, most areas of philosophy now contain debates that incorporate probability, and basic knowledge of it is essential even (...) for philosophers whose work isn’t primarily formal. In this paper, I explain how to teach probability to students who are not already enthusiastic about formal philosophy, taking into account the common phenomena of math anxiety and the lack of reading skills for formal texts. I address course design, lesson design, and assignment design. Most of my recommendations also apply to teaching formal methods other than probability theory. (shrink)
_Probability: A Philosophical Introduction_ introduces and explains the principal concepts and applications of probability. It is intended for philosophers and others who want to understand probability as we all apply it in our working and everyday lives. The book is not a course in mathematical probability, of which it uses only the simplest results, and avoids all needless technicality. The role of probability in modern theories of knowledge, inference, induction, causation, laws of nature, action and decision-making (...) makes an understanding of it especially important to philosophers and students of philosophy, to whom this book will be invaluable both as a textbook and a work of reference. In this book D. H. Mellor discusses the three basic kinds of probability – physical, epistemic, and subjective – and introduces and assesses the main theories and interpretations of them. The topics and concepts covered include: * chance * frequency * possibility * propensity * credence * confirmation * Bayesianism. _Probability: A Philosophical Introduction_ is essential reading for all philosophy students and others who encounter or need to apply ideas of probability. (shrink)
In Probability Designs, Karin Kukkonen presents the predictive processing model of cognition as a means of exploring narrative structure and reader experience. Utilizing the literary canon of various cultures, Kukkonen combines theory and cognitive science to analyze how reader expectation and prediction shape literature, and how literature accomplishes cognitive feats that determine the human capacity for free, exploratory thought.
According to the Lockean thesis, a proposition is believed just in case it is highly probable. While this thesis enjoys strong intuitive support, it is known to conflict with seemingly plausible logical constraints on our beliefs. One way out of this conflict is to make probability 1 a requirement for belief, but most have rejected this option for entailing what they see as an untenable skepticism. Recently, two new solutions to the conflict have been proposed that are alleged to (...) be non-skeptical. We compare these proposals with each other and with the Lockean thesis, in particular with regard to the question of how much we gain by adopting any one of them instead of the probability 1 requirement, that is, of how likely it is that one believes more than the things one is fully certain of. (shrink)
A. J. Ayer was one of the foremost analytical philosophers of the twentieth century, and was known as a brilliant and engaging speaker. In essays based on his influential Dewey Lectures, Ayer addresses some of the most critical and controversial questions in epistemology and the philosophy of science, examining the nature of inductive reasoning and grappling with the issues that most concerned him as a philosopher. This edition contains revised and expanded versions of the lectures and two additional essays. Ayer (...) begins by considering Hume's formulation of the problem of induction and then explores the inferences on which we base our beliefs in factual matters. In other essays, he defines the three kinds of probability that inform inductive reasoning and examines the various criteria for verifiability and falsifiability. In his extensive introduction, Graham Macdonald discusses the arguments in _Probability and Evidence_, how they relate to Ayer's other works, and their influence in contemporary philosophy. He also provides a brief biographical sketch of Ayer, and includes a bibliography of works about and in response to _Probability and Evidence_. (shrink)
The book was planned and written as a single, sustained argument. But earlier versions of a few parts of it have appeared separately. The object of this book is both to establish the existence of the paradoxes, and also to describe a non-Pascalian concept of probability in terms of which one can analyse the structure of forensic proof without giving rise to such typical signs of theoretical misfit. Neither the complementational principle for negation nor the multiplicative principle for conjunction (...) applies to the central core of any forensic proof in the Anglo-American legal system. There are four parts included in this book. Accordingly, these parts have been written in such a way that they may be read in different orders by different kinds of reader. (shrink)
When a doctor tells you there’s a one percent chance that an operation will result in your death, or a scientist claims that his theory is probably true, what exactly does that mean? Understanding probability is clearly very important, if we are to make good theoretical and practical choices. In this engaging and highly accessible introduction to the philosophy of probability, Darrell Rowbottom takes the reader on a journey through all the major interpretations of probability, with reference (...) to real–world situations. In lucid prose, he explores the many fallacies of probabilistic reasoning, such as the ‘gambler’s fallacy’ and the ‘inverse fallacy’, and shows how we can avoid falling into these traps by using the interpretations presented. He also illustrates the relevance of the interpretation of probability across disciplinary boundaries, by examining which interpretations of probability are appropriate in diverse areas such as quantum mechanics, game theory, and genetics. Using entertaining dialogues to draw out the key issues at stake, this unique book will appeal to students and scholars across philosophy, the social sciences, and the natural sciences. (shrink)
Richard Jeffrey is beyond dispute one of the most distinguished and influential philosophers working in the field of decision theory and the theory of knowledge. His work is distinctive in showing the interplay of epistemological concerns with probability and utility theory. Not only has he made use of standard probabilistic and decision theoretic tools to clarify concepts of evidential support and informed choice, he has also proposed significant modifications of the standard Bayesian position in order that it provide a (...) better fit with actual human experience. Probability logic is viewed not as a source of judgment but as a framework for explaining the implications of probabilistic judgments and their mutual compatability. This collection of essays spans a period of some 35 years and includes what have become some of the classic works in the literature. There is also one completely new piece, while in many instances Jeffrey includes afterthoughts on the older essays. (shrink)
The term probability can be used in two main senses. In the frequency interpretation it is a limiting ratio in a sequence of repeatable events. In the Bayesian view, probability is a mental construct representing uncertainty. This 2002 book is about these two types of probability and investigates how, despite being adopted by scientists and statisticians in the eighteenth and nineteenth centuries, Bayesianism was discredited as a theory of scientific inference during the 1920s and 1930s. Through the (...) examination of a dispute between two British scientists, the author argues that a choice between the two interpretations is not forced by pure logic or the mathematics of the situation, but depends on the experiences and aims of the individuals involved. The book should be of interest to students and scientists interested in statistics and probability theories and to general readers with an interest in the history, sociology and philosophy of science. (shrink)
In this influential study of central issues in the philosophy of science, Paul Horwich elaborates on an important conception of probability, diagnosing the failure of previous attempts to resolve these issues as stemming from a too-rigid conception of belief. Adopting a Bayesian strategy, he argues for a probabilistic approach, yielding a more complete understanding of the characteristics of scientific reasoning and methodology. Presented in a fresh twenty-first-century series livery, and including a specially commissioned preface written by Colin Howson, illuminating (...) its enduring importance and relevance to philosophical enquiry, this engaging work has been revived for a new generation of readers. (shrink)
Maxwell's deduction of the probability distribution over the velocity of gas molecules—one of the most important passages in physics (Truesdell)—presents a riddle: a physical discovery of the first importance was made in a single inferential leap without any apparent recourse to empirical evidence. -/- Tychomancy proposes that Maxwell's derivation was not made a priori; rather, he inferred his distribution from non-probabilistic facts about the dynamics of intermolecular collisions. Further, the inference is of the same sort as everyday reasoning about (...) the physical probabilities attached to such canonical chance setups as tossed coins or rolled dice. The structure of this reasoning is investigated and some simple rules for inferring physical probabilities from symmetries and other causally relevant properties of physical systems are proposed. -/- Not only physics but evolutionary biology and population ecology, the science of measurement error, and climate modeling have benefited enormously from the same kind of reasoning, the book goes on to argue. Inferences from dynamics to probability are so obvious to us, however, that their methodological importance has been largely overlooked. (shrink)
This book offers a concise survey of basic probability theory from a thoroughly subjective point of view whereby probability is a mode of judgment. Written by one of the greatest figures in the field of probability theory, the book is both a summation and synthesis of a lifetime of wrestling with these problems and issues. After an introduction to basic probability theory, there are chapters on scientific hypothesis-testing, on changing your mind in response to generally uncertain (...) observations, on expectations of the values of random variables, on de Finetti's dissolution of the so-called problem of induction, and on decision theory. (shrink)
In _Probability and Evidence_, one of Britain's foremost twentieth-century philosophers addresses central questions in the theory of knowledge and the philosophy of science. This book contains A.J. Ayer's John Dewey Lectures delivered at Columbia University, together with two additional essays, "Has Harrod Answered Hume?" and "The Problem of Conditionals.".
Historical records show that there was no real concept of probability in Europe before the mid-seventeenth century, although the use of dice and other randomizing objects was commonplace. Ian Hacking presents a philosophical critique of early ideas about probability, induction, and statistical inference and the growth of this new family of ideas in the fifteenth, sixteenth, and seventeenth centuries. Hacking invokes a wide intellectual framework involving the growth of science, economics, and the theology of the period. He argues (...) that the transformations that made it possible for probability concepts to emerge have constrained all subsequent development of probability theory and determine the space within which philosophical debate on the subject is still conducted. First published in 1975, this edition includes an introduction that contextualizes his book in light of developing philosophical trends. Ian Hacking is the winner of the Holberg International Memorial Prize 2009. (shrink)
Time travel is metaphysically possible. Nikk Effingham contends that arguments for the impossibility of time travel are not sound. Focusing mainly on the Grandfather Paradox, Effingham explores the ramifications of taking this view, discusses issues in probability and decision theory, and considers the potential dangers of travelling in time.
APA PsycNET abstract: This is the first volume of a two-volume work on Probability and Induction. Because the writer holds that probability logic is identical with inductive logic, this work is devoted to philosophical problems concerning the nature of probability and inductive reasoning. The author rejects a statistical frequency basis for probability in favor of a logical relation between two statements or propositions. Probability "is the degree of confirmation of a hypothesis (or conclusion) on the (...) basis of some given evidence (or premises)." Furthermore, all principles and theorems of inductive logic are analytic, and the entire system is to be constructed by means of symbolic logic and semantic methods. This means that the author confines himself to the formalistic procedures of word and symbol systems. The resulting sentence or language structures are presumed to separate off logic from all subjectivist or psychological elements. Despite the abstractionism, the claim is made that if an inductive probability system of logic can be constructed it will have its practical application in mathematical statistics, and in various sciences. 16-page bibliography. (PsycINFO Database Record (c) 2016 APA, all rights reserved). (shrink)
This paper develops an information-sensitive theory of the semantics and probability of conditionals and statements involving epistemic modals. The theory validates a number of principles linking probability and modality, including the principle that the probability of a conditional If A, then C equals the probability of C, updated with A. The theory avoids so-called triviality results, which are standardly taken to show that principles of this sort cannot be validated. To achieve this, we deny that rational (...) agents update their credences via conditionalization. We offer a new rule of update, Hyperconditionalization, which agrees with Conditionalization whenever nonmodal statements are at stake but differs for modal and conditional sentences. (shrink)
This volume, the third in this Springer series, contains selected papers from the four workshops organized by the ESF Research Networking Programme "The Philosophy of Science in a European Perspective" (PSE) in 2010: Pluralism in the Foundations of Statistics Points of Contact between the Philosophy of Physics and the Philosophy of Biology The Debate on Mathematical Modeling in the Social Sciences Historical Debates about Logic, Probability and Statistics The volume is accordingly divided in four sections, each of them containing (...) papers coming from the workshop focussing on one of these themes. While the programme's core topic for the year 2010 was probability and statistics, the organizers of the workshops embraced the opportunity of building bridges to more or less closely connected issues in general philosophy of science, philosophy of physics and philosophy of the special sciences. However, papers that analyze the concept of probability for various philosophical purposes are clearly a major theme in this volume, as it was in the previous volumes of the same series. This reflects the impressive productivity of probabilistic approaches in the philosophy of science, which form an important part of what has become known as formal epistemology - although, of course, there are non-probabilistic approaches in formal epistemology as well. It is probably fair to say that Europe has been particularly strong in this area of philosophy in recent years. . (shrink)
With this treatise, an insightful exploration of the probabilistic connection between philosophy and the history of science, the famous economist breathed new life into studies of both disciplines. Originally published in 1921, this important mathematical work represented a significant contribution to the theory regarding the logical probability of propositions. Keynes effectively dismantled the classical theory of probability, launching what has since been termed the “logical-relationist” theory. In so doing, he explored the logical relationships between classifying a proposition as (...) “highly probable” and as a “justifiable induction.” Unabridged republication of the classic 1921 edition. (shrink)
Ian Hacking here presents a philosophical critique of early ideas about probability, induction and statistical inference and the growth of this new family of ...
Decision theory and the theory of rational choice have recently been the subjects of considerable research by philosophers and economists. However, no adequate anthology exists which can be used to introduce students to the field. This volume is designed to meet that need. The essays included are organized into five parts covering the foundations of decision theory, the conceptualization of probability and utility, pholosophical difficulties with the rules of rationality and with the assessment of probability, and causal decision (...) theory. The editors provide an extensive introduction to the field and introductions to each part. (shrink)
This book is meant to be a primer, that is, an introduction, to probability logic, a subject that appears to be in its infancy. Probability logic is a subject envisioned by Hans Reichenbach and largely created by Adams. It treats conditionals as bearers of conditional probabilities and discusses an appropriate sense of validity for arguments such conditionals, as well as ordinary statements as premisses. This is a clear well-written text on the subject of probability logic, suitable for (...) advanced undergraduates or graduates, but also of interest to professional philosophers. There are well-thought-out exercises, and a number of advanced topics treated in appendices, while some are brought up in exercises and some are alluded to only in footnotes. By this means, it is hoped that the reader will at least be made aware of most of the important ramifications of the subject and its tie-ins with current research, and will have some indications concerning recent and relevant literature. (shrink)
This volume explores the conceptual terrain defined by the Greek word eikos: the probable, likely, or reasonable. A term of art in Greek rhetoric, a defining feature of literary fiction, a seminal mode of historical, scientific, and philosophical inquiry, eikos was a way of thinking about the probable and improbable, the factual and counterfactual, the hypothetical and the real. These thirteen original and provocative essays examine the plausible arguments of courtroom speakers and the 'likely stories' of philosophers, verisimilitude in art (...) and literature, the likelihood of resemblance in human reproduction, the limits of human knowledge and the possibilities of ethical and political agency. The first synthetic study of probabilistic thinking in ancient Greece, the volume illuminates a fascinating chapter in the history of Western thought. (shrink)
In this article, I aim at showing how powers may ground different types of probability in the universe. In Section 1 I single out several dimensions along which the probability of something can be determined. Each of such dimensions can be further specified at the type-level or at the token-level. In Section 2 I introduce some metaphysical assumptions about powers. In Section 3 I show how powers can ground single-case probabilities and frequency-probabilities in a deterministic setting. Later on, (...) in Section 4, I move to a theoretical framework where the falsity of determinism is assumed. Within such a framework, I first argue that some probabilities are grounded on basic powers. Moreover, in Section 5, I introduce tendencies and suggest that they are endowed with specific degrees of activation that may change over time. Such degrees explain why tendencies are more likely to be activated than non-activated, or vice versa. In Section 6 I compare my account of tendencies with other accounts. Finally, in Section 7, I anticipate some general objections against my account – objections that it shares with propensity-accounts of probability – and against degrees of activation of tendencies. (shrink)
In this book Pollock deals with the subject of probabilistic reasoning, making general philosophical sense of objective probabilities and exploring their ...
Why is understanding causation so important in philosophy and the sciences? Should causation be defined in terms of probability? Whilst causation plays a major role in theories and concepts of medicine, little attempt has been made to connect causation and probability with medicine itself. Causality, Probability, and Medicine is one of the first books to apply philosophical reasoning about causality to important topics and debates in medicine. Donald Gillies provides a thorough introduction to and assessment of competing (...) theories of causality in philosophy, including action-related theories, causality and mechanisms, and causality and probability. Throughout the book he applies them to important discoveries and theories within medicine, such as germ theory; tuberculosis and cholera; smoking and heart disease; the first ever randomized controlled trial designed to test the treatment of tuberculosis; the growing area of philosophy of evidence-based medicine; and philosophy of epidemiology. This book will be of great interest to students and researchers in philosophy of science and philosophy of medicine, as well as those working in medicine, nursing and related health disciplines where a working knowledge of causality and probability is required. (shrink)
We suggest a rigorous theory of how objective single-case transition probabilities fit into our world. The theory combines indeterminism and relativity in the “branching space–times” pattern, and relies on the existing theory of causae causantes (originating causes). Its fundamental suggestion is that (at least in simple cases) the probabilities of all transitions can be computed from the basic probabilities attributed individually to their originating causes. The theory explains when and how one can reasonably infer from the probabilities of one “chance (...) set-up” to the probabilities of another such set-up that is located far away. (shrink)
An aspect of Peirce’s thought that may still be underappreciated is his resistance to what Levi calls _pedigree epistemology_, to the idea that a central focus in epistemology should be the justification of current beliefs. Somewhat more widely appreciated is his rejection of the subjective view of probability. We argue that Peirce’s criticisms of subjectivism, to the extent they grant such a conception of probability is viable at all, revert back to pedigree epistemology. A thoroughgoing rejection of pedigree (...) in the context of probabilistic epistemology, however, _does_ challenge prominent subjectivist responses to the problem of the priors. (shrink)
This is a study in the meaning of natural language probability operators, sentential operators such as probably and likely. We ask what sort of formal structure is required to model the logic and semantics of these operators. Along the way we investigate their deep connections to indicative conditionals and epistemic modals, probe their scalar structure, observe their sensitivity to contex- tually salient contrasts, and explore some of their scopal idiosyncrasies.
This chapter presents probability logic as a rationality framework for human reasoning under uncertainty. Selected formal-normative aspects of probability logic are discussed in the light of experimental evidence. Specifically, probability logic is characterized as a generalization of bivalent truth-functional propositional logic (short “logic”), as being connexive, and as being nonmonotonic. The chapter discusses selected argument forms and associated uncertainty propagation rules. Throughout the chapter, the descriptive validity of probability logic is compared to logic, which was used (...) as the gold standard of reference for assessing the rationality of human reasoning in the 20th century. (shrink)
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. _Philosophical Theories of Probability_ is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the (...) subjective theory. (shrink)
INTRODUCTION I should begin by warning the reader that many of the views presented in this book are decidedly unfashionable; the theory of probability I ...
This collection of essays is on the relation between probabilities, especially conditional probabilities, and conditionals. It provides negative results which sharply limit the ways conditionals can be related to conditional probabilities. There are also positive ideas and results which will open up areas of research. The collection is intended to honour Ernest W. Adams, whose seminal work is largely responsible for creating this area of inquiry. As well as describing, evaluating, and applying Adams's work the contributions extend his ideas in (...) directions he may or may not have anticipated, but that he certainly inspired. In addition to a wide range of philosophers of science, the volume should interest computer scientists and linguists. (shrink)
Another title in the reissued Oxford Classic Texts in the Physical Sciences series, Jeffrey's Theory of Probability, first published in 1939, was the first to develop a fundamental theory of scientific inference based on the ideas of Bayesian statistics. His ideas were way ahead of their time and it is only in the past ten years that the subject of Bayes' factors has been significantly developed and extended. Until recently the two schools of statistics were distinctly different and set (...) apart. Recent work has changed all that and today's graduate students and researchers all require an understanding of Bayesian ideas. This book is their starting point. (shrink)
According to what is now commonly referred to as “the Equation” in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of Lewis and others have conclusively shown the Equation to be tenable only at the expense of the view that indicative conditionals express propositions. This study challenges the correctness of that (...) assessment by presenting data that cast doubt on an assumption underlying all triviality arguments. (shrink)
It is well known that classical, aka ‘sharp’, Bayesian decision theory, which models belief states as single probability functions, faces a number of serious difficulties with respect to its handling of agnosticism. These difficulties have led to the increasing popularity of so-called ‘imprecise’ models of decision-making, which represent belief states as sets of probability functions. In a recent paper, however, Adam Elga has argued in favour of a putative normative principle of sequential choice that he claims to be (...) borne out by the sharp model but not by any promising incarnation of its imprecise counterpart. After first pointing out that Elga has fallen short of establishing that his principle is indeed uniquely borne out by the sharp model, I cast aspersions on its plausibility. I show that a slight weakening of the principle is satisfied by at least one, but interestingly not all, varieties of the imprecise model and point out that Elga has failed to motivate his stronger commitment. (shrink)
The Ramseyan thesis that the probability of an indicative conditional is equal to the corresponding conditional probability of its consequent given its antecedent is both widely confirmed and subject to attested counterexamples (e.g., McGee 2000, Kaufmann 2004). This raises several puzzling questions. For instance, why are there interpretations of conditionals that violate this Ramseyan thesis in certain contexts, and why are they otherwise very rare? In this paper, I raise some challenges to Stefan Kaufmann's account of why the (...) Ramseyan thesis sometimes fails, and motivate my own theory. On my theory, the proposition expressed by an indicative conditional is partially determined by a background partition, and hence its probability depends on the choice of such a partition. I hold that this background partition is contextually determined, and in certain conditions is set by a salient question under discussion in the context. I show how the resulting theory offers compelling answers to the puzzling questions raised by failures of the Ramseyan thesis. (shrink)