The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. _Philosophical Theories of Probability_ is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
Evidence-based medicine (EBM) makes use of explicit procedures for grading evidence for causal claims. Normally, these procedures categorise evidence of correlation produced by statistical trials as better evidence for a causal claim than evidence of mechanisms produced by other methods. We argue, in contrast, that evidence of mechanisms needs to be viewed as complementary to, rather than inferior to, evidence of correlation. In this paper we first set out the case for treating evidence of mechanisms alongside evidence of correlation in (...) explicit protocols for evaluating evidence. Next we provide case studies which exemplify the ways in which evidence of mechanisms complements evidence of correlation in practice. Finally, we put forward some general considerations as to how the two sorts of evidence can be more closely integrated by EBM. (shrink)
Part I: Inductivism and its Critics:. 1. Some Historical Background: Inductivism, Russell and the Cambridge School, the Vienna Circle and Popper. 2. Popper’s Critique of Inductivism. 3. Duhem’s Critique of Inductivism. Part II: Conventionalism and the Duhem-Quine Thesis:. 4. Poincare’s Conventionalism of 1902. 5. The Duhem Thesis and the Quine Thesis. Part III: The Nature of Observation:. 6. Observation Statements: the Views of Carnap, Neurath, Popper and Duhem. 7. Observation Statements: Some Psychological Findings. Part IV: The Demarcation between Science and (...) Metaphysics:. 8. Is Metaphysics Meaningless? Wittgenstein, the Vienna Circle and Popper’s Critique. 9. Metaphysics in relation to Science: the Views of Popper, Duhem and Quine. 10. Falsification in the light of the Duhem-Quine Thesis. (shrink)
According to current hierarchies of evidence for EBM, evidence of correlation is always more important than evidence of mechanisms when evaluating and establishing causal claims. We argue that evidence of mechanisms needs to be treated alongside evidence of correlation. This is for three reasons. First, correlation is always a fallible indicator of causation, subject in particular to the problem of confounding; evidence of mechanisms can in some cases be more important than evidence of correlation when assessing a causal claim. Second, (...) evidence of mechanisms is often required in order to obtain evidence of correlation. Third, evidence of mechanisms is often required in order to generalise and apply causal claims. While the EBM movement has been enormously successful in making explicit and critically examining one aspect of our evidential practice, i.e., evidence of correlation, we wish to extend this line of work to make explicit and critically examine a second aspect of our evidential practices: evidence of mechanisms. (shrink)
The propensity interpretation of probability was introduced by Popper ([1957]), but has subsequently been developed in different ways by quite a number of philosophers of science. This paper does not attempt a complete survey, but discusses a number of different versions of the theory, thereby giving some idea of the varieties of propensity. Propensity theories are classified into (i) long-run and (ii) single-case. The paper argues for a long-run version of the propensity theory, but this is contrasted with two single-case (...) propensity theories, one due to Miller and the later Popper, and the other to Fetzer. The three approaches are compared by examining how they deal with a key problem for the propensity approach, namely the relationship between propensity and causality and Humphreys' paradox. (shrink)
Social revolutions--that is critical periods of decisive, qualitative change--are a commonly acknowledged historical fact. But can the idea of revolutionary upheaval be extended to the world of ideas and theoretical debate? The publication of Kuhn's The Structure of Scientific Revolutions in 1962 led to an exciting discussion of revolutions in the natural sciences. A fascinating, but little known, off-shoot of this was a debate which began in the United States in the mid-1970's as to whether the concept of revolution could (...) be applied to mathematics as well as science. Michael Grove declared that revolutions never occur in mathematics, while Joseph Dauben argued that there have been mathematical revolutions and gave some examples. This book is the first comprehensive examination of the question. It reprints the original papers of Grove, Dauben, and Mehrtens, together with additional chapters giving their current views. To this are added new contributions from nine further experts in the history of mathematics, who each discuss an important episode and consider whether it was a revolution. The whole question of mathematical revolutions is thus examined comprehensively and from a variety of perspectives. This thought-provoking volume will interest mathematicians, philosophers, and historians alike. (shrink)
Why is understanding causation so important in philosophy and the sciences? Should causation be defined in terms of probability? Whilst causation plays a major role in theories and concepts of medicine, little attempt has been made to connect causation and probability with medicine itself. Causality, Probability, and Medicine is one of the first books to apply philosophical reasoning about causality to important topics and debates in medicine. Donald Gillies provides a thorough introduction to and assessment of competing theories of causality (...) in philosophy, including action-related theories, causality and mechanisms, and causality and probability. Throughout the book he applies them to important discoveries and theories within medicine, such as germ theory; tuberculosis and cholera; smoking and heart disease; the first ever randomized controlled trial designed to test the treatment of tuberculosis; the growing area of philosophy of evidence-based medicine; and philosophy of epidemiology. This book will be of great interest to students and researchers in philosophy of science and philosophy of medicine, as well as those working in medicine, nursing and related health disciplines where a working knowledge of causality and probability is required. (shrink)
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises’ views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises’ definition of probability in terms of limiting frequency and claims that probability should (...) be taken as a primitive or undefined term in accordance with modern axiomatic approaches. This of course raises the problem of how the abstract calculus of probability should be connected with the ‘actual world of experiments’. It is suggested that this link should be established, not by a definition of probability, but by an application of Popper’s concept of falsifiability. In addition to formulating his own interesting theory, Dr Gillies gives a detailed criticism of the generally accepted Neyman Pearson theory of testing, as well as of alternative philosophical approaches to probability theory. The reissue will be of interest both to philosophers with no previous knowledge of probability theory and to mathematicians interested in the foundations of probability theory and statistics. (shrink)
Semmelweis’s investigations of puerperal fever are some of the most interesting in the history of medicine. This paper considers Hempel’s analysis of the Semmelweis case. It argues that this analysis is inadequate and needs to be supplemented by some Kuhnian ideas. Kuhn’s notion of paradigm needs to be modified to apply to medicine in order to take account of the classification schemes involved in medical theorising. However with a suitable modification it provides an explanation of Semmelweis’s failure which is argued (...) to be superior to some of the external reasons often given. Despite this success in applying Kuhn’s ideas to medicine, it is argued that these ideas must be further modified to take account of the fact that medicine is not a natural science but primarily a practice designed to prevent and cure diseases. (shrink)
Artificial Intelligence and Scientific Method examines the remarkable advances made in the field of AI over the past twenty years, discussing their profound implications for philosophy. Taking a clear, non-technical approach, Donald Gillies shows how current views on scientific method are challenged by this recent research, and suggests a new framework for the study of logic. Finally, he draws on work by such seminal thinkers as Bacon, Gdel, Popper, Penrose, and Lucas, to address the hotly-contested question of whether computers might (...) become intellectually superior to human beings. (shrink)
In their 1983 article, Popper and Miller present an argument against inductive probability. This argument is criticized by Redhead in his 1985 article. The aim of the present note is to state one form of the Popper-Miller argument, and defend it against Redhead's criticisms.
WilsonMark. _ Innovation and Certainty. _ Cambridge Elements in the Philosophy of Mathematics. Cambridge University Press, 2020. Pp. 74. ISBN: 978-1-108-74229-0 ; 978-1-108-59290-1. doi.org/10.1017/9781108592901.
Semmelweis’s investigations of puerperal fever are some of the most interesting in the history of medicine. This paper considers Hempel’s analysis of the Semmelweis case. It argues that this analysis is inadequate and needs to be supplemented by some Kuhnian ideas. Kuhn’s notion of paradigm needs to be modified to apply to medicine in order to take account of the classification schemes involved in medical theorising. However with a suitable modification it provides an explanation of Semmelweis’s failure which is argued (...) to be superior to some of the external reasons often given. Despite this success in applying Kuhn’s ideas to medicine, it is argued that these ideas must be further modified to take account of the fact that medicine is not a natural science but primarily a practice designed to prevent and cure diseases. (shrink)
This paper introduces what is called the intersubjective interpretation of the probability calculus. Intersubjective probabilities are related to subjective probabilities, and the paper begins with a particular formulation of the familiar Dutch Book argument. This argument is then extended, in Section 3, to social groups, and this enables the concept of intersubjective probability to be introduced in Section 4. It is then argued that the intersubjective interpretation is the appropriate one for the probabilities which appear in confirmation theory whether of (...) a Bayesian or a Popperian variety. The final section of the paper states and tries to answer an objection due to Putnam. (shrink)
The paper begins with a discussion of Russell's view that the notion of cause is unnecessary for science and can therefore be eliminated. It is argued that this is true for theoretical physics but untrue for medicine, where the notion of cause plays a central role. Medical theories are closely connected with practical action (attempts to cure and prevent disease), whereas theoretical physics is more remote from applications. This suggests the view that causal laws are appropriate in a context where (...) there is a close connection to action. This leads to a development of an action-related theory of causality which is similar to the agency theory of Menzies and Price, but differs from it in a number of respects, one of which is the following. Menzies and Price connect ‘A causes B’ with an action to produce B by instantiating A, but, particularly in the case of medicine, the law can also be linked to the action of trying to avoid B by ensuring that A is not instantiated. The action-related theory has in common with the agency theory of Menzies and Price the ability to explain causal asymmetry in a simple fashion, but the introduction of avoidance actions together with some ideas taken from Russell enable some of the objections to agency accounts of causality to be met. Introduction Russell on causality Preliminary exposition of the action-related theory Differences between the action-related theory and the agency theory of Menzies and Price Explanation of causal asymmetry Objections to the action-related theory Extension of the theory to the indeterminate case. (shrink)
This volume is a collection of papers on philosophy of mathematics which deal with a series of questions quite different from those which occupied the minds of the proponents of the three classic schools: logicism, formalism, and intuitionism. The questions of the volume are not to do with justification in the traditional sense, but with a variety of other topics. Some are concerned with discovery and the growth of mathematics. How does the semantics of mathematics change as the subject develops? (...) What heuristics are involved in mathematical discovery, and do such heuristics constitute a logic of mathematical discovery? What new problems have been introduced by the development of mathematics since the 1930s? Other questions are concerned with the applications of mathematics both to physics and to the new field of computer science. Then there is the new question of whether the axiomatic method is really so essential to mathematics as is often supposed, and the question, which goes back to Wittgenstein, of the sense in which mathematical proofs are compelling. Taking these questions together they give part of an emerging agenda which is likely to carry philosophy of mathematics forward into the twenty first century. (shrink)
First published in 1982, this reissue contains a critical exposition of the views of Frege, Dedekind and Peano on the foundations of arithmetic. The last quarter of the 19th century witnessed a remarkable growth of interest in the foundations of arithmetic. This work analyses both the reasons for this growth of interest within both mathematics and philosophy and the ways in which this study of the foundations of arithmetic led to new insights in philosophy and striking advances in logic. This (...) historical-critical study provides an excellent introduction to the problems of the philosophy of mathematics - problems which have wide implications for philosophy as a whole. This reissue will appeal to students of both mathematics and philosophy who wish to improve their knowledge of logic. (shrink)
This paper takes up a suggestion made by Floridi that the digital revolution is bringing about a profound change in our metaphysics. The paper aims to bring some older views from philosophy of mathematics to bear on this problem. The older views are concerned principally with mathematical realism—that is the claim that mathematical entities such as numbers exist. The new context for the discussion is informational realism, where the problem shifts to the question of the reality of information. Mathematical realism (...) is perhaps a special case of informational realism. The older views concerned with mathematical realism are the various theories of World 3. The concept of World 3 was introduced by Frege, whose position was close to Plato’s original views. Popper developed the theory of World 3 in a different direction which is characterised as ‘constructive Platonism’. But how is World 3 constructed? This is explored by means of two analogies: the analogy with money, and the analogy with meaning, as explicated by the later Wittgenstein. This leads to the development of an account of informational realism as constructive Aristoteliansim. Finally, this version of informational realism is compared with the informational structural realism which Floridi develops in his 2008 and 2009 papers in Synthese. Similarities and differences between the two positions are noted. (shrink)
This paper begins by developing a causal theory of mechanisms in medicine, and illustrates the theory with the example of the mechanism of the disease anthrax as elucidated by Koch. The causal approach to mechanisms is then compared to the Machamer, Darden, Craver approach. At first sight the two approaches appear to be very different, but it is argued that the divergence is less than it initially seems. There are some differences, however, and it is argued that, where these differences (...) occur, the causal approach to mechanisms is superior. (shrink)
This paper considers whether philosophy of mathematics could benefit by the introduction of some sociology. It begins by considering Lakatos's arguments that philosophy of science should be kept free of any sociology. An attempt is made to criticize these arguments, and then a positive argument is given for introducing a sociological dimension into the philosophy of mathematics. This argument is illustrated by considering Brouwer's account of numbers as mental constructions. The paper concludes with a critical discussion of Azzouni's view that (...) mathematics differs fundamentally from other social practices. (shrink)
Hume bases his argument against miracles on an informal principle. This paper gives a formal explication of this principle of Hume’s, and then shows that this explication can be rigorously proved in a Bayesian framework.
This commentary focuses on the authors’ treatment of Koch’s postulates. It argues in favour of a modification of Koch’s postulates and their analysis in terms of necessary and sufficient conditions. This leads to a criticism of the authors’ treatment of the C. difficile case, and to query the need for the criteria of specificity and proportionality.
This paper seeks to analyse the effects on Economics of Research Assessment Systems, such as the Research Assessment Exercise which was carried out in the UK between 1986 and 2008. The paper begins by pointing out that, in the 2008 RAE, economics turned out to be the research area which was accorded the highest valuation of any subject in the UK, even though economists were then under attack for failing to predict the global financial crash which had occurred a few (...) months earlier. One aim of the paper is to explain this economics anomaly in research assessment. The paper goes on to point out a key difference between economics and the natural sciences. Most areas of the natural sciences are dominated for most of the time by a single, generally accepted, paradigm, whereas there are always in economics different schools of thought which have different and highly conflicting paradigms. Given this situation, it is argued that the effect of research assessment systems in economics is to strengthen the majority school in the subject, and weaken the minority schools. This conclusion is supported by empirical data collected by Frederic Lee for the UK. It is then shown that the greater the dominance of the majority school, the higher the overall valuation of the subject is likely to be, and this is used to explain the anomaly noted earlier. It is argued that research in economics flourishes better in a situation in which there are a number of different schools treated equally, than in one in which a single school dominates. The conclusion is that research assessment systems have a negative effect on research in economics and give misleading results. Instead of such systems, an attempt should be made to encourage pluralism in the subject. (shrink)
This commentary focuses on the authors’ treatment of Koch’s postulates. It argues in favour of a modification of Koch’s postulates and their analysis in terms of necessary and sufficient conditions. This leads to a criticism of the authors’ treatment of the C. difficile case, and to query the need for the criteria of specificity and proportionality.
The Einsteinian revolution, which began around 1905, was one of the most remarkable in the history of physics. It replaced Newtonian mechanics, which had been accepted as completely correct for nearly 200 years, by the special and general theories of relativity. It also eliminated the aether, which had dominated physics throughout the nineteenth century. This paper poses the question of why this momentous scientific revolution began. The suggested answer is in terms of the remarkable series of discoveries and inventions which (...) occurred in the preceding decade and which were the result of technological developments in instrumentation. The paper gives a survey of these inventions and discoveries, which include X-rays, radioactivity, the electron, wireless transmissions across the Atlantic and the patenting of the first thermionic valve. An attempt is then made to show that it was these developments, which gave rise to special relativity. (shrink)
The development of causal modelling since the 1950s has been accompanied by a number of controversies, the most striking of which concerns the Markov condition. Reichenbach's conjunctive forks did satisfy the Markov condition, while Salmon's interactive forks did not. Subsequently some experts in the field have argued that adequate causal models should always satisfy the Markov condition, while others have claimed that non-Markovian causal models are needed in some cases. This paper argues for the second position by considering the multi-causal (...) forks, which are widespread in contemporary medicine (Section 2). A non-Markovian causal model for such forks is introduced and shown to be mathematically tractable (Sections 6, 7, and 8). The paper also gives a general discussion of the controversy about the Markov condition (Section 1), and of the related controversy about probabilistic causality (Sections 3, 4, and 5). (shrink)
Dynamic interaction is said to occur when two significanrly different fields A and B come into relation, and their interaction is dynamic in the sense that at first the flow of ideas is principally from A to B, but later ideas from B come to influence A. Two examples are given of dynamic interactions with the philosophy of mathematics. The first is with philosophy of scicnce, and thc sccond with computer science. Theanalysis cnables Lakatos to be charactcrised as thc first (...) to devclop the philosophy of mathematics using ideas taken from thc philosophy of science. (shrink)
The Research Assessment Exercise was introduced in 1986 by Thatcher, and was continued by Blair. So it has now been running for 21 years. During this time, the rules governing the RAE have changed considerably, and the interval between successive RAEs has also varied. These changes are not of great importance as far as the argument of this paper is concerned. We will concentrate on the main features of the RAE which can be summarised as follows.