We present a conservative extension of a Bayesian account of confirmation that can deal with the problem of old evidence and new theories. So-called open-minded Bayesianism challenges the assumption—implicit in standard Bayesianism—that the correct empirical hypothesis is among the ones currently under consideration. It requires the inclusion of a catch-all hypothesis, which is characterized by means of sets of probability assignments. Upon the introduction of a new theory, the former catch-all is decomposed into a new empirical hypothesis and a new (...) catch-all. As will be seen, this motivates a second update rule, besides Bayes’ rule, for updating probabilities in light of a new theory. This rule conserves probability ratios among the old hypotheses. This framework allows for old evidence to confirm a new hypothesis due to a shift in the theoretical context. The result is a version of Bayesianism that, in the words of Earman, “keep[s] an open mind, but not so open that your brain falls out”. (shrink)
This paper offers a new angle on the common idea that the process of science does not support epistemic diversity. Under minimal assumptions on the nature of journal editing, we prove that editorial procedures, even when impartial in themselves, disadvantage less prominent research programs. This purely statistical bias in article selection further skews existing differences in the success rate and hence attractiveness of research programs, and exacerbates the reputation difference between the programs. After a discussion of the modeling assumptions, the (...) paper ends with a number of recommendations that may help promote scientific diversity through editorial decision making. (shrink)
List and Pettit have stated an impossibility theorem about the aggregation of individual opinion states. Building on recent work on the lottery paradox, this paper offers a variation on that result. The present result places different constraints on the voting agenda and the domain of profiles, but it covers a larger class of voting rules, which need not satisfy the proposition-wise independence of votes.
We represent consensus formation processes based on iterated opinion pooling as a dynamic approach to common knowledge of posteriors :1236–1239, 1976; Geanakoplos and Polemarchakis in J Econ Theory 28:192–200, 1982). We thus provide a concrete and plausible Bayesian rationalization of consensus through iterated pooling. The link clarifies the conditions under which iterated pooling can be rationalized from a Bayesian perspective, and offers an understanding of iterated pooling in terms of higher-order beliefs.
The frequent occurrence of comorbidity has brought about an extensive theoretical debate in psychiatry. Why are the rates of psychiatric comorbidity so high and what are their implications for the ontological and epistemological status of comorbid psychiatric diseases? Current explanations focus either on classification choices or on causal ties between disorders. Based on empirical and philosophical arguments, we propose a conventionalist interpretation of psychiatric comorbidity instead. We argue that a conventionalist approach fits well with research and clinical practice and resolves (...) two problems for psychiatric diseases: experimenter’s regress and arbitrariness. (shrink)
This article presents a generalization of the Condorcet Jury Theorem. All results to date assume a fixed value for the competence of jurors, or alternatively, a fixed probability distribution over the possible competences of jurors. In this article, we develop the idea that we can learn the competence of the jurors by the jury vote. We assume a uniform prior probability assignment over the competence parameter, and we adapt this assignment in the light of the jury vote. We then compute (...) the posterior probability, conditional on the jury vote, of the hypothesis voted over. We thereby retain the central results of Condorcet, but we also show that the posterior probability depends on the size of the jury as well as on the absolute margin of the majority. (shrink)
This paper explores the fact that linear opinion pooling can be represented as a Bayesian update on the opinions of others. It uses this fact to propose a new interpretation of the pooling weights. Relative to certain modelling assumptions the weights can be equated with the so-called truth-conduciveness known from the context of Condorcet's jury theorem. This suggests a novel way to elicit the weights.
This paper develops a probabilistic model of belief change under interpretation shifts, in the context of a problem case from dynamic epistemic logic. Van Benthem [4] has shown that a particular kind of belief change, typical for dynamic epistemic logic, cannot be modelled by standard Bayesian conditioning. I argue that the problems described by van Benthem come about because the belief change alters the semantics in which the change is supposed to be modelled: the new information induces a shift in (...) the interpretation of the sentences. In this paper I show that interpretation shifts can be modeled in terms of updating by conditioning. The model derives from the knowledge structures developed by Fagin et al [8], and hinges on a distinction between the propositional and informational content of sentences. Finally, I show that Dempster-Shafer theory provides the appropriate probability kinematics for the model. (shrink)
This paper studies the use of hypotheses schemes in generatinginductive predictions. After discussing Carnap–Hintikka inductive logic,hypotheses schemes are defined and illustrated with two partitions. Onepartition results in the Carnapian continuum of inductive methods, the otherresults in predictions typical for hasty generalization. Following theseexamples I argue that choosing a partition comes down to making inductiveassumptions on patterns in the data, and that by choosing appropriately anyinductive assumption can be made. Further considerations on partitions makeclear that they do not suggest any solution (...) to the problem of induction.Hypotheses schemes provide the tools for making inductive assumptions, but theyalso reveal the need for such assumptions. (shrink)
This paper addresses the problem that Bayesian statistical inference cannot accommodate theory change, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent the theoretical structure underlying the scheme. This is followed by an example of a change of hypotheses. The paper then presents a general framework for hypotheses change, and proposes the minimization of the distance between hypotheses as a (...) rationality criterion. Finally the paper discusses the import of this for Bayesian statistical inference. (shrink)
This article investigates a problem for statistical model evaluation, in particular for curve fitting: by employing a different family of curves we can fit any scatter plot almost perfectly at apparently minor cost in terms of model complexity. The problem is resolved by an appeal to prior probabilities. This leads to some general lessons about how to approach model evaluation.
Until now, antirealists have offered sketches of a theory of truth, at best. In this paper, we present a probabilist account of antirealist truth in some formal detail, and we assess its ability to deal with the problems that are standardly taken to beset antirealism.
This paper investigates the viability of the Bayesian model of belief change. Van Benthem (2003) has shown that a particular kind of information change typical for dynamic epistemic logic cannot be modelled by Bayesian conditioning. I argue that the problems described by van Benthem come about because the information change alters the semantics in which the change is supposed to be modelled by conditioning: it induces a shift in meanings. I then show that meaning shifts can be modelled in terms (...) of conditioning by employing a semantics that makes these changes in meaning explicit, and that the appropriate probability kinematics can be described by Dempster’s rule. The new model thereby facilitates a better understanding between probabilistic epistemology and dynamic epistemic logic. (shrink)
This paper presents the progicnet programme. It proposes a general framework for probabilistic logic that can guide inference based on both logical and probabilistic input. After an introduction to the framework as such, it is illustrated by means of a toy example from psychometrics. It is shown that the framework can accommodate a number of approaches to probabilistic reasoning: Bayesian statistical inference, evidential probability, probabilistic argumentation, and objective Bayesianism. The framework thus provides insight into the relations between these approaches, it (...) illustrates how the results of different approaches can be combined, and it provides a basis for doing efficient inference in each of the approaches. (shrink)
In this paper I discuss probabilistic models of experimental intervention, and I show that such models elucidate the intuition that observations during intervention are more informative than observations per se. Because of this success, it seems attractive to also cast other problems addressed by the philosophy of experimentation in terms of such probabilistic models. However, a critical examination of the models reveals that some of the aspects of experimentation are covered up rather than resolved by probabilistic modelling. I end by (...) drawing a number of general lessons on the use of formal methods in the philosophy of science. (shrink)
An inductive logic is a system of inference that describes the relation between propositions on data, and propositions that extend beyond the data, such as predictions over future data, and general conclusions on all possible data. Statistics, on the other hand, is a mathematical discipline that describes procedures for deriving results about a population from sample data. These results include predictions on future samples, decisions on rejecting or accepting a hypothesis about the population, the determination of probability assignments over such (...) hypotheses, the selection of a statistical model for studying the population, and so on. Both inductive logic and statistics are calculi for getting from the given data to propositions or results that transcend the data. (shrink)
Summary. This paper proposes a common framework for various probabilistic logics. It consists of a set of uncertain premises with probabilities attached to them. This raises the question of the strength of a conclusion, but without imposing a particular semantics, no general solution is possible. The paper discusses several possible semantics by looking at it from the perspective of probabilistic argumentation.
Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer specific issues that arise from the study of processes, one cannot expect them to provide constraints in general.
In psychiatry, many scientists desire to move from a classification system based on symptoms toward a system based on biological causes. The idea is that psychiatric diseases should be redefined such that each disease would be associated with specific biological causes. This desire is intelligible because causal disease models often facilitate understanding and identification of new ways to intervene in disease processes. In its attempt to move from syndromal to specific etiological definitions, psychiatry follows the trend of general medicine.Current psychiatric...
We consider the use of interventions for resolving a problem of unidentified statistical models. The leading examples are from latent variable modelling, an influential statistical tool in the social sciences. We first explain the problem of statistical identifiability and contrast it with the identifiability of causal models. We then draw a parallel between the latent variable models and Bayesian networks with hidden nodes. This allows us to clarify the use of interventions for dealing with unidentified statistical models. We end by (...) discussing the philosophical and methodological import of our result. (shrink)
Over the past decades or so the probabilistic model of rational belief has enjoyed increasing interest from researchers in epistemology and the philosophy of science. Of course, such probabilistic models were used for much longer in economics, in game theory, and in other disciplines concerned with decision making. Moreover, Carnap and co-workers used probability theory to explicate philosophical notions of confirmation and induction, thereby targeting epistemic rather than decision-theoretic aspects of rationality. However, following Carnap’s early applications, philosophy has more recently (...) seen an increased popularity of probabilistic models in other areas concerned with the philosophical analysis of belief: there are models targeting coherence, informativeness, simplicity, and so on.In brief, the probabilistic model of belief comprises of a language, detailing the propositions about which an agent is supposed to have beliefs, and a function over the language that expresses beliefs: .. (shrink)
This edited collection showcases some of the best recent research in the philosophy of science. It comprises of thematically arranged papers presented at the 5th conference of the European Philosophy of Science Association, covering a broad variety of topics within general philosophy of science, and philosophical issues pertaining to specific sciences. The collection will appeal to researchers with an interest in the philosophical underpinnings of their own discipline, and to philosophers who wish to study the latest work on the themes (...) discussed. (shrink)
This article comments on the article of Thorn and Schurz in this volume and focuses on, what we call, the problem of parasitic experts. We discuss that both meta-induction and crowd wisdom can be understood as pertaining to absolute reliability rather than comparative optimality, and we suggest that the involvement of reliability will provide a handle on this problem.
This article argues that time-asymmetric processes in spacetime are enantiomorphs. Subsequently, the Kantian puzzle concerning enantiomorphs in space is reviewed to introduce a number of positions concerning enantiomorphy, and to arrive at a dilemma: one must either reject that orientations of enantiomorphs are determinate, or furnish space or objects with orientation. The discussion on space is then used to derive two problems in the debate on the direction of time. First, it is shown that certain kinds of reductionism about the (...) direction of time are at variance with the claim that orientation of enantiomorphic objects is intrinsic. Second, it is argued that reductive explanations of time-asymmetric processes presuppose that enantiomorphic processes do not have determinate orientation. (shrink)