Switch to: References

Add citations

You must login to add citations.
  1. Objective Bayesianism, Bayesian conditionalisation and voluntarism.Jon Williamson - 2011 - Synthese 178 (1):67-85.
    Objective Bayesianism has been criticised on the grounds that objective Bayesian updating, which on a finite outcome space appeals to the maximum entropy principle, differs from Bayesian conditionalisation. The main task of this paper is to show that this objection backfires: the difference between the two forms of updating reflects negatively on Bayesian conditionalisation rather than on objective Bayesian updating. The paper also reviews some existing criticisms and justifications of conditionalisation, arguing in particular that the diachronic Dutch book justification fails (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  • Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can help to (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  • Updating Probability: Tracking Statistics as Criterion.Bas C. van Fraassen & Joseph Y. Halpern - 2016 - British Journal for the Philosophy of Science:axv027.
    ABSTRACT For changing opinion, represented by an assignment of probabilities to propositions, the criterion proposed is motivated by the requirement that the assignment should have, and maintain, the possibility of matching in some appropriate sense statistical proportions in a population. This ‘tracking’ criterion implies limitations on policies for updating in response to a wide range of types of new input. Satisfying the criterion is shown equivalent to the principle that the prior must be a convex combination of the possible posteriors. (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  • Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  • The Application of Constraint Semantics to the Language of Subjective Uncertainty.Eric Swanson - 2016 - Journal of Philosophical Logic 45 (2):121-146.
    This paper develops a compositional, type-driven constraint semantic theory for a fragment of the language of subjective uncertainty. In the particular application explored here, the interpretation function of constraint semantics yields not propositions but constraints on credal states as the semantic values of declarative sentences. Constraints are richer than propositions in that constraints can straightforwardly represent assessments of the probability that the world is one way rather than another. The richness of constraints helps us model communicative acts in essentially the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   49 citations  
  • Peer Disagreement: A Call for the Revision of Prior Probabilities.Sven Rosenkranz & Moritz Schulz - 2015 - Dialectica 69 (4):551-586.
    The current debate about peer disagreement has so far mainly focused on the question of whether peer disagreements provide genuine counterevidence to which we should respond by revising our credences. By contrast, comparatively little attention has been devoted to the question by which process, if any, such revision should be brought about. The standard assumption is that we update our credences by conditionalizing on the evidence that peer disagreements provide. In this paper, we argue that non-dogmatist views have good reasons (...)
    Direct download  
     
    Export citation  
     
    Bookmark   8 citations  
  • A new resolution of the Judy Benjamin Problem.Igor Douven & Jan-Willem Romeijn - 2011 - Mind 120 (479):637 - 670.
    A paper on how to adapt your probabilisitc beliefs when learning a conditional.
    Direct download (16 more)  
     
    Export citation  
     
    Bookmark   30 citations  
  • The principle of maximum entropy and a problem in probability kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports maxent. This (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  • On Indeterminate Updating of Credences.Leendert Huisman - 2014 - Philosophy of Science 81 (4):537-557.
    The strategy of updating credences by minimizing the relative entropy has been questioned by many authors, most strongly by means of the Judy Benjamin puzzle. I present a new analysis of Judy Benjamin–like forms of new information and defend the thesis that in general the rational posterior is indeterminate, meaning that a family of posterior credence functions rather than a single one is the rational response when that type of information becomes available. The proposed thesis extends naturally to all cases (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • Bayesian conditionalization and probability kinematics.Colin Howson & Allan Franklin - 1994 - British Journal for the Philosophy of Science 45 (2):451-466.
  • Learning from Conditionals.Benjamin Eva, Stephan Hartmann & Soroush Rafiee Rad - 2020 - Mind 129 (514):461-508.
    In this article, we address a major outstanding question of probabilistic Bayesian epistemology: how should a rational Bayesian agent update their beliefs upon learning an indicative conditional? A number of authors have recently contended that this question is fundamentally underdetermined by Bayesian norms, and hence that there is no single update procedure that rational agents are obliged to follow upon learning an indicative conditional. Here we resist this trend and argue that a core set of widely accepted Bayesian norms is (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  • Bayesian argumentation and the value of logical validity.Benjamin Eva & Stephan Hartmann - 2018 - Psychological Review 125 (5):806-821.
    According to the Bayesian paradigm in the psychology of reasoning, the norms by which everyday human cognition is best evaluated are probabilistic rather than logical in character. Recently, the Bayesian paradigm has been applied to the domain of argumentation, where the fundamental norms are traditionally assumed to be logical. Here, we present a major generalisation of extant Bayesian approaches to argumentation that utilizes a new class of Bayesian learning methods that are better suited to modelling dynamic and conditional inferences than (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   23 citations  
  • Ramsey’s test, adams’ thesis, and left-nested conditionals.Richard Dietz & Igor Douven - 2010 - Review of Symbolic Logic 3 (3):467-484.
    Adams famously suggested that the acceptability of any indicative conditional whose antecedent and consequent are both factive sentences amounts to the subjective conditional probability of the consequent given the antecedent. The received view has it that this thesis offers an adequate partial explication of Ramsey’s test, which characterizes graded acceptability for conditionals in terms of hypothetical updates on the antecedent. Some results in van Fraassen may raise hope that this explicatory approach to Ramsey’s test is extendible to left-nested conditionals, that (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   5 citations