Switch to: References

Add citations

You must login to add citations.
  1. On the a priori and a posteriori assessment of probabilities.Anubav Vasudevan - 2013 - Journal of Applied Logic 11 (4):440-451.
    We argue that in spite of their apparent dissimilarity, the methodologies employed in the a priori and a posteriori assessment of probabilities can both be justified by appeal to a single principle of inductive reasoning, viz., the principle of symmetry. The difference between these two methodologies consists in the way in which information about the single-trial probabilities in a repeatable chance process is extracted from the constraints imposed by this principle. In the case of a posteriori reasoning, these constraints inform (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can help to (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  • Learning and Pooling, Pooling and Learning.Rush T. Stewart & Ignacio Ojea Quintana - 2018 - Erkenntnis 83 (3):1-21.
    We explore which types of probabilistic updating commute with convex IP pooling. Positive results are stated for Bayesian conditionalization, imaging, and a certain parameterization of Jeffrey conditioning. This last observation is obtained with the help of a slight generalization of a characterization of externally Bayesian pooling operators due to Wagner :336–345, 2009). These results strengthen the case that pooling should go by imprecise probabilities since no precise pooling method is as versatile.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • Speed-Optimal Induction and Dynamic Coherence.Michael Nielsen & Eric Wofsey - 2022 - British Journal for the Philosophy of Science 73 (2):439-455.
    A standard way to challenge convergence-based accounts of inductive success is to claim that they are too weak to constrain inductive inferences in the short run. We respond to such a challenge by answering some questions raised by Juhl (1994). When it comes to predicting limiting relative frequencies in the framework of Reichenbach, we show that speed-optimal convergence—a long-run success condition—induces dynamic coherence in the short run.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • Learning as Hypothesis Testing: Learning Conditional and Probabilistic Information.Jonathan Vandenburgh - manuscript
    Complex constraints like conditionals ('If A, then B') and probabilistic constraints ('The probability that A is p') pose problems for Bayesian theories of learning. Since these propositions do not express constraints on outcomes, agents cannot simply conditionalize on the new information. Furthermore, a natural extension of conditionalization, relative information minimization, leads to many counterintuitive predictions, evidenced by the sundowners problem and the Judy Benjamin problem. Building on the notion of a `paradigm shift' and empirical research in psychology and economics, I (...)
    Direct download  
     
    Export citation  
     
    Bookmark