Switch to: References

Add citations

You must login to add citations.
  1. Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete and (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  • Uncertainty, Rationality, and Agency.Wiebe van der Hoek - 2006 - Dordrecht, Netherland: Springer.
    This volume concerns Rational Agents - humans, players in a game, software or institutions - which must decide the proper next action in an atmosphere of partial information and uncertainty. The book collects formal accounts of Uncertainty, Rationality and Agency, and also of their interaction. It will benefit researchers in artificial systems which must gather information, reason about it and then make a rational decision on which action to take.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Statistical Learning Model of the Sense of Agency.Shiro Yano, Yoshikatsu Hayashi, Yuki Murata, Hiroshi Imamizu, Takaki Maeda & Toshiyuki Kondo - 2020 - Frontiers in Psychology 11.
    A sense of agency (SoA) is the experience of subjective awareness regarding the control of one’s actions. Humans have a natural tendency to generate prediction models of the environment and adapt their models according to changes in the environment. The SoA is associated with the degree of the adaptation of the prediction models, e.g., insufficient adaptation causes low predictability and lowers the SoA over the environment. Thus, identifying the mechanisms behind the adaptation process of a prediction model related to the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  • Conditional Ranking Revision: Iterated Revision with Sets of Conditionals.Emil Weydert - 2012 - Journal of Philosophical Logic 41 (1):237-271.
    In the context of a general framework for belief dynamics which interprets revision as doxastic constraint satisfaction, we discuss a proposal for revising quasi-probabilistic belief measures with finite sets of graded conditionals. The belief states are ranking measures with divisible values (generalizing Spohn’s epistemology), and the conditionals are interpreted as ranking constraints. The approach is inspired by the minimal information paradigm and based on the principle-guided canonical construction of a ranking model of the input conditionals. This is achieved by extending (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  • Integrating inconsistent data in a probabilistic model.Jiří Vomlel - 2004 - Journal of Applied Non-Classical Logics 14 (3):367-386.
    In this paper we discuss the process of building a joint probability distribution from an input set of low-dimensional probability distributions. Since the solution of the problem for a consistent input set of probability distributions is known we concentrate on a setup where the input probability distributions are inconsistent. In this case the iterative proportional fitting procedure, which converges in the consistent case, tends to come to cycles. We propose a new algorithm that converges even in inconsistent case. The important (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
  • Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem that can help to (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  • Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  • On probabilistic inference in relational conditional logics.M. Thimm & G. Kern-Isberner - 2012 - Logic Journal of the IGPL 20 (5):872-908.
  • Updating, supposing, and maxent.Brian Skyrms - 1987 - Theory and Decision 22 (3):225-246.
  • Maximum entropy inference as a special case of conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  • A new resolution of the Judy Benjamin Problem.Igor Douven & Jan-Willem Romeijn - 2011 - Mind 120 (479):637 - 670.
    A paper on how to adapt your probabilisitc beliefs when learning a conditional.
    Direct download (16 more)  
     
    Export citation  
     
    Bookmark   30 citations  
  • Features of the Expert-System-Shell SPIRIT.Wilhelm Rödder, Elmar Reucher & Friedhelm Kulmann - 2006 - Logic Journal of the IGPL 14 (3):485-500.
    The inference process in a probabilistic and conditional environment under minimum relative entropy, permits the acquisition of basic knowledge, the consideration of - even uncertain - ad hoc knowledge, and the response to queries. Even if these procedures are well known in the relevant literature their realisation for large-scale applications needs a sophisticated tool, allowing the communication with the user as well as all relevant logical transformations and numerical calculations. SPIRIT is an Expert-System-Shell for these purposes. Even for hundreds of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  • Probabilistic Belief Contraction.Raghav Ramachandran, Arthur Ramer & Abhaya C. Nayak - 2012 - Minds and Machines 22 (4):325-351.
    Probabilistic belief contraction has been a much neglected topic in the field of probabilistic reasoning. This is due to the difficulty in establishing a reasonable reversal of the effect of Bayesian conditionalization on a probabilistic distribution. We show that indifferent contraction, a solution proposed by Ramer to this problem through a judicious use of the principle of maximum entropy, is a probabilistic version of a full meet contraction. We then propose variations of indifferent contraction, using both the Shannon entropy measure (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  • The principle of maximum entropy and a problem in probability kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports maxent. This (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  • Formal Epistemology Meets Mechanism Design.Jürgen Landes - 2023 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 54 (2):215-231.
    This article connects recent work in formal epistemology to work in economics and computer science. Analysing the Dutch Book Arguments, Epistemic Utility Theory and Objective Bayesian Epistemology we discover that formal epistemologists employ the same argument structure as economists and computer scientists. Since similar approaches often have similar problems and have shared solutions, opportunities for cross-fertilisation abound.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  • Two information measures for inconsistent sets.Kevin M. Knight - 2003 - Journal of Logic, Language and Information 12 (2):227-248.
    I present two measures of information for both consistentand inconsistent sets of sentences in a finite language ofpropositional logic. The measures of information are based onmeasures of inconsistency developed in Knight (2002).Relative information measures are then provided corresponding to thetwo information measures.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  • Causal versions of maximum entropy and principle of insufficient reason.Dominik Janzing - 2021 - Journal of Causal Inference 9 (1):285-301.
    The principle of insufficient reason assigns equal probabilities to each alternative of a random experiment whenever there is no reason to prefer one over the other. The maximum entropy principle generalizes PIR to the case where statistical information like expectations are given. It is known that both principles result in paradoxical probability updates for joint distributions of cause and effect. This is because constraints on the conditional P P\left result in changes of P P\left that assign higher probability to those (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  • A Logic For Inductive Probabilistic Reasoning.Manfred Jaeger - 2005 - Synthese 144 (2):181-248.
    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from “70% of As are Bs” and “a is an A” infer that a is a B with probability 0.7. Direct inference is generalized by Jeffrey’s rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • The Semantics Latent in Shannon Information.M. C. Isaac Alistair - 2019 - British Journal for the Philosophy of Science 70 (1):103-125.
    The lore is that standard information theory provides an analysis of information quantity, but not of information content. I argue this lore is incorrect, and there is an adequate informational semantics latent in standard theory. The roots of this notion of content can be traced to the secret parallel development of an information theory equivalent to Shannon’s by Turing at Bletchley Park, and it has been suggested independently in recent work by Skyrms and Bullinaria and Levy. This paper explicitly articulates (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  • Bayesian conditionalization and probability kinematics.Colin Howson & Allan Franklin - 1994 - British Journal for the Philosophy of Science 45 (2):451-466.
  • Representing preorders with injective monotones.Pedro Hack, Daniel A. Braun & Sebastian Gottwald - 2022 - Theory and Decision 93 (4):663-690.
    We introduce a new class of real-valued monotones in preordered spaces, injective monotones. We show that the class of preorders for which they exist lies in between the class of preorders with strict monotones and preorders with countable multi-utilities, improving upon the known classification of preordered spaces through real-valued monotones. We extend several well-known results for strict monotones (Richter–Peleg functions) to injective monotones, we provide a construction of injective monotones from countable multi-utilities, and relate injective monotones to classic results concerning (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • Information Dynamics.Amos Golan - 2014 - Minds and Machines 24 (1):19-36.
    Though we have access to a wealth of information, the main issue is always how to process the available information. How to make sense of all we observe and know. Just like the English alphabet: we know there are 26 letters but unless we put these letters together in a meaningful way, they convey no information. There are infinitely many ways of putting these letters together. Only a small number of those make sense. Only some of those convey exactly what (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • Deceptive updating and minimal information methods.Haim Gaifman & Anubav Vasudevan - 2012 - Synthese 187 (1):147-178.
    The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  • A priori probability and localized observers.Matthew J. Donald - 1992 - Foundations of Physics 22 (9):1111-1172.
    A physical and mathematical framework for the analysis of probabilities in quantum theory is proposed and developed. One purpose is to surmount the problem, crucial to any reconciliation between quantum theory and space-time physics, of requiring instantaneous “wave-packet collapse” across the entire universe. The physical starting point is the idea of an observer as an entity, localized in space-time, for whom any physical system can be described at any moment, by a set of (not necessarily pure) quantum states compatible with (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  • Probability kinematics, conditionals, and entropy principles.Zoltan Domotor - 1985 - Synthese 63 (1):75 - 114.
  • Towards an Informational Pragmatic Realism.Ariel Caticha - 2014 - Minds and Machines 24 (1):37-70.
    I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information. (2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating from a (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • On Quantum-Classical Transition of a Single Particle.Agung Budiyono - 2010 - Foundations of Physics 40 (8):1117-1133.
    We discuss the issue of quantum-classical transition in a system of a single particle with and without external potential. This is done by elaborating the notion of self-trapped wave function recently developed by the author. For a free particle, we show that there is a subset of self-trapped wave functions which is particle-like. Namely, the spatially localized wave packet is moving uniformly with undistorted shape as if the whole wave packet is indeed a classical free particle. The length of the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  • Explaining default intuitions using maximum entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
  • Varieties of Bayesianism.Jonathan Weisberg - 2011
    Handbook of the History of Logic, vol. 10, eds. Dov Gabbay, Stephan Hartmann, and John Woods, forthcoming.
    Direct download  
     
    Export citation  
     
    Bookmark   35 citations