Results for 'logical entropy'

972 found
Order:
  1. Abner Shimony.Carnap On Entropy - 1975 - In Jaakko Hintikka (ed.), Rudolf Carnap, Logical Empiricist: Materials and Perspectives. D. Reidel Pub. Co.. pp. 381.
     
    Export citation  
     
    Bookmark  
  2. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  3. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  4. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  5.  17
    New Foundations for Information Theory: Logical Entropy and Shannon Entropy.David Ellerman - 2021 - Springer Verlag.
    This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. (...)
  6.  16
    Paraconsistent conjectural deduction based on logical entropy measures I: C-systems as non-standard inference framework.Paola Forcheri & Paolo Gentilini - 2005 - Journal of Applied Non-Classical Logics 15 (3):285-319.
    A conjectural inference is proposed, aimed at producing conjectural theorems from formal conjectures assumed as axioms, as well as admitting contradictory statements as conjectural theorems. To this end, we employ Paraconsistent Informational Logic, which provides a formal setting where the notion of conjecture formulated by an epistemic agent can be defined. The paraconsistent systems on which conjectural deduction is based are sequent formulations of the C-systems presented in Carnielli-Marcos [CAR 02b]. Thus, conjectural deduction may also be considered to be a (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  7.  79
    Logic and entropy.Orly R. Shenker - unknown
    A remarkable thesis prevails in the physics of information, saying that the logical properties of operations that are carried out by computers determine their physical properties. More specifically, it says that logically irreversible operations are dissipative by klog2 per bit of lost information. (A function is logically irreversible if its input cannot be recovered from its output. An operation is dissipative if it turns useful forms of energy into useless ones, such as heat energy.) This is Landauer's dissipation thesis, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  8.  3
    Conditional logic and the Principle of Entropy.Wilhelm Rödder - 2000 - Artificial Intelligence 117 (1):83-106.
  9. Information, entropy and inductive logic.S. Pakswer - 1954 - Philosophy of Science 21 (3):254-259.
    It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of the same train of events. In physical terminology the energy is degraded by an increase in entropy due to an (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark  
  10.  32
    Entropy in operational statistics and quantum logic.Carl A. Hein - 1979 - Foundations of Physics 9 (9-10):751-786.
    In a series of recent papers, Randall and Foulis have developed a generalized theory of probability (operational statistics) which is based on the notion of a physical operation. They have shown that the quantum logic description of quantum mechanics can be naturally imbedded into this generalized theory of probability. In this paper we shall investigate the role of entropy (in the sense of Shannon's theory of information) in operational statistics. We shall find that there are several related entropy (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  11.  30
    Maximum Entropy Applied to Inductive Logic and Reasoning.Jürgen Landes & Jon Williamson (eds.) - 2015 - Ludwig-Maximilians-Universität München.
    This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  12.  28
    Determining Maximal Entropy Functions for Objective Bayesian Inductive Logic.Juergen Landes, Soroush Rafiee Rad & Jon Williamson - 2022 - Journal of Philosophical Logic 52 (2):555-608.
    According to the objective Bayesian approach to inductive logic, premisses inductively entail a conclusion just when every probability function with maximal entropy, from all those that satisfy the premisses, satisfies the conclusion. When premisses and conclusion are constraints on probabilities of sentences of a first-order predicate language, however, it is by no means obvious how to determine these maximal entropy functions. This paper makes progress on the problem in the following ways. Firstly, we introduce the concept of a (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  13.  6
    Combining probabilistic logic programming with the power of maximum entropy.Gabriele Kern-Isberner & Thomas Lukasiewicz - 2004 - Artificial Intelligence 157 (1-2):139-202.
  14.  23
    Probability Sequent Calculi and Entropy Based Nonclassical Logics Classification.Marija Boričić - 2019 - Bulletin of Symbolic Logic 25 (4):446-447.
  15. Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.
    Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  16.  31
    First-order probabilistic conditional logic and maximum entropy.J. Fisseler - 2012 - Logic Journal of the IGPL 20 (5):796-830.
  17.  23
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  18.  86
    Entropy of formulas.Vera Koponen - 2009 - Archive for Mathematical Logic 48 (6):515-522.
    A probability distribution can be given to the set of isomorphism classes of models with universe {1, ..., n} of a sentence in first-order logic. We study the entropy of this distribution and derive a result from the 0–1 law for first-order sentences.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  19.  44
    Logical information theory: new logical foundations for information theory.David Ellerman - 2017 - Logic Journal of the IGPL 25 (5):806-835.
    There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the probability (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  20.  67
    Entropy of knowledge.Hillary Jay Kelley - 1969 - Philosophy of Science 36 (2):178-196.
    Entropy is proposed as a concept which in its broader scope can contribute to the study of the General Information System. This paper attempts to identify a few fundamental subconcepts and LEMMAS which will serve to facilitate further study of system order. The paper discusses: partitioning order into logical and arbitrary kinds; the relationship of order to pattern; and suggested approaches to evaluating and improving the General Information System.
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  21.  15
    The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):423-442.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same probabilities. While (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  22.  4
    Characterizing the principle of minimum cross-entropy within a conditional-logical framework.Gabriele Kern-Isberner - 1998 - Artificial Intelligence 98 (1-2):169-208.
  23.  22
    Entropy, prediction and the cultural ecosystem of human cognition.Pablo Fernandez Velasco - 2023 - Synthese 201 (3):1-18.
    Major proponents of both Distributed Cognition and Predictive Processing have argued that the two theoretical frameworks are strongly compatible. An important conjecture supporting the union of the two frameworks is that cultural practices tend to reduce entropy —that is, to increase predictability— at all scales in a cultural cognitive ecosystem. This conjecture connects Distributed Cognition with Predictive Processing because it shows how cultural practices facilitate prediction. The present contribution introduces the following challenge to the union of Distributed Cognition and (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  24. Analysis of the maximum entropy principle “debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
    Jaynes's maximum entropy principle (MEP) is analyzed by considering in detail a recent controversy. Emphasis is placed on the inductive logical interpretation of “probability” and the concept of “total knowledge.” The relation of the MEP to relative frequencies is discussed, and a possible realm of its fruitful application is noted.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  25. Unitarity as Preservation of Entropy and Entanglement in Quantum Systems.Florian Hulpke, Uffe V. Poulsen, Anna Sanpera, Aditi Sen, Ujjwal Sen & Maciej Lewenstein - 2006 - Foundations of Physics 36 (4):477-499.
    The logical structure of Quantum Mechanics (QM) and its relation to other fundamental principles of Nature has been for decades a subject of intensive research. In particular, the question whether the dynamical axiom of QM can be derived from other principles has been often considered. In this contribution, we show that unitary evolutions arise as a consequences of demanding preservation of entropy in the evolution of a single pure quantum system, and preservation of entanglement in the evolution of (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  26.  40
    How to exploit parametric uniformity for maximum entropy reasoning in a relational probabilistic logic.Marc Finthammer & Christoph Beierle - 2012 - In Luis Farinas del Cerro, Andreas Herzig & Jerome Mengin (eds.), Logics in Artificial Intelligence. Springer. pp. 189--201.
  27.  36
    Towards the entropy-limit conjecture.Jürgen Landes, Soroush Rafiee Rad & Jon Williamson - 2020 - Annals of Pure and Applied Logic 172 (2):102870.
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size n of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  28. Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this complete (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  29.  79
    A New Logic, a New Information Measure, and a New Information-Based Approach to Interpreting Quantum Mechanics.David Ellerman - 2024 - Entropy Special Issue: Information-Theoretic Concepts in Physics 26 (2).
    The new logic of partitions is dual to the usual Boolean logic of subsets (usually presented only in the special case of the logic of propositions) in the sense that partitions and subsets are category-theoretic duals. The new information measure of logical entropy is the normalized quantitative version of partitions. The new approach to interpreting quantum mechanics (QM) is showing that the mathematics (not the physics) of QM is the linearized Hilbert space version of the mathematics of partitions. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  30.  98
    Information Processing and Thermodynamic Entropy.Owen Maroney - unknown
    Are principles of information processing necessary to demonstrate the consistency of statistical mechanics? Does the physical implementation of a computational operation have a fundamental thermodynamic cost, purely by virtue of its logical properties? These two questions lie at the centre of a large body of literature concerned with the Szilard engine (a variant of the Maxwell's demon thought experiment), Landauer's principle (supposed to embody the fundamental principle of the thermodynamics of computation) and possible connections between the two. A variety (...)
    Direct download  
     
    Export citation  
     
    Bookmark   18 citations  
  31.  32
    The political theology of entropy: A Katechon for the cybernetic age.David Bates - 2020 - History of the Human Sciences 33 (1):109-127.
    The digital revolution invites a reconsideration of the very essence of politics. How can we think about decision, control, and will at a time when technologies of automation are transforming every dimension of human life, from military combat to mental attention, from financial systems to the intimate lives of individuals? This article looks back to a moment in the 20th century when the concept of the political as an independent logic was developed, in a time when the boundaries and operations (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  32.  41
    Computing the topological entropy of shifts.Christoph Spandl - 2007 - Mathematical Logic Quarterly 53 (4):493-510.
    Different characterizations of classes of shift dynamical systems via labeled digraphs, languages, and sets of forbidden words are investigated. The corresponding naming systems are analyzed according to reducibility and particularly with regard to the computability of the topological entropy relative to the presented naming systems. It turns out that all examined natural representations separate into two equivalence classes and that the topological entropy is not computable in general with respect to the defined natural representations. However, if a specific (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  33.  21
    Self, Logic, and Figurative Thinking.Harwood Fisher - 2008 - Columbia University Press.
    Introduction: Major terms, their classification, and their relation to the book's objective -- The problem of analogous forms -- Natural logic, categories, and the individual -- Shift to individual categories, dynamics, and a psychological look at identity form versus function -- What is the difference between the logic governing a figure of speech and the logic that is immature or unconscious? -- What are the role and function of the self vis-à-vis consciousness? -- Development in the logic from immature to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  34.  6
    The Acceleration of Time, Presentism and Entropy.Bernard Ancori - 2019-12-16 - In The Carousel of Time. Hoboken, NJ, USA: Wiley. pp. 187–212.
    This chapter presents that our model is that of a network whose time is historically constructed by the events that occur within it – inter‐individual communications, intra‐individual categorization, forgetting by erasure or reinforcement – and not the other way around: far from being part of a temporal framework that is always there, these events produce time. The historical‐sociological register shows that the psychological and an historical determinations of the perceived acceleration of time thus interpreted have been over‐determined by the primacy (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  35.  18
    Is the maximum entropy production just a heuristic principle? Metaphysics on natural determination.Javier Sánchez-Cañizares - 2023 - Synthese 201 (4):1-15.
    The Maximum Entropy Production Principle (MEPP) stands out as an overarching principle that rules life phenomena in Nature. However, its explanatory power beyond heuristics remains controversial. On the one hand, the MEPP has been successfully applied principally to non-living systems far from thermodynamic equilibrium. On the other hand, the underlying assumptions to lay the MEPP’s theoretical foundations and range of applicability increase the possibilities of conflicting interpretations. More interestingly, from a metaphysical stance, the MEPP’s philosophical status is hotly debated: (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  36.  66
    Probabilistic stability, agm revision operators and maximum entropy.Krzysztof Mierzewski - 2020 - Review of Symbolic Logic:1-38.
    Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of maximum (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  37. There’s Plenty of Boole at the Bottom: A Reversible CA Against Information Entropy.Francesco Berto, Jacopo Tagliabue & Gabriele Rossi - 2016 - Minds and Machines 26 (4):341-357.
    “There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct final (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  38.  23
    Information, Genetics and Entropy.Julio Ernesto Rubio Barrios - 2015 - Principia: An International Journal of Epistemology 19 (1):121.
    The consolidation of the informational paradigm in molecular biology research concluded on a system to convert the epistemic object into an operational technological object and a stable epistemic product. However, the acceptance of the informational properties of genetic acids failed to clarify the meaning of the concept of information. The “information”’ as a property of the genetic molecules remained as an informal notion that allows the description of the mechanism of inheritance, but it was not specified in a logic–semantic structure. (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
  39. Qualitative probabilistic inference under varied entropy levels.Paul D. Thorn & Gerhard Schurz - 2016 - Journal of Applied Logic 19 (2):87-101.
    In previous work, we studied four well known systems of qualitative probabilistic inference, and presented data from computer simulations in an attempt to illustrate the performance of the systems. These simulations evaluated the four systems in terms of their tendency to license inference to accurate and informative conclusions, given incomplete information about a randomly selected probability distribution. In our earlier work, the procedure used in generating the unknown probability distribution (representing the true stochastic state of the world) tended to yield (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  40.  45
    Bayesian model learning based on predictive entropy.Jukka Corander & Pekka Marttinen - 2006 - Journal of Logic, Language and Information 15 (1-2):5-20.
    Bayesian paradigm has been widely acknowledged as a coherent approach to learning putative probability model structures from a finite class of candidate models. Bayesian learning is based on measuring the predictive ability of a model in terms of the corresponding marginal data distribution, which equals the expectation of the likelihood with respect to a prior distribution for model parameters. The main controversy related to this learning method stems from the necessity of specifying proper prior distributions for all unknown parameters of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  41.  42
    A short note on the logico-conceptual foundations of information theory in partition logic.David Ellerman - 2009 - The Reasoner 3 (7):4-5.
    A new logic of partitions has been developed that is dual to ordinary logic when the latter is interpreted as the logic of subsets of a fixed universe rather than the logic of propositions. For a finite universe, the logic of subsets gave rise to finite probability theory by assigning to each subset its relative size as a probability. The analogous construction for the dual logic of partitions gives rise to a notion of logical entropy that is precisely (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  42. The temporal foundation of the principle of maximal entropy.Vasil Penchev - 2020 - Logic and Philosophy of Mathematics eJournal 12 (11):1-3.
    The principle of maximal entropy (further abbreviated as “MaxEnt”) can be founded on the formal mechanism, in which future transforms into past by the mediation of present. This allows of MaxEnt to be investigated by the theory of quantum information. MaxEnt can be considered as an inductive analog or generalization of “Occam’s razor”. It depends crucially on choice and thus on information just as all inductive methods of reasoning. The essence shared by Occam’s razor and MaxEnt is for the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  43. The Inert vs. the Living State of Matter: Extended Criticality, Time Geometry, Anti-Entropy - An Overview.Giuseppe Longo & Maël Montévil - 2012 - Frontiers in Physiology 3:39.
    The physical singularity of life phenomena is analyzed by means of comparison with the driving concepts of theories of the inert. We outline conceptual analogies, transferals of methodologies and theoretical instruments between physics and biology, in addition to indicating significant differences and sometimes logical dualities. In order to make biological phenomenalities intelligible, we introduce theoretical extensions to certain physical theories. In this synthetic paper, we summarize and propose a unified conceptual framework for the main conclusions drawn from work spanning (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  44. An Introduction to Partition Logic.David Ellerman - 2014 - Logic Journal of the IGPL 22 (1):94-125.
    Classical logic is usually interpreted as the logic of propositions. But from Boole's original development up to modern categorical logic, there has always been the alternative interpretation of classical logic as the logic of subsets of any given (nonempty) universe set. Partitions on a universe set are dual to subsets of a universe set in the sense of the reverse-the-arrows category-theoretic duality--which is reflected in the duality between quotient objects and subobjects throughout algebra. Hence the idea arises of a dual (...)
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  45.  13
    Bayesian estimation of Shannon entropy.Lin Yuan & H. K. Kesavan - 1997 - History and Philosophy of Logic 26 (1):139-148.
  46. The logic of the past hypothesis.David Wallace - 2023 - In Barry Loewer, Brad Weslake & Eric B. Winsberg (eds.), The Probability Map of the Universe: Essays on David Albert’s _time and Chance_. Cambridge MA: Harvard University Press. pp. 76-109.
    I attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from time-reversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a “Past Hypothesis” about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a constraint on the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   39 citations  
  47. The Logical Consistency of Simultaneous Agnostic Hypothesis Tests.Julio Michael Stern - 2016 - Entropy 8 (256):1-22.
    Simultaneous hypothesis tests can fail to provide results that meet logical requirements. For example, if A and B are two statements such that A implies B, there exist tests that, based on the same data, reject B but not A. Such outcomes are generally inconvenient to statisticians (who want to communicate the results to practitioners in a simple fashion) and non-statisticians (confused by conflicting pieces of information). Based on this inconvenience, one might want to use tests that satisfy (...) requirements. However, Izbicki and Esteves shows that the only tests that are in accordance with three logical requirements (monotonicity, invertibility and consonance) are trivial tests based on point estimation, which generally lack statistical optimality. As a possible solution to this dilemma, this paper adapts the above logical requirements to agnostic tests, in which one can accept, reject or remain agnostic with respect to a given hypothesis. Each of the logical requirements is characterized in terms of a Bayesian decision theoretic perspective. Contrary to the results obtained for regular hypothesis tests, there exist agnostic tests that satisfy all logical requirements and also perform well statistically. In particular, agnostic tests that fulfill all logical requirements are characterized as region estimator-based tests. Examples of such tests are provided. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  48. A Logic For Inductive Probabilistic Reasoning.Manfred Jaeger - 2005 - Synthese 144 (2):181-248.
    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from “70% of As are Bs” and “a is an A” infer that a is a B with probability 0.7. Direct inference is generalized by Jeffrey’s rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  49.  23
    À la suite de L'évolution créatrice : Les deux sources de la morale et de la religion L'entropie, un principe social ?Brigitte Sitbon-Peillon - 2008 - Archives de Philosophie 2 (2):289-308.
    Faut-il lire Les deux sources de la morale et de la religion comme la suite de L’évolution créatrice où sont suggérées des questions d’ordre moral et religieux mais laissées là comme « pierre d’attente » ? Ce n’est pas une théodicée qui s’élabore dans le dernier ouvrage de Bergson et sa continuité avec L’évolution créatrice est sans doute à rechercher ailleurs. Elle pourrait notamment être envisagée à partir de « l’application » à la théorie sociale des Deux sources du second (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  50.  23
    Explaining default intuitions using maximum entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
1 — 50 / 972