Logical information theory: new logical foundations for information theory

Logic Journal of the IGPL 25 (5):806-835 (2017)
  Copy   BIBTEX

Abstract

There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the probability measure on the sets of distinctions. The compound notions of joint, conditional, and mutual entropies are obtained as the values of the measure, respectively, on the union, difference, and intersection of the sets of distinctions. These compound notions of logical entropy satisfy the usual Venn diagram relationships since they are values of a measure. The uniform transformation into the formulas for Shannon entropy is linear so it explains the long-noted fact that the Shannon formulas satisfy the Venn diagram relations--as an analogy or mnemonic--since Shannon entropy is not a measure on a given set. What is the logic that gives rise to logical information theory? Partitions are dual to subsets, and the logic of partitions was recently developed in a dual/parallel relationship to the Boolean logic of subsets. Boole developed logical probability theory as the normalized counting measure on subsets. Similarly the normalized counting measure on partitions is logical entropy--when the partitions are represented as the set of distinctions that is the complement to the equivalence relation for the partition. In this manner, logical information theory provides the set-theoretic and measure-theoretic foundations for information theory. The Shannon theory is then derived by the transformation that replaces the counting of distinctions with the counting of the number of binary partitions it takes, on average, to make the same distinctions by uniquely encoding the distinct elements--which is why the Shannon theory perfectly dovetails into coding and communications theory.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,386

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Logical pluralism and semantic information.Patrick Allo - 2007 - Journal of Philosophical Logic 36 (6):659 - 694.
Do logical truths carry information?Manuel E. Bremer - 2003 - Minds and Machines 13 (4):567-575.
A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
Hard and Soft Logical Information.Allo Patrick - 2017 - Journal of Logic and Computation:1-20.
Information: Does it have to be true? [REVIEW]James H. Fetzer - 2004 - Minds and Machines 14 (2):223-229.
Search for syllogistic structure of semantic information.Marcin J. Schroeder - 2012 - Journal of Applied Non-Classical Logics 22 (1-2):83-103.
The Information Medium.Orlin Vakarelov - 2012 - Philosophy and Technology 25 (1):47-65.

Analytics

Added to PP
2017-10-25

Downloads
42 (#370,011)

6 months
6 (#504,917)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

David Ellerman
University of Ljubljana