Information, entropy and inductive logic

Philosophy of Science 21 (3):254-259 (1954)
  Copy   BIBTEX

Abstract

It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of the same train of events. In physical terminology the energy is degraded by an increase in entropy due to an increased randomness in the positions or velocities of components, wave functions, complexions in phase space; in informational terminology some information about the same components has been lost or the negentropy has been decreased. In equilibrium the system has for a given energy content maximum randomness. One consequence of this dual aspect was the idea to apply the methods of statistical mechanics to problems of communication and Brillouin showed that Fermi-Dirac statistics or generalized Fermi statistics are applicable for example to a transmission of signals such as telegrams.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,592

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analytics

Added to PP
2009-01-28

Downloads
193 (#102,104)

6 months
10 (#261,437)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references