Can the maximum entropy principle be explained as a consistency requirement?

Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261 (1995)
  Copy   BIBTEX


The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with certain compelling consistency requirements. This paper reviews these consistency arguments and the surrounding controversy. It is shown that the uniqueness proofs are flawed, or rest on unreasonably strong assumptions. A more general class of inference rules, maximizing the so-called Re[acute ]nyi entropies, is exhibited which also fulfill the reasonable part of the consistency assumptions.



    Upload a copy of this work     Papers currently archived: 86,554

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analysis of the maximum entropy principle “debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
Entropia a modelovanie.Ján Paulov - 2002 - Organon F: Medzinárodný Časopis Pre Analytickú Filozofiu 9 (2):157-175.
Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
Application of the maximum entropy principle to nonlinear systems far from equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 239.
Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.


Added to PP

119 (#129,235)

6 months
6 (#161,978)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Jos Uffink
University of Minnesota

Citations of this work

Compendium of the foundations of classical statistical physics.Jos Uffink - 2005 - In Jeremy Butterfield & John Earman (eds.), Handbook of the Philosophy of Physics. Elsevier.
Generalizing the lottery paradox.Igor Douven & Timothy Williamson - 2006 - British Journal for the Philosophy of Science 57 (4):755-779.
Entropy - A Guide for the Perplexed.Roman Frigg & Charlotte Werndl - 2011 - In Claus Beisbart & Stephan Hartmann (eds.), Probabilities in Physics. Oxford University Press. pp. 115-142.

View all 24 citations / Add more citations

References found in this work

A treatise on probability.John Maynard Keynes - 1921 - Mineola, N.Y.: Dover Publications.
A Mathematical Theory of Communication.Claude Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
The Well-Posed Problem.Edwin T. Jaynes - 1973 - Foundations of Physics 3 (4):477-493.
Bayesian conditionalisation and the principle of minimum information.P. M. Williams - 1980 - British Journal for the Philosophy of Science 31 (2):131-144.
A problem for relative information minimizers in probability kinematics.Bas C. van Fraassen - 1981 - British Journal for the Philosophy of Science 32 (4):375-379.

View all 18 references / Add more references