F-Information, a Unitless Variant of Fisher Information

Foundations of Physics 29 (10):1521-1541 (1999)
  Copy   BIBTEX

Abstract

A new information matrix [F] with elements F mn = 〈 (y m - a m )(y n - a n) (∂ ln p(y | a)/∂a m ) (∂ ln p(y | a)/∂a n ) 〉 is analyzed. The PDF p(y | a) is the usual likelihood law. [F] differs from the Fisher information matrix by the presence of the first two factors in the given expectation. These factors make F mn unitless, in contrast with the Fisher information. This lack of units allows F mn values from entirely different phenomena to be compared as, for example, Shannon information values can be compared. Each element F mn defines an error inequality analogous to the Cramer-Rao inequality. In the scalar case F mn ≡ F, for a normal p(y|a) law F = 3, while for an exponential law F = 9. A variational principle F = min (called FMIN) allows an unknown PDF p(x) to be estimated in the presence of weak information. Under certain conditions F obeys a “Boltzmann F-theorem” ∂F/∂t ⩽ 0, indicating that F is a physical entropy. Finally, the trace ℱ of [F] may be used as the scalar information quantity in an information-based principle for deriving distribution laws p of physics

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 93,098

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2013-11-22

Downloads
75 (#226,258)

6 months
4 (#862,833)

Historical graph of downloads
How can I increase my downloads?

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references