Information as a measure of variation
In many applications of information theory, information measures the reduction of uncertainty that results from the knowledge that an event has occurred. Even so, an item of information learned need not be the occurrence of an event but, rather, the change in probability distribution associated with an ensemble of events. This paper examines the basic account of information, which focuses on events, and reviews how it may be naturally generalized to probability distributions/measures. The resulting information measure is special case of the Rényi information divergence (also known as the Rényi entropy). This information measure, herein dubbed the variational information, meaningfully assigns a numerical bit-value to arbitrary state transitions of physical systems. The information topology of these state transitions is characterized canonically by a right and left continuity spectrum defined in terms of the Kantorovich- Wasserstein metric. These continuity spectra provide a theoretical framework for characterizing the informational continuity of evolving systems and for rigorously assessing the degree to which such systems exhibit, or fail to exhibit, continuous change.