On transmitted information as a measure of explanatory power

Philosophy of Science 45 (4):531-562 (1978)
  Copy   BIBTEX

Abstract

This paper contrasts two information-theoretic approaches to statistical explanation: namely, (1) an analysis, which originated in my earlier research on problems of testing stochastic models of learning, based on an entropy-like measure of expected transmitted-information (and here referred to as the Expected-Information Model), and (2) the analysis, which was proposed by James Greeno (and which is closely related to Wesley Salmon's Statistical Relevance Model), based on the information-transmitted-by-a-system. The substantial differences between these analyses can be traced to the following basic difference. On Greeno's view, the essence of explanation lies in the relevance relations expressed by the conditional probabilities that relate the explanans variables to the explanandum variables; on my view, in contrast, the essence of explanation lies in theories viewed as hypothetical structures which deductively entail conditional probability distributions linking the explanans variables and the explanandum variables. The explanatory power of a stochastic theory is identified with information (regarding the values of explanandum variables) which is "absorbed from" the explanans variables. While other information which is "absorbed from" the explanandum variables (through the process of parameter estimation, for example) reflects descriptive power of the theory. I prove that Greeno's measure of transmitted information is a limiting special case of the E-I model, but that the former, unlike the latter, makes no distinction between explanatory power and descriptive power

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,386

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2009-01-28

Downloads
33 (#472,429)

6 months
8 (#342,364)

Historical graph of downloads
How can I increase my downloads?

References found in this work

The Logic of Scientific Discovery.Karl Popper - 1959 - Studia Logica 9:262-265.
Studies in the logic of explanation.Carl Gustav Hempel & Paul Oppenheim - 1948 - Philosophy of Science 15 (2):135-175.
Statistical explanation.Wesley C. Salmon - 1970 - In Robert Colodny (ed.), The Nature and Function of Scientific Theories. University of Pittsburgh Press. pp. 173--231.
A Mathematical Theory of Communication.Claude Elwood Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
A logical measure function.John G. Kemeny - 1953 - Journal of Symbolic Logic 18 (4):289-308.

View all 9 references / Add more references