The use of information theory in epistemology

Philosophy of Science 65 (3):472-501 (1998)
  Copy   BIBTEX

Abstract

Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 74,174

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Analytics

Added to PP
2009-01-28

Downloads
106 (#115,088)

6 months
1 (#413,740)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

William F. Harms
Seattle Central Community College

References found in this work

Knowledge and the Flow of Information.Fred I. Dretske - 1981 - Revue Philosophique de la France Et de l'Etranger 175 (1):69-70.
A Mathematical Theory of Communication.Claude E. Shannon - 1948 - Bell System Technical Journal 27:379–423.
Signal, Decision, Action.Peter Godfrey-Smith - 1991 - Journal of Philosophy 88 (12):709.
Misinformation.Peter Godfrey-Smith - 1989 - Canadian Journal of Philosophy 19 (4):533-50.

View all 12 references / Add more references