Theory and Decision 72 (2):273-285 (2012)

Abstract
Expected utility maximization problem is one of the most useful tools in mathematical finance, decision analysis and economics. Motivated by statistical model selection, via the principle of expected utility maximization, Friedman and Sandow (J Mach Learn Res 4:257–291, 2003a) considered the model performance question from the point of view of an investor who evaluates models based on the performance of the optimal strategies that the models suggest. They interpreted their performance measures in information theoretic terms and provided new generalizations of Shannon entropy and Kullback–Leibler relative entropy and called them U-entropy and U-relative entropy. In this article, a utility-based criterion for independence of two random variables is defined. Then, Markov’s inequality for probabilities is extended from the U-entropy viewpoint. Moreover, a lower bound for the U-relative entropy is obtained. Finally, a link between conditional U-entropy and conditional Renyi entropy is derived
Keywords Expected utility maximization   U-entropy   U-relative entropy  Conditional relative U-entropy  Conditional U-entropy  Mutual U-information  Data processing inequality
Categories (categorize this paper)
ISBN(s)
DOI 10.1007/s11238-010-9232-5
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 71,231
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

Theory of Games and Economic Behavior. [REVIEW]E. N. - 1945 - Journal of Philosophy 42 (20):550-554.
A Mathematical Theory of Communication.Claude E. Shannon - 1948 - Bell System Technical Journal 27:379–423.

Add more references

Citations of this work BETA

No citations found.

Add more citations

Similar books and articles

Entropy, Its Language, and Interpretation.Harvey S. Leff - 2007 - Foundations of Physics 37 (12):1744-1766.
Quantum Mutual Entropy Defined by Liftings.Satoshi Iriyama & Masanori Ohya - 2011 - Foundations of Physics 41 (3):406-413.
Time Evolution in Macroscopic Systems. II. The Entropy.W. T. Grandy - 2004 - Foundations of Physics 34 (1):21-57.
How Does the Entropy/Information Bound Work?Jacob D. Bekenstein - 2005 - Foundations of Physics 35 (11):1805-1823.
Choosing a Definition of Entropy That Works.Robert H. Swendsen - 2012 - Foundations of Physics 42 (4):582-593.
Maxwell's Demon and the Entropy Cost of Information.Paul N. Fahn - 1996 - Foundations of Physics 26 (1):71-93.

Analytics

Added to PP index
2013-12-01

Total views
56 ( #204,678 of 2,518,239 )

Recent downloads (6 months)
1 ( #408,577 of 2,518,239 )

How can I increase my downloads?

Downloads

My notes