Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s probability defined by Zadeh is treated as the logical probability sought by Popper, and the membership grade is treated as the truth-value of a proposition and also as the posterior logical probability. The classical relative information formula (Information=log(Posterior probability / Prior probability) is revised into SIF by replacing the posterior probability with the membership grade and the prior probability with the fuzzy set’s probability. The SIF can be explained as “Information=Testing severity – Relative square deviation” and hence can be used as Popper's information criterion to test scientific theories or propositions. The information measure defined by the SIF also means the spared codeword length as the classical information measure. This paper introduces the set-Bayes’ formula which establishes the relationship between statistical probability and logical probability, derives Fuzzy Information Criterion (FIC) for the optimization of semantic channel, and discusses applications of SIF and FIC in areas such as linguistic communication, prediction, estimation, test, GPS, translation, and fuzzy reasoning. Particularly, through a detailed example of reasoning, it is proved that we can improve semantic channel with proper fuzziness to increase average semantic information to reach its upper limit: Shannon mutual information.
Keywords Shannon  Popper  semantic information  logical probability  factual test  semantic channel  fuzzy truth function
Categories (categorize this paper)
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy

 PhilArchive page | Other versions
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

Semantic Conceptions of Information.Luciano Floridi - 2008 - Stanford Encyclopedia of Philosophy.
A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.

Add more references

Citations of this work BETA

No citations found.

Add more citations

Similar books and articles

Strongly Semantic Information and Verisimilitude.Gustavo Cevolani - 2011 - Etica and Politica / Ethics and Politics (2):159-179.
On Quantifying Semantic Information.Simon D'Alfonso - 2011 - Information 2 (1):61-101.
Logical Pluralism and Semantic Information.Patrick Allo - 2007 - Journal of Philosophical Logic 36 (6):659 - 694.
Search for Syllogistic Structure of Semantic Information.Marcin J. Schroeder - 2012 - Journal of Applied Non-Classical Logics 22 (1-2):83-103.
An Introduction to Logical Entropy and its Relation to Shannon Entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
A Critical Analysis of Floridi’s Theory of Semantic Information.Pieter Adriaans - 2010 - Knowledge, Technology & Policy 23 (1-2):41-56.
Probability as a Measure of Information Added.Peter Milne - 2012 - Journal of Logic, Language and Information 21 (2):163-188.
Pre-Cognitive Semantic Information.Orlin Vakarelov - 2010 - Knowledge, Technology & Policy 23 (1-2):193-226.
The Transmission Sense of Information.Carl T. Bergstrom & Martin Rosvall - 2011 - Biology and Philosophy 26 (2):159-176.


Added to PP index

Total views
243 ( #42,259 of 2,462,679 )

Recent downloads (6 months)
33 ( #26,169 of 2,462,679 )

How can I increase my downloads?


My notes