Switch to: Citations

Add references

You must login to add references.
  1. A Mathematical Theory of Communication.Claude Elwood Shannon - 1948 - Bell System Technical Journal 27 (April 1924):379–423.
    The mathematical theory of communication.
    Direct download  
     
    Export citation  
     
    Bookmark   1179 citations  
  • Content in Simple Signalling Systems.Nicholas Shea, Peter Godfrey-Smith & Rosa Cao - 2018 - British Journal for the Philosophy of Science 69 (4):1009-1035.
    Our understanding of communication and its evolution has advanced significantly through the study of simple models involving interacting senders and receivers of signals. Many theorists have thought that the resources of mathematical information theory are all that are needed to capture the meaning or content that is being communicated in these systems. However, the way theorists routinely talk about the models implicitly draws on a conception of content that is richer than bare informational content, especially in contexts where false content (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   38 citations  
  • Measuring consciousness: relating behavioural and neurophysiological approaches.Anil K. Seth, Zoltán Dienes, Axel Cleeremans, Morten Overgaard & Luiz Pessoa - 2008 - Trends in Cognitive Sciences 12 (8):314-321.
  • The Hard Problem Of Content: Solved (Long Ago).Marcin Miłkowski - 2015 - Studies in Logic, Grammar and Rhetoric 41 (1):73-88.
    In this paper, I argue that even if the Hard Problem of Content, as identified by Hutto and Myin, is important, it was already solved in natu- ralized semantics, and satisfactory solutions to the problem do not rely merely on the notion of information as covariance. I point out that Hutto and Myin have double standards for linguistic and mental representation, which leads to a peculiar inconsistency. Were they to apply the same standards to basic and linguistic minds, they would (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  • The use of information theory in epistemology.William F. Harms - 1998 - Philosophy of Science 65 (3):472-501.
    Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  • Functional Information: a Graded Taxonomy of Difference Makers.Nir Fresco, Simona Ginsburg & Eva Jablonka - 2020 - Review of Philosophy and Psychology 11 (3):547-567.
    There are many different notions of information in logic, epistemology, psychology, biology and cognitive science, which are employed differently in each discipline, often with little overlap. Since our interest here is in biological processes and organisms, we develop a taxonomy of functional information that extends the standard cue/signal distinction. Three general, main claims are advanced here. This new taxonomy can be useful in describing learning and communication. It avoids some problems that the natural/non-natural information distinction faces. Functional information is​ ​produced (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  • Knowledge and the flow of information.F. Dretske - 1989 - Trans/Form/Ação 12:133-139.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1374 citations  
  • The Intentional Stance.Daniel Clement Dennett - 1981 - MIT Press.
    Through the use of such "folk" concepts as belief, desire, intention, and expectation, Daniel Dennett asserts in this first full scale presentation of...
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1471 citations  
  • Content and Consciousness.D. C. Dennett - 1969 - Journal of Philosophy 69 (18):604-604.
  • Is coding a relevant metaphor for the brain?Romain Brette - 2019 - Behavioral and Brain Sciences 42:1-44.
    “Neural coding” is a popular metaphor in neuroscience, where objective properties of the world are communicated to the brain in the form of spikes. Here I argue that this metaphor is often inappropriate and misleading. First, when neurons are said to encode experimental parameters, the neural code depends on experimental details that are not carried by the coding variable. Thus, the representational power of neural codes is much more limited than generally implied. Second, neural codes carry information only by reference (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  • Information: a very short introduction.Luciano Floridi - 2010 - New York: Oxford University Press.
    This book helps us understand the true meaning of the concept and how it can be used to understand our world.
    Direct download  
     
    Export citation  
     
    Bookmark   97 citations  
  • Signals: Evolution, Learning, and Information.Brian Skyrms - 2010 - Oxford, GB: Oxford University Press.
    Brian Skyrms offers a fascinating demonstration of how fundamental signals are to our world. He uses various scientific tools to investigate how meaning and communication develop. Signals operate in networks of senders and receivers at all levels of life, transmitting and processing information. That is how humans and animals think and interact.
  • A Philosophical Letter of Alfred Tarski.Morton White - 1987 - Journal of Philosophy 84 (1):28-32.
  • From Bacteria to Bach and Back: The Evolution of Minds.Daniel C. Dennett - unknown
    No categories
     
    Export citation  
     
    Bookmark   164 citations  
  • On Quantifying Semantic Information.Simon D'Alfonso - 2011 - Information 2 (1):61-101.
    The purpose of this paper is to look at some existing methods of semantic information quantification and suggest some alternatives. It begins with an outline of Bar-Hillel and Carnap’s theory of semantic information before going on to look at Floridi’s theory of strongly semantic information. The latter then serves to initiate an in-depth investigation into the idea of utilising the notion of truthlikeness to quantify semantic information. Firstly, a couple of approaches to measure truthlikeness are drawn from the literature and (...)
    Direct download  
     
    Export citation  
     
    Bookmark   8 citations  
  • What is information?David J. Israel & John Perry - 1990 - In Philip P. Hanson (ed.), Information, Language and Cognition. University of British Columbia Press.
  • Representational content in humans and machines.Mark H. Bickhard - 1993 - Journal of Experimental and Theoretical Artificial Intelligence 5:285-33.
    This article focuses on the problem of representational content. Accounting for representational content is the central issue in contemporary naturalism: it is the major remaining task facing a naturalistic conception of the world. Representational content is also the central barrier to contemporary cognitive science and artificial intelligence: it is not possible to understand representation in animals nor to construct machines with genuine representation given current (lack of) understanding of what representation is. An elaborated critique is offered to current approaches to (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   88 citations  
  • Information, Mechanism and Meaning.Donald M. Mackay - 1972 - Synthese 24 (3):472-474.