Information is a notion of wide use and great intuitive appeal, and hence, not surprisingly, different formal paradigms claim part of it, from Shannon channel theory to Kolmogorov complexity. Information is also a widely used term in logic, but a similar diversity repeats itself: there are several competing logical accounts of this notion, ranging from semantic to syntactic. In this chapter, we will discuss three major logical accounts of information.
Brutus wanted to kill Caesar. He believed that Caesar was an ordinary mortal, and that, given this, stabbing him (by which we mean plunging a knife into his heart) was a way of killing him. He thought that he could stab Caesar, for he remembered that he had a knife and saw that Caesar was standing next to him on his left, in the Forum. So Brutus was motivated to stab the man to his left. He did so, thereby killing (...) Caesar. (shrink)
Kaplan says that monsters violate Principle 2 of his theory. Principle 2 is that indexicals, pure and demonstrative alike, are directly referential. In providing this explanation of there being no monsters, Kaplan feels his theory has an advantage over double-indexing theories like Kamp’s or Segerberg’s (or Stalnaker’s), which either embrace monsters or avoid them only by ad hoc stipulation, in the sharp conceptual distinction it draws between circumstances of evaluation and contexts of utterance. We shall argue that Kaplan’s prohibition is (...) also essentially stipulative, and that it is too general. The main difference between ourselves and Kaplan is that the basic carriers of a truth-value is a sentence-in-a-context; our account is utterance-based. (shrink)
We sketch the historical and conceptual context of Turing's analysis of algorithmic or mechanical computation. We then discuss two responses to that analysis, by Gödel and by Gandy, both of which raise, though in very different ways. The possibility of computation procedures that cannot be reduced to the basic procedures into which Turing decomposed computation. Along the way, we touch on some of Cleland's views.
Situation theory is the result of an interdisciplinary effort to create a full-fledged theory of information. Created by scholars and scientists from cognitive science, computer science and AI, linguistics, logic, philosophy, and mathematics, it aims to provide a common set of tools for the analysis of phenomena from all these fields. Unlike Shannon-Weaver type theories of information, which are purely quantitative theories, situation theory aims at providing tools for the analysis of the specific content of a situation. The question addressed (...) is not how much information is carried, but what information is carried. (shrink)