About this topic
Summary The main focus of this category is Shannon's mathematical theory of information, and its broader philosophical uses. This includes in the first place cybernetics, signalling theories, the sender-receiver communication model, and Kolmogorov complexity.More general uses of information-theory that overlap with other domains of the philosophy of information may also belong to this category. Examples include different philosophical conceptions of information (semantic conceptions, semiotic approaches), as well as applications in specific domains of theoretical philosophy like the philosophy of science and the philosophy of mind.
Related categories

220 found
Order:
1 — 50 / 220
  1. Falsification and Future Performance.David Balduzzi - manuscript
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  2. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  3. A Semantic Information Formula Compatible with Shannon and Popper's Theories.Chenguang Lu - manuscript
    Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  4. Information, Learning and Falsification.David Balduzzi - 2011
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  5. Minimum Message Length as a Truth-Conducive Simplicity Measure.Steve Petersen - manuscript
    given at the 2007 Formal Epistemology Workshop at Carnegie Mellon June 2nd. Good compression must track higher vs lower probability of inputs, and this is one way to approach how simplicity tracks truth.
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  6. Information, Meaning and Physics: The Intellectual Transformation of the English School of Information Theory During 1946-1956.Javier Anta - forthcoming - Science in Context.
    In this comparative historical analysis, we will analyze the intellectual tendency that emerged between 1946 and 1956 to take advantage of the popularity of communication theory to develop a kind of informational epistemology of statistical mechanics. We will argue that this tendency results from a historical confluence in the early 1950s of certain theoretical claims of the so-called English School of Information Theory, championed by authors such as Gabor (1956) or MacKay (1969), and the search to extend the profound success (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  7. Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the beginning of (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  8. Algorithmic Randomness and Measures of Complexity.George Barmpalias - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on (weak) reducibilities that measure (a) the initial segment complexity of reals and (b) the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  9. Strengthening Weak Emergence.Nora Berenstain - forthcoming - Erkenntnis:1-18.
    Bedau's influential (1997) account analyzes weak emergence in terms of the non-derivability of a system’s macrostates from its microstates except by simulation. I offer an improved version of Bedau’s account of weak emergence in light of insights from information theory. Non-derivability alone does not guarantee that a system’s macrostates are weakly emergent. Rather, it is non-derivability plus the algorithmic compressibility of the system’s macrostates that makes them weakly emergent. I argue that the resulting information-theoretic picture provides a metaphysical account of (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  10. Filled/Non-Filled Pairs: An Empirical Challenge to the Integrated Information Theory of Consciousness.Amber R. Hopkins & Kelvin J. McQueen - forthcoming - Consciousness and Cognition.
    Perceptual filling-in for vision is the insertion of visual properties (e.g., color, contour, luminance, or motion) into one’s visual field, when those properties have no corresponding retinal input. This paper introduces and provides preliminary empirical support for filled/non-filled pairs, pairs of images that appear identical, yet differ by amount of filling-in. It is argued that such image pairs are important to the experimental testing of theories of consciousness. We review recent experimental research and conclude that filling-in involves brain activity with (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  11. Algorithm, Information.A. N. Kolmogorov - forthcoming - Complexity.
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  12. Information Before Information Theory: The Politics of Data Beyond the Perspective of Communication.Colin Koopman - forthcoming - New Media and Society.
    Scholarship on the politics of new media widely assumes that communication functions as a sufficient conceptual paradigm for critically assessing new media politics. This article argues that communication-centric analyses fail to engage the politics of information itself, limiting information only to its consequences for communication, and neglecting information as it reaches into our selves, lives, and actions beyond the confines of communication. Furthering recent new media historiography on the “information theory” of Shannon and Wiener, the article reveals both the primacy (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  13. Objects and Processes: Two Notions for Understanding Biological Information.Agustín Mercado-Reyes, Pablo Padilla Longoria & Alfonso Arroyo-Santos - forthcoming - Journal of Theoretical Biology.
    In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon's theory are denounced as pernicious metaphors. We perform a computational experiment to explore whether Shannon's information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that quantitative theoretical frameworks do (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  14. Reviewed Work(S): Some Theorems on the Algorithmic Approach to Probability Theory and Information Theory (1971 Dissertation Directed by A. N. Kolmogorov). Annals of Pure and Applied Logic, Vol. 162 by L. A. Levin. [REVIEW]Jan Reimann - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    Review by: Jan Reimann The Bulletin of Symbolic Logic, Volume 19, Issue 3, Page 397-399, September 2013.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  15. Real Patterns and Indispensability.Abel Suñé & Manolo Martínez - forthcoming - Synthese 198 (5):4315-4330.
    While scientific inquiry crucially relies on the extraction of patterns from data, we still have a far from perfect understanding of the metaphysics of patterns—and, in particular, of what makes a pattern real. In this paper we derive a criterion of real-patternhood from the notion of conditional Kolmogorov complexity. The resulting account belongs to the philosophical tradition, initiated by Dennett :27–51, 1991), that links real-patternhood to data compressibility, but is simpler and formally more perspicuous than other proposals previously defended in (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  16. Can Informational Thermal Physics Explain the Approach to Equilibrium?Javier Anta - 2021 - Synthese 199 (1-2):4015–4038.
    In this paper I will defend the incapacity of the informational frameworks in thermal physics, mainly those that historically and conceptually derive from the work of Brillouin (1962) and Jaynes (1957a), to robustly explain the approach of certain gaseous systems to their state of thermal equilibrium from the dynamics of their molecular components. I will further argue that, since their various interpretative, conceptual and technical-formal resources (e.g. epistemic interpretations of probabilities and entropy measures, identification of thermal entropy as Shannon information, (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17. Make Information in Science Meaningful Again.Javier Anta - 2021 - Logos and Episteme: An International Journal of Epistemology (3):263-286.
    Although the everyday notion of information has clear semantic properties, the all-pervasive technical concept of Shannon information is usually considered as a non-semantic concept. In this paper I show how this concept was implicitly ‘semantized’ in the early 1950s by many authors, such as Rothstein or Brillouin, in order to explain the knowledge dynamics underlying certain scientific practices such as measurement. On the other hand, I argue that the main attempts in the literature to develop a quantitative measure of semantic (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  18. Integrated Information Theory, Intrinsicality, and Overlapping Conscious Systems.James C. Blackmon - 2021 - Journal of Consciousness Studies 28 (11-12):31-53.
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  19. Critique of the Integrated Information Theory of Consciousness: Or, the Relevance of Ontological Information.A. Peuhu - 2021 - Journal of Consciousness Studies 28 (5-6):58-78.
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  20. Nature's Operating System.Ilexa Yardley - 2021 - Https://Medium.Com/the-Circular-Theory.
  21. Two Informational Theories of Memory: A Case From Memory-Conjunction Errors.Danilo Fraga Dantas - 2020 - Disputatio 12 (59):395-431.
    The causal and simulation theories are often presented as very distinct views about declarative memory, their major difference lying on the causal condition. The causal theory states that remembering involves an accurate representation causally connected to an earlier experience. In the simulation theory, remembering involves an accurate representation generated by a reliable memory process. I investigate how to construe detailed versions of these theories that correctly classify memory errors as misremembering or confabulation. Neither causalists nor simulationists have paid attention to (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  22. A Contingency Interpretation of Information Theory as a Bridge Between God’s Immanence and Transcendence.Philippe Gagnon - 2020 - In Michael Fuller, Dirk Evers, Anne L. C. Runehov, Knut-Willy Sæther & Bernard Michollet (eds.), Issues in Science and Theology: Nature – and Beyond. Cham: Springer. pp. 169-185.
    This paper investigates the degree to which information theory, and the derived uses that make it work as a metaphor of our age, can be helpful in thinking about God’s immanence and transcendance. We ask when it is possible to say that a consciousness has to be behind the information we encounter. If God is to be thought about as a communicator of information, we need to ask whether a communication system has to pre-exist to the divine and impose itself (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  23. Channels’ Confirmation and Predictions’ Confirmation: From the Medical Test to the Raven Paradox.Chenguang Lu - 2020 - Entropy 22 (4):384.
    After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple proposed the Raven Paradox. Then, Carnap used the increment of logical probability as the confirmation measure. So far, many confirmation measures have been proposed. Measure F proposed by Kemeny and Oppenheim among them possesses symmetries and asymmetries proposed by Elles and Fitelson, monotonicity proposed by Greco et al., and normalizing property suggested by many researchers. Based on the (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  24. The P–T Probability Framework for Semantic Communication, Falsification, Confirmation, and Bayesian Reasoning.Chenguang Lu - 2020 - Philosophies 5 (25):25-0.
    Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  25. Conscious Matter and Matters of Conscience.Matthew Owen - 2020 - Philosophia Christi 22 (1):145-156.
    In recent decades consciousness science has become a prominent field of research. This essay analyzes the most recent book by a leading pioneer in the scientific study of consciousness. In the The Feeling of Life Itself Christof Koch presents the integrated information theory and applies it to multiple pressing topics in consciousness studies. This essay considers the philosophical basis of the theory and Koch’s application of it from neurobiology to animal ethics.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  26. Элементарная основа языка.Andrej Poleev - 2020 - Enzymes 18.
    Русский язык прошёл долгий путь становления, в ходе которого совершенствовался его алфавит, его понятийное и смысловое содержание, его культура речи. В 20-м веке русский язык стал и продолжает оставаться самым развитым языком современности, и в этом качестве он является своеобразным эталоном для оценки других языков.
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  27. Two Kinds of Information Processing in Cognition.Mark Sprevak - 2020 - Review of Philosophy and Psychology 11 (3):591-611.
    What is the relationship between information and representation? Dating back at least to Dretske, an influential answer has been that information is a rung on a ladder that gets one to representation. Representation is information, or representation is information plus some other ingredient. In this paper, I argue that this approach oversimplifies the relationship between information and representation. If one takes current probabilistic models of cognition seriously, information is connected to representation in a new way. It enters as a property (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  28. Does Semantic Information Need to Be Truthful?Lundgren Björn - 2019 - Synthese 196 (7):2885-2906.
    The concept of information has well-known difficulties. Among the many issues that have been discussed is the alethic nature of a semantic conception of information. Floridi :197–222, 2004; Philos Phenomenol Res 70:351–370, 2005; EUJAP 3:31–41, 2007; The philosophy of information, Oxford University Press, Oxford, 2011) argued that semantic information must be truthful. In this article, arguments will be presented in favor of an alethically neutral conception of semantic information and it will be shown that such a conception can withstand Floridi’s (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  29. Defining Information Security.Lundgren Björn & Möller Niklas - 2019 - Science and Engineering Ethics 25 (2):419-441.
    This article proposes a new definition of information security, the ‘Appropriate Access’ definition. Apart from providing the basic criteria for a definition—correct demarcation and meaning concerning the state of security—it also aims at being a definition suitable for any information security perspective. As such, it bridges the conceptual divide between so-called ‘soft issues’ of information security and more technical issues. Because of this it is also suitable for various analytical purposes, such as analysing possible security breaches, or for studying conflicting (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  30. The Semantics Latent in Shannon Information.M. C. Isaac Alistair - 2019 - British Journal for the Philosophy of Science 70 (1):103-125.
    The lore is that standard information theory provides an analysis of information quantity, but not of information content. I argue this lore is incorrect, and there is an adequate informational semantics latent in standard theory. The roots of this notion of content can be traced to the secret parallel development of an information theory equivalent to Shannon’s by Turing at Bletchley Park, and it has been suggested independently in recent work by Skyrms and Bullinaria and Levy. This paper explicitly articulates (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  31. Semantic Information G Theory and Logical Bayesian Inference for Machine Learning.Chenguang Lu - 2019 - Information 10 (8):261.
    An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  32. A Simplicity Criterion for Physical Computation.Tyler Millhouse - 2019 - British Journal for the Philosophy of Science 70 (1):153-178.
    The aim of this paper is to offer a formal criterion for physical computation that allows us to objectively distinguish between competing computational interpretations of a physical system. The criterion construes a computational interpretation as an ordered pair of functions mapping (1) states of a physical system to states of an abstract machine, and (2) inputs to this machine to interventions in this physical system. This interpretation must ensure that counterfactuals true of the abstract machine have appropriate counterparts which are (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  33. El breve “Discurso del método” de Claude Shannon.Juan Ramón Álvarez - 2018 - Principia: An International Journal of Epistemology 22 (3):393-410.
    The following study departs from the lecture, entitled “Creative thinking”, delivered by Claude Shannon in 1952 at the Bell Laboratories. This paper includes an interpretive and critical account of the necessary conditions, as well as the desirable procedures, which must be satisfied in the scientific and technological invention, within the frame of the so-called scientist’s spontaneous philosophy.
    Remove from this list   Direct download (3 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  34. Intervening on the Causal Exclusion Problem for Integrated Information Theory.Matthew Baxendale & Garrett Mindt - 2018 - Minds and Machines 28 (2):331-351.
    In this paper, we examine the causal framework within which integrated information theory of consciousness makes it claims. We argue that, in its current formulation, IIT is threatened by the causal exclusion problem. Some proponents of IIT have attempted to thwart the causal exclusion problem by arguing that IIT has the resources to demonstrate genuine causal emergence at macro scales. In contrast, we argue that their proposed solution to the problem is damagingly circular as a result of inter-defining information and (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  35. An Informational Theory of Counterfactuals.Danilo Dantas - 2018 - Acta Analytica 33 (4):525-538.
    Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then I propose an (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  36. A Scientific Metaphysical Naturalisation of Information.Bruce Long - 2018 - Dissertation, University of Sydney
    The objective of this thesis is to present a naturalised metaphysics of information, or to naturalise information, by way of deploying a scientific metaphysics according to which contingency is privileged and a-priori conceptual analysis is excluded (or at least greatly diminished) in favour of contingent and defeasible metaphysics. The ontology of information is established according to the premises and mandate of the scientific metaphysics by inference to the best explanation, and in accordance with the idea that the primacy of physics (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  37. Information and Inaccuracy.William Roche & Tomoji Shogenji - 2018 - British Journal for the Philosophy of Science 69 (2):577-604.
    This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the problem of (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  38. Patterns, Information, and Causation.Holly Andersen - 2017 - Journal of Philosophy 114 (11):592-622.
    This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their degree (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  39. The Deluge of Spurious Correlations in Big Data.Cristian S. Calude & Giuseppe Longo - 2017 - Foundations of Science 22 (3):595-612.
    Very large databases are a major opportunity for science and data analytics is a remarkable new field of investigation in computer science. The effectiveness of these tools is used to support a “philosophy” against the scientific method as developed throughout history. According to this view, computer-discovered correlations should replace understanding and guide prediction and action. Consequently, there will be no need to give scientific meaning to phenomena, by proposing, say, causal relations, since regularities in very large databases are enough: “with (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  40. Information-Not-Thing: Further Problems with and Alternatives to the Belief That Information is Physical.Jesse David Dinneen & Christian Brauner - 2017 - Proceedings of 2017 CAIS-ACSI Conference.
    In this short paper, we show that a popular view in information science, information-as-thing, fails to account for a common example of information that seems physical. We then demonstrate how the distinction between types and tokens, recently used to analyse Shannon information, can account for this same example by viewing information as abstract, and discuss existing definitions of information that are consistent with this approach. -/- Dans ce court article nous montrons qu'une vision populaire en sciences de l'information, l'information en (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  41. Logical Information Theory: New Logical Foundations for Information Theory.David Ellerman - 2017 - Logic Journal of the IGPL 25 (5):806-835.
    There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the probability measure on the sets (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  42. Information Flow in the Brain: Ordered Sequences of Metastable States.Andrew A. Fingelkurts & Alexander A. Fingelkurts - 2017 - Information 8 (1):22.
    In this brief overview paper, we analyse information flow in the brain. Although Shannon’s information concept, in its pure algebraic form, has made a number of valuable contributions to neuroscience, information dynamics within the brain is not fully captured by its classical description. These additional dynamics consist of self-organisation, interplay of stability/instability, timing of sequential processing, coordination of multiple sequential streams, circular causality between bottom-up and top-down operations, and information creation. Importantly, all of these processes are dynamic, hierarchically nested and (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  43. Information and Veridicality: Information Processing and the Bar-Hillel/Carnap Paradox.Nir Fresco & Michaelis Michael - 2016 - Philosophy of Science 83 (1):131-151.
    Floridi’s Theory of Strongly Semantic Information posits the Veridicality Thesis. One motivation is that it can serve as a foundation for information-based epistemology being an alternative to the tripartite theory of knowledge. However, the Veridicality thesis is false, if ‘information’ is to play an explanatory role in human cognition. Another motivation is avoiding the so-called Bar-Hillel/Carnap paradox. But this paradox only seems paradoxical, if ‘information’ and ‘informativeness’ are synonymous, logic is a theory of inference, or validity suffices for rational inference; (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  44. On the Eigenvalue and Shannon's Entropy of Finite Length Random Sequences.Lingfeng Liu, Suoxia Miao, Hanping Hu & Yashuang Deng - 2016 - Complexity 21 (2):154-161.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  45. Deflating the Deflationary View of Information.Olimpia Lombardi, Sebastian Fortin & Cristian López - 2016 - European Journal for Philosophy of Science 6 (2):209-230.
    Christopher Timpson proposes a deflationary view about information, according to which the term ‘information’ is an abstract noun and, as a consequence, information is not part of the material contents of the world. The main purpose of the present article consists in supplying a critical analysis of this proposal, which will lead us to conclude that information is an item even more abstract than what Timpson claims. From this view, we embrace a pluralist stance that recognizes the legitimacy of different (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  46. What is Quantum Information?Olimpia Lombardi, Federico Holik & Leonardo Vanni - 2016 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 56:17-26.
    In the present paper we develop different arguments to show that there are no reasons to consider that there exists quantum information as qualitatively different than Shannon information. There is only one kind of information, which can be coded by means of orthogonal or non-orthogonal states. The analogy between Shannon’s theory and Schumacher’s theory is confined to coding theorems. The attempt to extend the analogy beyond this original scope leads to a concept of quantum information that becomes indistinguishable from that (...)
    Remove from this list   Direct download (8 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  47. What is Shannon Information?Olimpia Lombardi, Federico Holik & Leonardo Vanni - 2016 - Synthese 193 (7):1983-2012.
    Despite of its formal precision and its great many applications, Shannon’s theory still offers an active terrain of debate when the interpretation of its main concepts is the task at issue. In this article we try to analyze certain points that still remain obscure or matter of discussion, and whose elucidation contribute to the assessment of the different interpretative proposals about the concept of information. In particular, we argue for a pluralist position, according to which the different views about information (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  48. A review on a peer review.Andrej Poleev - 2016 - Enzymes 14.
    The peer review is an opportunity to perform an unlawful censorship which ensures that no apostate notion ever get published in mainstream journals. Or such peer review censorship is an opportunity to steal any content and to claim afterward the priority of the first publication. And last but not least, the peer review is an academic tool to promote the mainstream pseudoscience.
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark   2 citations  
  49. Information-Theoretic Philosophy of Mind.Jason Winning & William Bechtel - 2016 - In Luciano Floridi (ed.), The Routledge Handbook of Philosophy of Information. London and New York: Routledge. pp. 347-360.
  50. A Quantitative-Informational Approach to Logical Consequence.Marcos Antonio Alves & Ítala M. Loffredo D'Otaviano - 2015 - In Jean-Yves Beziau (ed.), The Road to Universal Logic (Studies in Universal Logic). Switzerland: Springer International Publishing. pp. 105-24.
    In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity of (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
1 — 50 / 220