Results for 'entropy measure'

991 found
Order:
  1.  12
    Entropy Measures Can Add Novel Information to Reveal How Runners' Heart Rate and Speed Are Regulated by Different Environments.Juliana Exel, Nuno Mateus, Bruno Gonçalves, Catarina Abrantes, Julio Calleja-González & Jaime Sampaio - 2019 - Frontiers in Psychology 10.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  2.  16
    Paraconsistent conjectural deduction based on logical entropy measures I: C-systems as non-standard inference framework.Paola Forcheri & Paolo Gentilini - 2005 - Journal of Applied Non-Classical Logics 15 (3):285-319.
    A conjectural inference is proposed, aimed at producing conjectural theorems from formal conjectures assumed as axioms, as well as admitting contradictory statements as conjectural theorems. To this end, we employ Paraconsistent Informational Logic, which provides a formal setting where the notion of conjecture formulated by an epistemic agent can be defined. The paraconsistent systems on which conjectural deduction is based are sequent formulations of the C-systems presented in Carnielli-Marcos [CAR 02b]. Thus, conjectural deduction may also be considered to be a (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  3.  34
    Entropy and compression: two measures of complexity.Teresa Henriques, Hernâni Gonçalves, Luís Antunes, Mara Matias, João Bernardes & Cristina Costa-Santos - 2013 - Journal of Evaluation in Clinical Practice 19 (6):1101-1106.
  4.  16
    On Measuring the Complexity of Networks: Kolmogorov Complexity versus Entropy.Mikołaj Morzy, Tomasz Kajdanowicz & Przemysław Kazienko - 2017 - Complexity:1-12.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  5. IN-cross Entropy Based MAGDM Strategy under Interval Neutrosophic Set Environment.Shyamal Dalapati, Surapati Pramanik, Shariful Alam, Florentin Smarandache & Tapan Kumar Roy - 2017 - Neutrosophic Sets and Systems 18:43-57.
    Cross entropy measure is one of the best way to calculate the divergence of any variable from the priori one variable. We define a new cross entropy measure under interval neutrosophic set environment.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  6.  19
    Maximum-entropy spectral analysis of extended energy-loss fine structure and its application to time-resolved measurement.Shunsuke Muto † - 2004 - Philosophical Magazine 84 (25-26):2793-2808.
  7.  10
    Entropy and a sub-group of geometric measures of paths predict the navigability of an environment.D. Yesiltepe, P. Fernández Velasco, A. Coutrot, A. Ozbil Torun, J. M. Wiener, C. Holscher, M. Hornberger, R. Conroy Dalton & H. J. Spiers - 2023 - Cognition 236 (C):105443.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  8.  23
    Improved Permutation Entropy for Measuring Complexity of Time Series under Noisy Condition.Zhe Chen, Yaan Li, Hongtao Liang & Jing Yu - 2016 - Complexity 2019 (3):1-12.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  9.  40
    Entropy in Relation to Incomplete Knowledge.Michael J. Zenzen - 1985 - Cambridge University Press.
    This book is about an important issue which has arisen within two of the branches of physical science - namely thermodynamics and statistical mechanics - where the notion of entropy plays an essential role. A number of scientists and information theorists have maintained that entropy is a subjective concept and is a measure of human ignorance. Such a view, if it is valid, would create some profound philosophical problems and would tend to undermine the objectivity of the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  10. In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?—Bridging the Gap Between Dynamical Systems Theory and Communication Theory.Roman Frigg - 2004 - British Journal for the Philosophy of Science 55 (3):411-434.
    On an influential account, chaos is explained in terms of random behaviour; and random behaviour in turn is explained in terms of having positive Kolmogorov-Sinai entropy (KSE). Though intuitively plausible, the association of the KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. I provide this justification for the case of Hamiltonian systems by proving that the KSE is equivalent to a generalized version of (...)
    Direct download (11 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  11. Logical Entropy: Introduction to Classical and Quantum Logical Information theory.David Ellerman - 2018 - Entropy 20 (9):679.
    Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences and distinguishability and is formalized using the distinctions of a partition. All the definitions of simple, joint, conditional and mutual entropy of Shannon information theory are derived by a uniform transformation from the corresponding definitions at the logical level. The purpose of this (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  12. Entropy of Polysemantic Words for the Same Part of Speech.Mihaela Colhon, Florentin Smarandache & Dan Valeriu Voinea - unknown
    In this paper, a special type of polysemantic words, that is, words with multiple meanings for the same part of speech, are analyzed under the name of neutrosophic words. These words represent the most dif cult cases for the disambiguation algorithms as they represent the most ambiguous natural language utterances. For approximate their meanings, we developed a semantic representation framework made by means of concepts from neutrosophic theory and entropy measure in which we incorporate sense related data. We (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  13.  8
    On the Possibility to Observe Relations Between Quantum Measurements and the Entropy of Phase Transitions in Zn2(BDC)2.Svetlana G. Kozlova & Denis P. Pishchur - 2021 - Foundations of Physics 51 (1):1-9.
    The work interprets experimental data for the heat capacity of Zn22 in the region of second-order phase transitions. The proposed understanding of the processes occurring during phase transitions may be helpful to reveal quantum Zeno effects in metal–organic frameworks with evolving structural subsystems and to establish relations between quantum measurements and the entropy of phase transitions.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  14.  11
    Brain Entropy During Aging Through a Free Energy Principle Approach.Filippo Cieri, Xiaowei Zhuang, Jessica Z. K. Caldwell & Dietmar Cordes - 2021 - Frontiers in Human Neuroscience 15.
    Neural complexity and brain entropy have gained greater interest in recent years. The dynamics of neural signals and their relations with information processing continue to be investigated through different measures in a variety of noteworthy studies. The BEN of spontaneous neural activity decreases during states of reduced consciousness. This evidence has been showed in primary consciousness states, such as psychedelic states, under the name of “the entropic brain hypothesis.” In this manuscript we propose an extension of this hypothesis to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  15. Entropy and uncertainty.Teddy Seidenfeld - 1986 - Philosophy of Science 53 (4):467-491.
    This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result 1 provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an "a priori" probability that is uniform, and where all MAXENT constraints are limited to 0-1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result 2 established a sensitivity (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   54 citations  
  16.  27
    Incorporation of phylogeny in biological diversity measurement: Drawbacks of extensively used indices, and advantages of quadratic entropy.Bastien Mérigot & Jean-Claude Gaertner - 2011 - Bioessays 33 (11):819-822.
    Graphical AbstractUsing the indices δ+ and δ* for assessing phylogenetic diversity may lead to spurious results and interpretations; it can bias recommendations for conservation and lead to inappropriate management decisions. Therefore these indices should be avoided and other indices based on quadratic entropy (Qδ+ and Q) should be used instead.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17.  55
    Thermodynamic Entropy and Its Relation to Probability in Classical Mechanics.Kevin Davey - 2011 - Philosophy of Science 78 (5):955-975.
    A gas relaxing into equilibrium is often taken to be a process in which a system moves from an “improbable” to a “probable” state. Given that the thermodynamic entropy increases during such a process, it is natural to conjecture that the thermodynamic entropy is a measure of the probability of a macrostate. For nonideal classical gases, however, I claim that there is no clear sense in which the thermodynamic entropy of a macrostate measures its probability. We (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  18.  8
    Defining synergy thermodynamically using quantitative measurements of entropy and free energy.Klaus Jaffe & Gerardo Febres - 2016 - Complexity 21 (S2):235-242.
  19.  47
    Physical entropy and the senses.Kenneth H. Norwich - 2005 - Acta Biotheoretica 53 (3):167-180.
    With reference to two specific modalities of sensation, the taste of saltiness of chloride salts, and the loudness of steady tones, it is shown that the laws of sensation (logarithmic and power laws) are expressions of the entropy per mole of the stimulus. That is, the laws of sensation are linear functions of molar entropy. In partial verification of this hypothesis, we are able to derive an approximate value for the gas constant, a fundamental physical constant, directly from (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  20.  56
    A Decision-Making Approach Incorporating TODIM Method and Sine Entropy in q-Rung Picture Fuzzy Set Setting.Büşra Aydoğan, Murat Olgun, Florentin Smarandache & Mehmet Ünver - 2024 - Journal of Applied Mathematics 2024.
    In this study, we propose a new approach based on fuzzy TODIM (Portuguese acronym for interactive and multicriteria decision-making) for decision-making problems in uncertain environments. Our method incorporates group utility and individual regret, which are often ignored in traditional multicriteria decision-making (MCDM) methods. To enhance the analysis and application of fuzzy sets in decision-making processes, we introduce novel entropy and distance measures for q-rung picture fuzzy sets. These measures include an entropy measure based on the sine function (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  21.  56
    Generalized graph entropies.Matthias Dehmer & Abbe Mowshowitz - 2011 - Complexity 17 (2):45-50.
  22. An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  23.  8
    Entropy of eye movement during rapid automatized naming.Hongan Wang, Fulin Liu, Yuhong Dong & Dongchuan Yu - 2022 - Frontiers in Human Neuroscience 16.
    Numerous studies have focused on the understanding of rapid automatized naming, which can be applied to predict reading abilities and developmental dyslexia in children. Eye tracking technique, characterizing the essential ocular activities, might have the feasibility to reveal the visual and cognitive features of RAN. However, traditional measures of eye movements ignore many dynamical details about the visual and cognitive processing of RAN, and are usually associated with the duration of time spent on some particular areas of interest, fixation counts, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  24. Entropy, Disorder, and Traces.Ayhan Sol - 2007 - The Proceedings of the Twenty-First World Congress of Philosophy 12:149-153.
    Traces are generally considered to constitute an ontologically distinct class of objects that can be distinguished from other objects. However, it can be observed on close inspection that the principles to demarcate traces from other objects are quite general, imprecise and intuitively unclear, except perhaps the entropic account envisaging traces as low entropy states. This view was developed by Hans Reichenbach, Adolf Grünbaum, and J. J. C. Smart on the basis of Reichenbach's theory of branch systems that are subsystems (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  25.  11
    The Measurement Problem is a Feature, Not a Bug – Schematising the Observer and the Concept of an Open System on an Informational, or (neo-)Bohrian, Approach.Michael E. Cuffaro - 2023 - Entropy 25:1410.
    I flesh out the sense in which the informational approach to interpreting quantum mechanics, as defended by Pitowsky and Bub and lately by a number of other authors, is (neo-)Bohrian. I argue that on this approach, quantum mechanics represents what Bohr called a “natural generalisation of the ordinary causal description” in the sense that the idea (which philosophers of science like Stein have argued for on the grounds of practical and epistemic necessity) that understanding a theory as a theory of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  26. Von Neumann’s Entropy Does Not Correspond to Thermodynamic Entropy.Meir Hemmo & Orly Shenker - 2006 - Philosophy of Science 73 (2):153-174.
    Von Neumann argued by means of a thought experiment involving measurements of spin observables that the quantum mechanical quantity is conceptually equivalent to thermodynamic entropy. We analyze Von Neumann's thought experiment and show that his argument fails. Over the past few years there has been a dispute in the literature regarding the Von Neumann entropy. It turns out that each contribution to this dispute addressed a different special case. In this paper we generalize the discussion and examine the (...)
    Direct download (10 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  27. Maximum Shannon Entropy, Minimum Fisher Information, and an Elementary Game.Shunlong Luo - 2002 - Foundations of Physics 32 (11):1757-1772.
    We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is no (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  28.  18
    Darwinian fitness, evolutionary entropy and directionality theory.Klaus Dietz - 2005 - Bioessays 27 (11):1097-1101.
    Two recent articles1, 2 provide computational and empirical validation of the following analytical fact: the outcome of competition between an invading genotype and that of a resident population is determined by the rate at which the population returns to its original size after a random perturbation. This phenomenon can be quantitatively described in terms of the demographic parameter termed “evolutionary entropy”, a measure of the variability in the age at which individuals produce offspring and die. The two articles (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  29.  62
    Aspects concerning entropy and utility.A. R. Hoseinzadeh, G. R. Mohtashami Borzadaran & G. H. Yari - 2012 - Theory and Decision 72 (2):273-285.
    Expected utility maximization problem is one of the most useful tools in mathematical finance, decision analysis and economics. Motivated by statistical model selection, via the principle of expected utility maximization, Friedman and Sandow (J Mach Learn Res 4:257–291, 2003a) considered the model performance question from the point of view of an investor who evaluates models based on the performance of the optimal strategies that the models suggest. They interpreted their performance measures in information theoretic terms and provided new generalizations of (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  30.  20
    Measuring Causal Invariance Formally.Pierrick Bourrat - 2021 - Entropy 23 (6):690.
    Invariance is one of several dimensions of causal relationships within the interventionist account. The more invariant a relationship between two variables, the more the relationship should be considered paradigmatically causal. In this paper, I propose two formal measures to estimate invariance, illustrated by a simple example. I then discuss the notion of invariance for causal relationships between non-nominal (i.e., ordinal and quantitative) variables, for which Information theory, and hence the formalism proposed here, is not well suited. Finally, I propose how (...)
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  31.  9
    Sustaining Action and Optimizing Entropy: Coupling Efficiency for Energy and the Sustainability of Global Ecosystems.Ivan R. Kennedy, Angus N. Crossan & Michael T. Rose - 2008 - Bulletin of Science, Technology and Society 28 (3):260-272.
    Consideration of the property of action is proposed to provide a more meaningful definition of efficient energy use and sustainable production in ecosystems. Action has physical dimensions similar to angular momentum, its magnitude varying with mass, spatial configuration and relative motion. In this article, the relationship of action to thermodynamic processes such as the spontaneous increase in entropy of the second law is explained and the utility of action for measuring changes in energy and material distribution is promoted. In (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  32.  47
    Clausius versus Sackur–Tetrode entropies.Thomas Oikonomou & G. Baris Bagci - 2013 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 44 (2):63-68.
    Based on the property of extensivity , we derive in a mathematically consistent manner the explicit expressions of the chemical potential μμ and the Clausius entropy S for the case of monoatomic ideal gases in open systems within phenomenological thermodynamics. Neither information theoretic nor quantum mechanical statistical concepts are invoked in this derivation. Considering a specific expression of the constant term of S, the derived entropy coincides with the Sackur–Tetrode entropy in the thermodynamic limit. We demonstrate, however, (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  33.  82
    Aspects concerning entropy and utility.Gholam Reza Mohtashami Borzadaran - 2012 - Theory and Decision 72 (2):273-285.
    Expected utility maximization problem is one of the most useful tools in mathematical finance, decision analysis and economics. Motivated by statistical model selection, via the principle of expected utility maximization, Friedman and Sandow (J Mach Learn Res 4:257–291, 2003a) considered the model performance question from the point of view of an investor who evaluates models based on the performance of the optimal strategies that the models suggest. They interpreted their performance measures in information theoretic terms and provided new generalizations of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  34. Uncertainty Reduction as a Measure of Cognitive Load in Sentence Comprehension.Stefan L. Frank - 2013 - Topics in Cognitive Science 5 (3):475-494.
    The entropy-reduction hypothesis claims that the cognitive processing difficulty on a word in sentence context is determined by the word's effect on the uncertainty about the sentence. Here, this hypothesis is tested more thoroughly than has been done before, using a recurrent neural network for estimating entropy and self-paced reading for obtaining measures of cognitive processing load. Results show a positive relation between reading time on a word and the reduction in entropy due to processing that word, (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  35.  45
    Bayesian model learning based on predictive entropy.Jukka Corander & Pekka Marttinen - 2006 - Journal of Logic, Language and Information 15 (1-2):5-20.
    Bayesian paradigm has been widely acknowledged as a coherent approach to learning putative probability model structures from a finite class of candidate models. Bayesian learning is based on measuring the predictive ability of a model in terms of the corresponding marginal data distribution, which equals the expectation of the likelihood with respect to a prior distribution for model parameters. The main controversy related to this learning method stems from the necessity of specifying proper prior distributions for all unknown parameters of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  36. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  37.  26
    Measuring the complexity of the law: the United States Code.Daniel Martin Katz & M. J. Bommarito - 2014 - Artificial Intelligence and Law 22 (4):337-374.
    Einstein’s razor, a corollary of Ockham’s razor, is often paraphrased as follows: make everything as simple as possible, but not simpler. This rule of thumb describes the challenge that designers of a legal system face—to craft simple laws that produce desired ends, but not to pursue simplicity so far as to undermine those ends. Complexity, simplicity’s inverse, taxes cognition and increases the likelihood of suboptimal decisions. In addition, unnecessary legal complexity can drive a misallocation of human capital toward comprehending and (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  38.  63
    Dynamic Measurement Analysis of Urban Innovation Ability and Ecological Efficiency in China.Xing Li & Fuzhou Luo - 2022 - Complexity 2022:1-14.
    To establish the evaluation index system of urbanization innovation level and ecological efficiency, the entropy method is applied to measure the comprehensive index of urbanization innovation level and ecological efficiency, the VAR model is established, and empirical measurement is used to study the internal relationship and dynamic development between urbanization innovation level and ecological efficiency. The empirical results show the following: The overall development of the innovation level in 30 cities in China is uneven, there is a large (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  39.  13
    On measurement and irreversible processes.Gunnar Sperber - 1974 - Foundations of Physics 4 (2):163-179.
    The nature of physical measurements performed on microscopic systems is discussed, and it is suggested that the procedures which are conventionally referred to as “measurements” fall into at least three different categories. The connection between observation processes and irreversible processes is stressed. The customary quantum mechanical treatment of irreversible processes is discussed, and its deficiencies from the philosophical point of view are criticized. The standpoint that quantum mechanics should not be considered as a basic philosophical system but rather as an (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  40.  18
    Information-Theoretic Measures Predict the Human Judgment of Rhythm Complexity.Remi Fleurian, Tim Blackwell, Oded Ben‐Tal & Daniel Müllensiefen - 2017 - Cognitive Science 41 (3):800-813.
    To formalize the human judgment of rhythm complexity, we used five measures from information theory and algorithmic complexity to measure the complexity of 48 artificially generated rhythmic sequences. We compared these measurements to human prediction accuracy and easiness judgments obtained from a listening experiment, in which 32 participants guessed the last beat of each sequence. We also investigated the modulating effects of musical expertise and general pattern identification ability. Entropy rate and Kolmogorov complexity were correlated with prediction accuracy, (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  41.  18
    On Curvilinear Regression Analysis via Newly Proposed Entropies for Some Benzene Models.Guangwu Liu, Muhammad Kamran Siddiqui, Shazia Manzoor, Muhammad Naeem & Douhadji Abalo - 2022 - Complexity 2022:1-14.
    To avoid exorbitant and extensive laboratory experiments, QSPR analysis, based on topological descriptors, is a very constructive statistical approach for analyzing the numerous physical and chemical properties of compounds. Therefore, we presented some new entropy measures which are based on the sum of the neighborhood degree of the vertices. Firstly, we made the partition of the edges of benzene derivatives which are based on the degree sum of neighboring vertices and then computed the neighborhood version of entropies. Secondly, we (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  42. Quantifying privacy in terms of entropy for context aware services.Athanasios S. Voulodimos & Charalampos Z. Patrikakis - 2009 - Identity in the Information Society 2 (2):155-169.
    In this paper, we address the issue of privacy protection in context aware services, through the use of entropy as a means of measuring the capability of locating a user’s whereabouts and identifying personal selections. We present a framework for calculating levels of abstraction in location and personal preferences reporting in queries to a context aware services server. Finally, we propose a methodology for determining the levels of abstraction in location and preferences that should be applied in user data (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  43.  21
    Uncertainty Relations for General Canonically Conjugate Observables in Terms of Unified Entropies.Alexey E. Rastegin - 2015 - Foundations of Physics 45 (8):923-942.
    We study uncertainty relations for a general class of canonically conjugate observables. It is known that such variables can be approached within a limiting procedure of the Pegg–Barnett type. We show that uncertainty relations for conjugate observables in terms of generalized entropies can be obtained on the base of genuine finite-dimensional consideration. Due to the Riesz theorem, there exists an inequality between norm-like functionals of two probability distributions in finite dimensions. Using a limiting procedure of the Pegg–Barnett type, we take (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  44.  7
    Complexity of grammatical metaphor: an entropy-based approach.Jiangping Zhou - 2023 - Semiotica 2023 (252):173-185.
    Grammatical metaphor in M. A. K. Halliday’s sense has long been extensively investigated by researchers in terms of theoretical and empirical studies. Regarding the empirical studies, they have predominantly employed observed or normalized frequencies of grammatical metaphor to uncover its distribution in different text types. Few studies, however, were conducted to quantitatively examine the complexity of grammatical metaphor in that no indicator presently is proposed to measure the degree of complexity in grammatical metaphor. This paper targeted at investigating the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  45.  14
    Modeling Urban Growth and Form with Spatial Entropy.Yanguang Chen - 2020 - Complexity 2020:1-14.
    Entropy is one of the physical bases for the fractal dimension definition, and the generalized fractal dimension was defined by Renyi entropy. Using the fractal dimension, we can describe urban growth and form and characterize spatial complexity. A number of fractal models and measurements have been proposed for urban studies. However, the precondition for fractal dimension application is to find scaling relations in cities. In the absence of the scaling property, we can make use of the entropy (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  46.  37
    Parametric scaling from species relative abundances to absolute abundances in the computation of biological diversity: A first proposal using Shannon's entropy.Carlo Ricotta - 2003 - Acta Biotheoretica 51 (3):181-188.
    Traditional diversity measures such as the Shannon entropy are generally computed from the species' relative abundance vector of a given community to the exclusion of species' absolute abundances. In this paper, I first mention some examples where the total information content associated with a given community may be more adequate than Shannon's average information content for a better understanding of ecosystem functioning. Next, I propose a parametric measure of statistical information that contains both Shannon's entropy and total (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  47.  16
    LMC and SDL Complexity Measures: A Tool to Explore Time Series.José Roberto C. Piqueira & Sérgio Henrique Vannucchi Leme de Mattos - 2019 - Complexity 2019:1-8.
    This work is a generalization of the López-Ruiz, Mancini, and Calbet (LMC) and Shiner, Davison, and Landsberg (SDL) complexity measures, considering that the state of a system or process is represented by a continuous temporal series of a dynamical variable. As the two complexity measures are based on the calculation of informational entropy, an equivalent information source is defined by using partitions of the dynamical variable range. During the time intervals, the information associated with the measured dynamical variable is (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  48.  78
    Competing Definitions of Information Versus Entropy in Physics.Thomas Durt - 2011 - Foundations of Science 16 (4):315-318.
    As was mentioned by Nicolas Lori in his (Found Sci, 2010 ) commentary, the definition of Information in Physics is something about which not all authors agreed. According to physicists like me Information decreases when Entropy increases (so entropy would be a negative measure of information), while many physicists, seemingly the majority of them, are convinced of the contrary (even in the camp of Quantum Information Theoreticians). In this reply I reproduce, and make more precise, some of (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  49. A Quantitative Approach to Measuring Assurance with Uncertainty in Data Provenance.Stephen Bush, Moitra F., Crapo Abha, Barnett Andrew, Dill Bruce & J. Stephen - manuscript
    A data provenance framework is subject to security threats and risks, which increase the uncertainty, or lack of trust, in provenance information. Information assurance is challenged by incomplete information; one cannot exhaustively characterize all threats or all vulnerabilities. One technique that specifically incorporates a probabilistic notion of uncertainty is subjective logic. Subjective logic allows belief and uncertainty, due to incomplete information, to be specified and operated upon in a coherent manner. A mapping from the standard definition of information assurance to (...)
     
    Export citation  
     
    Bookmark  
  50.  7
    Complexity Measures for Maxwell–Boltzmann Distribution.Nicholas Smaal & José Roberto C. Piqueira - 2021 - Complexity 2021:1-6.
    This work presents a discussion about the application of the Kolmogorov; López-Ruiz, Mancini, and Calbet ; and Shiner, Davison, and Landsberg complexity measures to a common situation in physics described by the Maxwell–Boltzmann distribution. The first idea about complexity measure started in computer science and was proposed by Kolmogorov, calculated similarly to the informational entropy. Kolmogorov measure when applied to natural phenomena, presents higher values associated with disorder and lower to order. However, it is considered that high (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
1 — 50 / 991