Results for 'Maximum entropy'

999 found
Order:
See also
  1.  27
    Maximum Entropy Applied to Inductive Logic and Reasoning.Jürgen Landes & Jon Williamson (eds.) - 2015 - Ludwig-Maximilians-Universität München.
    This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  2.  21
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  3.  35
    Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  4.  64
    Probabilistic stability, agm revision operators and maximum entropy.Krzysztof Mierzewski - 2020 - Review of Symbolic Logic:1-38.
    Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of (...) entropy. In situations of information loss, or whenever the agent relies on a qualitative description of her information state - such as a plausibility ranking over hypotheses, or a belief set - the dynamics of AGM belief revision are compatible with Bayesian conditioning; indeed, through the maximum entropy principle, conditioning naturally generated AGM revision operators. This mitigates an impossibility theorem of Lin and Kelly for tracking Bayesian conditioning with AGM revision, and suggests an approach to the compatibility problem that highlights the information loss incurred by acceptance rules in passing from probabilistic to qualitative representations of beliefs. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  5.  17
    In defense of the maximum entropy inference process.J. Paris & A. Vencovská - 1997 - International Journal of Approximate Reasoning 17 (1):77-103.
    This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the maximum entropy inference process, ME, is the only inference process respecting “common sense.” This result was criticized on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identified with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependent. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   23 citations  
  6.  77
    Maximum entropy inference as a special case of conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  7. Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  8. The principle of maximum entropy and a problem in probability kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  9.  17
    Maximum-entropy spectral analysis of extended energy-loss fine structure and its application to time-resolved measurement.Shunsuke Muto † - 2004 - Philosophical Magazine 84 (25-26):2793-2808.
  10. Objective Bayesianism and the maximum entropy principle.Jürgen Landes & Jon Williamson - 2013 - Entropy 15 (9):3528-3591.
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  11.  98
    Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  12.  28
    Maximum power and maximum entropy production: finalities in nature.Stanley Salthe - 2010 - Cosmos and History 6 (1):114-121.
    I begin with the definition of power, and find that it is finalistic inasmuch as work directs energy dissipation in the interests of some system. The maximum power principle of Lotka and Odum implies an optimal energy efficiency for any work; optima are also finalities. I advance a statement of the maximum entropy production principle, suggesting that most work of dissipative structures is carried out at rates entailing energy flows faster than those that would associate with (...) power. This is finalistic in the sense that the out-of-equilibrium universe, taken as an isolated system, entrains work in the interest of global thermodynamic equilibration. I posit an evolutionary scenario, with a development on Earth from abiotic times, when promoting convective energy flows could be viewed as the important function of dissipative structures, to biotic times when the preservation of living dissipative structures was added to the teleology. Dissipative structures are required by the equilibrating universe to enhance local energy gradient dissipation. (shrink)
    Direct download  
     
    Export citation  
     
    Bookmark   6 citations  
  13. Analysis of the maximum entropy principle “debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
    Jaynes's maximum entropy principle (MEP) is analyzed by considering in detail a recent controversy. Emphasis is placed on the inductive logical interpretation of “probability” and the concept of “total knowledge.” The relation of the MEP to relative frequencies is discussed, and a possible realm of its fruitful application is noted.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  14.  16
    Is the maximum entropy production just a heuristic principle? Metaphysics on natural determination.Javier Sánchez-Cañizares - 2023 - Synthese 201 (4):1-15.
    The Maximum Entropy Production Principle (MEPP) stands out as an overarching principle that rules life phenomena in Nature. However, its explanatory power beyond heuristics remains controversial. On the one hand, the MEPP has been successfully applied principally to non-living systems far from thermodynamic equilibrium. On the other hand, the underlying assumptions to lay the MEPP’s theoretical foundations and range of applicability increase the possibilities of conflicting interpretations. More interestingly, from a metaphysical stance, the MEPP’s philosophical status is hotly (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  15.  1
    Model and Simulation of Maximum Entropy Phrase Reordering of English Text in Language Learning Machine.Weifang Wu - 2020 - Complexity 2020:1-9.
    This paper proposes a feature extraction algorithm based on the maximum entropy phrase reordering model in statistical machine translation in language learning machines. The algorithm can extract more accurate phrase reordering information, especially the feature information of reversed phrases, which solves the problem of imbalance of feature data during maximum entropy training in the original algorithm, and improves the accuracy of phrase reordering in translation. In the experiment, they were combined with linguistic features such as parts (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  16.  13
    Causal versions of maximum entropy and principle of insufficient reason.Dominik Janzing - 2021 - Journal of Causal Inference 9 (1):285-301.
    The principle of insufficient reason assigns equal probabilities to each alternative of a random experiment whenever there is no reason to prefer one over the other. The maximum entropy principle generalizes PIR to the case where statistical information like expectations are given. It is known that both principles result in paradoxical probability updates for joint distributions of cause and effect. This is because constraints on the conditional P P\left result in changes of P P\left that assign higher probability (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  17.  23
    Explaining default intuitions using maximum entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
  18. Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  19. The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  20.  18
    The W systems: between maximum entropy and minimal ranking….Michael Freund - 1994 - Journal of Applied Non-Classical Logics 4 (1):79-90.
  21. Application of the maximum entropy principle to nonlinear systems far from equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 239.
  22.  5
    Rumor Identification with Maximum Entropy in MicroNet.Suisheng Yu, Mingcai Li & Fengming Liu - 2017 - Complexity:1-8.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  23.  7
    A Novel Chinese Entity Relationship Extraction Method Based on the Bidirectional Maximum Entropy Markov Model.Chengyao Lv, Deng Pan, Yaxiong Li, Jianxin Li & Zong Wang - 2021 - Complexity 2021:1-8.
    To identify relationships among entities in natural language texts, extraction of entity relationships technically provides a fundamental support for knowledge graph, intelligent information retrieval, and semantic analysis, promotes the construction of knowledge bases, and improves efficiency of searching and semantic analysis. Traditional methods of relationship extraction, either those proposed at the earlier times or those based on traditional machine learning and deep learning, have focused on keeping relationships and entities in their own silos: extracting relationships and entities are conducted in (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  24. Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.J. E. Shore & R. W. Johnson - 1980 - IEEE Transactions on Information Theory:26-37.
  25.  4
    Combining probabilistic logic programming with the power of maximum entropy.Gabriele Kern-Isberner & Thomas Lukasiewicz - 2004 - Artificial Intelligence 157 (1-2):139-202.
  26.  51
    Enriching the knowledge sources used in a maximum entropy part-of-speech tagger.Christopher Manning - manuscript
    Kristina Toutanova Christopher D. Manning Dept of Computer Science Depts of Computer Science and Linguistics Gates Bldg 4A, 353 Serra Mall Gates Bldg 4A, 353 Serra Mall Stanford, CA 94305–9040, USA Stanford, CA 94305–9040, USA [email protected] [email protected]..
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  27. The status of the principle of maximum entropy.Abner Shimony - 1985 - Synthese 63 (1):35 - 53.
  28.  23
    Vehicle Text Data Compression and Transmission Method Based on Maximum Entropy Neural Network and Optimized Huffman Encoding Algorithms.Jingfeng Yang, Zhenkun Zhang, Nanfeng Zhang, Ming Li, Yanwei Zheng, Li Wang, Yong Li, Ji Yang, Yifei Xiang & Yu Zhang - 2019 - Complexity 2019:1-9.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  29.  21
    A fuzzy neuron based upon maximum entropy ordered weighted averaging.Michael O'Hagan - 1991 - In B. Bouchon-Meunier, R. R. Yager & L. A. Zadeh (eds.), Uncertainty in Knowledge Bases. Springer. pp. 598--609.
  30.  41
    A look back: Early applications of maximum entropy estimation to quantum statistical mechanics.D. J. Scalapino - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 9.
  31.  31
    First-order probabilistic conditional logic and maximum entropy.J. Fisseler - 2012 - Logic Journal of the IGPL 20 (5):796-830.
  32.  40
    How to exploit parametric uniformity for maximum entropy reasoning in a relational probabilistic logic.Marc Finthammer & Christoph Beierle - 2012 - In Luis Farinas del Cerro, Andreas Herzig & Jerome Mengin (eds.), Logics in Artificial Intelligence. Springer. pp. 189--201.
  33.  12
    Noise induced phase transition between maximum entropy production structures and minimum entropy production structures?Alfred Hubler, Andrey Belkin & Alexey Bezryadin - 2015 - Complexity 20 (3):8-11.
  34.  83
    Quantum Model of Classical Mechanics: Maximum Entropy Packets. [REVIEW]P. Hájíček - 2009 - Foundations of Physics 39 (9):1072-1096.
    In a previous paper, a statistical method of constructing quantum models of classical properties has been described. The present paper concludes the description by turning to classical mechanics. The quantum states that maximize entropy for given averages and variances of coordinates and momenta are called ME packets. They generalize the Gaussian wave packets. A non-trivial extension of the partition-function method of probability calculus to quantum mechanics is given. Non-commutativity of quantum variables limits its usefulness. Still, the general form of (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  35. Maximum Shannon Entropy, Minimum Fisher Information, and an Elementary Game.Shunlong Luo - 2002 - Foundations of Physics 32 (11):1757-1772.
    We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  36.  29
    The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):1-20.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  37.  15
    The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):423-442.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  38. Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  39.  36
    Towards the entropy-limit conjecture.Jürgen Landes, Soroush Rafiee Rad & Jon Williamson - 2020 - Annals of Pure and Applied Logic 172 (2):102870.
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  40. Entropy and Vacuum Radiation.Jean E. Burns - 1998 - Foundations of Physics 28 (7):1191-1207.
    It is shown that entropy increase in thermodynamic systems can plausibly be accounted for by the random action of vacuum radiation. A recent calculation by Rueda using stochastic electrodynamics (SED) shows that vacuum radiation causes a particle to undergo a rapid Brownian motion about its average dynamical trajectory. It is shown that the magnitude of spatial drift calculated by Rueda can also be predicted by assuming that the average magnitudes of random shifts in position and momentum of a particle (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  41.  11
    Entropy and sign conventions.G. M. Anderson - 2023 - Foundations of Chemistry 25 (1):119-125.
    It is a fundamental cornerstone of thermodynamics that entropy (SU,V\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$S_{U,V}$$\end{document}) increases in spontaneous processes in isolated systems (often called closed or thermally closed systems when the transfer of energy as work is considered to be negligible) and achieves a maximum when the system reaches equilibrium. But with a different sign convention entropy could just as well be said to decrease to a minimum in spontaneous constant U, V processes. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  42. Information, entropy and inductive logic.S. Pakswer - 1954 - Philosophy of Science 21 (3):254-259.
    It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of the same train of events. In physical terminology the energy is degraded by an increase in entropy due to an (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark  
  43.  47
    Predictive Statistical Mechanics and Macroscopic Time Evolution: Hydrodynamics and Entropy Production.Domagoj Kuić - 2016 - Foundations of Physics 46 (7):891-914.
    In the previous papers, it was demonstrated that applying the principle of maximum information entropy by maximizing the conditional information entropy, subject to the constraint given by the Liouville equation averaged over the phase space, leads to a definition of the rate of entropy change for closed Hamiltonian systems without any additional assumptions. Here, we generalize this basic model and, with the introduction of the additional constraints which are equivalent to the hydrodynamic continuity equations, show that (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  44.  28
    Newtonian Dynamics from the Principle of Maximum Caliber.Diego González, Sergio Davis & Gonzalo Gutiérrez - 2014 - Foundations of Physics 44 (9):923-931.
    The foundations of Statistical Mechanics can be recovered almost in their entirety from the principle of maximum entropy. In this work we show that its non-equilibrium generalization, the principle of maximum caliber (Jaynes, Phys Rev 106:620–630, 1957), when applied to the unknown trajectory followed by a particle, leads to Newton’s second law under two quite intuitive assumptions (both the expected square displacement in one step and the spatial probability distribution of the particle are known at all times). (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  45. Spencer-Brown vs. Probability and Statistics: Entropy’s Testimony on Subjective and Objective Randomness.Julio Michael Stern - 2011 - Information 2 (2):277-301.
    This article analyzes the role of entropy in Bayesian statistics, focusing on its use as a tool for detection, recognition and validation of eigen-solutions. “Objects as eigen-solutions” is a key metaphor of the cognitive constructivism epistemological framework developed by the philosopher Heinz von Foerster. Special attention is given to some objections to the concepts of probability, statistics and randomization posed by George Spencer-Brown, a figure of great influence in the field of radical constructivism.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  46.  14
    New Foundations for Information Theory: Logical Entropy and Shannon Entropy.David Ellerman - 2021 - Springer Verlag.
    This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then (...)
  47.  18
    Statistical models of syntax learning and use.Mark Johnson & Stefan Riezler - 2002 - Cognitive Science 26 (3):239-253.
    This paper shows how to define probability distributions over linguistically realistic syntactic structures in a way that permits us to define language learning and language comprehension as statistical problems. We demonstrate our approach using lexical‐functional grammar (LFG), but our approach generalizes to virtually any linguistic theory. Our probabilistic models are maximum entropy models. In this paper we concentrate on statistical inference procedures for learning the parameters that define these probability distributions. We point out some of the practical problems (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  48.  34
    A Bayesian Interpretation of First-Order Phase Transitions.Sergio Davis, Joaquín Peralta, Yasmín Navarrete, Diego González & Gonzalo Gutiérrez - 2016 - Foundations of Physics 46 (3):350-359.
    In this work we review the formalism used in describing the thermodynamics of first-order phase transitions from the point of view of maximum entropy inference. We present the concepts of transition temperature, latent heat and entropy difference between phases as emergent from the more fundamental concept of internal energy, after a statistical inference analysis. We explicitly demonstrate this point of view by making inferences on a simple game, resulting in the same formalism as in thermodynamical phase transitions. (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  49. Abner Shimony.Carnap On Entropy - 1975 - In Jaakko Hintikka (ed.), Rudolf Carnap, Logical Empiricist: Materials and Perspectives. D. Reidel Pub. Co.. pp. 381.
     
    Export citation  
     
    Bookmark  
  50.  14
    Phonological Concept Learning.Elliott Moreton, Joe Pater & Katya Pertsova - 2017 - Cognitive Science 41 (1):4-69.
    Linguistic and non-linguistic pattern learning have been studied separately, but we argue for a comparative approach. Analogous inductive problems arise in phonological and visual pattern learning. Evidence from three experiments shows that human learners can solve them in analogous ways, and that human performance in both cases can be captured by the same models. We test GMECCS, an implementation of the Configural Cue Model in a Maximum Entropy phonotactic-learning framework with a single free parameter, against the alternative hypothesis (...)
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
1 — 50 / 999