Results for 'Maximum entropy modeling'

997 found
Order:
  1.  5
    Modeling the Past Hypothesis: A Mechanical Cosmology.Jordan Scharnhorst & Anthony Aguirre - 2023 - Foundations of Physics 54 (1):1-24.
    There is a paradox in the standard model of cosmology. How can matter in the early universe have been in thermal equilibrium, indicating maximum entropy, but the initial state also have been low entropy (the “past hypothesis"), so as to underpin the second law of thermodynamics? The problem has been highly contested, with the only consensus being that gravity plays a role in the story, but with the exact mechanism undecided. In this paper, we construct a well-defined (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  2.  18
    Statistical models of syntax learning and use.Mark Johnson & Stefan Riezler - 2002 - Cognitive Science 26 (3):239-253.
    This paper shows how to define probability distributions over linguistically realistic syntactic structures in a way that permits us to define language learning and language comprehension as statistical problems. We demonstrate our approach using lexical‐functional grammar (LFG), but our approach generalizes to virtually any linguistic theory. Our probabilistic models are maximum entropy models. In this paper we concentrate on statistical inference procedures for learning the parameters that define these probability distributions. We point out some of the practical problems (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  3.  23
    Maximum Entropy Inference with Quantified Knowledge.Owen Barnett & Jeff Paris - 2008 - Logic Journal of the IGPL 16 (1):85-98.
    We investigate uncertain reasoning with quantified sentences of the predicate calculus treated as the limiting case of maximum entropy inference applied to finite domains.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  4.  35
    Maximum Entropy and Probability Kinematics Constrained by Conditionals.Stefan Lukits - 2015 - Entropy 17 (4):1690-1700.
    Two open questions of inductive reasoning are solved: (1) does the principle of maximum entropy (pme) give a solution to the obverse Majerník problem; and (2) is Wagner correct when he claims that Jeffrey’s updating principle (jup) contradicts pme? Majerník shows that pme provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether pme also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates (...)
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  5.  10
    3D Face Modeling Algorithm for Film and Television Animation Based on Lightweight Convolutional Neural Network.Cheng Di, Jing Peng, Yihua Di & Siwei Wu - 2021 - Complexity 2021:1-10.
    Through the analysis of facial feature extraction technology, this paper designs a lightweight convolutional neural network. The LW-CNN model adopts a separable convolution structure, which can propose more accurate features with fewer parameters and can extract 3D feature points of a human face. In order to enhance the accuracy of feature extraction, a face detection method based on the inverted triangle structure is used to detect the face frame of the images in the training set before the model extracts the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  6.  19
    In defense of the maximum entropy inference process.J. Paris & A. Vencovská - 1997 - International Journal of Approximate Reasoning 17 (1):77-103.
    This paper is a sequel to an earlier result of the authors that in making inferences from certain probabilistic knowledge bases the maximum entropy inference process, ME, is the only inference process respecting “common sense.” This result was criticized on the grounds that the probabilistic knowledge bases considered are unnatural and that ignorance of dependence should not be identified with statistical independence. We argue against these criticisms and also against the more general criticism that ME is representation dependent. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   24 citations  
  7.  77
    Maximum entropy inference as a special case of conditionalization.Brian Skyrms - 1985 - Synthese 63 (1):55 - 74.
  8. Analysis of the maximum entropy principle “debate”.John F. Cyranski - 1978 - Foundations of Physics 8 (5-6):493-506.
    Jaynes's maximum entropy principle (MEP) is analyzed by considering in detail a recent controversy. Emphasis is placed on the inductive logical interpretation of “probability” and the concept of “total knowledge.” The relation of the MEP to relative frequencies is discussed, and a possible realm of its fruitful application is noted.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  9. Can the maximum entropy principle be explained as a consistency requirement?Jos Uffink - 1995 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 26 (3):223-261.
    The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  10.  30
    Maximum Entropy Applied to Inductive Logic and Reasoning.Jürgen Landes & Jon Williamson (eds.) - 2015 - Ludwig-Maximilians-Universität München.
    This editorial explains the scope of the special issue and provides a thematic introduction to the contributed papers.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  11. Common sense and maximum entropy.Jeff Paris - 1998 - Synthese 117 (1):75-93.
    This paper concerns the question of how to draw inferences common sensically from uncertain knowledge. Since the early work of Shore and Johnson (1980), Paris and Vencovská (1990), and Csiszár (1989), it has been known that the Maximum Entropy Inference Process is the only inference process which obeys certain common sense principles of uncertain reasoning. In this paper we consider the present status of this result and argue that within the rather narrow context in which we work this (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   20 citations  
  12. The principle of maximum entropy and a problem in probability kinematics.Stefan Lukits - 2014 - Synthese 191 (7):1-23.
    Sometimes we receive evidence in a form that standard conditioning (or Jeffrey conditioning) cannot accommodate. The principle of maximum entropy (MAXENT) provides a unique solution for the posterior probability distribution based on the intuition that the information gain consistent with assumptions and evidence should be minimal. Opponents of objective methods to determine these probabilities prominently cite van Fraassen’s Judy Benjamin case to undermine the generality of maxent. This article shows that an intuitive approach to Judy Benjamin’s case supports (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  13.  19
    Maximum-entropy spectral analysis of extended energy-loss fine structure and its application to time-resolved measurement.Shunsuke Muto † - 2004 - Philosophical Magazine 84 (25-26):2793-2808.
  14. Objective Bayesianism and the maximum entropy principle.Jürgen Landes & Jon Williamson - 2013 - Entropy 15 (9):3528-3591.
    Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities, they should be calibrated to our evidence of physical probabilities, and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   18 citations  
  15.  29
    Maximum power and maximum entropy production: finalities in nature.Stanley Salthe - 2010 - Cosmos and History 6 (1):114-121.
    I begin with the definition of power, and find that it is finalistic inasmuch as work directs energy dissipation in the interests of some system. The maximum power principle of Lotka and Odum implies an optimal energy efficiency for any work; optima are also finalities. I advance a statement of the maximum entropy production principle, suggesting that most work of dissipative structures is carried out at rates entailing energy flows faster than those that would associate with (...) power. This is finalistic in the sense that the out-of-equilibrium universe, taken as an isolated system, entrains work in the interest of global thermodynamic equilibration. I posit an evolutionary scenario, with a development on Earth from abiotic times, when promoting convective energy flows could be viewed as the important function of dissipative structures, to biotic times when the preservation of living dissipative structures was added to the teleology. Dissipative structures are required by the equilibrating universe to enhance local energy gradient dissipation. (shrink)
    Direct download  
     
    Export citation  
     
    Bookmark   6 citations  
  16.  18
    Is the maximum entropy production just a heuristic principle? Metaphysics on natural determination.Javier Sánchez-Cañizares - 2023 - Synthese 201 (4):1-15.
    The Maximum Entropy Production Principle (MEPP) stands out as an overarching principle that rules life phenomena in Nature. However, its explanatory power beyond heuristics remains controversial. On the one hand, the MEPP has been successfully applied principally to non-living systems far from thermodynamic equilibrium. On the other hand, the underlying assumptions to lay the MEPP’s theoretical foundations and range of applicability increase the possibilities of conflicting interpretations. More interestingly, from a metaphysical stance, the MEPP’s philosophical status is hotly (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17.  66
    Probabilistic stability, agm revision operators and maximum entropy.Krzysztof Mierzewski - 2020 - Review of Symbolic Logic:1-38.
    Several authors have investigated the question of whether canonical logic-based accounts of belief revision, and especially the theory of AGM revision operators, are compatible with the dynamics of Bayesian conditioning. Here we show that Leitgeb's stability rule for acceptance, which has been offered as a possible solution to the Lottery paradox, allows to bridge AGM revision and Bayesian update: using the stability rule, we prove that AGM revision operators emerge from Bayesian conditioning by an application of the principle of (...) entropy. In situations of information loss, or whenever the agent relies on a qualitative description of her information state - such as a plausibility ranking over hypotheses, or a belief set - the dynamics of AGM belief revision are compatible with Bayesian conditioning; indeed, through the maximum entropy principle, conditioning naturally generated AGM revision operators. This mitigates an impossibility theorem of Lin and Kelly for tracking Bayesian conditioning with AGM revision, and suggests an approach to the compatibility problem that highlights the information loss incurred by acceptance rules in passing from probabilistic to qualitative representations of beliefs. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  18.  13
    Causal versions of maximum entropy and principle of insufficient reason.Dominik Janzing - 2021 - Journal of Causal Inference 9 (1):285-301.
    The principle of insufficient reason assigns equal probabilities to each alternative of a random experiment whenever there is no reason to prefer one over the other. The maximum entropy principle generalizes PIR to the case where statistical information like expectations are given. It is known that both principles result in paradoxical probability updates for joint distributions of cause and effect. This is because constraints on the conditional P P\left result in changes of P P\left that assign higher probability (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  19.  23
    Explaining default intuitions using maximum entropy.Rachel A. Bourne - 2003 - Journal of Applied Logic 1 (3-4):255-271.
  20.  2
    Model and Simulation of Maximum Entropy Phrase Reordering of English Text in Language Learning Machine.Weifang Wu - 2020 - Complexity 2020:1-9.
    This paper proposes a feature extraction algorithm based on the maximum entropy phrase reordering model in statistical machine translation in language learning machines. The algorithm can extract more accurate phrase reordering information, especially the feature information of reversed phrases, which solves the problem of imbalance of feature data during maximum entropy training in the original algorithm, and improves the accuracy of phrase reordering in translation. In the experiment, they were combined with linguistic features such as parts (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  21. Bertrand's Paradox and the Maximum Entropy Principle.Nicholas Shackel & Darrell P. Rowbottom - 2019 - Philosophy and Phenomenological Research 101 (3):505-523.
    An important suggestion of objective Bayesians is that the maximum entropy principle can replace a principle which is known to get into paradoxical difficulties: the principle of indifference. No one has previously determined whether the maximum entropy principle is better able to solve Bertrand’s chord paradox than the principle of indifference. In this paper I show that it is not. Additionally, the course of the analysis brings to light a new paradox, a revenge paradox of the (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  22. The constraint rule of the maximum entropy principle.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (1):47-79.
    The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  23. Application of the maximum entropy principle to nonlinear systems far from equilibrium.H. Haken - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 239.
  24.  18
    The W systems: between maximum entropy and minimal ranking….Michael Freund - 1994 - Journal of Applied Non-Classical Logics 4 (1):79-90.
  25.  7
    Rumor Identification with Maximum Entropy in MicroNet.Suisheng Yu, Mingcai Li & Fengming Liu - 2017 - Complexity:1-8.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  26.  10
    A Novel Chinese Entity Relationship Extraction Method Based on the Bidirectional Maximum Entropy Markov Model.Chengyao Lv, Deng Pan, Yaxiong Li, Jianxin Li & Zong Wang - 2021 - Complexity 2021:1-8.
    To identify relationships among entities in natural language texts, extraction of entity relationships technically provides a fundamental support for knowledge graph, intelligent information retrieval, and semantic analysis, promotes the construction of knowledge bases, and improves efficiency of searching and semantic analysis. Traditional methods of relationship extraction, either those proposed at the earlier times or those based on traditional machine learning and deep learning, have focused on keeping relationships and entities in their own silos: extracting relationships and entities are conducted in (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  27. Axiomatic Derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy.J. E. Shore & R. W. Johnson - 1980 - IEEE Transactions on Information Theory:26-37.
  28.  6
    Combining probabilistic logic programming with the power of maximum entropy.Gabriele Kern-Isberner & Thomas Lukasiewicz - 2004 - Artificial Intelligence 157 (1-2):139-202.
  29.  51
    Enriching the knowledge sources used in a maximum entropy part-of-speech tagger.Christopher Manning - manuscript
    Kristina Toutanova Christopher D. Manning Dept of Computer Science Depts of Computer Science and Linguistics Gates Bldg 4A, 353 Serra Mall Gates Bldg 4A, 353 Serra Mall Stanford, CA 94305–9040, USA Stanford, CA 94305–9040, USA [email protected] [email protected]..
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  30. The status of the principle of maximum entropy.Abner Shimony - 1985 - Synthese 63 (1):35 - 53.
  31.  41
    A look back: Early applications of maximum entropy estimation to quantum statistical mechanics.D. J. Scalapino - 1993 - In E. T. Jaynes, Walter T. Grandy & Peter W. Milonni (eds.), Physics and Probability: Essays in Honor of Edwin T. Jaynes. Cambridge University Press. pp. 9.
  32.  24
    Vehicle Text Data Compression and Transmission Method Based on Maximum Entropy Neural Network and Optimized Huffman Encoding Algorithms.Jingfeng Yang, Zhenkun Zhang, Nanfeng Zhang, Ming Li, Yanwei Zheng, Li Wang, Yong Li, Ji Yang, Yifei Xiang & Yu Zhang - 2019 - Complexity 2019:1-9.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  33.  31
    First-order probabilistic conditional logic and maximum entropy.J. Fisseler - 2012 - Logic Journal of the IGPL 20 (5):796-830.
  34.  40
    How to exploit parametric uniformity for maximum entropy reasoning in a relational probabilistic logic.Marc Finthammer & Christoph Beierle - 2012 - In Luis Farinas del Cerro, Andreas Herzig & Jerome Mengin (eds.), Logics in Artificial Intelligence. Springer. pp. 189--201.
  35.  21
    A fuzzy neuron based upon maximum entropy ordered weighted averaging.Michael O'Hagan - 1991 - In B. Bouchon-Meunier, R. R. Yager & L. A. Zadeh (eds.), Uncertainty in Knowledge Bases. Springer. pp. 598--609.
  36.  12
    Noise induced phase transition between maximum entropy production structures and minimum entropy production structures?Alfred Hubler, Andrey Belkin & Alexey Bezryadin - 2015 - Complexity 20 (3):8-11.
  37.  84
    Quantum Model of Classical Mechanics: Maximum Entropy Packets. [REVIEW]P. Hájíček - 2009 - Foundations of Physics 39 (9):1072-1096.
    In a previous paper, a statistical method of constructing quantum models of classical properties has been described. The present paper concludes the description by turning to classical mechanics. The quantum states that maximize entropy for given averages and variances of coordinates and momenta are called ME packets. They generalize the Gaussian wave packets. A non-trivial extension of the partition-function method of probability calculus to quantum mechanics is given. Non-commutativity of quantum variables limits its usefulness. Still, the general form of (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  38. Maximum Shannon Entropy, Minimum Fisher Information, and an Elementary Game.Shunlong Luo - 2002 - Foundations of Physics 32 (11):1757-1772.
    We formulate an elementary statistical game which captures the essence of some fundamental quantum experiments such as photon polarization and spin measurement. We explore and compare the significance of the principle of maximum Shannon entropy and the principle of minimum Fisher information in solving such a game. The solution based on the principle of minimum Fisher information coincides with the solution based on an invariance principle, and provides an informational explanation of Malus' law for photon polarization. There is (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  39.  14
    Modeling Urban Growth and Form with Spatial Entropy.Yanguang Chen - 2020 - Complexity 2020:1-14.
    Entropy is one of the physical bases for the fractal dimension definition, and the generalized fractal dimension was defined by Renyi entropy. Using the fractal dimension, we can describe urban growth and form and characterize spatial complexity. A number of fractal models and measurements have been proposed for urban studies. However, the precondition for fractal dimension application is to find scaling relations in cities. In the absence of the scaling property, we can make use of the entropy (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  40.  30
    The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):1-20.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  41.  15
    The Entropy-Limit (Conjecture) for $$Sigma _2$$ Σ 2 -Premisses.Jürgen Landes - 2020 - Studia Logica 109 (2):423-442.
    The application of the maximum entropy principle to determine probabilities on finite domains is well-understood. Its application to infinite domains still lacks a well-studied comprehensive approach. There are two different strategies for applying the maximum entropy principle on first-order predicate languages: applying it to finite sublanguages and taking a limit; comparing finite entropies of probability functions defined on the language as a whole. The entropy-limit conjecture roughly says that these two strategies result in the same (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  42. Entropy and Insufficient Reason: A Note on the Judy Benjamin Problem.Anubav Vasudevan - 2020 - British Journal for the Philosophy of Science 71 (3):1113-1141.
    One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen. The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this article, I present an analysis of the Judy Benjamin problem (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  43.  36
    Towards the entropy-limit conjecture.Jürgen Landes, Soroush Rafiee Rad & Jon Williamson - 2020 - Annals of Pure and Applied Logic 172 (2):102870.
    The maximum entropy principle is widely used to determine non-committal probabilities on a finite domain, subject to a set of constraints, but its application to continuous domains is notoriously problematic. This paper concerns an intermediate case, where the domain is a first-order predicate language. Two strategies have been put forward for applying the maximum entropy principle on such a domain: applying it to finite sublanguages and taking the pointwise limit of the resulting probabilities as the size (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  44. Entropy and Vacuum Radiation.Jean E. Burns - 1998 - Foundations of Physics 28 (7):1191-1207.
    It is shown that entropy increase in thermodynamic systems can plausibly be accounted for by the random action of vacuum radiation. A recent calculation by Rueda using stochastic electrodynamics (SED) shows that vacuum radiation causes a particle to undergo a rapid Brownian motion about its average dynamical trajectory. It is shown that the magnitude of spatial drift calculated by Rueda can also be predicted by assuming that the average magnitudes of random shifts in position and momentum of a particle (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  45.  8
    Application of quorum response and information entropy to animal collective motion modeling.Feng Hu, Carlos Escudero, Jerome Buhl & Stephen J. Simpson - 2016 - Complexity 21 (S1):584-592.
  46.  11
    Entropy and sign conventions.G. M. Anderson - 2023 - Foundations of Chemistry 25 (1):119-125.
    It is a fundamental cornerstone of thermodynamics that entropy (SU,V\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$S_{U,V}$$\end{document}) increases in spontaneous processes in isolated systems (often called closed or thermally closed systems when the transfer of energy as work is considered to be negligible) and achieves a maximum when the system reaches equilibrium. But with a different sign convention entropy could just as well be said to decrease to a minimum in spontaneous constant U, V processes. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  47.  70
    Holistic modeling: an objection to Weisberg’s weighted feature-matching account.Wei Fang - 2017 - Synthese 194 (5):1743–1764.
    Michael Weisberg’s account of scientific models concentrates on the ways in which models are similar to their targets. He intends not merely to explain what similarity consists in, but also to capture similarity judgments made by scientists. In order to scrutinize whether his account fulfills this goal, I outline one common way in which scientists judge whether a model is similar enough to its target, namely maximum likelihood estimation method. Then I consider whether Weisberg’s account could capture the judgments (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  48.  20
    Computational Modeling of Cognition and Behavior.Simon Farrell & Stephan Lewandowsky - 2017 - Cambridge University Press.
    Computational modeling is now ubiquitous in psychology, and researchers who are not modelers may find it increasingly difficult to follow the theoretical developments in their field. This book presents an integrated framework for the development and application of models in psychology and related disciplines. Researchers and students are given the knowledge and tools to interpret models published in their area, as well as to develop, fit, and test their own models. Both the development of models and key features of (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  49. Information, entropy and inductive logic.S. Pakswer - 1954 - Philosophy of Science 21 (3):254-259.
    It has been shown by several authors that in operations involving information a quantity appears which is the negative of the quantity usually defined as entropy in similar situations. This quantity ℜ = − KI has been termed “negentropy” and it has been shown that the negentropy of information and the physical entropy S are mirrorlike representations of the same train of events. In physical terminology the energy is degraded by an increase in entropy due to an (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark  
  50.  47
    Predictive Statistical Mechanics and Macroscopic Time Evolution: Hydrodynamics and Entropy Production.Domagoj Kuić - 2016 - Foundations of Physics 46 (7):891-914.
    In the previous papers, it was demonstrated that applying the principle of maximum information entropy by maximizing the conditional information entropy, subject to the constraint given by the Liouville equation averaged over the phase space, leads to a definition of the rate of entropy change for closed Hamiltonian systems without any additional assumptions. Here, we generalize this basic model and, with the introduction of the additional constraints which are equivalent to the hydrodynamic continuity equations, show that (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
1 — 50 / 997