Roughly speaking, classical statistical physics is the branch of theoretical physics that aims to account for the thermal behaviour of macroscopic bodies in terms of a classical mechanical model of their microscopic constituents, with the help of probabilistic assumptions. In the last century and a half, a fair number of approaches have been developed to meet this aim. This study of their foundations assesses their coherence and analyzes the motivations for their basic assumptions, and the interpretations of their central concepts. (...) The most outstanding foundational problems are the explanation of time-asymmetry in thermal behaviour, the relative autonomy of thermal phenomena from their microscopic underpinning, and the meaning of probability. A more or less historic survey is given of the work of Maxwell, Boltzmann and Gibbs in statistical physics, and the problems and objections to which their work gave rise. Next, we review some modern approaches to (i) equilibrium statistical mechanics, such as ergodic theory and the theory of the thermodynamic limit; and to (ii) non-equilibrium statistical mechanics as provided by Lanford's work on the Boltzmann equation, the so-called Bogolyubov-Born-Green-Kirkwood-Yvon approach, and stochastic approaches such as `coarse-graining' and the `open systems' approach. In all cases, we focus on the subtle interplay between probabilistic assumptions, dynamical assumptions, initial conditions and other ingredients used in these approaches. (shrink)
The aim of this article is to analyse the relation between the second law of thermodynamics and the so-called arrow of time. For this purpose, a number of different aspects in this arrow of time are distinguished, in particular those of time-reversal (non-)invariance and of (ir)reversibility. Next I review versions of the second law in the work of Carnot, Clausius, Kelvin, Planck, Gibbs, Caratheodory and Lieb and Yngvason, and investigate their connection with these aspects of the arrow of time. It (...) is shown that this connection varies a great deal along with these formulations of the second law. According to the famous formulation by Planck, the second law expresses the irreversibility of natural processes. But in many other formulations irreversibility or even time-reversal non-invariance plays no role. I therefore argue for the view that the second law has nothing to do with the arrow of time. (shrink)
This paper investigates what the source of time-asymmetry is in thermodynamics, and comments on the question whether a time-symmetric formulation of the Second Law is possible.
Quantum mechanics is generally regarded as the physical theory that is our best candidate for a fundamental and universal description of the physical world. The conceptual framework employed by this theory differs drastically from that of classical physics. Indeed, the transition from classical to quantum physics marks a genuine revolution in our understanding of the physical world.
It has been a longstanding problem to show how the irreversible behaviour of macroscopic systems can be reconciled with the time-reversal invariance of these same systems when considered from a microscopic point of view. A result by Lanford shows that, under certain conditions, the famous Boltzmann equation, describing the irreversible behaviour of a dilute gas, can be obtained from the time-reversal invariant Hamiltonian equations of motion for the hard spheres model. Here, we examine how and in what sense Lanford’s theorem (...) succeeds in deriving this remarkable result. Many authors have expressed different views on the question which of the ingredients in Lanford’s theorem is responsible for the emergence of irreversibility. We claim that these interpretations miss the target. In fact, we argue that there is no time-asymmetric ingredient at all. (shrink)
The principle of maximum entropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with (...) certain compelling consistency requirements. This paper reviews these consistency arguments and the surrounding controversy. It is shown that the uniqueness proofs are flawed, or rest on unreasonably strong assumptions. A more general class of inference rules, maximizing the so-called Re[acute ]nyi entropies, is exhibited which also fulfill the reasonable part of the consistency assumptions. (shrink)
I consider the problem of extending Reichenbach's principle of the common cause to more than two events, vis-a-vis an example posed by Bernstein. It is argued that the only reasonable extension of Reichenbach's principle stands in conflict with a recent proposal due to Horwich. I also discuss prospects of the principle of the common cause in the light of these and other difficulties known in the literature and argue that a more viable version of the principle is the one provided (...) by Penrose and Percival (1962). (shrink)
The starting point of the present paper is Bell’s notion of local causality and his own sharpening of it so as to provide for mathematical formalisation. Starting with Norsen’s analysis of this formalisation, it is subjected to a critique that reveals two crucial aspects that have so far not been properly taken into account. These are the correct understanding of the notions of sufficiency, completeness and redundancy involved; and the fact that the apparatus settings and measurement outcomes have very different (...) theoretical roles in the candidate theories under study. Both aspects are not adequately incorporated in the standard formalisation, and we will therefore do so. The upshot of our analysis is a more detailed, sharp and clean mathematical expression of the condition of local causality. A preliminary analysis of the repercussions of our proposal shows that it is able to locate exactly where and how the notions of locality and causality are involved in formalising Bell’s condition of local causality. (shrink)
The principle of maximum entropy is a method for assigning values to probability distributions on the basis of partial information. In usual formulations of this and related methods of inference one assumes that this partial information takes the form of a constraint on allowed probability distributions. In practical applications, however, the information consists of empirical data. A constraint rule is then employed to construct constraints on probability distributions out of these data. Usually one adopts the rule that equates the expectation (...) values of certain functions with their empirical averages. There are, however, various other ways in which one can construct constraints from empirical data, which makes the maximum entropy principle lead to very different probability assignments. This paper shows that an argument by Jaynes to justify the usual constraint rule is unsatisfactory and investigates several alternative choices. The choice of a constraint rule is also shown to be of crucial importance to the debate on the question whether there is a conflict between the methods of inference based on maximum entropy and Bayesian conditionalization. (shrink)
Nought but Molecules in Motion.Jos Uffink - 1996 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 27 (3):373-387.details
Shan Gao recently presented a critical reconsideration of a paper I wote on the subject of protective measurement. Here, I take the occasion to reply to his objections.
The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close relationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in inference can be obtained by means of the so-called statistical distance between probability distributions. When applied to quantum mechanics, this distance leads to a measure of the distinguishability of quantum states, which essentially is the absolute value of the (...) matrix element between the states. The importance of this result to the quantum mechanical uncertainty principle is noted. The second part of the paper provides a derivation of the statistical distance on the basis of the so-called method of support. (shrink)
Bohr and Heisenberg suggested that the thermodynamical quantities of temperature and energy are complementary in the same way as position and momentum in quantum mechanics. Roughly speaking their idea was that a definite temperature can be attributed to a system only if it is submerged in a heat bath, in which case energy fluctuations are unavoidable. On the other hand, a definite energy can be assigned only to systems in thermal isolation, thus excluding the simultaneous determination of its temperature. Rosenfeld (...) extended this analogy with quantum mechanics and obtained a quantitative uncertainty relation in the form ΔU Δ(1/T) ≥ k, where k is Boltzmann's constant. The two “extreme” cases of this relation would then characterize this complementarity between isolation (U definite) and contact with a heat bath (T definite). Other formulations of the thermodynamical uncertainty relations were proposed by Mandelbrot (1956, 1989), Lindhard (1986), and Lavenda (1987, 1991). This work, however, has not led to a consensus in the literature. It is shown here that the uncertainty relation for temperature and energy in the version of Mandelbrot is indeed exactly analogous to modern formulations of the quantum mechanical uncertainty relations. However, his relation holds only for the canonical distribution, describing a system in contact with a heat bath. There is, therefore, no complementarily between this situation and a thermally isolated system. (shrink)
Gao presents a critical reconsideration of a paper I wrote on the subject of protective measurement. Here, I take the occasion to reply to his objections. In particular, I retract my previous claim to have proven that in a protective measurement, the observable being measured on a system must commute with the system's Hamiltonian. However, I do maintain the viability of the interpretation I offered for protective measurements, as well as my analysis of a thought experiment proposed by Aharonov, Anandan (...) and Vaidman against Gao's objections. (shrink)
Time and Chance.Jos Uffink - 2002 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 33 (3):555-563.details
A recent argument by Hawthorne and Lasonen-Aarnio purports to show that we can uphold the principle that competently forming conjunctions is a knowledge-preserving operation only at the cost of a rampant skepticism about the future. A key premise of their argument is that, in light of quantum-mechanical considerations, future contingents never quite have chance 1 of being true. We argue, by drawing attention to the order of magnitude of the relevant quantum probabilities, that the skeptical threat of Hawthorne and Lasonen-Aarnio’s (...) argument is illusory. (shrink)
Introduction.Jos Uffink - 2005 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 36 (2):219-223.details