David Wallace argues that we should take quantum theory seriously as an account of what the world is like--which means accepting the idea that the universe is constantly branching into new universes. He presents an accessible but rigorous account of the 'Everett interpretation', the best way to make coherent sense of quantum physics.
What would it mean to apply quantum theory, without restriction and without involving any notion of measurement and state reduction, to the whole universe? What would realism about the quantum state then imply? This book brings together an illustrious team of philosophers and physicists to debate these questions. The contributors broadly agree on the need, or aspiration, for a realist theory that unites micro- and macro-worlds. But they disagree on what this implies. Some argue that if unitary quantum evolution has (...) unrestricted application, and if the quantum state is taken to be something physically real, then this universe emerges from the quantum state as one of countless others, constantly branching in time, all of which are real. The result, they argue, is many worlds quantum theory, also known as the Everett interpretation of quantum mechanics. No other realist interpretation of unitary quantum theory has ever been found. Others argue in reply that this picture of many worlds is in no sense inherent to quantum theory, or fails to make physical sense, or is scientifically inadequate. The stuff of these worlds, what they are made of, is never adequately explained, nor are the worlds precisely defined; ordinary ideas about time and identity over time are compromised; no satisfactory role or substitute for probability can be found in many worlds theories; they can't explain experimental data; anyway, there are attractive realist alternatives to many worlds. Twenty original essays, accompanied by commentaries and discussions, examine these claims and counterclaims in depth. They consider questions of ontology - the existence of worlds; probability - whether and how probability can be related to the branching structure of the quantum state; alternatives to many worlds - whether there are one-world realist interpretations of quantum theory that leave quantum dynamics unchanged; and open questions even given many worlds, including the multiverse concept as it has arisen elsewhere in modern cosmology. A comprehensive introduction lays out the main arguments of the book, which provides a state-of-the-art guide to many worlds quantum theory and its problems. (shrink)
According to Bayesian epistemology, the epistemically rational agent updates her beliefs by conditionalization: that is, her posterior subjective probability after taking account of evidence X, pnew, is to be set equal to her prior conditional probability pold(·|X). Bayesians can be challenged to provide a justification for their claim that conditionalization is recommended by rationality—whence the normative force of the injunction to conditionalize? There are several existing justifications for conditionalization, but none directly addresses the idea that conditionalization will be epistemically rational (...) if and only if it can reasonably be expected to lead to epistemically good outcomes. We apply the approach of cognitive decision theory to provide a justification for conditionalization using precisely that idea. We assign epistemic utility functions to epistemically rational agents; an agent’s epistemic utility is to depend both upon the actual state of the world and on the agent’s credence distribution over possible states. We prove that, under independently motivated conditions, conditionalization is the unique updating rule that maximizes expected epistemic utility. (shrink)
Using as a starting point recent and apparently incompatible conclusions by Saunders and Knox, I revisit the question of the correct spacetime setting for Newtonian physics. I argue that understood correctly, these two versions of Newtonian physics make the same claims both about the background geometry required to define the theory, and about the inertial structure of the theory. In doing so I illustrate and explore in detail the view—espoused by Knox, and also by Brown —that inertial structure is defined (...) by the dynamics governing subsystems of a larger system. This clarifies some interesting features of Newtonian physics, notably the distinction between using the theory to model subsystems of a larger whole and using it to model complete universes, and the scale-relativity of spacetime structure. _1_ Introduction _2_ Newtonian Mechanics and Galilean Spacetime _3_ Vector Relationism and Maxwellian Spacetime _4_ Recovering the Galilei Group: Dynamics of Subsystems _5_ Knox on Inertial Structure _6_ Connections on Maxwellian Spacetime _7_ Knox on Newtonian Gravity _8_ Vector Relationism and Newton–Cartan Theory _9_ Inertial Structure in Newton–Cartan Gravity _10_ Reconciling Knox and Saunders _11_ Conclusions. (shrink)
What ontology does realism about the quantum state suggest? The main extant view in contemporary philosophy of physics is wave-function realism . We elaborate the sense in which wave-function realism does provide an ontological picture, and defend it from certain objections that have been raised against it. However, there are good reasons to be dissatisfied with wave-function realism, as we go on to elaborate. This motivates the development of an opposing picture: what we call spacetime state realism , a view (...) which takes the states associated to spacetime regions as fundamental. This approach enjoys a number of beneficial features, although, unlike wave-function realism, it involves non-separability at the level of fundamental ontology. We investigate the pros and cons of this non-separability, arguing that it is a quite acceptable feature, even one which proves fruitful in the context of relativistic covariance. A companion paper discusses the prospects for combining a spacetime-based ontology with separability, along lines suggested by Deutsch and Hayden. (shrink)
Coordinate-based approaches to physical theories remain standard in mainstream physics but are largely eschewed in foundational discussion in favour of coordinate-free differential-geometric approaches. I defend the conceptual and mathematical legitimacy of the coordinate-based approach for foundational work. In doing so, I provide an account of the Kleinian conception of geometry as a theory of invariance under symmetry groups; I argue that this conception continues to play a very substantial role in contemporary mathematical physics and indeed that supposedly ``coordinate-free'' differential geometry (...) relies centrally on this conception of geometry. I discuss some foundational and pedagogical advantages of the coordinate-based formulation and briefly connect it to some remarks of Norton on the historical development of geometry in physics during the establishment of the general theory of relativity. (shrink)
Everett and Structure.David Wallace - 2003 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 34 (1):87-105.details
I address the problem of indefiniteness in quantum mechanics: the problem that the theory, without changes to its formalism, seems to predict that macroscopic quantities have no definite values. The Everett interpretation is often criticised along these lines, and I shall argue that much of this criticism rests on a false dichotomy: that the macroworld must either be written directly into the formalism or be regarded as somehow illusory. By means of analogy with other areas of physics, I develop the (...) view that the macroworld is instead to be understood in terms of certain structures and patterns which emerge from quantum theory (given appropriate dynamics, in particular decoherence). I extend this view to the observer, and in doing so make contact with functionalist theories of mind. (shrink)
What is called ``orthodox'' quantum mechanics, as presented in standard foundational discussions, relies on two substantive assumptions --- the projection postulate and the eigenvalue-eigenvector link --- that do not in fact play any part in practical applications of quantum mechanics. I argue for this conclusion on a number of grounds, but primarily on the grounds that the projection postulate fails correctly to account for repeated, continuous and unsharp measurements and that the eigenvalue-eigenvector link implies that virtually all interesting properties are (...) maximally indefinite pretty much always. I present an alternative way of conceptualising quantum mechanics that does a better job of representing quantum mechanics as it is actually used, and in particular that eliminates use of either the projection postulate or the eigenvalue-eigenvector link, and I reformulate the measurement problem within this new presentation of orthodoxy. (shrink)
I argue against the currently prevalent view that algebraic quantum field theory (AQFT) is the correct framework for philosophy of quantum field theory and that “conventional” quantum field theory (CQFT), of the sort used in mainstream particle physics, is not suitable for foundational study. In doing so, I defend that position that AQFT and CQFT should be understood as rival programs to resolve the mathematical and physical pathologies of renormalization theory, and that CQFT has succeeded in this task and AQFT (...) has failed. I also defend CQFT from recent criticisms made by Doreen Fraser. (shrink)
I present a proof of the quantum probability rule from decision-theoretic assumptions, in the context of the Everett interpretation. The basic ideas behind the proof are those presented in Deutsch's recent proof of the probability rule, but the proof is simpler and proceeds from weaker decision-theoretic assumptions. This makes it easier to discuss the conceptual ideas involved in the proof, and to show that they are defensible.
I analyse the conceptual and mathematical foundations of Lagrangian quantum field theory (QFT) (that is, the ‘naive’ (QFT) used in mainstream physics, as opposed to algebraic quantum field theory). The objective is to see whether Lagrangian (QFT) has a sufficiently firm conceptual and mathematical basis to be a legitimate object of foundational study, or whether it is too ill-defined. The analysis covers renormalisation and infinities, inequivalent representations, and the concept of localised states; the conclusion is that Lagrangian QFT (at least (...) as described here) is a perfectly respectable physical theory, albeit somewhat different in certain respects from most of those studied in foundational work. (shrink)
Quantum mechanics, and classical mechanics, are framework theories that incorporate many different concrete theories which in general cannot be arranged in a neat hierarchy, but discussion of ‘the ontology of quantum mechanics’ tends to proceed as if quantum mechanics were a single concrete theory, specifically the physics of nonrelativistically moving point particles interacting by long-range forces. I survey the problems this causes and make some suggestions for how a more physically realistic perspective ought to influence the metaphysics of quantum mechanics.
I give a fairly systematic and thorough presentation of the case for regarding black holes as thermodynamic systems in the fullest sense, aimed at students and non-specialists and not presuming advanced knowledge of quantum gravity. I pay particular attention to the availability in classical black hole thermodynamics of a well-defined notion of adiabatic intervention; the power of the membrane paradigm to make black hole thermodynamics precise and to extend it to local-equilibrium contexts; the central role of Hawking radiation in permitting (...) black holes to be in thermal contact with one another; the wide range of routes by which Hawking radiation can be derived and its back-reaction on the black hole calculated; the interpretation of Hawking radiation close to the black hole as a gravitationally bound thermal atmosphere. In an appendix I discuss recent criticisms of black hole thermodynamics by Dougherty and Callender. This paper confines its attention to the thermodynamics of black holes; a sequel will consider their statistical mechanics. (shrink)
Following Lewis, it is widely held that branching worlds differ in important ways from diverging worlds. There is, however, a simple and natural semantics under which ordinary sentences uttered in branching worlds have much the same truth values as they conventionally have in diverging worlds. Under this semantics, whether branching or diverging, speakers cannot say in advance which branch or world is theirs. They are uncertain as to the outcome. This same semantics ensures the truth of utterances typically made about (...) quantum mechanical contingencies, including statements of uncertainty, if the Everett interpretation of quantum mechanics is true. The ‘incoherence problem’ of the Everett interpretation, that it can give no meaning to the notion of uncertainty, is thereby solved. IntroductionMetaphysicsPersonal fissionBranching worldsPhysicsObjections. (shrink)
`Quantum theory' is not a single physical theory but a framework in which many different concrete theories fit. As such, a solution to the quantum measurement problem ought to provide a recipe to interpret each such concrete theory, in a mutually consistent way. But with the exception of the Everett interpretation, the mainextant solutions either try to make sense of the abstract framework as if it were concrete, or else interpret one particular quantum theory under the fiction that it is (...) fundamental and exact. In either case, these approaches are unable to help themselves to the very theory-laden, level-relative ways in which quantum theory makes contact with experiment in mainstream physics, and so are committed to major revisionary projects which have not been carried out even in outline. As such, only the Everett interpretation is currently suited to make sense of quantum physics as we find it. (shrink)
This is a discussion of how we can understand the world-view given to us by the Everett interpretation of quantum mechanics, and in particular the role played by the concept of 'world'. The view presented is that we are entitled to use 'many-worlds' terminology even if the theory does not specify the worlds in the formalism; this is defended by means of an extensive analogy with the concept of an 'instant' or moment of time in relativity, with the lack of (...) a preferred foliation of spacetime being compared with the lack of a preferred basis in quantum theory. Implications for identity of worlds over time, and for relativistic quantum mechanics, are discussed. (shrink)
I provide a fairly systematic analysis of when quantities that are variant under a dynamical symmetry transformation should be regarded as unobservable, or redundant, or unreal; of when models related by a dynamical symmetry transformation represent the same state of affairs; and of when mathematical structure that is variant under a dynamical symmetry transformation should be regarded as surplus. In most of these cases the answer is `it depends': depends, that is, on the details of the symmetry in question. A (...) central feature of the analysis is that in order to draw any of these conclusions for a dynamical symmetry it needs to be understood in terms of its possible extensions to other physical systems, in particular to measurement devices. (shrink)
Quantum mechanics, and classical mechanics, are framework theories that incorporate many different concrete theories which in general cannot be arranged in a neat hierarchy, but discussion of ‘the ontology of quantum mechanics’ tends to proceed as if quantum mechanics were a single concrete theory, specifically the physics of nonrelativistically moving point particles interacting by long-range forces. I survey the problems this causes and make some suggestions for how a more physically realistic perspective ought to influence the metaphysics of quantum mechanics.
An analysis is made of Deutsch's recent claim to have derived the Born rule from decision-theoretic assumptions. It is argued that Deutsch's proof must be understood in the explicit context of the Everett interpretation, and that in this context, it essentially succeeds. Some comments are made about the criticism of Deutsch's proof by Barnum, Caves, Finkelstein, Fuchs, and Schack; it is argued that the flaw which they point out in the proof does not apply if the Everett interpretation is assumed.
I consider exactly what is involved in a solution to the probability problem of the Everett interpretation, in the light of recent work on applying considerations from decision theory to that problem. I suggest an overall framework for understanding probability in a physical theory, and conclude that this framework, when applied to the Everett interpretation, yields the result that that interpretation satisfactorily solves the measurement problem. Introduction What is probability? 2.1 Objective probability and the Principal Principle 2.2 Three ways of (...) satisfying the functional definition 2.3 Cautious functionalism 2.4 Is the functional definition complete? The Everett interpretation and subjective uncertainty 3.1 Interpreting quantum mechanics 3.2 The need for subjective uncertainty 3.3 Saunders' argument for subjective uncertainty 3.4 Objections to Saunders' argument 3.5 Subjective uncertainty again: arguments from interpretative charity 3.6 Quantum weights and the functional definition of probability Rejecting subjective uncertainty 4.1 The fission program 4.2 Against the fission program Justifying the axioms of decision theory 5.1 The primitive status of the decision-theoretic axioms 5.2 Holistic scepticism 5.3 The role of an explanation of decision theory Conclusion. (shrink)
I attempt to get as clear as possible on the chain of reasoning by which irreversible macrodynamics is derivable from time-reversible microphysics, and in particular to clarify just what kinds of assumptions about the initial state of the universe, and about the nature of the microdynamics, are needed in these derivations. I conclude that while a “Past Hypothesis” about the early Universe does seem necessary to carry out such derivations, that Hypothesis is not correctly understood as a constraint on the (...) early Universe’s entropy. (shrink)
I present in detail the case for regarding black hole thermodynamics as having a statistical-mechanical explanation in exact parallel with the statistical-mechanical explanation believed to underly the thermodynamics of other systems. I focus on three lines of argument: zero-loop and one-loop calculations in quantum general relativity understood as a quantum field theory, using the path-integral formalism; calculations in string theory of the leading-order terms, higher-derivative corrections, and quantum corrections, in the black hole entropy formula for extremal and near-extremal black holes; (...) recovery of the qualitative and quantitative structure of black hole statistical mechanics via the AdS/CFT correspondence. In each case I briefly review the content of, and arguments for, the form of quantum gravity being used at a introductory level: the paper is aimed at students and non-specialists and does not presume advanced knowledge of quantum gravity.. My conclusion is that the evidence for black hole statistical mechanics is as solid as we could reasonably expect it to be in the absence of a directly-empirically-verified theory of quantum gravity. (shrink)
An examination is made of the way in which particles emerge from linear, bosonic, massive quantum field theories. Two different constructions of the one-particle subspace of such theories are given, both illustrating the importance of the interplay between the quantum-mechanical linear structure and the classical one. Some comments are made on the Newton-Wigner representation of one-particle states, and on the relationship between the approach of this paper and those of Segal, and of Haag and Ruelle.
The quantum theory of de Broglie and Bohm solves the measurement problem, but the hypothetical corpuscles play no role in the argument. The solution finds a more natural home in the Everett interpretation.
NGC 1300 (shown in figure 1) is a spiral galaxy 65 million light years from Earth.1 We have never been there, and (although I would love to be wrong about this) we will never go there; all we will ever know about NGC 1300 is what we can see of it from sixty-five million light years away, and what we can infer from our best physics. Fortunately, “what we can infer from our best physics” is actually quite a lot. To (...) take a particular example: our best theory of galaxies tells us that that hazy glow is actually made up of the light of hundreds of billions of stars; our best theories of planetary formation tell us that a sizable fraction of those stars.. (shrink)
I make the case that the Universe according to unitary quantum theory has a branching structure, and so can literally be regarded as a "many-worlds" theory. These worlds are not part of the _fundamental_ ontology of quantum theory - instead, they are to be understood as structures, or patterns, emergent from the underlying theory, through the dynamical process of decoherence. That they are structures in this sense does not mean that they are in any way unreal: indeed, pretty much all (...) higher-level ontology in science, from tables to phonons to tigers, is likewise emergent. Unitary quantum theory is therefore a "many-worlds" theory without any modification of the mathematical structure of the theory: the Everett interpretation does not consist in adding worlds to the formalism, but in realising that they are there already. Our grounds for accepting the reality of those worlds is no more, but no less, than our grounds for accepting any other not-directly observable consequence of an empirically very successful theory. (shrink)
In discussions of the foundations of statistical mechanics, it is widely held that the Gibbsian and Boltzmannian approaches are incompatible but empirically equivalent; the Gibbsian approach may be calculationally preferable but only the Boltzmannian approach is conceptually satisfactory. I argue against both assumptions. Gibbsian statistical mechanics is applicable to a wide variety of problems and systems, such as the calculation of transport coefficients and the statistical mechanics and thermodynamics of mesoscopic systems, in which the Boltzmannian approach is inapplicable. And the (...) supposed conceptual problems with the Gibbsian approach are either misconceived, or apply only to certain versions of the Gibbsian approach, or apply with equal force to both approaches. I conclude that Boltzmannian statistical mechanics is best seen as a special case of, and not an alternatve to, Gibbsian statistical mechanics. (shrink)
I point out a radical indeterminism in potential-based formulations of Newtonian gravity once we drop the condition that the potential vanishes at infinity. This indeterminism, which is well known in theoretical cosmology but has received little attention in foundational discussions, can be removed only by specifying boundary conditions at all instants of time, which undermines the theory's claim to be fully cosmological, i.e., to apply to the Universe as a whole. A recent alternative formulation of Newtonian gravity due to Saunders (...) pp.22-48) provides a conceptually satisfactory cosmology but fails to reproduce the Newtonian limit of general relativity in homogenous but anisotropic universes. I conclude that Newtonian gravity lacks a fully satisfactory cosmological formulation. (shrink)
I give an introduction to the conceptual structure of quantum field theory as it is used in mainstream theoretical physics today, aimed at non-specialists. My main focuses in the article are the common structure of quantum field theory as it is applied in solid-state physics and as it is applied in high-energy physics; the modern theory of renormalisation.
An investigation is made into how the foundations of statistical mechanics are affected once we treat classical mechanics as an approximation to quantum mechanics in certain domains rather than as a theory in its own right; this is necessary if we are to understand statistical-mechanical systems in our own world. Relevant structural and dynamical differences are identified between classical and quantum mechanics (partly through analysis of technical work on quantum chaos by other authors). These imply that quantum mechanics significantly affects (...) a number of foundational questions, including the nature of statistical probability and the direction of time. (shrink)
I develop an account of naturalness in physics which demonstrates that naturalness assumptions are not restricted to narrow cases in high-energy physics but are a ubiquitous part of how interlevel relations are derived in physics. After exploring how and to what extent we might justify such assumptions on methodological grounds or through appeal to speculative future physics, I consider the apparent failure of naturalness in cosmology and in the Standard Model. I argue that any such naturalness failure threatens to undermine (...) the entire structure of our understanding of intertheoretic reduction, and so risks a much larger crisis in physics than is sometimes suggested; I briefly review some currently-popular strategies that might avoid that crisis. (shrink)
I discuss the statistical mechanics of gravitating systems and in particular its cosmological implications, and argue that many conventional views on this subject in the foundations of statistical mechanics embody significant confusion; I attempt to provide a clearer and more accurate account. In particular, I observe that (i) the role of gravity in entropy calculations must be distinguished from the entropy of gravity, that (ii) although gravitational collapse is entropy-increasing, this is not usually because the collapsing matter itself increases in (...) entropy, and that (iii) the Second Law of thermodynamics does not owe its validity to the statistical mechanics of gravitational collapse. (shrink)
Decoherence is widely felt to have something to do with the quantum measurement problem, but getting clear on just what is made diffcult by the fact that the "measurement problem", as traditionally presented in foundational and philosophical discussions, has become somewhat disconnected from the conceptual problems posed by real physics. This, in turn, is because quantum mechanics as discussed in textbooks and in foundational discussions has become somewhat removed from scientific practice, especially where the analysis of measurement is concerned. This (...) paper has two goals: firstly, to present an account of how quantum measurements are actually dealt with in modern physics and to state the measurement problem from the perspective of that account; and secondly, to clarify what role decoherence plays in modern measurement theory and what effect it has on the various strategies that have been proposed to solve the measurement problem. (shrink)
I argue that wavefunction realism --- the view that quantum mechanics reveals the fundamental ontology of the world to be a field on a high-dimensional spacetime, must be rejected as relying on artefacts of too-simple versions of quantum mechanics, and not conceptually well-motivated even were those too-simple versions exactly correct. I end with some brief comments on the role of spacetime in any satisfactory account of the metaphysics of extant quantum theories.
This is a preliminary version of an article to appear in the forthcoming Ashgate Companion to the New Philosophy of Physics.In it, I aim to review, in a way accessible to foundationally interested physicists as well as physics-informed philosophers, just where we have got to in the quest for a solution to the measurement problem. I don't advocate any particular approach to the measurement problem (not here, at any rate!) but I do focus on the importance of decoherence theory to (...) modern attempts to solve the measurement problem, and I am fairly sharply critical of some aspects of the "traditional" formulation of the problem. (shrink)
QFT, Antimatter, and Symmetry.David Wallace - 2009 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 40 (3):209-222.details
A systematic analysis is made of the relations between the symmetries of a classical field and the symmetries of the one-particle quantum system that results from quantizing that field in regimes where interactions are weak. The results are applied to gain a greater insight into the phenomenon of antimatter.
Quantum theory plays an increasingly significant role in contemporary early-universe cosmology, most notably in the inflationary origins of the fluctuation spectrum of the microwave background radiation. I consider the two main strategies for interpreting standard quantum mechanics in the light of cosmology. I argue that the conceptual difficulties of the approaches based around an irreducible role for measurement - already very severe - become intolerable in a cosmological context, whereas the approach based around Everett's original idea of treating quantum systems (...) as closed systems handles cosmological quantum theory satisfactorily. Contemporary cosmology, which indeed applies standard quantum theory without supplementation or modification, is thus committed - tacitly or explictly - to the Everett interpretation. (shrink)
I distinguish between two versions of the black hole information-loss paradox. The first arises from apparent failure of unitarity on the spacetime of a completely evaporating black hole, which appears to be non-globally-hyperbolic; this is the most commonly discussed version of the paradox in the foundational and semipopular literature, and the case for calling it `paradoxical' is less than compelling. But the second arises from a clash between a fully-statistical-mechanical interpretation of black hole evaporation and the quantum-field-theoretic description used in (...) derivations of the Hawking effect. This version of the paradox arises long before a black hole completely evaporates, seems to be the version that has played a central role in quantum gravity, and is genuinely paradoxical. After explicating the paradox, I discuss the implications of more recent work on AdS/CFT duality and on the `Firewall paradox', and conclude that the paradox is if anything now sharper. The article is written at a introductory level and does not assume advanced knowledge of quantum gravity. (shrink)
Mathematically, gauge theories are extraordinarily rich --- so rich, in fact, that it can become all too easy to lose track of the connections between results, and become lost in a mass of beautiful theorems and properties: indeterminism, constraints, Noether identities, local and global symmetries, and so on. -/- One purpose of this short article is to provide some sort of a guide through the mathematics, to the conceptual core of what is actually going on. Its focus is on the (...) Lagrangian, variational-problem description of classical mechanics, from which the link between gauge symmetry and the apparent violation of determinism is easy to understand; only towards the end will the Hamiltonian description be considered. -/- The other purpose is to warn against adopting too unified a perspective on gauge theories. It will be argued that the meaning of the gauge freedom in a theory like general relativity is (at least from the Lagrangian viewpoint) significantly different from its meaning in theories like electromagnetism. The Hamiltonian framework blurs this distinction, and orthodox methods of quantization obliterate it; this may, in fact, be genuine progress, but it is dangerous to be guided by mathematics into conflating two conceptually distinct notions without appreciating the physical consequences. (shrink)
An extended analysis is given of the program, originally suggested by Deutsch, of solving the probability problem in the Everett interpretation by means of decision theory. Deutsch's own proof is discussed, and alternatives are presented which are based upon different decision theories and upon Gleason's Theorem. It is argued that decision theory gives Everettians most or all of what they need from `probability'. Contact is made with Lewis's Principal Principle linking subjective credence with objective chance: an Everettian Principal Principle is (...) formulated, and shown to be at least as defensible as the usual Principle. Some consequences of (Everettian) quantum mechanics for decision theory itself are also discussed. -/- [NB: this this long (70 pages) and occasionally rambling online-only paper has been almost entirely superseded by material in subsequent; if something is not included in them it usually means that I have had second thoughts. I include it for completeness only. (shrink)
In this article, I briefly explain the quantum measurement problem and the Everett interpretation, in a way that is faithful to modern physics and yet accessible to readers without any physics training. I then consider the metaphysical lessons for ontology from quantum mechanics under the Everett interpretation. My conclusions are largely negative: I argue that very little can be said in full generality about the ontology of quantum mechanics, because quantum mechanics, like abstract classical mechanics, is a framework within which (...) we can consider different physical theories which have very little in common at the level of ontology. Along the way I discuss, and criticise, several positive ontological proposals that have been made in the context of the Everett interpretation: ontologies based on the so-called "eigenstate-eigenvalue link", ontologies based on taking the "many-worlds" language seriously at the fundamental level, and ontologies that treat the wavefunction as a complex field on a high-dimensional space. (shrink)
I provide a self-contained introduction to the problem of the arrow of time in physics, concentrating on the irreversibility of dynamical processes as described in statistical mechanics.
I investigate the consequences for semantics, and in particular for the semantics of tense, if time is assumed to have a branching structure not out of metaphysical necessity (to solve some philosophical problem) but just as a contingent physical fact, as is suggested by a currently-popular approach to the interpretation of quantum mechanics.
It seems to be widely assumed that the only effect of the Ghirardi-Rimini-Weber dynamical collapse mechanism on the `tails' of the wavefunction is to reduce their weight. In consequence it seems to be generally accepted that the tails behave exactly as do the various branches in the Everett interpretation except for their much lower weight. These assumptions are demonstrably inaccurate: the collapse mechanism has substantial and detectable effects within the tails. The relevance of this misconception for the dynamical-collapse theories is (...) debatable, though. (shrink)
I explore the reduction of thermodynamics to statistical mechanics by treating the former as a control theory: a theory of which transitions between states can be induced on a system by means of operations from a fixed list. I recover the results of standard thermodynamics in this framework on the assumption that the available operations do not include measurements which affect subsequent choices of operations. I then relax this assumption and use the framework to consider the vexed questions of Maxwell's (...) demon and Landauer's principle. Throughout I assume rather than prove the basic irreversibility features of statistical mechanics, taking care to distinguish them from the conceptually distinct assumptions of thermodynamics proper. (shrink)