We show that three fundamental information-theoretic constraints -- the impossibility of superluminal information transfer between two physical systems by performing measurements on one of them, the impossibility of broadcasting the information contained in an unknown physical state, and the impossibility of unconditionally secure bit commitment -- suffice to entail that the observables and state space of a physical theory are quantum-mechanical. We demonstrate the converse derivation in part, and consider the implications of alternative answers to a remaining open question about (...) nonlocality and bit commitment. (shrink)
I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles. The notion of quantum information is to be understood as a new physical primitive—just as, following Einstein’s special theory of relativity, a field is no longer regarded as the physical manifestation of vibrations in a mechanical medium, but recognized as a new physical primitive in its own right.
We argue that the intractable part of the measurement problem -- the 'big' measurement problem -- is a pseudo-problem that depends for its legitimacy on the acceptance of two dogmas. The first dogma is John Bell's assertion that measurement should never be introduced as a primitive process in a fundamental mechanical theory like classical or quantum mechanics, but should always be open to a complete analysis, in principle, of how the individual outcomes come about dynamically. The second dogma is the (...) view that the quantum state has an ontological significance analogous to the significance of the classical state as the 'truthmaker' for propositions about the occurrence and non-occurrence of events, i.e., that the quantum state is a representation of physical reality. We show how both dogmas can be rejected in a realist information-theoretic interpretation of quantum mechanics as an alternative to the Everett interpretation. The Everettian, too, regards the 'big' measurement problem as a pseudo-problem, because the Everettian rejects the assumption that measurements have definite outcomes, in the sense that one particular outcome, as opposed to other possible outcomes, actually occurs in a quantum measurement process. By contrast with the Everettians, we accept that measurements have definite outcomes. By contrast with the Bohmians and the GRW 'collapse' theorists who add structure to the theory and propose dynamical solutions to the 'big' measurement problem, we take the problem to arise from the failure to see the significance of Hilbert space as a new kinematic framework for the physics of an indeterministic universe, in the sense that Hilbert space imposes kinematic objective probabilistic constraints on correlations between events. (shrink)
We prove a uniqueness theorem showing that, subject to certain natural constraints, all 'no collapse' interpretations of quantum mechanics can be uniquely characterized and reduced to the choice of a particular preferred observable as determine (definite, sharp). We show how certain versions of the modal interpretation, Bohm's 'causal' interpretation, Bohr's complementarity interpretation, and the orthodox (Dirac-von Neumann) interpretation without the projection postulate can be recovered from the theorem. Bohr's complementarity and Einstein's realism appear as two quite different proposals for selecting (...) the preferred determinate observable--either settled pragmatically by what we choose to observe, or fixed once and for all, as the Einsteinian realist would require, in which case the preferred observable is a 'beable' in Bell's sense, as in Bohm's interpretation (where the preferred observable is position in configuration space). (shrink)
John von Neumann (1903-1957) was undoubtedly one of the scientific geniuses of the 20th century. The main fields to which he contributed include various disciplines of pure and applied mathematics, mathematical and theoretical physics, logic, theoretical computer science, and computer architecture. Von Neumann was also actively involved in politics and science management and he had a major impact on US government decisions during, and especially after, the Second World War. There exist several popular books on his personality and various collections (...) focusing on his achievements in mathematics, computer science, and economy. Strangely enough, to date no detailed appraisal of his seminal contributions to the mathematical foundations of quantum physics has appeared. Von Neumann's theory of measurement and his critique of hidden variables became the touchstone of most debates in the foundations of quantum mechanics. Today, his name also figures most prominently in the mathematically rigorous branches of contemporary quantum mechanics of large systems and quantum field theory. And finally - as one of his last lectures, published in this volume for the first time, shows - he considered the relation of quantum logic and quantum mechanical probability as his most important problem for the second half of the twentieth century. The present volume embraces both historical and systematic analyses of his methodology of mathematical physics, and of the various aspects of his work in the foundations of quantum physics, such as theory of measurement, quantum logic, and quantum mechanical entropy. The volume is rounded off by previously unpublished letters and lectures documenting von Neumann's thinking about quantum theory after his 1932 Mathematical Foundations of Quantum Mechanics. The general part of the Yearbook contains papers emerging from the Institute's annual lecture series and reviews of important publications of philosophy of science and its history. (shrink)
Since the analysis by John Bell in 1965, the consensus in the literature is that von Neumann’s ‘no hidden variables’ proof fails to exclude any significant class of hidden variables. Bell raised the question whether it could be shown that any hidden variable theory would have to be nonlocal, and in this sense ‘like Bohm’s theory.’ His seminal result provides a positive answer to the question. I argue that Bell’s analysis misconstrues von Neumann’s argument. What von Neumann proved was the (...) impossibility of recovering the quantum probabilities from a hidden variable theory of dispersion free (deterministic) states in which the quantum observables are represented as the ‘beables’ of the theory, to use Bell’s term. That is, the quantum probabilities could not reflect the distribution of pre-measurement values of beables, but would have to be derived in some other way, e.g., as in Bohm’s theory, where the probabilities are an artefact of a dynamical process that is not in fact a measurement of any beable of the system. (shrink)
It is generally accepted, following Landauer and Bennett, that the process of measurement involves no minimum entropy cost, but the erasure of information in resetting the memory register of a computer to zero requires dissipating heat into the environment. This thesis has been challenged recently in a two-part article by Earman and Norton. I review some relevant observations in the thermodynamics of computation and argue that Earman and Norton are mistaken: there is in principle no entropy cost to the acquisition (...) of information, but the destruction of information does involve an irreducible entropy cost. (shrink)
I show how quantum mechanics, like the theory of relativity, can be understood as a 'principle theory' in Einstein's sense, and I use this notion to explore the approach to the problem of interpretation developed in my book Interpreting the Quantum World.
In 2018, Daniela Frauchiger and Renato Renner published an article in Nature Communications entitled ‘Quantum theory cannot consistently describe the use of itself.’ The argument has been attacked as flawed from a variety of interpretational perspectives. I clarify the significance of the result as a sequence of actions and inferences by agents modeled as quantum systems evolving unitarily at all times. At no point does the argument appeal to a ‘collapse’ of the quantum state following a measurement.
I show that the quantum state ω can be interpreted as defining a probability measure on a subalgebra of the algebra of projection operators that is not fixed (as in classical statistical mechanics) but changes with ω and appropriate boundary conditions, hence with the dynamics of the theory. This subalgebra, while not embeddable into a Boolean algebra, will always admit two-valued homomorphisms, which correspond to the different possible ways in which a set of “determinate” quantities (selected by ω and the (...) boundary conditions) can have values. The probabilities defined by ω (via the Born rule) are probabilities over these two-valued homomorphisms or value assignments. So any universe of interacting systems, including those functioning as measuring instruments, can be modelled quantum mechanically without the projection postulate. (shrink)
The aim of cognitive neuropsychology is to articulate the functional architecture underlying normal cognition, on the basis of congnitive performance data involving brain-damaged subjects. Throughout the history of the subject, questions have been raised as to whether the methods of neuropsychology are adequate to its goals. The question has been reopened by Glymour , who formulates a discovery problem for cognitive neuropsychology, in the sense of formal learning theory, concerning the existence of a reliable methodology. It appears that the discovery (...) problem may be insoluble in principle! I propose a modified formulation of Glymour's discovery problem and argue that a sceptical conclusion about the possiblity of cognitive neuropsychology as an empirical science is not warranted. (shrink)
A solution to the measurement problem of quantum mechanics is proposed within the framework of an intepretation according to which only quantum systems with an infinite number of degrees of freedom have determinate properties, i.e., determinate values for (some) observables of the theory. The important feature of the infinite case is the existence of many inequivalent irreducible Hilbert space representations of the algebra of observables, which leads, in effect, to a restriction on the superposition principle, and hence the possibility of (...) defining (macro-) observables which commute with every observable. Such observables have determinate values which are not subject to quantum interference effects. A measurement process is schematized as an interaction between a microsystem and a macrosystem, idealized as an infinite quantum system, and it is shown that there exists a unitary transformation which transforms the initial pure state of the composite system in a finite time (the duration of the interaction) into the required mixture of disjoint states. (shrink)
Unconditionally secure two-party bit commitment based solely on the principles of quantum mechanics (without exploiting special relativistic signalling constraints, or principles of general relativity or thermodynamics) has been shown to be impossible, but the claim is repeatedly challenged. The quantum bit commitment theorem is reviewed here and the central conceptual point, that an “Einstein–Podolsky–Rosen” attack or cheating strategy can always be applied, is clarified. The question of whether following such a cheating strategy can ever be disadvantageous to the cheater is (...) considered and answered in the negative. There is, indeed, no loophole in the theorem. (shrink)
We present an exegesis of the Einstein-Podolsky-Rosen argument for the incompleteness of quantum mechanics, and defend it against the critique in Fine. (1) We contend,contra Fine, that it compares favorably with an argument reconstructed by him from a letter by Einstein to Schrödinger; and also with one given by Einstein in a letter to Popper. All three arguments turn on a dubious assumption of “separability,” which accords separate elements of reality to space-like separated systems. We discuss how this assumption figures (...) in the literature spawned by the Bell inequalities. (shrink)
We define a family of ‘no signaling’ bipartite boxes with arbitrary inputs and binary outputs, and with a range of marginal probabilities. The defining correlations are motivated by the Klyachko version of the Kochen-Specker theorem, so we call these boxes Kochen-Specker-Klyachko boxes or, briefly, KS-boxes. The marginals cover a variety of cases, from those that can be simulated classically to the superquantum correlations that saturate the Clauser-Horne-Shimony-Holt inequality, when the KS-box is a generalized PR-box (hence a vertex of the ‘no (...) signaling’ polytope). We show that for certain marginal probabilities a KS-box is classical with respect to nonlocality as measured by the Clauser-Horne-Shimony-Holt correlation, i.e., no better than shared randomness as a resource in simulating a PR-box, even though such KS-boxes cannot be perfectly simulated by classical or quantum resources for all inputs. We comment on the significance of these results for contextuality and nonlocality in ‘no signaling’ theories. (shrink)
Friedman and Putnam have argued (Friedman and Putnam 1978) that the quantum logical interpretation of quantum mechanics gives us an explanation of interference that the Copenhagen interpretation cannot supply without invoking an additional ad hoc principle, the projection postulate. I show that it is possible to define a notion of equivalence of experimental arrangements relative to a pure state φ , or (correspondingly) equivalence of Boolean subalgebras in the partial Boolean algebra of projection operators of a system, which plays a (...) role in the Copenhagen explanation of interference analogous to the role played by the material equivalence, given φ , of certain propositions in the Friedman-Putnam quantum logical analysis. I also show that the quantum logical interpretation and the Copenhagen interpretation are equally capable of avoiding the paradoxical conclusion of the Einstein-Podolsky-Rosen argument (Einstein, Podolsky, and Rosen 1935). Thus, neither interference phenomena nor the correlations between separated systems provide a test case for distinguishing between the relative acceptability of the Copenhagen interpretation and the quantum logical interpretation as explanations of quantum effects. (shrink)
J. S. Bell's argument that only “nonlocal” hidden variable theories can reproduce the quantum statistical correlations of the singlet spin state in the case of two separated spin-1/2 particles is examined in terms of Wigner's formulation. It is shown that a similar argument applies to a single spin-1/2 particle, and that the exclusion of hidden variables depends on an obviously untenable assumption concerning conditional probabilities. The problem of completeness is discussed briefly, and the grounds for rejecting a phase-space reconstruction of (...) the quantum statistics are clarified. (shrink)
I define sublaltices of quantum propositions that can be taken as having determinate (but perhaps unknown) truth values for a given quantum state, in the sense that sufficiently many two-valued maps satisfying a Boolean homomorphism condition exist on each determinate sublattice to generate a Kolmogorov probability space for the probabilities defined by the slate. I show that these sublattices are maximal, subject to certain constraints, from which it follows easily that they are unique. I discuss the relevance of this result (...) for the measurement problem, relating it to an early proposal by Jauch and Piron for defining a new notion of state for quantum systems, to a recent uniqueness proof by Clifton for the sublattice of propositions specified as determinate by modal interpretations of quantum mechanics that exploit the polar decompostion theorem, and to my own previous suggestions for interpreting quantum mechanics without the projection postulate. (shrink)
I present a new 33-ray proof of the Kochen and Specker “no-go” hidden variable theorem in ℋ3, based on a classical tautology that corresponds to a contingent quantum proposition in ℋ3 proposed by Kurt Schütte in an unpublished letter to Specker in 1965. 1 discuss the relation of this proof to a 31-ray proof by Conway and Kochen, and to a 33-ray proof by Peres.
A quantum algorithm succeeds not because the superposition principle allows ‘the computation of all values of a function at once’ via ‘quantum parallelism’, but rather because the structure of a quantum state space allows new sorts of correlations associated with entanglement, with new possibilities for information‐processing transformations between correlations, that are not possible in a classical state space. I illustrate this with an elementary example of a problem for which a quantum algorithm is more efficient than any classical algorithm. I (...) also introduce the notion of ‘pseudotelepathic’ games and show how the difference between classical and quantum correlations plays a similar role here for games that can be won by quantum players exploiting entanglement, but not by classical players whose only allowed common resource consists of shared strings of random numbers (common causes of the players’ correlated responses in a game). *Received October 2008. †To contact the author, please write to: Department of Philosophy, University of Maryland, College Park, MD 20742; e‐mail: [email protected]. (shrink)
I formulate the interpretation problem of quantum mechanics as the problem of identifying all possible maximal sublattices of quantum propositions that can be taken as simultaneously determinate, subject to certain constraints that allow the representation of quantum probabilities as measures over truth possibilities in the standard sense, and the representation of measurements in terms of the linear dynamics of the theory. The solution to this problem yields a modal interpretation that I show to be a generalized version of Bohm's hidden (...) variable theory. I argue that unless we alter the dynamics of quantum mechanics, or accept a for all practical purposes solution, this generalized Bohmian mechanics is the unique solution to the problem of interpretation. (shrink)
It is argued that the measurement problem reduces to the problem of modeling quasi-classical systems in a modified quantum mechanics with superselection rules. A measurement theorem is proved, demonstrating, on the basis of a principle for selecting the quantities of a system that are determinate (i.e., have values) in a given state, that after a suitable interaction between a systemS and a quasi-classical systemM, essentially only the quantity measured in the interaction and the indicator quantity ofM are determinate. The theorem (...) justifies interpreting the noncommutative algebra of observables of a quantum mechanical system as an algebra of “beables,” in Bell's sense. (shrink)
The properties of classical and quantum systems are characterized by different algebraic structures. We know that the properties of a quantum mechanical system form a partial Boolean algebra not embeddable into a Boolean algebra, and so cannot all be co-determinate. We also know that maximal Boolean subalgebras of properties can be (separately) co-determinate. Are there larger subsets of properties that can be co-determinate without contradiction? Following an analysis of Bohrs response to the Einstein-Podolsky-Rosen objection to the complementarity interpretation of quantum (...) mechanics, a principled argument is developed justifying the selection of particular subsets of properties as co-determinate for a quantum system in particular physical contexts. These subsets are generated by sets of maximal Boolean subalgebras, defined in each case by the relation between the quantum state and a measurement (possibly, but not necessarily, the measurement in terms of which we seek to establish whether or not a particular property of the system in question obtains). If we are required to interpret quantum mechanics in this way, then predication for quantum systems is quite unlike the corresponding notion for classical systems. (shrink)
Quantum theory is a probabilistic theory that embodies notoriously striking correlations, stronger than any that classical theories allow but not as strong as those of hypothetical ‘super-quantum’ theories. This raises the question ‘Why the quantum?’—whether there is a handful of principles that account for the character of quantum probability. We ask what quantum-logical notions correspond to this investigation. This project isn’t meant to compete with the many beautiful results that information-theoretic approaches have yielded but rather aims to complement that work.