A book on the notion of fundamental length, covering issues in the philosophy of math, metaphysics, and the history and the philosophy of modern physics, from classical electrodynamics to current theories of quantum gravity. Published (2014) in Cambridge University Press.
Combining physics, mathematics and computer science, quantum computing and its sister discipline of quantum information have developed in the past few decades from visionary ideas to two of the most fascinating areas of quantum theory. General interest and excitement in quantum computing was initially triggered by Peter Shor (1994) who showed how a quantum algorithm could exponentially “speed-up” classical computation and factor large numbers into primes far more efficiently than any (known) classical algorithm. Shor’s algorithm was soon followed by several (...) other algorithms that aimed to solve combinatorial and algebraic problems, and in the years since theoretical study of quantum systems serving as computational devices has achieved tremendous progress. Common belief has it that the implementation of Shor’s algorithm on a large scale quantum computer would have devastating consequences for current cryptography protocols which rely on the premise that all known classical worst-case algorithms for factoring take time exponential in the length of their input (see, e.g., Preskill 2005). Consequently, experimentalists around the world are engaged in attempts to tackle the technological difficulties that prevent the realisation of a large scale quantum computer. But regardless whether these technological problems can be overcome (Unruh 1995; Ekert and Jozsa 1996; Haroche and Raimond 1996), it is noteworthy that no proof exists yet for the general superiority of quantum computers over their classical counterparts. -/- The philosophical interest in quantum computing is manifold. From a social-historical perspective, quantum computing is a domain where experimentalists find themselves ahead of their fellow theorists. Indeed, quantum mysteries such as entanglement and nonlocality were historically considered a philosophical quibble, until physicists discovered that these mysteries might be harnessed to devise new efficient algorithms. But while the technology for harnessing the power of 50–100 qubits (the basic unit of information in the quantum computer) is now within reach (Preskill 2018), only a handful of quantum algorithms exist, and the question of whether these can truly outperform any conceivable classical alternative is still open. From a more philosophical perspective, advances in quantum computing may yield foundational benefits. For example, it may turn out that the technological capabilities that allow us to isolate quantum systems by shielding them from the effects of decoherence for a period of time long enough to manipulate them will also allow us to make progress in some fundamental problems in the foundations of quantum theory itself. Indeed, the development and the implementation of efficient quantum algorithms may help us understand better the border between classical and quantum physics (Cuffaro 2017, 2018a; cf. Pitowsky 1994, 100), and perhaps even illuminate fundamental concepts such as measurement and causality. Finally, the idea that abstract mathematical concepts such as computability and complexity may not only be translated into physics, but also re-written by physics bears directly on the autonomous character of computer science and the status of its theoretical entities—the so-called “computational kinds”. As such it is also relevant to the long-standing philosophical debate on the relationship between mathematics and the physical world. (shrink)
We argue that current constructive approaches to the special theory of relativity do not derive the geometrical Minkowski structure from the dynamics but rather assume it. We further argue that in current physics there can be no dynamical derivation of primitive geometrical notions such as length. By this we believe we continue an argument initiated by Einstein.
Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.
I discuss the philosophical implications that the rising new science of quantum computing may have on the philosophy of computer science. While quantum algorithms leave the notion of Turing-Computability intact, they may re-describe the abstract space of computational complexity theory hence militate against the autonomous character of some of the concepts and categories of computer science.
A remarkable theorem by Clifton, Bub and Halvorson (2003) (CBH) characterizes quantum theory in terms of information--theoretic principles. According to Bub (2004, 2005) the philosophical significance of the theorem is that quantum theory should be regarded as a ``principle'' theory about (quantum) information rather than a ``constructive'' theory about the dynamics of quantum systems. Here we criticize Bub's principle approach arguing that if the mathematical formalism of quantum mechanics remains intact then there is no escape route from solving the measurement (...) problem by constructive theories. We further propose a (Wigner--type) thought experiment that we argue demonstrates that quantum mechanics on the information--theoretic approach is incomplete. (shrink)
Loop quantum gravity predicts that spatial geometry is fundamentally discrete. Whether this discreteness entails a departure from exact Lorentz symmetry is a matter of dispute that has generated an interesting methodological dilemma. On one hand one would like the theory to agree with current experiments, but, so far, tests in the highest energies we can manage show no such sign of departure. On the other hand one would like the theory to yield testable predictions, and deformations of exact Lorentz symmetry (...) in certain yet– to–be–tested regimes may have phenomenological consequences. Exposing their shortcomings, here I discuss two arguments that exemplify this dilemma, and compare them to other cases from the history of physics that share their symptoms. (shrink)
A recent attempt to compute a (recursion‐theoretic) noncomputable function using the quantum adiabatic algorithm is criticized and found wanting. Quantum algorithms may outperform classical algorithms in some cases, but so far they retain the classical (recursion‐theoretic) notion of computability. A speculation is then offered as to where the putative power of quantum computers may come from.
I discuss a rarely mentioned correspondence between Einstein and Swann on the constructive approach to the special theory of relativity, in which Einstein points out that the attempts to construct a dynamical explanation of relativistic kinematical effects require postulating a fundamental length scale in the level of the dynamics. I use this correspondence to shed light on several issues under dispute in current philosophy of spacetime that were highlighted recently in Harvey Brown’s monograph Physical Relativity, namely, Einstein’s view on the (...) distinction between principle and constructive theories, and the consequences of pursuing the constructive approach in the context of spacetime theories. r 2008 Elsevier Ltd. All rights reserved. (shrink)
A recent proposal to solve the halting problem with the quantum adiabatic algorithm is criticized and found wanting. Contrary to other physical hypercomputers, where one believes that a physical process “computes” a (recursive-theoretic) non-computable function simply because one believes the physical theory that presumably governs or describes such process, believing the theory (i.e., quantum mechanics) in the case of the quantum adiabatic “hypercomputer” is tantamount to acknowledging that the hypercomputer cannot perform its task.
Huw Price (1996, 2002, 2003) argues that causal-dynamical theories that aim to explain thermodynamic asymmetry in time are misguided. He points out that in seeking a dynamical factor responsible for the general tendency of entropy to increase, these approaches fail to appreciate the true nature of the problem in the foundations of statistical mechanics (SM). I argue that it is Price who is guilty of misapprehension of the issue at stake. When properly understood, causal-dynamical approaches in the foundations of SM (...) offer a solution for a different problem; a problem that unfortunately receives no attention in Price’s celebrated work. (shrink)
Among the alternatives of non-relativistic quantum mechanics (NRQM) there are those that give different predictions than quantum mechanics in yet-untested circumstances, while remaining compatible with current empirical findings. In order to test these predictions, one must isolate one’s system from environmental induced decoherence, which, on the standard view of NRQM, is the dynamical mechanism that is responsible for the ‘apparent’ collapse in open quantum systems. But while recent advances in condensed-matter physics may lead in the near future to experimental setups (...) that will allow one to test the two hypotheses, namely genuine collapse vs. decoherence, hence make progress toward a solution to the quantum measurement problem, those philosophers and physicists who are advocating an information-theoretic approach to the foundations of quantum mechanics are still unwilling to acknowledge the empirical character of the issue at stake. Here I argue that in doing so they are displaying an unwarranted double standard. r 2007 Elsevier Ltd. All rights reserved. (shrink)
The early history of the attempts to unify quantum theory with the general theory of relativity is depicted through the work of the under--appreciated Italo-Brazilian physicist Gleb Wataghin, who is responsible for many of the ideas that the quantum gravity community is entertaining today.
It is occasionally claimed that the important work of philosophers, physicists, and mathematicians in the nineteenth and in the early twentieth centuries made Kant’s critical philosophy of geometry look somewhat unattractive. Indeed, from the wider perspective of the discovery of non-Euclidean geometries, the replacement of Newtonian physics with Einstein’s theories of relativity, and the rise of quantificational logic, Kant’s philosophy seems “quaint at best and silly at worst”.1 While there is no doubt that Kant’s transcendental project involves his own conceptions (...) of Newtonian physics, Euclidean geometry and Aristotelian logic, the issue at stake is whether the replacement of these conceptions collapses Kant’s philosophy into an unfortunate embarrassment.2 Thus, in evaluating the debate over the contemporary relevance of Kant’s philosophical project one is faced with the following two questions: (1) Are there any contradictions between the scientific developments of our era and Kant’s philosophy? (2) What is left from the Kantian legacy in light of our modern conceptions of logic, geometry and physics? Within this broad context, this paper aims to evaluate the Kantian project vis à vis the discovery and application of non-Euclidean geometries. Many important philosophers have evaluated Kant’s philosophy of geometry throughout the last century,3 but opinions with regard to the impact of non-Euclidean geometries on it diverge. In the beginning of the century there was a consensus that the Euclidean character of space should be considered as a consequence of the Kantian project, i.e., of the metaphysical view of space and of the synthetic a priori character of geometry. The impact of non-Euclidean geometries was then thought as undermining the Kantian project since it implied, according to positivists such.. (shrink)
Relying on the universality of quantum mechanics and on recent results known as the “threshold theorems,” quantum information scientists deem the question of the feasibility of large‐scale, fault‐tolerant, and computationally superior quantum computers as purely technological. Reconstructing this question in statistical mechanical terms, this article suggests otherwise by questioning the physical significance of the threshold theorems. The skepticism it advances is neither too strong (hence is consistent with the universality of quantum mechanics) nor too weak (hence is independent of technological (...) contingencies). *Received June 2009; revised August 2009. †To contact the author, please write to: Department of History and Philosophy of Science, College of Arts and Sciences, Indiana University, Bloomington, IN 47405; e‐mail: [email protected] (shrink)
This article tells the story of Ed Fredkin, a pilot, programmer, engineer, hardware designer and entrepreneur, whose work inside and outside academia has influenced major developments in computer science and in the foundations of theoretical physics for the past fifty years.
In the chapter “The Geometry of Visibles” in his ‘Inquiry into the Human Mind’, Thomas Reid constructs a special space, develops a special geometry for that space, and offers a natural model for this geometry. In doing so, Reid “discovers” non-Euclidean Geometry sixty years before the mathematicians. This paper examines this “discovery” and the philosophical motivations underlying it. By reviewing Reid’s ideas on visible space and confronting him with Kant and Berkeley, I hope, moreover, to resolve an alleged impasse in (...) Reid’s philosophy concerning the contradictory characteristics of Reid’s tangible and visible space. (shrink)
We present a brief history of decoherence, from its roots in the foundations of classical statistical mechanics, to the current spin bath models in condensed matter physics. We analyze the philosophical import of the subject matter in three different foundational problems, and find that, contrary to the received view, decoherence is less instrumental to their solutions than it is commonly believed. What makes decoherence more philosophically interesting, we argue, are the methodological issues it draws attention to, and the question of (...) the universality of quantum mechanics. (shrink)
For many among the scientifically informed public, and even among physicists, Heisenberg's uncertainty principle epitomizes quantum mechanics. Nevertheless, more than 86 years after its inception, there is no consensus over the interpretation, scope, and validity of this principle. The aim of this chapter is to offer one such interpretation, the traces of which may be found already in Heisenberg's letters to Pauli from 1926, and in Dirac's anticipation of Heisenberg's uncertainty relations from 1927, that stems form the hypothesis of finite (...) nature. Instead of a mere mathematical theorem of quantum theory, or a manifestation of "wave-particle duality", the uncertainty relations turn out to be a result of a more fundamental premise, namely, the inherent limitation on spatial resolution that follows from the bound on physical resources. The implication of this view are far reaching: it depicts the Hilbert space formalism as a phenomenological, "effective", formalism that approximates an underlying discrete structure; it supports a novel interpretation of probability in statistical physics that sees probabilities as deterministic, dynamical transition probabilities which arise from objective and inherent measurement errors; and it helps to clarify several puzzles in the foundations of statistical physics, such as the status of the "disturbance" view of measurement in quantum theory, or the tension between ontology and epistemology in the attempts to describe nature with physical theories whose formalisms include subjective probabilities. Finally, this view also renders obsolete the entire class of interpretations of quantum theory that adhere to "the reality of the wave function". (shrink)
In their book The Road to Maxwell's Demon Hemmo & Shenker re-describe the foundations of statistical mechanics from a purely empiricist perspective. The result is refreshing, as well as intriguing, and it goes against much of the literature on the demon. Their conclusion, however, that Maxwell's demon is consistent with statistical mechanics, still leaves open the question of why such a demon hasn't yet been observed on a macroscopic scale. This essay offers a sketch of what a possible answer could (...) look like. (shrink)
We propose a new interpretation of objective deterministic chances in statistical physics based on physical computational complexity. This notion applies to a single physical system (be it an experimental set--up in the lab, or a subsystem of the universe), and quantifies (1) the difficulty to realize a physical state given another, (2) the 'distance' (in terms of physical resources) from a physical state to another, and (3) the size of the set of time--complexity functions that are compatible with the physical (...) resources required to reach a physical state from another. (shrink)
An analysis of the two routes through which one may disentangle a quantum system from a measuring apparatus, hence protect the state vector of a single quantum system from being disturbed by the measurement, reveals several loopholes in the argument from protective measurement to the reality of the state vector of a single quantum system.
Hugh Everett III died of a heart attack in July 1982 at the age of 51. Almost 26 years later, a New York Times obituary for his PhD advisor, John Wheeler, mentioned him and Richard Feynman as Wheeler’s most prominent students. Everett’s PhD thesis on the relative state formulation of quantum mechanics, later known as the “Many Worlds Interpretation”, was published (in its edited form) in 1957, and later (in its original, unedited form) in 1973, and since then has given (...) rise to one of the most radical schools of thought in the foundations of quantum theory. Several years ago two conferences held in Oxford and in the Perimeter Institute celebrated the occasion of 50 years to the first publication of Everett’s thesis. The book Many worlds? grew out from contributions to these conferences, but, as its editors emphasize, it is more than mere conference proceedings. Instead, an attempt was made to assemble an impressive collection of papers which together illustrate the promise of the many worlds interpretation and the obstacles it faces. 23 papers divided into six sections follow an introduction by Simon Saunders, one of Oxford’s fiercest Everettians. The first four sections cover two thorny issues that have been flagged by contemporary opponents to the many worlds interpretation, namely, the problem of ontology and the problem of probability, while the fifth discusses alternatives to Everett such as Bohmian mechanics and information–theoretic approaches to quantum theory. The sixth section seems to be a wild card, hosting several papers unrelated to each other, including one of the most interesting contributions to this volume on the history of Everett’s thesis and his (some may say all too) short academic career. Each section concludes with transcripts of the discussion session that took place after the talks, thus giving an additional emphasis to the points of contention. Apart from general comments on the volume, in what follows I would like to concentrate on few papers I found especially illuminating. Start with ontology.. (shrink)
Scientific realism is dead, or so many philosophers believe. Its death was announced when philosophers became convinced that one can accept all scientific results without committing oneself to metaphysical existence claims about theoretical entities (Fine 1986, 112). In addition, the inability of self–proclaimed scientific realists, despite recurrent demands, to distinguish themselves from their rival anti–realists (Stein 1989) didn’t exactly help their cause. If realists cannot identify the key feature or features that set them apart from their opponents, then there is (...) really no need to conduct a debate on scientific realism, is there? (shrink)
Quantum computers are hypothetical quantum information processing (QIP) devices that allow one to store, manipulate, and extract information while harnessing quantum physics to solve various computational problems and do so putatively more efficiently than any known classical counterpart. Despite many ‘proofs of concept’ (Aharonov and Ben–Or 1996; Knill and Laflamme 1996; Knill et al. 1996; Knill et al. 1998) the key obstacle in realizing these powerful machines remains their scalability and susceptibility to noise: almost three decades after their conceptions, experimentalists (...) still struggle to maintain useful quantum coherence in QIP devices with more than a pair of qubits (e.g., Blatt and Wineland 2008). This slow progress has prompted debates on the feasibility of quantum computers, yet the quantum information community has dismissed the skepticism as “ideology” (Aaronson 2004), claiming that the obstacles are merely technological (Kaye et al. 2007, 240). In a recent paper (Hagar 2009) I’ve argued that such a skepticism with respect to the feasibility of quantum computers need not be deemed ideological at all, and that the aforementioned ‘proofs of concept’ are physically suspect. Using analogies from the foundations of classical statistical mechanics (SM), I’ve also argued that instead of active error correction, the appropriate framework for debating the feasibility of large–scale, fault–tolerant and computationally superior quantum computers should be the project of error avoidance: rather than trying to constantly ‘cool down’ the QIP device and prevent its thermalization, one should try to locate those regions in the device’s state space which are thermodynamically ‘abnormal’, i.e., those regions in the device’s state space which resist thermalization regardless of external noise. This paper is intended as a further contribution to the debate on the feasibility of large–scale, fault–tolerant and computationally superior quantum computers. Relying again on analogies from the foundations of classical SM, it suggests a skeptical conjecture and frames it in the ‘passive’, error avoidance, context.. (shrink)
In quantum computing, where algorithms exist that can solve computational problems more efficiently than any known classical algorithms, the elimination of errors that result from external disturbances or from imperfect gates has become the ...
One of the recurrent problems in the foundations of physics is to explain why we rarely observe certain phenomena that are allowed by our theories and laws. In thermodynamics, for example, the spontaneous approach towards equilibrium is ubiquitous yet the time-reversal-invariant laws that presumably govern thermal behaviour in the microscopic level equally allow spontaneous departure from equilibrium to occur. Why are the former processes frequently observed while the latter are almost never reported? Another example comes from quantum mechanics where the (...) formalism, if considered complete and universally applicable, predicts the existence of macroscopic superpositions—monstrous Schr¨odinger cats—and these are never observed: while electrons and atoms enjoy the cloudiness of waves, macroscopic objects are always localized to definite positions. (shrink)
Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.