There is currently no viable alternative to the Bayesian analysis of scientific inference, yet the available versions of Bayesianism fail to do justice to several aspects of the testing and confirmation of scientific hypotheses. Bayes or Bust? provides the first balanced treatment of the complex set of issues involved in this nagging conundrum in the philosophy of science. Both Bayesians and anti-Bayesians will find a wealth of new insights on topics ranging from Bayes's original paper to contemporary formal learning theory. (...) In a paper published posthumously in 1763, the Reverend Thomas Bayes made a seminal contribution to the understanding of "analogical or inductive reasoning." Building on his insights, modem Bayesians have developed an account of scientific inference that has attracted numerous champions as well as numerous detractors. Earman argues that Bayesianism provides the best hope for a comprehensive and unified account of scientific inference, yet the presently available versions of Bayesianisin fail to do justice to several aspects of the testing and confirming of scientific theories and hypotheses. By focusing on the need for a resolution to this impasse, Earman sharpens the issues on which a resolution turns. John Earman is Professor of History and Philosophy of Science at the University of Pittsburgh. (shrink)
Determinism is a perennial topic of philosophical discussion. Very little acquaintance with the philosophical literature is needed to reveal the Tower of ...
There is currently no viable alternative to the Bayesian analysis of scientific inference, yet the available versions of Bayesianism fail to do justice to several aspects of the testing and confirmation of scientific hypotheses. Bayes or Bust? provides the first balanced treatment of the complex set of issues involved in this nagging conundrum in the philosophy of science. Both Bayesians and anti-Bayesians will find a wealth of new insights on topics ranging from Bayes’s original paper to contemporary formal learning theory.In (...) a paper published posthumously in 1763, the Reverend Thomas Bayes made a seminal contribution to the understanding of "analogical or inductive reasoning." Building on his insights, modem Bayesians have developed an account of scientific inference that has attracted numerous champions as well as numerous detractors. Earman argues that Bayesianism provides the best hope for a comprehensive and unified account of scientific inference, yet the presently available versions of Bayesianisin fail to do justice to several aspects of the testing and confirming of scientific theories and hypotheses. By focusing on the need for a resolution to this impasse, Earman sharpens the issues on which a resolution turns. John Earman is Professor of History and Philosophy of Science at the University of Pittsburgh. (shrink)
Spacetime substantivalism leads to a radical form of indeterminism within a very broad class of spacetime theories which include our best spacetime theory, general relativity. Extending an argument from Einstein, we show that spacetime substantivalists are committed to very many more distinct physical states than these theories' equations can determine, even with the most extensive boundary conditions.
It has become something of a dogma in the philosophy of science that modern cosmology has completed Boltzmann's program for explaining the statistical validity of the Second Law of thermodynamics by providing the low entropy initial state needed to ground the asymmetry in entropic behavior that underwrites our inference about the past. This dogma is challenged on several grounds. In particular, it is argued that it is likely that the Boltzmann entropy of the initial state of the universe is an (...) ill-defined or severely hobbled concept. It is also argued that even if the entropy of the initial state of the universe had a well-defined, low value, this would not suffice to explain why thermodynamics works as well as it does for the kinds of systems we care about. Because the role of Boltzmann entropy in our inferences to the past has been vastly overrated, the failure of the Boltzmann program does not pose a serious problem for our knowledge of the past. But it does call a different explanation of why thermodynamics works as well as it does. A suggestion is offered for a different approach. (shrink)
Many have claimed that ceteris paribus laws are a quite legitimate feature of scientific theories, some even going so far as to claim that laws of all scientific theories currently on offer are merely CP. We argue here that one of the common props of such a thesis, that there are numerous examples of CP laws in physics, is false. Moreover, besides the absence of genuine examples from physics, we suggest that otherwise unproblematic claims are rendered untestable by the mere (...) addition of the CP operator. Thus, “CP all Fs are Gs”, when read as a straightforward statement of fact, cannot be the stuff of scientific theory. Rather, we suggest that when “ceteris paribus” appears in scientific works it plays a pragmatic role of pointing to more respectable claims. (shrink)
Much of the literature on "ceteris paribus" laws is based on a misguided egalitarianism about the sciences. For example, it is commonly held that the special sciences are riddled with ceteris paribus laws; from this many commentators conclude that if the special sciences are not to be accorded a second class status, it must be ceteris paribus all the way down to fundamental physics. We argue that the (purported) laws of fundamental physics are not hedged by ceteris paribus clauses and (...) provisos. Furthermore, we show that not only is there no persuasive analysis of the truth conditions for ceteris paribus laws, there is not even an acceptable account of how they are to be saved from triviality or how they are to be melded with standard scientific methodology. Our way out of this unsatisfactory situation to reject the widespread notion that the achievements and the scientific status of the special sciences must be understood in terms of ceteris paribus laws. (shrink)
This vital study offers a new interpretation of Hume's famous "Of Miracles," which notoriously argues against the possibility of miracles. By situating Hume's popular argument in the context of the 18th century debate on miracles, Earman shows Hume's argument to be largely unoriginal and chiefly without merit where it is original. Yet Earman constructively conceives how progress can be made on the issues that Hume's essay so provocatively posed about the ability of eyewitness testimony to establish the credibility of marvelous (...) and miraculous events. (shrink)
Newton's Principia introduced conceptions of space and time that launched one of themost famous and sustained debates in the history of physics, a controversy that involves fundamentalconcerns in the foundations of physics, metaphysics, and scientific epistemology.This bookintroduces and clarifies the historical and philosophical development of the clash between Newton'sabsolute conception of space and Leibniz's relational one. It separates the issues and provides newperspectives on absolute relational accounts of motion and relational-substantival accounts of theontology of space time.Earman's sustained treatment and imaginative (...) insights raise to a new levelthe debate on these important issues at the boundary of philosophy and physics. He surveys thehistory of the controversy from Newton to Einstein develops the mathematics and physics needed topose the issues in sharp form and provides a persuasive assessment of the philosophical problemsinvolved.Most importantly, Earman revitalizes the connection of the debate to contemporary science.He shows, for example, how concerns raised by Leibniz form the core of ongoing debate on thefoundations of general theory of relativity, moving the discussion into a new and vital arena andintroducing arguments that will be discussed for years to come.John Earman is Professor of Historyand Philosophy of Science at the University of Pittsburgh. A Bradford Book. (shrink)
Given its importance in modern physics, philosophers of science have paid surprisingly little attention to the subject of symmetries and invariances, and they have largely neglected the subtopic of symmetry breaking. I illustrate how the topic of laws and symmetries brings into fruitful interaction technical issues in physics and mathematics with both methodological issues in philosophy of science, such as the status of laws of physics, and metaphysical issues, such as the nature of objectivity.
The philosophical literature on time and change is fixated on the issue of whether the B-series account of change is adequate or whether real change requires Becoming of either the property-based variety of McTaggart's A-series or the non-property-based form embodied in C. D. Broad's idea of the piling up of successive layers of existence. For present purposes it is assumed that the B-series suffices to ground real change. But then it is noted that modern science in the guise of Einstein's (...) general theory poses a threat to real change by implying that none of the genuine physical magnitudes countenanced by the theory changes its value with time. The aims of this paper are to explain how this seemingly paradoxical conclusion arises and to assess the merits and demerits of possible reactions to it. (shrink)
In 1894 Pierre Curie announced what has come to be known as Curie's Principle: the asymmetry of effects must be found in their causes. In the same publication Curie discussed a key feature of what later came to be known as spontaneous symmetry breaking: the phenomena generally do not exhibit the symmetries of the laws that govern them. Philosophers have long been interested in the meaning and status of Curie's Principle. Only comparatively recently have they begun to delve into the (...) mysteries of spontaneous symmetry breaking. The present paper aims to advance the discussion of both of these twin topics by tracing their interaction in classical physics, ordinary quantum mechanics and quantum field theory. The features of spontaneous symmetry that are peculiar to quantum field theory have received scant attention in the philosophical literature. These features are highlighted here, along with an explanation of why Curie's Principle, though valid in quantum field theory, is nearly vacuous in that context. (shrink)
On standard accounts of scientific theorizing, the role of idealizations is to facilitate the analysis of some real world system by employing a simplified representation of the target system, raising the obvious worry about how reliable knowledge can be obtained from inaccurate descriptions. The idealizations involved in the Aharonov–Bohm effect do not, it is claimed, fit this paradigm; rather the target system is a fictional system characterized by features that, though physically possible, are not realized in the actual world. The (...) point of studying such a fictional system is to understand the foundations of quantum mechanics and how its predictions depart from those of classical mechanics. The original worry about the use of idealizations is replaced by a new one; namely, how can actual world experiments serve to confirm the AB effect if it concerns the behavior of a fictional system? Struggle with this issue helps to account for the fact that almost three decades elapsed before a consensus emerged that the predicted AB effect had received solid experimental support. Standard accounts of idealizations tout the role they play in making tractable the analysis of the target system; by contrast, the idealizations involved in the AB effect make its analysis both conceptually and mathematically challenging. The idealizations required for the AB effect are also responsible for the existence of unitarily inequivalent representations of the canonical commutation relations and of the current algebra, representations which an observer confined to the electron’s configuration space could invoke to ‘explain’ AB-type effect without the need to posit a hidden magnetic field. The goal of this paper is to bring to the attention of the philosophers of science these and other aspects of the AB effect which are neglected or inadequately treated in literature. (shrink)
It is argued that the main problem with "the problem of the direction of time" is to figure out what the problem is or is supposed to be. Towards this end, an attempt is made to disentangle and to classify some of the many issues which have been discussed under the label of 'the direction of time'. Secondly, some technical apparatus is introduced in the hope of producing a sharper formulation of the issues than they have received in the philosophical (...) literature. Finally, some tentative suggestions about the central issues are offered. In particular, it is suggested that entropy and irreversibility are much less crucial to the central issues than most philosophers would have us believe. This suggestion is not made because of any firm conviction of its correctness but rather because it helps to focus the discussion on some basic but long neglected assumptions which underlie traditional approaches. (shrink)
Focusing on spacetime singularities, Earman engages with a host of foundational issues at the intersection of science and philosophy, ranging from the big bang to the possibility of time travel.
We argue that, contrary to some analyses in the philosophy of science literature, ergodic theory falls short in explaining the success of classical equilibrium statistical mechanics. Our claim is based on the observations that dynamical systems for which statistical mechanics works are most likely not ergodic, and that ergodicity is both too strong and too weak a condition for the required explanation: one needs only ergodic-like behaviour for the finite set of observables that matter, but the behaviour must ensure that (...) the approach to equilibrium for these observables is on the appropriate time-scale. (shrink)
Schrödinger averred that entanglement is the characteristic trait of quantum mechanics. The first part of this paper is simultaneously an exploration of Schrödinger’s claim and an investigation into the distinction between mere entanglement and genuine quantum entanglement. The typical discussion of these matters in the philosophical literature neglects the structure of the algebra of observables, implicitly assuming a tensor product structure of the simple Type I factor algebras used in ordinary Quantum Mechanics . This limitation is overcome by adopting the (...) algebraic approach to quantum physics, which allows a uniform treatment of ordinary QM, relativistic quantum field theory, and quantum statistical mechanics. The algebraic apparatus helps to distinguish several different criteria of quantum entanglement and to prove results about the relation of quantum entanglement to two additional ways of characterizing the classical versus quantum divide, viz. abelian versus non-abelian algebras of observables, and the ability versus inability to interrogate the system without disturbing it. Schrödinger’s claim is reassessed in the light of this discussion. The second part of the paper deals with the relativity-to-ambiguity threat: the entanglement of a state on a system algebra is entanglement of the state relative to a decomposition of the system algebra into subsystem algebras; a state may be entangled with respect to one decomposition but not another; hence, unless there is some principled way to choose a decomposition, entanglement is a radically ambiguous notion. The problem is illustrated in terms a Realist versus Pragmatist debate, the former claiming that the decomposition must correspond to real as opposed to virtual subsystems, while the latter claims that the real versus virtual distinction is bogus and that practical considerations can steer the choice of decomposition. This debate is applied to the fraught problem of measuring entanglement for indistinguishable particles. The paper ends with some remarks about claims in the philosophical literature that entanglement undermines the separability or independence of subsystems while promoting holism. (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no- go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of a time (...) machine. We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently "potent" to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident "yes" has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
Physicists who work on canonical quantum gravity will sometimes remark that the general covariance of general relativity is responsible for many of the thorniest technical and conceptual problems in their field.1 In particular, it is sometimes alleged that one can trace to this single source a variety of deep puzzles about the nature of time in quantum gravity, deep disagreements surrounding the notion of ‘observable’ in classical and quantum gravity, and deep questions about the nature of the existence of spacetime (...) in general relativity. Philosophers who think about these things are sometimes skeptical about such claims. We have all learned that Kretschmann was quite correct to urge against Einstein that the “General Theory of Relativity” was no such thing, since any theory could be cast in a generally covariant form, and hence the general covariance of general relativity could not have any physical content, let alone bear the kind of weight that Einstein expected it to.2 Friedman’s assessment is widely accepted: “As Kretschmann first pointed out in 1917, the principle of general covariance has no physical content whatever: it specifies no particular physical theory; rather, it merely expresses our commitment to a certain style of formulating physical theories” (1984, p. 44). Such considerations suggest that general covariance, as a technically crucial but physically contentless feature of general relativity, simply cannot be the source of any significant conceptual or physical problems.3 Physicists are, of course, conscious of the weight of Kretschmann’s points against Einstein. Yet they are considerably more ambivalent than their philosophical colleagues. Consider Kuchaˇr’s conclusion at the end of a discussion of this topic. (shrink)
Although the philosophical literature on the foundations of quantum field theory recognizes the importance of Haag’s theorem, it does not provide a clear discussion of the meaning of this theorem. The goal of this paper is to make up for this deficit. In particular, it aims to set out the implications of Haag’s theorem for scattering theory, the interaction picture, the use of non-Fock representations in describing interacting fields, and the choice among the plethora of the unitarily inequivalent representations of (...) the canonical commutation relations for free and interacting fields. (shrink)
In this first part of a two-part paper, we describe efforts in the early decades of this century to restrict the extent of violations of the Second Law of thermodynamics that were brought to light by the rise of the kinetic theory and the identification of fluctuation phenomena. We show how these efforts mutated into Szilard’s proposal that Maxwell’s Demon is exorcised by proper attention to the entropy costs associated with the Demon’s memory and information acquisition. In the second part (...) we will argue that the information theoretic exorcisms of the Demon provide largely illusory benefits. According to the case, they either return a presupposition that can be had without information theoretic consideration or they postulate a broader connection between information and entropy than can be sustained. (shrink)
Inflationary cosmology won a large following on the basis of the claim that it solves various problems that beset the standard big bang model. We argue that these problems concern not the empirical adequacy of the standard model but rather the nature of the explanations it offers. Furthermore, inflationary cosmology has not been able to deliver on its proposed solutions without offering models which are increasingly complicated and contrived, which depart more and more from the standard model it was supposed (...) to improve upon, and which sever the connection between cosmology and particle physics that initially made the inflationary paradigm so attractive. Nevertheless, inflationary cosmology remains a promising research program, not least because it offers an explanation of the origin of the density perturbations that seeded the formation of galaxies and other cosmic structures. Tests of this explanation are underway and may settle the issue of whether inflation played an important role in the early universe. (shrink)
David Albert's Time and Chance (2000) provides a fresh and interesting perspective on the problem of the direction of time. Unfortunately, the book opens with a highly non-standard exposition of time reversal invariance that distorts the subsequent discussion. The present article not only has the remedial goal of setting the record straight about the meaning of time reversal invariance, but it also aims to show how the niceties of this symmetry concept matter to the problem of the direction of time (...) and to related foundation issues in physics. (shrink)
The standard theory of computation excludes computations whose completion requires an infinite number of steps. Malament-Hogarth spacetimes admit observers whose pasts contain entire future-directed, timelike half-curves of infinite proper length. We investigate the physical properties of these spacetimes and ask whether they and other spacetimes allow the observer to know the outcome of a computation with infinitely many steps.
This is the first part of a two-part article in which we defend the thesis of Humean Supervenience about Laws of Nature (HS). According to this thesis, two possible worlds cannot differ on what is a law of nature unless they also differ on the Humean base. The Humean base is easy to characterize intuitively, but there is no consensus on how, precisely, it should be defined. Here in Part I, we present and motivate a characterization of the Humean base (...) that, we argue, enables HS to capture what is really stake in the debate, without taking on extraneous commitments. (shrink)
On standard accounts of scientific theorizing, the role of idealizations is to facilitate the analysis of some real world system by employing a simplified representation of the target system, raising the obvious worry about how reliable knowledge can be obtained from inaccurate descriptions. The idealizations involved in the Aharonov–Bohm effect do not, it is claimed, fit this paradigm; rather the target system is a fictional system characterized by features that, though physically possible, are not realized in the actual world. The (...) point of studying such a fictional system is to understand the foundations of quantum mechanics and how its predictions depart from those of classical mechanics. The original worry about the use of idealizations is replaced by a new one; namely, how can actual world experiments serve to confirm the AB effect if it concerns the behavior of a fictional system? Struggle with this issue helps to account for the fact that almost three decades elapsed before a consensus emerged that the predicted AB effect had received solid experimental support. Standard accounts of idealizations tout the role they play in making tractable the analysis of the target system; by contrast, the idealizations involved in the AB effect make its analysis both conceptually and mathematically challenging. The idealizations required for the AB effect are also responsible for the existence of unitarily inequivalent representations of the canonical commutation relations and of the current algebra, representations which an observer confined to the electron’s configuration space could invoke to ‘explain’ AB-type effect without the need to posit a hidden magnetic field. The goal of this paper is to bring to the attention of the philosophers of science these and other aspects of the AB effect which are neglected or inadequately treated in literature. (shrink)
Although C. D. Broad's notion of Becoming has received a fair amount of attention in the philosophy-of-time literature, there are no serious attempts to show how to replace the standard 'block' spacetime models by models that are more congenial to Broad's idea that the sum total of existence is continuously increased by Becoming or the coming into existence of events. In the Newtonian setting Broad-type models can be constructed in a cheating fashion by starting with a Newtonian block model, carving (...) chips off the block, and assembling the chips in an appropriately structured way. However, attempts to construct Broad-type models in a non-cheating fashion reveal a number of problematic aspects of Becoming that have not received adequate attention in the literature. The paper then turns to an assessment of the problem and prospects of adapting Becoming models to relativistic spacetimes. The results of the assessment differ in both minor and major ways from the ones in the extant literature. Finally, the paper describes how the causal set approach to quantum gravity promises to provide a mechanism for realizing Becoming, though the form of Becoming that emerges may not conform to any of the versions discussed in the philosophical literature. (shrink)
The constrained Hamiltonian formalism is recommended as a means for getting a grip on the concepts of gauge and gauge transformation. This formalism makes it clear how the gauge concept is relevant to understanding Newtonian and classical relativistic theories as well as the theories of elementary particle physics; it provides an explication of the vague notions of “local” and “global” gauge transformations; it explains how and why a fibre bundle structure emerges for theories which do not wear their bundle structure (...) on their sleeves; it illuminates the connections of the gauge concept to issues of determinism and what counts as a genuine “observable”; and it calls attention to problems which arise in attempting to quantize gauge theories. Some of the limitations and problematic aspects of the formalism are also discussed. (shrink)
Hume defined ‘cause’ three times over. The two principal definitions (constant conjunction, felt determination) provide the anchors for the two main strands of the modem empiricist accounts of laws of nature 1 while the third (the counter factual definition 2) may be seen as the inspiration of the nonHumean necessitarian analyses. Corresponding to the felt determination definition is the account of laws that emphasizes human attitudes, beliefs, and actions. Latter day weavers of this strand include Nelson Goodman, A. J. Ayer, (...) and Nicholas Rescher. In Fact, Fiction and Forecast Goodman writes: “I want to emphasize the Humean idea that rather than a sentence being used for prediction because it is a law, it is called a law because it is used for prediction” (1955, p. 62). In “What is a law of nature?”, Ayer explains that the difference between ‘generalizations of fact’ and ‘generalizations of law’ “lies not so much on the side of facts which make them true, as in the attitude of those who put them forward” (1956, p. 162). And in a similar vein, Rescher maintains that lawfulness is “mind-dependent”; it is not something which is discovered but which is supplied: “Lawfulness is not found in or extracted from the evidence but superadded to it. Lawfulness is a matter of imputation” (1970, p. 107). By contrast, the constant conjunction definition promotes the view that laws are to be analyzed in terms of the de re characteristics of regularities, independently of the attitudes and actions of actual or potential knowers. (shrink)
In Part I, we presented and motivated a new formulation of Humean Supervenience about Laws of Nature (HS). Here in Part II, we present an epistemological argument in defense of HS, thus formulated. Our contention is that one can combine a modest realism about laws of nature with a proper recognition of the importance of empirical testability in the epistemology of science only if one accepts HS.
Physicists who work on canonical quantum gravity will sometimes remark that the general covariance of general relativity is responsible for many of the thorniest technical and conceptual problems in their field.1 In particular, it is sometimes alleged that one can trace to this single source a variety of deep puzzles about the nature of time in quantum gravity, deep disagreements surrounding the notion of ‘observable’ in classical and quantum gravity, and deep questions about the nature of the existence of spacetime (...) in general relativity. (shrink)
Like moths attracted to a bright light, philosophers are drawn to glitz. So in discussing the notions of ‘gauge’, ‘gauge freedom’, and ‘gauge theories’, they have tended to focus on examples such as Yang–Mills theories and on the mathematical apparatus of fibre bundles. But while Yang–Mills theories are crucial to modern elementary particle physics, they are only a special case of a much broader class of gauge theories. And while the fibre bundle apparatus turned out, in retrospect, to be the (...) right formalism to illuminate the structure of Yang–Mills theories, the strength of this apparatus is also its weakness: the fibre bundle formalism is very flexible and general, and, as such, fibre bundles can be seen lurking under, over, and around every bush. What is needed is an explanation of what the relevant bundle structure is and how it arises, especially for theories that are not initially formulated in fibre bundle language. Here I will describe an approach that grows out of the conviction that, at least for theories that can be written in Lagrangian/Hamiltonian form, gauge freedom arises precisely when there are Lagrangian/Hamiltonian constraints of an appropriate character. This conviction is shared, if only tacitly, by that segment of the physics community that works on constrained Hamiltonian systems. (shrink)
In this second part of our two-part paper we review and analyse attempts since 1950 to use information theoretic notions to exorcise Maxwell’s Demon. We argue through a simple dilemma that these attempted exorcisms are ineffective, whether they follow Szilard in seeking a compensating entropy cost in information acquisition or Landauer in seeking that cost in memory erasure. In so far as the Demon is a thermodynamic system already governed by the Second Law, no further supposition about information and entropy (...) is needed to save the Second Law. In so far as the Demon fails to be such a system, no supposition about the entropy cost of information acquisition and processing can save the Second Law from the Demon. (shrink)
We address the question of whether it is possible to operate a time machine by manipulating matter and energy so as to manufacture closed timelike curves. This question has received a great deal of attention in the physics literature, with attempts to prove no- go theorems based on classical general relativity and various hybrid theories serving as steps along the way towards quantum gravity. Despite the effort put into these no-go theorems, there is no widely accepted definition of a time (...) machine. We explain the conundrum that must be faced in providing a satisfactory definition and propose a resolution. Roughly, we require that all extensions of the time machine region contain closed timelike curves; the actions of the time machine operator are then sufficiently "potent" to guarantee that closed timelike curves appear. We then review no-go theorems based on classical general relativity, semi-classical quantum gravity, quantum field theory on curved spacetime, and Euclidean quantum gravity. Our verdict on the question of our title is that no result of sufficient generality to underwrite a confident "yes" has been proven. Our review of the no-go results does, however, highlight several foundational problems at the intersection of general relativity and quantum physics that lend substance to the search for an answer. (shrink)
We discuss the relationship between the interpretative problems of quantum gravity and those of general relativity. We argue that classical and quantum theories of gravity resuscitate venerable philosophical questions about the nature of space, time, and change; and that the resolution of some of the difficulties facing physicists working on quantum theories of gravity would appear to require philosophical as well as scientific creativity.
Stephen Hawking has argued that universes containing evaporating black holes can evolve from pure initial states to mixed final ones. Such evolution is non-unitary and so contravenes fundamental quantum principles on which Hawking's analysis was based. It disables the retrodiction of the universe's initial state from its final one, and portends the time-asymmetry of quantum gravity. Small wonder that Hawking's paradox has met with considerable resistance. Here we use a simple result for C*-algebras to offer an argument for pure-to-mixed state (...) evolution in black hole evaporation, and review responses to the Hawking paradox with respect to how effectively they rebut this argument. (shrink)
Time in electromagnetism shares many features with time in other physical theories. But there is one aspect of electromagnetism's relationship with time that has always been controversial, yet has not always attracted the limelight it deserves: the electromagnetic arrow of time. Beginning with a re-analysis of a famous argument between Ritz and Einstein over the origins of the radiation arrow, this chapter frames the debate between modern Einsteinians and neo-Ritzians. It tries to find a clean statement of what the arrow (...) is and then explains how it relates to the cosmological and thermodynamic arrows, representing the most developed and sophisticated attack yet, in either the physics or philosophy literature, on the electromagnetic arrow of time. (shrink)
The Unruh Effect for Philosophers.John Earman - 2011 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 42 (2):81-97.details
The importance of the Unruh effect lies in the fact that, together with the related Hawking effect, it serves to link the three main branches of modern physics: thermal/statistical physics, relativity theory/gravitation, and quantum physics. However, different researchers can have in mind different phenomena when they speak of “the Unruh effect” in flat spacetime and its generalization to curved spacetimes. Three different approaches are reviewed here. They are shown to yield results that are sometimes concordant and sometimes discordant. The discordance (...) is disconcerting only if one insists on taking literally the definite article in “the Unruh effect.” It is argued that the role of linking different branches of physics is better served by taking “the Unruh effect” to designate a family of related phenomena. The relation between the Hawking effect and the generalized Unruh effect for curved spacetimes is briefly discussed. (shrink)
Various senses in which laws of nature are supposed to be "universal" are distinguished. Conditions designed to capture the content of the more important of these senses are proposed and the relations among these conditions are examined. The status of universality requirements is briefly discussed.
A vast amount of ink has been spilled in both the physics and the philosophy literature on the measurement problem in quantum mechanics. Important as it is, this problem is but one aspect of the more general issue of how, if at all, classical properties can emerge from the quantum descriptions of physical systems. In this paper we will study another aspect of the more general issue-the emergence of classical chaos-which has been receiving increasing attention from physicists but which has (...) largely been neglected by philosophers of science. (shrink)