Achieving understanding of nature is one of the aims of science. In this paper we offer an analysis of the nature of scientific understanding that accords with actual scientific practice and accommodates the historical diversity of conceptions of understanding. Its core idea is a general criterion for the intelligibility of scientific theories that is essentially contextual: which theories conform to this criterion depends on contextual factors, and can change in the course of time. Our analysis provides a general account of (...) how understanding is provided by scientific explanations of diverse types. In this way, it reconciles conflicting views of explanatory understanding, such as the causal-mechanical and the unificationist conceptions. (shrink)
'Holographic' relations between theories have become a main theme in quantum gravity research. These relations entail that a theory without gravity is equivalent to a gravitational theory with an extra spatial dimension. The idea of holography was first proposed in 1993 by Gerard 't Hooft on the basis of his studies of evaporating black holes. Soon afterwards the holographic 'AdS/CFT' duality was introduced, which since has been heavily studied in the string theory community and beyond. Recently, Erik Verlinde has proposed (...) that even Newton's law of gravitation can be related holographically to the thermodynamics of information on screens. We discuss inter-theoretical relations in these scenarios: what is the status of the holographic relation in them and in what sense is gravity, or spacetime, emergent? (shrink)
Experimental evidence of the last decades has made the status of ``collapses of the wave function'' even more shaky than it already was on conceptual grounds: interference effects turn out to be detectable even when collapses are typically expected to occur. Non-collapse interpretations should consequently be taken seriously. In this paper we argue that such interpretations suggest a perspectivalism according to which quantum objects are not characterized by monadic properties, but by relations to other systems. Accordingly, physical systems may possess (...) different properties with respect to different ``reference systems''. We discuss some of the relevant arguments, and argue that perspectivalism both evades recent arguments that single-world interpretations are inconsistent and eliminates the need for a privileged rest frame in the relativistic case. (shrink)
According to the modal interpretation, the standard mathematical framework of quantum mechanics specifies the physical magnitudes of a system, which have definite values. Probabilities are assigned to the possible values that these magnitudes may adopt. The interpretation is thus concerned with physical properties rather than with measurement results: it is a realistic interpretation. One of the notable achievements of this interpretation is that it dissolves the notorious measurement problem. The papers collected here, together with the introduction and concluding critical appraisal, (...) explain the various forms of the modal interpretation, survey its achievements, and discuss those problems that have yet to be solved. Audience: Philosophers of science, theoretical physicists, and graduate students in these disciplines. (shrink)
It is a central aspect of our ordinary concept of time that history unfolds and events come into being. It is only natural to take this seriously. However, it is notoriously difficult to explain further what this `becoming' consists in, or even to show that the notion is consistent at all. In this article I first argue that the idea of a global temporal ordering, involving a succession of cosmic nows, is not indispensable for our concept of time. Our experience (...) does not support the existence of global simultaneity and arguments from modern physics further support the conclusion that time should not be seen as a succession of cosmic nows. Accordingly, I propose that if we want to make sense of becoming we should attempt to interpret it as something purely local. Second, I address the question of what this local becoming consists in. I maintain that processes of becoming are nothing but the successive happening of events, and that this happening of events consists entirely in the occurring of these events at their own spacetime locations. This leads to a consistent view of becoming, which is applicable even to rather pathological spacetimes. (shrink)
The symmetrization postulates of quantum mechanics (symmetry for bosons, antisymmetry for fermions) are usually taken to entail that quantum particles of the same kind (e.g., electrons) are all in exactly the same state and therefore indistinguishable in the strongest possible sense. These symmetrization postulates possess a general validity that survives the classical limit, and the conclusion seems therefore unavoidable that even classical particles of the same kind must all be in the same state—in clear conflict with what we know about (...) classical particles. In this article we analyze the origin of this paradox. We shall argue that in the classical limit classical particles emerge, as new entities that do not correspond to the “particle indices” defined in quantum mechanics. Put differently, we show that the quantum mechanical symmetrization postulates do not pertain to particles, as we know them from classical physics, but rather to indices that have a merely formal significance. This conclusion raises the question of whether many discussions in the literature about the status of identical quantum particles have not been misguided. (shrink)
According to classical physics _particles_ are basic building blocks of the world. These classical particles are distinguishable objects, individuated by unique combinations of physical properties. By contrast, in quantum mechanics the received view is that particles of the same kind (“identical particles”) are physically indistinguishable from each other and lack identity. This doctrine rests on the quantum mechanical (anti)symmetrization postulates together with the “factorist” assumption that each single particle is represented in exactly one factor space of the tensor product Hilbert (...) space of a many-particle system. Even though standard in theoretical physics and the philosophy of physics, the assumption of factorism and the ensuing indistinguishability of particles are problematic. Particle indistinguishability is irreconcilable with the everyday meaning of “particle”, and also with how this term is used in the practice of physics. Moreover, it is a consequence of the standard view that identical quantum particles remain indistinguishable even in the classical limit, which makes a smooth transition to the classical particle concept impossible. Lubberdink (1998; 2009) and Dieks and Lubberdink (2011) have proposed an alternative conception of quantum particles that does not rely on factorism and avoids these difficulties. We further explain and discuss this alternative framework here. One of its key consequences is that particles in quantum theory are not fundamental but _emergent_; another that once they have emerged, quantum particles are always physically distinguishable and thus possess a physically grounded identity. (shrink)
Saunders has recently claimed that “identical quantum particles” with an anti-symmetric state (fermions) are weakly discernible objects, just like irreflexively related ordinary objects in situations with perfect symmetry (Black’s spheres, for example). Weakly discernible objects have all their qualitative properties in common but nevertheless differ from each other by virtue of (a generalized version of) Leibniz’s principle, since they stand in relations an entity cannot have to itself. This notion of weak discernibility has been criticized as question begging, but we (...) defend and accept it for classical cases likes Black’s spheres. We argue, however, that the quantum mechanical case is different. Here the application of the notion of weak discernibility indeed is question begging and in conflict with standard interpretational ideas. We conclude that the introduction of the conceptual resource of weak discernibility does not change the interpretational status quo in quantum mechanics. (shrink)
According to what has become a standard history of quantum mechanics, von Neumann in 1932 succeeded in convincing the physics community that he had proved that hidden variables were impossible as a matter of principle. Subsequently, leading proponents of the Copenhagen interpretation emphatically confirmed that von Neumann's proof showed the completeness of quantum mechanics. Then, the story continues, Bell in 1966 finally exposed the proof as seriously and obviously wrong; this rehabilitated hidden variables and made serious foundational research possible. It (...) is often added in recent accounts that von Neumann's error had been spotted almost immediately by Grete Hermann, but that her discovery was of no effect due to the dominant Copenhagen Zeitgeist. We shall attempt to tell a more balanced story. Most importantly, von Neumann did not claim to have shown the impossibility of hidden variables tout court, but argued that hidden-variable theories must possess a structure that deviates fundamentally from that of quantum mechanics. Both Hermann and Bell appear to have missed this point; moreover, both raised unjustified technical objections to the proof. Von Neumann's conclusion was basically that hidden-variables schemes must violate the "quantum principle" that all physical quantities are to be represented by operators in a Hilbert space. According to this conclusion, hidden-variables schemes are possible in principle but necessarily exhibit a certain kind of contextuality. As we shall illustrate, early reactions to Bohm's theory are in agreement with this account. Leading physicists pointed out that Bohm's theory has the strange feature that particle properties do not generally reveal themselves in measurements, in accordance with von Neumann's result. They did not conclude that the "impossible was done" and that von Neumann had been shown wrong. (shrink)
According to classical physics particles are basic building blocks of the world. These classical particles are distinguishable objects, individuated by unique combinations of physical properties. By contrast, in quantum mechanics the received view is that particles of the same kind are physically indistinguishable from each other and lack identity. This doctrine rests on the quantum mechanical symmetrization postulates together with the “factorist” assumption that each single particle is represented in exactly one factor space of the tensor product Hilbert space of (...) a many-particle system. Even though standard in theoretical physics and the philosophy of physics, the assumption of factorism and the ensuing indistinguishability of particles are problematic. Particle indistinguishability is irreconcilable with the everyday meaning of “particle”, and also with how this term is used in the practice of physics. Moreover, it is a consequence of the standard view that identical quantum particles remain indistinguishable even in the classical limit, which makes a smooth transition to the classical particle concept impossible. Lubberdink and Dieks and Lubberdink have proposed an alternative conception of quantum particles that does not rely on factorism and avoids these difficulties. We further explain and discuss this alternative framework here. One of its key consequences is that particles in quantum theory are not fundamental but emergent; another that once they have emerged, quantum particles are always physically distinguishable and thus possess a physically grounded identity. (shrink)
We generalize the modal interpretation of quantum mechanics so that it may be applied to composite systems represented by arbitrary density operators. We discuss the interpretation these density operators receive and relate this to the discussion about the interpretation of proper and improper mixtures in the standard interpretation.
This book contains selected papers from the First International Conference on the Ontology of Spacetime. Its fourteen chapters address two main questions: first, what is the current status of the substantivalism/relationalism debate, and second, what about the prospects of presentism and becoming within present-day physics and its philosophy? The overall tenor of the four chapters of the book’s first part is that the prospects of spacetime substantivalism are bleak, although different possible positions remain with respect to the ontological status of (...) spacetime. Part II and Part III of the book are devoted to presentism, eternalism, and becoming, from two different perspectives. In the six chapters of Part II it is argued, in different ways, that relativity theory does not have essential consequences for these issues. It certainly is true that the structure of time is different, according to relativity theory, from the one in classical theory. But that does not mean that a decision is forced between presentism and eternalism, or that becoming has proved to be an impossible concept. It may even be asked whether presentism and eternalism really offer different ontological perspectives at all. The writers of the last four chapters, in Part III, disagree. They argue that relativity theory is incompatible with becoming and presentism. Several of them come up with proposals to go beyond relativity, in order to restore the prospects of presentism. · Space and time in present-day physics and philosophy · Relatively low level of technicality, easily accessible · Introduction from scratch of the debates surrounding time · Top authors explaining their positions · Broad spectrum of approaches, coherently represented. (shrink)
N. Maxwell (1985) has claimed that special relativity and "probabilism" are incompatible; "probabilism" he defines as the doctrine that "the universe is such that, at any instant, there is only one past but many alternative possible futures". Thus defined, the doctrine is evidently prerelativistic as it depends on the notion of a universal instant of the universe. In this note I show, however, that there is a straightforward relativistic generalization, and that therefore Maxwell's conclusion that the special theory of relativity (...) should be amended is unwarranted. I leave open the question whether or not probabilism (or the related doctrine of the flow of time) is true, but argue that the special theory of relativity has no fundamental significance for this question. (shrink)
In 1991 Larry Laudan and Jarret Leplin proposed a solution for the problem of empirical equivalence and the empirical underdetermination that is often thought to result from it. In this paper we argue that, even though Laudan and Leplin’s reasoning is essentially correct, their solution should be accurately assessed in order to appreciate its nature and scope. Indeed, Laudan and Leplin’s analysis does not succeed in completely removing the problem or, as they put it, in refuting the thesis of underdetermination (...) as a consequence of empirical equivalence. Instead, what they show is merely that science possesses tools that may eventually lead out of an underdetermination impasse. We apply their argument to a real case of two empirically equivalent theories: Lorentz’s ether theory and Einstein’s special relativity. This example illustrates the validity of Laudan and Leplin’s reasoning, but also shows the importance of the reassessment we argue for. (shrink)
It has often been remarked that Bohr's writings on the interpretation of quantum mechanics make scant reference to the mathematical formalism of quantum theory; and it has not infrequently been suggested that this is another symptom of the general vagueness, obscurity and perhaps even incoherence of Bohr's ideas. Recent years have seen a reappreciation of Bohr, however. In this article we broadly follow this "rehabilitation program". We offer what we think is a simple and coherent reading of Bohr's statements about (...) the interpretation of quantum mechanics, basing ourselves on primary sources and making use of and filling lacunas in|recent secondary literature. We argue that Bohr's views on quantum mechanics are more firmly connected to the structure of the quantum formalism than usually acknowledged, even though Bohr's explicit use of this formalism remains on a rather global and qualitative level. In our reading, Bohr's pronouncements on the meaning of quantum mechanics should first of all be seen as responses to concrete physical problems, rather than as expressions of a preconceived philosophical doctrine. In our final section we attempt a more detailed comparison with the formalism and conclude that Bohr's interpretation is not far removed from present-day non-collapse interpretations of quantum mechanics. (shrink)
Modal interpretations have the ambition to construe quantum mechanics as an objective, man-independent description of physical reality. Their second leading idea is probabilism: quantum mechanics does not completely fix physical reality but yields probabilities. In working out these ideas an important motif is to stay close to the standard formalism of quantum mechanics and to refrain from introducing new structure by hand. In this paper we explain how this programme can be made concrete. In particular, we show that the Born (...) probability rule, and sets of definite-valued observables to which the Born probabilities pertain, can be uniquely defined from the quantum state and Hilbert space structure. We discuss the status of probability in modal interpretations, and to this end we make a comparison with many-worlds alternatives. An overall point that we stress is that the modal ideas define a general framework and research programme rather than one definite and finished interpretation. (shrink)
We propose a new quantum ontology, in which properties are the fundamental building blocks. In this property ontology physical systems are defined as bundles of type-properties. Not all elements of such bundles are associated with definite case-properties, and this accommodates the Kochen and Specker theorem and contextuality. Moreover, we do not attribute an identity to the type-properties, which gives rise to a novel form of the bundle theory. There are no “particles” in the sense of classical individuals in this ontology, (...) although the behavior of such individuals is mimicked in some circumstances. This picture leads in a natural way to the symmetrization postulates for systems of many “identical particles”. (shrink)
We study the process of observation (measurement), within the framework of a “perspectival” (“relational,” “relative state”) version of the modal interpretation of quantum mechanics. We show that if we assume certain features of discreteness and determinism in the operation of the measuring device (which could be a part of the observer's nerve system), this gives rise to classical characteristics of the observed properties, in the first place to spatial localization. We investigate to what extent semi-classical behavior of the object system (...) itself (as opposed to the observational system) is needed for the emergence of classicality. Decoherence is an essential element in the mechanism of observation that we assume, but it turns out that in our approach no environment-induced decoherence on the level of the object system is required for the emergence of classical properties. (shrink)
According to the Doomsday Argument we have to rethink the probabilities we assign to a soon or not so soon extinction of mankind when we realize that we are living now, rather early in the history of mankind. Sleeping Beauty finds herself in a similar predicament: on learning the date of her first awakening, she is asked to re-evaluate the probabilities of her two possible future scenarios. In connection with Doom, I argue that it is wrong to assume that our (...) ordinary probability judgements do not already reflect our place in history: we justify the predictive use we make of the probabilities yielded by science by our knowledge of the fact that we live now, a certain time before the possible occurrence of the events the probabilities refer to. Our degrees of belief should change drastically when we forget the date—importantly, this follows without invoking the “Self Indication Assumption”. Subsequent conditionalization on information about which year it is cancels this probability shift again. The Doomsday Argument is about such probability shifts, but tells us nothing about the concrete values of the probabilities —for these, experience provides the only basis. Essentially the same analysis applies to the Sleeping Beauty problem. I argue that Sleeping Beauty “thirders” should be committed to thinking that the Doomsday Argument is ineffective; whereas “halfers” should agree that doom is imminent—but they are wrong. (shrink)
Pekka Lahti is a prominent exponent of the renaissance of foundational studies in quantum mechanics that has taken place during the last few decades. Among other things, he and coworkers have drawn renewed attention to, and have analyzed with fresh mathematical rigor, the threat of inconsistency at the basis of quantum theory: ordinary measurement interactions, described within the mathematical formalism by Schrödinger-type equations of motion, seem to be unable to lead to the occurrence of definite measurement outcomes, whereas the same (...) formalism is interpreted in terms of probabilities of precisely such definite outcomes. Of course, it is essential here to be explicit about how definite measurement results (or definite properties in general) should be represented in the formalism. To this end Lahti et al. have introduced their objectification requirement that says that a system can be taken to possess a definite property if it is certain (in the sense of probability 1) that this property will be found upon measurement. As they have gone on to demonstrate, this requirement entails that in general definite outcomes cannot arise in unitary measuring processes.In this paper we investigate whether it is possible to escape from this deadlock. As we shall argue, there is a way out in which the objectification requirement is fully maintained. The key idea is to adapt the notion of objectivity itself, by introducing relational or perspectival properties. It seems that such a “relational perspective” offers prospects of overcoming some of the long-standing problems in the interpretation of quantum mechanics. (shrink)
Modal interpretations have the ambition to construe quantum mechanics as an objective, man-independent description of physical reality. Their second leading idea is probabilism: quantum mechanics does not completely fix physical reality but yields probabilities. In working out these ideas an important motif is to stay close to the standard formalism of quantum mechanics and to refrain from introducing new structure by hand. In this paper we explain how this programme can be made concrete. In particular, we show that the Born (...) probability rule, and sets of definite-valued observables to which the Born probabilities pertain, can be uniquely defined from the quantum state and Hilbert space structure. We discuss the status of probability in modal interpretations, and to this end we make a comparison with many-worlds alternatives. An overall point that we stress is that the modal ideas define a general framework and research programme rather than one definite and finished interpretation. (shrink)
Paul Busch has emphasized on various occasions the importance for physics of going beyond a merely instrumentalist view of quantum mechanics. Even if we cannot be sure that any particular realist interpretation describes the world as it actually is, the investigation of possible realist interpretations helps us to develop new physical ideas and better intuitions about the nature of physical objects at the micro level. In this spirit, Paul Busch himself pioneered the concept of “unsharp quantum reality”, according to which (...) there is an objective non-classical indeterminacy—a lack of sharpness—in the properties of individual quantum systems. We concur with Busch’s motivation for investigating realist interpretations of quantum mechanics and with his willingness to move away from classical intuitions. In this article we try to take some further steps on this road. In particular, we pay attention to a number of prima facie implausible and counter-intuitive aspects of realist interpretations of unitary quantum mechanics. We shall argue that from a realist viewpoint, quantum contextuality naturally leads to “perspectivalism” with respect to properties of spatially extended quantum systems, and that this perspectivalism is important for making relativistic covariance possible. (shrink)
In his general theory of relativity (GR) Einstein sought to generalize the special-relativistic equivalence of inertial frames to a principle according to which all frames of reference are equivalent. He claimed to have achieved this aim through the general covariance of the equations of GR. There is broad consensus among philosophers of relativity that Einstein was mistaken in this. That equations can be made to look the same in different frames certainly does not imply in general that such frames are (...) physically equivalent. We shall argue, however, that Einstein's position is tenable. The equivalence of arbitrary frames in GR should not be equated with relativity of arbitrary motion, though. There certainly are observable differences between reference frames in GR (differences in the way particles move and fields evolve). The core of our defense of Einstein's position will be to argue that such differences should be seen as fact-like rather than law-like in GR. By contrast, in classical mechanics and in special relativity (SR) the differences between inertial systems and accelerated systems have a law-like status. The fact-like character of the differences between frames in GR justifies regarding them as equivalent in the same sense as inertial frames in SR. (shrink)
It is argued that the symmetry and anti-symmetry of the wave functions of systems consisting of identical particles have nothing to do with the observational indistinguishability of these particles. Rather, a much stronger conceptual indistinguishability is at the bottom of the symmetry requirements. This can be used to argue further, in analogy to old arguments of De Broglie and Schrödinger, that the reality described by quantum mechanics has a wave-like rather than particle-like structure. The question of whether quantum statistics alone (...) can give rise to empirically observable correlations between results of distant measurements is also discussed. (shrink)
Textbooks present classical particle and field physics as theories of physical systems situated in Newtonian absolute space. This absolute space has an influence on the evolution of physical processes, and can therefore be seen as a physical system itself; it is substantival. It turns out to be possible, however, to interpret the classical theories in another way. According to this rival interpretation, spatiotemporal position is a property of physical systems, and there is no substantival spacetime. The traditional objection that such (...) a relationist view could not cope with the existence of inertial effects and other manifestations of the causal efficacy of spacetime can be answered successfully. According to the new point of view, the spacetime manifold of classical physics is a purely representational device. It represents possible locations of physical objects or events; but these locations are physical properties inherent in the physical objects or events themselves and having no existence independently of them. In relativistic quantum field theory the physical meaning of the spacetime manifold becomes even less tangible. Not only does the manifold lose its status as a substantival container, but also its function as a representation of spacetime properties possessed by physical systems becomes problematic. 'Space and time' become ordering parameters in the web of properties of physical systems. They seem to regain their traditional meaning only in the non-relativistic limit in which the classical particle concept becomes approximately applicable. (shrink)
This volume is a serious attempt to open up the subject of European philosophy of science to real thought, and provide the structural basis for the ...
Reductionism, in the sense of the doctrine that theories on different levels of reality should exhibit strict and general relations of deducibility, faces well-known difficulties. Nevertheless, the idea that deeper layers of reality are responsible for what happens at higher levels is well-entrenched in scientific practice. We argue that the intuition behind this idea is adequately captured by the notion of supervenience: the physical state of the fundamental physical layers fixes the states of the higher levels. Supervenience is weaker than (...) traditional reductionism, but it is not a metaphysical doctrine: one can empirically support the existence of a supervenience relation by exhibiting concrete relations between the levels. Much actual scientific research is directed towards finding such inter-level relations. It seems to be quite generally held that the importance of such relations between different levels is that they are explanatory and give understanding: deeper levels provide deeper understanding, and this justifies the search for ever deeper levels. We shall argue, however, that although achieving understanding is an important aim of science, its correct analysis is not in terms of relations between higher and lower levels. Connections with deeper layers of reality do not generally provide for deeper understanding. Accordingly, the motivation for seeking deeper levels of reality does not come from the desire to find deeper understanding of phenomena, but should be seen as a consequence of the goal to formulate ever better, in the sense of more accurate and more-encompassing, empirical theories. (shrink)
Classical particles of the same kind are distinguishable: they can be labeled by their positions and follow different trajectories. This distinguishability affects the number of ways W a macrostate can be realized on the micro-level, and via S=k ln W this leads to a non-extensive expression for the entropy. This result is generally considered wrong because of its inconsistency with thermodynamics. It is sometimes concluded from this inconsistency, notoriously illustrated by the Gibbs paradox, that identical particles must be treated as (...) indistinguishable after all; and even that quantum mechanics is indispensable for making sense of this. In this article we argue, by contrast, that the classical statistics of distinguishable particles and the resulting non-extensive entropy function are perfectly all-right both from a theoretical and an experimental perspective. We remove the inconsistency with thermodynamics by pointing out that the entropy concept in statistical mechanics is not completely identical to the thermodynamical one. Finally, we observe that even identical quantum particles are in some cases distinguishable; and conclude that quantum mechanics is irrelevant to the Gibbs paradox. (shrink)
I argue that there is natural relationist interpretation of Newtonian and relativistic non-quantum physics. Although relationist, this interpretation does not fall prey to the traditional objections based on the existence of inertial effects.
This book contains a selection of original conference papers covering all major fields in the philosophy of science, that have been organized into themes.
In his general theory of relativity Einstein sought to generalize the special-relativistic equivalence of inertial frames to a principle according to which all frames of reference are equivalent. He claimed to have achieved this aim through the general covariance of the equations of GR. There is broad consensus among philosophers of relativity that Einstein was mistaken in this. That equations can be made to look the same in different frames certainly does not imply in general that such frames are physically (...) equivalent. We shall argue, however, that Einstein's position is tenable. The equivalence of arbitrary frames in GR should not be equated with relativity of arbitrary motion, though. There certainly are observable differences between reference frames in GR. The core of our defense of Einstein's position will be to argue that such differences should be seen as fact-like rather than law-like in GR. By contrast, in classical mechanics and in special relativity the differences between inertial systems and accelerated systems have a law-like status. The fact-like character of the differences between frames in GR justifies regarding them as equivalent in the same sense as inertial frames in SR. (shrink)
Jim Cushing emphasized that physical theory should tell us an intelligible and objective story about the world, and concluded that the Bohm theory is to be preferred over the Copenhagen interpretation. We argue here, however, that the Bohm theory is only one member of a wider class of interpretations that can be said to fulfill Cushing’s desiderata. We discuss how the pictures provided by these interpretations differ from the classical one. In particular, it seems that a rather drastic form of (...) perspectivalism is needed if accordance with special relativity is to be achieved. (shrink)
An often repeated account of the genesis of special relativity tells us that relativity theory was to a considerable extent the fruit of an operationalist philosophy of science. Indeed, Einstein’s 1905 paper stresses the importance of rods and clocks for giving concrete physical content to spatial and temporal notions. I argue, however, that it would be a mistake to read too much into this. Einstein’s operationalist remarks should be seen as serving rhetoric purposes rather than as attempts to promulgate a (...) particular philosophical position --- in fact, Einstein never came close to operationalism in any of his philosophical writings. By focussing on what could actually be measured with rods and clocks Einstein shed doubt on the empirical status of a number of pre-relativistic concepts, with the intention to persuade his readers that the applicability of these concepts was not obvious. This rhetoric manoeuvre has not always been rightly appreciated in the philosophy of physics. Thus, the influence of operationalist misinterpretations, according to which associated operations strictly define what a concept means, can still be felt in present-day discussions about the conventionality of simultaneity. The standard story continues by pointing out that Minkowski in 1908 supplanted Einstein’s approach with a realist spacetime account that has no room for a foundational role of rods and clocks: relativity theory became a description of a four-dimensional ‘absolute world’. As it turns out, however, it is not at all clear that Minkowski was proposing a substantivalist position with respect to spacetime. On the contrary, it seems that from a philosophical point of view Minkowski’s general position was not very unlike the one in the back of Einstein’s mind. However, in Minkowski’s formulation of special relativity it becomes more explicit that the content of spatiotemporal concepts relates to considerations about the form of physical laws. If accepted, this position has important consequences for the discussion about the conventionality of simultaneity. (shrink)
The surprising aspects of quantum information are due to two distinctly non-classical features of the quantum world: first, different quantum states need not be orthogonal and, second, quantum states may be entangled. Non-orthogonality leads to the blurring of classical distinctions. On the other hand, entanglement leads via non-locality to teleportation and other ``entanglement-assisted'' forms of communication that go beyond what is classically possible. In this article we attempt to understand these new possibilities via an analysis of the significance of entanglement (...) for the basic physical concepts of a ``particle'' and a ``localized physical system''. Classical particles can be individuated on the basis of qualitative differences in their sets of properties. But in entangled states the ``particle labels'' of the quantum formalism usually do not pick out such sets of individuating particle properties. It is sometimes nevertheless possible to think in terms of individual particles, which may be localized; but we argue that in general the structure of quantum mechanics is at odds with such a particle interpretation. This finally leads us to the conclusion that quantum mechanics is best seen as not belonging to the category of space-time theories, in which physical quantities are functions on space-time points. The resulting picture of the quantum world is relevant for our understanding of the way in which quantum theory is non-local, and it sheds light on the novel aspects of quantum information. (shrink)
The aim of this article is twofold. First, we shall review and analyse the neo-kantian justification for the application of probabilistic concepts in science that was defended by Hans Reichenbach early in his career, notably in his dissertation of 1916. At first sight this kantian approach seems to contrast sharply with Reichenbach’s later logical positivist, frequentist viewpoint. But, and this is our second goal, we shall attempt to show that there is an underlying continuity in Reichenbach’s thought: typical features of (...) his early kantian conceptions can still be recognized in his later work. (shrink)
In his book Philosophie der Raum-Zeit-Lehre (1928) Reichenbach introduced the concept of universal force. Reichenbach's use of this concept was later severely criticized by Grünbaum. In this article it is argued that although Grünbaum's criticism is correct in an important respect, it misses part of Reichenbach's intentions. An attempt is made to clarify and defend Reichenbach's position, and to show that universal force is a useful notion in the physically important case of gravitation.
In relativistic quantum field theory the notion of a local operation is regarded as basic: each open space-time region is associated with an algebra of observables representing possible measurements performed within this region. It is much more difficult to accommodate the notions of events taking place in such regions or of localized objects. But how can the notion of a local operation be basic in the theory if this same theory would not be able to represent localized measuring devices and (...) localized events? After briefly reviewing these difficulties we discuss a strategy for eliminating the tension, namely by interpreting quantum theory in a realist way. To implement this strategy we use the ideas of the modal interpretation of quantum mechanics. We then consider the question of whether the resulting scheme can be made Lorentz invariant. (shrink)
Saunders has recently claimed that ``identical quantum particles'' with an anti-symmetric state (fermions) are weakly discernible objects, just like irreflexively related ordinary objects in situations with perfect symmetry (Black's spheres, for example). Weakly discernible objects have all their qualitative properties in common but nevertheless differ from each other by virtue of (a generalized version of) Leibniz's principle, since they stand in relations an entity cannot have to itself. This notion of weak discernibility has been criticized as question begging, but we (...) defend and accept it for classical cases likes Black's spheres. We argue, however, that the quantum mechanical case is different. Here the application of the notion of weak discernibility indeed is question begging and in conflict with standard interpretational ideas. We conclude that the introduction of the conceptual resource of weak discernibility does not change the interpretational status quo in quantum mechanics. (shrink)
This volume, the third in this Springer series, contains selected papers from the four workshops organized by the ESF Research Networking Programme "The Philosophy of Science in a European Perspective" (PSE) in 2010: Pluralism in the Foundations of Statistics Points of Contact between the Philosophy of Physics and the Philosophy of Biology The Debate on Mathematical Modeling in the Social Sciences Historical Debates about Logic, Probability and Statistics The volume is accordingly divided in four sections, each of them containing papers (...) coming from the workshop focussing on one of these themes. While the programme's core topic for the year 2010 was probability and statistics, the organizers of the workshops embraced the opportunity of building bridges to more or less closely connected issues in general philosophy of science, philosophy of physics and philosophy of the special sciences. However, papers that analyze the concept of probability for various philosophical purposes are clearly a major theme in this volume, as it was in the previous volumes of the same series. This reflects the impressive productivity of probabilistic approaches in the philosophy of science, which form an important part of what has become known as formal epistemology - although, of course, there are non-probabilistic approaches in formal epistemology as well. It is probably fair to say that Europe has been particularly strong in this area of philosophy in recent years. . (shrink)
The theories of pre-quantum physics are standardly seen as representing physical systems and their properties. Quantum mechanics in its standard form is a more problematic case: here, interpretational problems have led to doubts about the tenability of realist views. Thus, QBists and Quantum Pragmatists maintain that quantum mechanics should not be thought of as representing physical systems, but rather as an agent-centered tool for updating beliefs about such systems. It is part and parcel of such views that different agents may (...) have different beliefs and may assign different quantum states. What results is a collection of agent-centered perspectives rather than a unique representation of the physical world. In this paper we argue that the problems identified by QBism and Quantum Pragmatism do not necessitate abandoning the ideal of representing the physical world. We can avail ourselves of the same puzzle-solving strategies as employed by QBists and pragmatists by adopting a perspectival quantum realism. According to this perspectivalism objects may possess different, but equally objective properties with respect to different physically defined perspectives. We discuss two options for such a perspectivalism, a local and a nonlocal one, and apply them to Wigner’s friend and EPR scenarios. Finally, we connect quantum perspectivalism to the recently proposed philosophical position of fragmentalism. (shrink)
We take another look at Reichenbach’s 1920 conversion to conventionalism, with a special eye to the background of his ‘conventionality of distant simultaneity’ thesis. We argue that elements of Reichenbach earlier neo-Kantianism can still be discerned in his later work and, related to this, that his conventionalism should be seen as situated at the level of global theory choice. This is contrary to many of Reichenbach’s own statements, in which he declares that his conventionalism is a consequence of the arbitrariness (...) of coordinative definitions. (shrink)
This volume, the second in the Springer series Philosophy of Science in a European Perspective, contains selected papers from the workshops organised by the ESF Research Networking Programme PSE (The Philosophy of Science in a European Perspective) in 2009. Five general topics are addressed: 1. Formal Methods in the Philosophy of Science; 2. Philosophy of the Natural and Life Sciences; 3. Philosophy of the Cultural and Social Sciences; 4. Philosophy of the Physical Sciences; 5. History of the Philosophy of Science. (...) This volume is accordingly divided in five sections, each section containing papers coming from the meetings focussing on one of these five themes. However, these sections are not completely independent and detached from each other. For example, an important connecting thread running through a substantial number of papers in this volume is the concept of probability: probability plays a central role in present-day discussions in formal epistemology, in the philosophy of the physical sciences, and in general methodological debates---it is central in discussions concerning explanation, prediction and confirmation. The volume thus also attempts to represent the intellectual exchange between the various fields in the philosophy of science that was central in the ESF workshops. (shrink)