In this new edition, Arthur Fine looks at Einstein's philosophy of science and develops his own views on realism. A new Afterword discusses the reaction to Fine's own theory. "What really led Einstein . . . to renounce the new quantum order? For those interested in this question, this book is compulsory reading."--Harvey R. Brown, American Journal of Physics "Fine has successfully combined a historical account of Einstein's philosophical views on quantum mechanics and a discussion of some of the philosophical (...) problems associated with the interpretation of quantum theory with a discussion of some of the contemporary questions concerning realism and antirealism. . . . Clear, thoughtful, [and] well-written."--Allan Franklin, Annals of Science "Attempts, from Einstein's published works and unpublished correspondence, to piece together a coherent picture of 'Einstein realism.' Especially illuminating are the letters between Einstein and fellow realist Schrodinger, as the latter was composing his famous 'Schrodinger-Cat' paper."--Nick Herbert, New Scientist "Beautifully clear. . . . Fine's analysis is penetrating, his own results original and important. . . . The book is a splendid combination of new ways to think about quantum mechanics, about realism, and about Einstein's views of both."--Nancy Cartwright, Isis. (shrink)
The realist programme has degenerated by now to the point where it is quite beyond salvage. A token of this degeneration is that there are altogether too many realisms. It is as though by splitting into a confusing array of types and kinds, realism has hoped that some one variety might yet escape extinct. I shall survey the debate, and some of these realisms, below. Here I would just point out the obvious; that in so far as the successes of (...) science mount while realism continues to decline we must conclude that scientific success lends no support to realism. Since it is unlikely to support anti-realism, we have some reason to suspect that the philosophical debate over realism does not concern issues that can be settled by developments in the sciences, no matter how successful science may be. Further, since that success grounds a culture of acceptance for science and its entities, we have reason to believe that the existence of those entities is also not actually the issue that concerns realism. Afortiori, it is not the issue that concerns anti-realism either; nor, I might add, is anti-realism the winner in the philosophical debate that realism has lost. (shrink)
Faced with realist-resistant sciences and the no-nonsense attitude of the times realism has moved away from the rather grandiose program that had traditionally been characteristic of its school. The objective of the shift seems to be to protect some doctrine still worthy of the "realist" name. The strategy is to relocate the school to where conditions seem optimal for its defense, and then to insinuate that the case for such a " piecemeal realism" could be made elsewhere too, were there (...) but world enough and time. The burden of this paper is to examine this piecemeal approach and to show why, despite the relocation, it cannot escape the difficulties of its grander cousins. For that purpose I begin with some brief historical reminders, and with a quick review of the state of the argument before realism went to pieces. This will help us see what has been abandoned in realism's flight, and what baggage still remains. (shrink)
We are often told that quantum phenomena demand radical revisions of our scientific world view and that no physical theory describing well defined objects, such as particles described by their positions, evolving in a well defined way, let alone deterministically, can account for such phenomena. The great majority of physicists continue to subscribe to this view, despite the fact that just such a deterministic theory, accounting for all of the phe nomena of nonrelativistic quantum mechanics, was proposed by David Bohm (...) more than four decades ago and has arguably been around almost since the inception of quantum mechanics itself. Our purpose in asking colleagues to write the essays for this volume has not been to produce a Festschrift in honor of David Bohm or to gather together a collection of papers simply stating uncritically Bohm's views on quantum mechanics. The central theme around which the essays in this volume are arranged is David Bohm's version of quantum mechanics. It has by now become fairly standard practice to refer to his theory as Bohmian mechanics and to the larger conceptual framework within which this is located as the causal quantum theory program. While it is true that one can have reservations about the appropriateness of these specific labels, both do elicit distinc tive images characteristic of the key concepts of these approaches and such terminology does serve effectively to contrast this class of theories with more standard formulations of quantum theory. (shrink)
This paper constructs two classes of models for the quantum correlation experiments used to test the Bell-type inequalities, synchronization models and prism models. Both classes employ deterministic hidden variables, satisfy the causal requirements of physical locality, and yield precisely the quantum mechanical statistics. In the synchronization models, the joint probabilities, for each emission, do not factor in the manner of stochastic independence, showing that such factorizability is not required for locality. In the prism models the observables are not random variables (...) over a common space; hence these models throw into question the entire random variables idiom of the literature. Both classes of models appear to be testable. (shrink)
This paper develops lines of criticism directed at two currently popular versions of anti-realism: the putnam-rorty-kuhn version that is centered on an acceptance theory of truth, and the van fraassen version that is centered on empiricist strictures over warranted beliefs. the paper continues by elaborating and extending a stance, called "the natural ontological attitude", that is neither realist nor anti-realist.
In the May 15, 1935 issue of Physical Review Albert Einstein co-authored a paper with his two postdoctoral research associates at the Institute for Advanced Study, Boris Podolsky and Nathan Rosen. The article was entitled “Can Quantum Mechanical Description of Physical Reality Be Considered Complete?” (Einstein et al. 1935). Generally referred to as “EPR”, this paper quickly became a centerpiece in the debate over the interpretation of the quantum theory, a debate that continues today. The paper features a striking case (...) where two quantum systems interact in such a way as to link both their spatial coordinates in a certain direction and also their linear momenta (in the same direction). As a result of this “entanglement”, determining either position or momentum for one system would fix (respectively) the position or the momentum of the other. EPR use this case to argue that one cannot maintain both an intuitive condition of local action and the completeness of the quantum description by means of the wave function. This entry describes the argument of that 1935 paper, considers several different versions and reactions, and explores the ongoing significance of the issues they raise. (shrink)
Two principles of locality used in discussions about quantum mechanics are distinguished. The intuitive no-action-at-a distance requirement is called physical locality. There is also a mathematical requirement of a kind of factorizability which is referred to as "locality". It is argued in this paper that factorizability is not necessary for physical locality. Ways of producing models that are physically local although not factorizable which are concerned with correlations between the behavior of pairs of particles are suggested. These models can account (...) for all the quantum mechanical single and joint probabilities. (shrink)
(Draft copy published as “Science Made Up: Constructivist Sociology of Scientific Knowledge.” In P. Galison and D. Stump (eds.) The Disunity of Science: Boundaries, Contexts, and Power. Stanford: Stanford University Press, 1996, pp. 231-54.).
This paper examines the efficiency problem involved in experimental tests of so-called “local” hidden variables. It separates the phenomenological locality at issue in the Bell case from Einstein's different conception of locality, and shows how phenomenological locality also differs from the factorizability needed to derive the Bell inequalities in the stochastic case. It then pursues the question of whether factorizable, local models (or, equivalently, deterministic ones) exist for the experiments designed to test the Bell inequalities, thus rendering the experimental argument (...) against them incomplete. This leads to an investigation of the so-called “prism models” and to new inequalities for a significant class of such models, inequalities that are testable even at the low efficiencies of the photon correlation experiments. (shrink)
In the contemporary discussion of hidden variable interpretations of quantum mechanics, much attention has been paid to the “no hidden variable” proof contained in an important paper of Kochen and Specker. It is a little noticed fact that Bell published a proof of the same result the preceding year, in his well-known 1966 article, where it is modestly described as a corollary to Gleason's theorem. We want to bring out the great simplicity of Bell's formulation of this result and to (...) show how it can be extended in certain respects. (shrink)
This paper addresses the “inefficiency loophole” in the Bell theorem. We examine factorizable stochastic models for the Bell inequalities, where we allow the detection efficiency to depend both on the “hidden” state of the measured system and also its passage through an analyzer. We show that, nevertheless, if the efficiency functions are symmetric between the two wings of the experiment, one can dispense with supplementary assumptions and derive new inequalities that enable the models to be tested even for highly inefficient (...) experiments. (shrink)
What we represent to ourselves behind the appear- ances exists only in our understanding . . . [having] only the value of memoria technica or formula whose form, because it is arbitrary and irrelevant, varies . . . with the standpoint of our culture.
Recently, W. H. Zurek presented a novel derivation of the Born rule based on a mechanism termed environment-assisted invariance, or “envariance” [W. H. Zurek, Phys. Rev. Lett. 90, 120404 ]. We review this approach and identify fundamental assumptions that have implicitly entered into it, emphasizing issues that any such derivation is likely to face.
The aim of this paper is to present and discuss a probabilistic framework that is adequate for the formulation of quantum theory and faithful to its applications. Contrary to claims, which are examined and rebutted, that quantum theory employs a nonclassical probability theory based on a nonclassical "logic," the probabilistic framework set out here is entirely classical and the "logic" used is Boolean. The framework consists of a set of states and a set of quantities that are interrelated in a (...) specified manner. Each state induces a classical probability space on the values of each quantity. The quantities, so considered, become statistical variables (not random variables). Such variables need not have a "joint distribution." For the quantum theoretic application, there is a uniform procedure that defines and determines the existence of such "joint distributions" for statistical variables. A general rule is provided and it is shown to lead to the usual compatibility-commutivity requirements of quantum theory. The paper concludes with a brief discussion of interference and the misunderstandings that are involved in the false move from interference to nonclassical probability. (shrink)
A recent analysis by de Barros and Suppes of experimentally realizable GHZ correlations supports the conclusion that these correlations cannot be explained by introducing local hidden variables. We show, nevertheless, that their analysis does not exclude local hidden variable models in which the inefficiency in the experiment is an effect not only of random errors in the detector equipment, but is also the manifestation of a pre-set, hidden property of the particles ("prism models"). Indeed, we present an explicit prism model (...) for the GHZ scenario; that is, a local hidden variable model entirely compatible with recent GHZ experiments. (shrink)
This is a comment on Peter Godfrey-Smith’s, “Models and Fictions in Science”. The comments explore problems he raises if we treat model systems as fictions in a naturalized and deflationary framework.
In the concluding chapter of Exceeding our Grasp Kyle Stanford outlines a positive response to the central issue raised brilliantly by his book, the problem of unconceived alternatives. This response, called "epistemic instrumentalism", relies on a distinction between instrumental and literal belief. We examine this distinction and with it the viability of Stanford's instrumentalism, which may well be another case of exceeding our grasp.
Interpreting Science.Arthur Fine - 1988 - PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association 1988:3 - 11.details
Using episodes in the history of the interpretation of the psi-function, this paper addresses the question of how the understanding of science sought by philosophy of science relates to the understanding sought by science itself. This leads to a conception of the discipline of philosophy of science as an historical entity. The paper concludes by drawing out the implications of that conception for our role in the humanities, and our relationship to the sciences and to ongoing scientific work.
Two things about Hilary Putnam have not changed throughout his career: some (including Putnam himself) have regarded him as a “realist” and some have seen him as a philosopherwho changed his positions (certainly with respect to realism) almost continually. Apparently, what realism meant to him in the 1960s, in the late seventies and eighties, and in the nineties, respectively, are quite different things. Putnam indicates this by changing prefixes: scientific, metaphysical, internal, pragmatic, commonsense, but always realism. Encouraged by Putnam’s own (...) attempts to distinguish his views from one time to another, his work is often regarded as split between an early period of “metaphysical realism” (his characterization) and a later and still continuing period of “internal realism”. Late Putnam is understood to be a view that insists on the primacy of our practices, while the early period is taken to be a view from outside these, a “God’s Eye view”. As Putnam himself stresses (1992b), this way of dividing his work obscures continuities, the most important of which is a continuing attempt to understand what is involved in judging practices of inquiry, like science, as being objectively correct. Thus Putnam’s early and his current work appear to have more in common than the division between “early” and “late” suggests. In fact, Putnam’s earlier writings owe much of their critical force to his adopting the pragmatic perspective of an open-minded participant in practices of empirical inquiry, a stance not explicitly articulated in these writings but rather taken simply as a matter of course.1 Thus insofaras Putnam’s early writings defend a form of representational realism, they can be regarded as attempts to articulate a realist position at work inside our ordinary practices of making empirical judgments. For this reason, we begin our review of Putnam’s realisms by extracting from the early writings a core of principles that carries over into his current work but underwent significantly different interpretations over time.. (shrink)
"But science in the making, science as an end to be pursued, is as subjective and psychologically conditioned as any other branch of human endeavor-- so much so that the question, What is the purpose and meaning of science? receives quite different answers at different times and from different sorts of people" (Einstein 1934, p. 112).
In defending NOA against some contemporary antirealisms I distinguish two antirealist camps: the epistemology inflaters, who come to their antirealism by filling up inquiry and belief formation with various warrants and principles of justification, and the semantic inflaters, or truthmongers, who come to their antirealism by exchanging truth for some epistemic notion, like ideal rational acceptablility. In parity with arguments against the correspondence theory of truth, which I see at the heart of various realisms, I argue against antirealist truthmongering in (...) two ways. One is inductive and hortative. I point to the history of failures of all past attempts at theories of truth, and try to suggest better things for philosophy to do instead. The other way is deconstructive. I examine the attempted explications of truth in the terms set by their own discourses, and try to show that they cannot actually stand on their own there. Lily Knezevich looks at this deconstructive work in her ‘Truthmongering‘ and finds it flawed by what I will call ‘Knezevich’s fallacy.’. (shrink)
In defending NOA against some contemporary antirealisms I distinguish two antirealist camps: the epistemology inflaters, who come to their antirealism by filling up inquiry and belief formation with various warrants and principles of justification, and the semantic inflaters, or truthmongers, who come to their antirealism by exchanging truth for some epistemic notion, like ideal rational acceptablility. In parity with arguments against the correspondence theory of truth, which I see at the heart of various realisms, I argue against antirealist truthmongering in (...) two ways. One is inductive and hortative. I point to the history of failures of all past attempts at theories of truth, and try to suggest better things for philosophy to do instead. The other way is deconstructive. I examine the attempted explications of truth in the terms set by their own discourses, and try to show that they cannot actually stand on their own there. Lily Knezevich looks at this deconstructive work in her ‘Truthmongering‘ and finds it flawed by what I will call ‘Knezevich’s fallacy.’. (shrink)