If a proposition is typically true, given your evidence, then you should believe that proposition; or so I argue here. In particular, in this paper, I propose and defend a principle of rationality---call it the `Typical Principle'---which links rational belief to facts about what is typical. As I show, this principle avoids several problems that other, seemingly similar principles face. And as I show, in many cases, this principle implies the verdicts of the Principal Principle: so ultimately, the Typical Principle (...) may be the more fundamental of the two. (shrink)
This paper is concerned with the causally symmetric version of the familiar de Broglie–Bohm interpretation, this version allowing the spacelike nonlocality and the configuration space ontology of the original model to be avoided via the addition of retrocausality. Two different features of this alternative formulation are considered here. With regard to probabilities, it is shown that the model provides a derivation of the Born rule identical to that in Bohm’s original formulation. This derivation holds just as well for a many-particle, (...) entangled state as for a single particle. With regard to “certainties”, the description of a particle’s spin is examined within the model and it is seen that a statistical description is no longer necessary once final boundary conditions are specified in addition to the usual initial state, with the particle then possessing a definite value for every spin component at intermediate times. These values are consistent with being the components of a single, underlying spin vector. The case of a two-particle entangled spin state is also examined and it is found that, due to the retrocausal aspect, each particle possesses its own definite spin during the entanglement, independent of the other particle. In formulating this picture, it is demonstrated how such a realistic model can preserve Lorentz invariance in the face of Bell’s theorem and avoid the need for a preferred reference frame. (shrink)
In this paper, I reconstruct the development and the complex character of Zilsel’s conception of scientific laws. This concept functions as a fil rouge for understanding Zilsel’s philosophy throughout different times (here, the focus is on his Viennese writings and how they pave the way to the more renown American ones) and across his many fields of work (from physics to politics). A good decade before Heisenberg’s uncertainty principle was going to mark the outbreak of indeterminism in quantum physics, Edgar (...) Zilsel started to develop a complex logical-philosophical theory in which statistical and causal laws were given an indeterministic foundation (Zilsel 1916). However, in developing his thoughts on the emergence of regularities from disorder, Zilsel arrives at a profound ambiguity with respect to the ontological or the epistemic nature of laws and order in the world: Whether this order is to be conceived of as an empirical finding or as the product of reason – this would have to remain unclear. This tension between rationalism and empiricism, as well as a tension between a realist and an anti-realist conception of lawfulness, can be identified in both Zilsel’s Viennese and American writings: a tension which touches the core of the “application problem” that would keep haunting Zilsel until his premature death. (shrink)
This article traces the origins of Kenneth Wilson's conception of effective field theories (EFTs) in the 1960s. I argue that what really made the difference in Wilson's path to his first prototype of EFT are his long-standing pragmatic aspirations and methodological commitments. Wilson's primary interest was to work on mathematically interesting physical problems and he thought that progress could be made by treating them as if they could be analyzed in principle by a sufficiently powerful computer. The first point explains (...) why he had no qualms about twisting the structure of field theories; the second why he divided the state-space of a toy model field theory into continuous slices by following a standard divide-and-conquer algorithmic strategy instead of working directly with a fully discretized and finite theory. I also show how Wilson's prototype bears the mark of these aspirations and commitments and clear up a few striking ironies along the way. (shrink)
Changes within observable reality at the lowest level of reality seem to occur in accordance with the probability theory in mathematics. It is quite remarkable that nature itself has chosen the probability theory to arrange all the changes within the structure of the basic quantum fields. This rises a question about the distribution of properties in space and time. DOI: 10.5281/zenodo.5515861.
A disposition or dispositional property is a capacity, ability, or potential to display or exhibit some outcome. Evolvability refers to a disposition to evolve. This chapter discusses why the dispositional nature of evolvability matters—why philosophical distinctions about dispositions can have scientific implications. To that end, we build a conceptual toolkit with vocabulary from prior philosophical analyses using a different disposition: protein foldability. We then apply this toolkit to address several methodological questions related to evolvability. What entities are the bearers of (...) evolvability? What features causally contribute to the disposition of evolvability? How does evolvability manifest? The various possible answers to these questions available from philosophical distinctions suggest key implications for why the concept of evolvability as a disposition is useful in evolutionary research. These include (1) securing scientific virtues (e.g., explanatory depth and generalization, prediction or retrodiction, and control or manipulation) and (2) fostering interdisciplinary collaboration through the coordination of definitional diversity and different types of inquiry. Together these facilitate concentration on a variety of research questions at different levels of organization and on distinct time scales, all of which should be expected for a complex dispositional property such as evolvability. (shrink)
This paper develops a Fragmentalist theory of Presentism and shows how it can help to develop a interpretation of quantum mechanics. There are several fragmental interpretations of physics. In the interpretation of this paper, each quantum system forms a fragment, and fragment f1 makes a measurement on fragment f2 if and only if f2 makes a corresponding measurement on f1. The main idea is then that each fragment has its own present (or ‘now’) until a mutual quantum measurement—at which time (...) they come (‘become’) to share the same ‘now’. The theory of time developed here will make use of both McTaggart’s A-series (in the form of future-present-past) and B-series (earlier-times to later-times). An example of an application is that a Bell pair of electrons does not take on definite spin values until measurement because the measuring system and the Bell pair do not share the same present (‘now’) until mutual quantum measurement, i.e. until they ‘become’ to share the same A-series. Before that point the ‘now’ of the opposing system is not in the reference system’s fragment. Relativistic no-signaling is preserved within each fragment, which will turn out to be sufficient for the general case. Several issues in the foundations of quantum mechanics are canvassed, including Schrodinger’s cat, the Born rule, modifications to Minkowski space that accommodate both the A-series and the B-series, and entropy. (shrink)
Ian Hacking (born in 1936, Vancouver, British Columbia) is most well-known for his work in the philosophy of the natural and social sciences, but his contributions to philosophy are broad, spanning many areas and traditions. In his detailed case studies of the development of probabilistic and statistical reasoning, Hacking pioneered the naturalistic approach in the philosophy of science. Hacking’s research on social constructionism, transient mental illnesses, and the looping effect of the human kinds make use of historical materials to shed (...) light on how developments in the social, medical, and behavioural sciences have shaped our contemporary conceptions of identity and agency. Hacking’s other contributions to philosophy include his work on the philosophy of mathematics (Hacking, 2014), philosophy of statistics, philosophy of logic, inductive logic (Hacking, 1965, 1979, 2001) and natural kinds (Hacking, 1991, 2007a). (shrink)
This book seeks to offer original answers to all the major open questions in epistemology—as indicated by the book’s title. These questions and answers arise organically in the course of a validation of the entire corpus of human knowledge. The book explains how we know what we know, and how well we know it. The author presents a positive theory, motivated and directed at every step not by a need to reply to skeptics or subjectivists, but by the need of (...) a rational individual to know the world. -/- The author draws heavily from Ayn Rand’s theory of concepts as presented in her book, Introduction to Objectivist Epistemology, but departs from her epistemology—especially as explicated by leading exponents of her philosophy—in important ways. Areas of departure include the perception of entities and the role of probability theory in epistemology. -/- A main idea in the book is to begin the validation of the law of causality by identifying that existence causes our consciousness of it. Another key idea is this: The very fact that we perceive entities is the most basic and most extensive evidence that causal forces and causal interactions are regular. -/- Applying Ayn Rand’s principle that characteristics are ranges of measurement, the book offers a philosophical validation of “nonparametric predictive inference,” in turn providing an objective basis for prior probabilities in Bayesian analysis. -/- This book is primarily for a specialized readership—familiar with epistemological ideas and with Ayn Rand’s theory of concepts in particular. The last third of the book uses probability theory and mathematics somewhat beyond basic calculus, but less mathematical explanations are also included to make the general ideas accessible to lay readers. (shrink)
This article has been written about the explanation of the scientific affair. There are the philosophical circles that a philosopher must consider their approaches. Postmodern thinkers generally refuse the universality of the rational affair. They believe that the experience cannot reach general knowledge. They emphasize on the partial and plural knowledge. Any human being has his knowledge and interpretation. The world is always becoming. Diversity is an inclusive epistemological principle. Naturally, in such a state, the scientific activity is a non-sense (...) process. The postmodern world is a post-scientific world. Another collection of approaches to science belongs to the analytical philosophy. The main part of analytical philosophy context centers around language. Language is the center of the world. We can study the world through language. Language is the possibility of knowledge. If we overcome the language games, we can study the world accurately. Generally, at present, it seems that the rational thoughts are marginal points in the sea of irrationality. We can probably talk about an anarchistic epistemological situation. Another issue is the role of observation to form a scientific activity. In this situation, the question is whether the observation is the technique of gathering information or the final reference of original scientific analysis. In relation to this issue, the confrontation of science with the families and the categories of the objects is the main problem. For more than two thousand years, the totalitarian heritage of the Aristotelian logic has been the guideline of categorizations for the human being. In order to reach an inclusive and applicable result, we should categorize everything; but it is not really known that if the method of the world categorization is the same or not. The accordance of the conceptual categories with their extensions in the world is one of the main problems which has malfunctioned the scientific system. It is better to say that it has deactivated science. Moreover, there are many interpretations for each event based on its occurrence conditions. For example, consider a fever; In order to diagnose its cause, the most significant factor is the role of a physician. There is no sense in saying that the observation has diagnosed the cause of the fever. This is the observer that plays a determinative role. Another problem is the disputes between realism and instrumentalism. Instrumentalists believe that to form a theory, the concepts are just the tools of scientific researches. Based on realism, but we confront the concepts in the original way. This chaos causes the scientific system to collapse. Now the question is what the solution is. I would like to propose my scheme that is constructed based on a main principle: there is no occurrence in the world unless we can trace the effect of consciousness back in that occurrence. In other words, there is no accident in the world. One can argue that it is the main problem that the effects of consciousness ---if there is such a concept--- is not traceable in the world. An immediate response to this problem is an inefficient technique . The observation is a technique . It is not the method of interpretation. Some people argue that the empirical approach cannot lead us to an authentic criterion to know the world. Because there is no criterion for truth, and if there is probably such a criterion, we can obtain it via several ways. The solution to the first problem is simple. Any occurrence is a truth; even we suppose it as a partial truth. Therefore we can say we face a multi-facial world. It can be the solution of the second problem: if consciousness is essentially the origin of the world and all occurrences, we face consciousness as a unique truth; even its creations contradict each other. If there is no consciousness as a creator of the world and we cannot trace it back in all existents, then what should we do? If there is consciousness, why its creations contradict each other. Since consciousness is a will, it is able to decide in any situation it intends. The issue of contradictory consciousness intentions has a teleological aspect. We cannot judge its teleological substance at present. If we are able to believe that there is no consciousness at all, it will contradict the main practical principle in our lives. Every action that we take in our lives is conjoined with consciousness. We learn many things from many sources. The substance of our instincts and emotions can be questioned. It is true that we cannot actually analyze our instincts and emotions, but we can discuss explicitly about the rational elements of the instincts. What are these elements? These elements are the quality of their appearance. The quality of instinct appearance is different among beings. The strength and the depth and length of these irrational qualities are different in all beings. What factors or factor cause the differences among these aspects? There is certainly a determinative factor. The same factor which determines the occurrence of any event. Even if we believe in partial knowledge, we must accept the confrontation of the occurrences. Thus it is not possible to refuse the occurrences as the reference unit of knowledge. In spite of the difference, every occurrence, even as a single occurrence, happens and we can regard it as the reference unit of knowledge. Then we can consider every occurrence as an occurred object. When an occurrence happens, then it is a determined object. Everywhere there is a determination, there is certainly a determiner. Why do not we encounter a multi-determinative consciousness? Because there are a few aspects of the unique super-consciousness. These aspects are motion and growth and the amount of constituent material in every being. In all over the world, the difference in these factors creates all differences. Thus we do not encounter the plural consciousness. Is the world birth accidental? Actually, there is no accident. The accident has no meaning in the world. As explained above, the constructing element of the world is the occurrence. An occurrence can be analyzed. Therefore, we should not talk about the accident until we are not able to analyze all phenomena. We can look at every subject from different points of view at present. Thus there is theoretically no possibility to say that a certain matter is the consequence of the accident. The accident is a popular and inexact concept that has no philosophical content. Moreover, the definition of consciousness versus its contradiction causes the collapse of all epistemological axioms. We can ask why we must not define other logical principles against their contradictions. Thinkers who deny the role of logic usually ignore the role of logic. They have not been able to deny the logical function to form an epistemological system because it is not an easy task. In this article some other issues were discussed, such as the consideration of restriction as a stable and firm foundation of knowledge instead of the Descartes cogito. In addition to these philosophical points some other issues were propounded such as the role of instrumental mechanism to form the new science and the great alterations in philosophy and history of Europe. Many issues threaten human civilization. The only way to deal with such threats is certainly to use scientific facilities. The scientific possibilities undoubtedly come from a complete and inclusive scientific theory. The present article is an attempt to explain the scientific affair in order to develop human achievement, however small. (shrink)
The motivation comes from the analogy (equivalence?) of the A-series to ontologically private qualia in Dualism. This leads to the proposal that two quantum systems, no matter how small, mutually observe each other when and only when they come to share the same A-series. McTaggart's A-series and B-series can be varied independently so they cannot be the same temporal variable.
In this thought-provoking book, Richard Healey proposes a new interpretation of quantum theory inspired by pragmatist philosophy. Healey puts forward the interpretation as an alternative to realist quantum theories on the one hand such as Bohmian mechanics, spontaneous collapse theories, and many-worlds interpretations, which are different proposals for describing what the quantum world is like and what the basic laws of physics are, and non-realist interpretations on the other hand such as quantum Bayesianism, which proposes to understand quantum theory as (...) describing agents' subjective epistemic states. The central idea of Healey's proposal is to understand quantum theory as providing not a description of the physical world but a set of authoritative and objectively correct prescriptions about how agents should act. The book provides a detailed development and defense of that idea, and it contains interesting discussions about a wide range of philosophical issues such as representation, probability, explanation, causation, objectivity, meaning, and fundamentality. Healey's project is at the intersection of physics and philosophy. The book is divided into two parts. Part I of the book discusses the foundational questions in quantum theory from the perspective of the prescriptive interpretation. In Part II, Healey discusses the philosophical implications of the view. Both parts are written in a way that is largely accessible to non-specialists. In this brief book review, I will focus on two questions: (1) How does Healey's idea work? (2) What reasons are there to believe in it? (shrink)
This continues from Part 1. It is shown how an intensional interpretation of physics object languages can be formalised, and how a syntactic compositional time reversal operator can subsequently be defined. This is applied to solve the problems used as examples in Part 1. A proof of a general theorem that such an operator must be defineable is sketched. A number of related issues about the interpretation of theories of physics, including classical and quantum mechanics and classical EM theory are (...) discussed. (shrink)
The principle of common cause asserts that positive correlations between causally unrelated events ought to be explained through the action of some shared causal factors. Reichenbachian common cause systems are probabilistic structures aimed at accounting for cases where correlations of the aforesaid sort cannot be explained through the action of a single common cause. The existence of Reichenbachian common cause systems of arbitrary finite size for each pair of non-causally correlated events was allegedly demonstrated by Hofer-Szabó and Rédei in 2006. (...) This paper shows that their proof is logically deficient, and we propose an improved proof. (shrink)
Rather than entailing that a particular outcome will occur, many scientific theories only entail that an outcome will occur with a certain probability. Because scientific evidence inevitably falls short of conclusive proof, when choosing between different theories it is standard to make reference to how probable the various options are in light of the evidence. A full understanding of probability in science needs to address both the role of probabilities in theories, or chances, as well as the role of probabilistic (...) judgment in theory choice. In this chapter, the author introduces and distinguishes the two sorts of probability from one another and attempt to offer a satisfactory characterization of how the different uses for probability in science are to be understood. A closing section turns to the question of how views about the chance of some outcome should guide our confidence in that outcome. (shrink)
We clarify the significance of quasiprobability in quantum mechanics that is relevant in describing physical quantities associated with a transition process. Our basic quantity is Aharonov’s weak value, from which the QP can be defined up to a certain ambiguity parameterized by a complex number. Unlike the conventional probability, the QP allows us to treat two noncommuting observables consistently, and this is utilized to embed the QP in Bohmian mechanics such that its equivalence to quantum mechanics becomes more transparent. We (...) also show that, with the help of the QP, Bohmian mechanics can be recognized as an ontological model with a certain type of contextuality. (shrink)
One implication of Bell’s theorem is that there cannot in general be hidden variable models for quantum mechanics that both are noncontextual and retain the structure of a classical probability space. Thus, some hidden variable programs aim to retain noncontextuality at the cost of using a generalization of the Kolmogorov probability axioms. We generalize a theorem of Feintzeig to show that such programs are committed to the existence of a finite null cover for some quantum mechanical experiments, i.e., a finite (...) collection of probability zero events whose disjunction exhausts the space of experimental possibilities. (shrink)
The paper argues that the formulation of quantum mechanics proposed by Ghirardi, Rimini and Weber is a serious candidate for being a fundamental physical theory and explores its ontological commitments from this perspective. In particular, we propose to conceive of spatial superpositions of non-massless microsystems as dispositions or powers, more precisely propensities, to generate spontaneous localizations. We set out five reasons for this view, namely that it provides for a clear sense in which quantum systems in entangled states possess properties (...) even in the absence of definite values; it vindicates objective, single-case probabilities; it yields a clear transition from quantum to classical properties; it enables to draw a clear distinction between purely mathematical and physical structures, and it grounds the arrow of time in the time-irreversible manifestation of the propensities to localize. (shrink)
In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts (...) circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability. (shrink)
This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions within the one-vector formalism. In turn, considerations of forwards (...) and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival. (shrink)
Though Béla von Juhos belonged to a Hungarian family, he was born in Vienna and, after his ninth year, lived there for the rest of his life. Though associated with the Vienna Circle, he did not assume a teaching position in Vienna until 1948. The present collection, ably translated by Paul Foulkes and introduced by Gerhard Frey, focuses on the type of epistemological analysis of scientific knowledge that remained Juhos’s abiding concern. By the mid-nineteen-thirties the pristine positivism of the early (...) Vienna Circle had been compromised by the switch from phenomenalism to physicalism and by expanding the early emphasis on logical syntax to include Tarski’s semantics. Juhos refused to accept such changes and remained to the end the purest, though hardly the most original, of the logical positivists. His development is of some interest in showing how the potentiality of the original program could be fulfilled, something that will merely be indicated here. (shrink)
Through the mental experiment that I suggest, it is possiblc to demonstrate that Hugh Everett’s quantum interpretation, known as of the “many universes”, is incongruent with the special theory of relativity.
According to the Everett interpretation, branching structure and ratios of norms of branch amplitudes are the objective correlates of chance events and chances; that is, 'chance' and 'chancing', like 'red' and 'colour', pick out objective features of reality, albeit not what they seemed. Once properly identified, questions about how and in what sense chances can be observed can be treated as straightforward dynamical questions. On that basis, given the unitary dynamics of quantum theory, it follows that relative and never absolute (...) chances can be observed; that only on repetition of a large numbers of similar trials can relative probabilities be measured; and so on. The epistemology of objective chances can in this way be worked out from the dynamics. its curious features are thus explained. But one aspect of chance set-ups seems to resist this subsuming of chancing to branching: how is it that chance involves uncertainty? And if that is not possible, on Everettian lines, then the whole project is doomed. I argue that in fact there is no difficulty in making sense of uncertainty in the face of branching. Contrary to initial impressions, the unitary formalism is consistent with a well-defined notion of self-locating uncertainty. It is also consistent without: the mathematics under-determines the metaphysics in these respects. (shrink)
In the absence of a fundamental theory that precisely predicts values for observable parameters, anthropic reasoning attempts to constrain probability distributions over those parameters in order to facilitate the extraction of testable predictions. The utility of this approach has been vigorously debated of late, particularly in light of theories that claim we live in a multiverse, where parameters may take differing values in regions lying outside our observable horizon. Within this cosmological framework, we investigate the efficacy of top-down anthropic reasoning (...) based on the weak anthropic principle. We argue contrary to recent claims that it is not clear one can either dispense with notions of typicality altogether or presume typicality, in comparing resulting probability distributions with observations. We show in a concrete, top-down setting related to dark matter, that assumptions about typicality can dramatically affect predictions, thereby providing a guide to how errors in reasoning regarding typicality translate to errors in the assessment of predictive power. We conjecture that this dependence on typicality is an integral feature of anthropic reasoning in broader cosmological contexts, and argue in favour of the explicit inclusion of measures of typicality in schemes invoking anthropic reasoning, with a view to extracting predictions from multiverse scenarios. (shrink)
We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.Argumentamos que la teoría cuántica no admite una generalización de la noción clásica de probabilidad condicionada. Mostramos que la probabilidad definida por la regla de Lüders, interpretada generalmente como la regla de condicionalización mecánico-cuántica, no puede ser interpretada como (...) tal. (shrink)
A non-monotonic theory of probability is put forward and shown to have applicability in the quantum domain. It is obtained simply by replacing Kolmogorov's positivity axiom, which places the lower bound for probabilities at zero, with an axiom that reduces that lower bound to minus one. Kolmogorov's theory of probability is monotonic, meaning that the probability of A is less then or equal to that of B whenever A entails B. The new theory violates monotonicity, as its name suggests; yet, (...) many standard theorems are also theorems of the new theory since Kolmogorov's other axioms are retained. What is of particular interest is that the new theory can accommodate quantum phenomena (photon polarization experiments) while preserving Boolean operations, unlike Kolmogorov's theory. Although non-standard notions of probability have been discussed extensively in the physics literature, they have received very little attention in the philosophical literature. One likely explanation for that difference is that their applicability is typically demonstrated in esoteric settings that involve technical complications. That barrier is effectively removed for non-monotonic probability theory by providing it with a homely setting in the quantum domain. Although the initial steps taken in this paper are quite substantial, there is much else to be done, such as demonstrating the applicability of non-monotonic probability theory to other quantum systems and elaborating the interpretive framework that is provisionally put forward here. Such matters will be developed in other works. (shrink)
Recently advocates of the propensity interpretation of fitness have turned critics. To accommodate examples from the population genetics literature they conclude that fitness is better defined broadly as a family of propensities rather than the propensity to contribute descendants to some future generation. We argue that the propensity theorists have misunderstood the deeper ramifications of the examples they cite. These examples demonstrate why there are factors outside of propensities that determine fitness. We go on to argue for the more general (...) thesis that no account of fitness can satisfy the desiderata that have motivated the propensity account. (shrink)
Although it has become a common place to refer to the ׳sixth problem׳ of Hilbert׳s (1900) Paris lecture as the starting point for modern axiomatized probability theory, his own views on probability have received comparatively little explicit attention. The central aim of this paper is to provide a detailed account of this topic in light of the central observation that the development of Hilbert׳s project of the axiomatization of physics went hand-in-hand with a redefinition of the status of probability theory (...) and the meaning of probability. Where Hilbert first regarded the theory as a mathematizable physical discipline and later approached it as a ׳vague׳ mathematical application in physics, he eventually understood probability, first, as a feature of human thought and, then, as an implicitly defined concept without a fixed physical interpretation. It thus becomes possible to suggest that Hilbert came to question, from the early 1920s on, the very possibility of achieving the goal of the axiomatization of probability as described in the ׳sixth problem׳ of 1900. (shrink)
The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the ‘prediction interpretation’ of probability. The consequences of that definition are discussed. The “ladder”-structure of the probability calculus (...) is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction. (shrink)
Climate change adaptation is largely a local matter, and adaptation planning can benefit from local climate change projections. Such projections are typically generated by accepting climate model outputs in a relatively uncritical way. We argue, based on the IPCC’s treatment of model outputs from the CMIP5 ensemble, that this approach is unwarranted and that subjective expert judgment should play a central role in the provision of local climate change projections intended to support decision-making.
I present a proof of the quantum probability rule from decision-theoretic assumptions, in the context of the Everett interpretation. The basic ideas behind the proof are those presented in Deutsch's recent proof of the probability rule, but the proof is simpler and proceeds from weaker decision-theoretic assumptions. This makes it easier to discuss the conceptual ideas involved in the proof, and to show that they are defensible.
Einstein made several attempts to argue for the incompleteness of quantum mechanics, not all of them using a separation principle. One unpublished example, the box parable, has received increased attention in the recent literature. Though the example is tailor-made for applying a separation principle and Einstein indeed applies one, he begins his discussion without it. An analysis of this first part of the parable naturally leads to an argument for incompleteness not involving a separation principle. I discuss the argument and (...) its systematic import. Though it should be kept in mind that the argument is not the one Einstein intends, I show how it suggests itself and leads to a conflict between QM’s completeness and a physical principle more fundamental than the separation principle, i.e. a principle saying that QM should deliver probabilities for physical systems possessing properties at definite times. (shrink)
In this review article we present different formal frameworks for the description of generalized probabilities in statistical theories. We discuss the particular cases of probabilities appearing in classical and quantum mechanics, possible generalizations of the approaches of A. N. Kolmogorov and R. T. Cox to non-commutative models, and the approach to generalized probabilities based on convex sets.
I review the role of probability in contemporary physics and the origin of probabilistic time asymmetry, beginning with the pre-quantum case but concentrating on quantum theory. I argue that quantum mechanics radically changes the pre-quantum situation and that the philosophical nature of objective probability in physics, and of probabilistic asymmetry in time, is dependent on the correct resolution of the quantum measurement problem.
We are inundated by scientific and statistical information, but what should we believe? How much should we trust the polls on the latest electoral campaign? When a physician tells us that a diagnosis of cancer is 90% certain or a scientist informs us that recent studies support global warming, what should we conclude? How can we acquire reliable statistical information? Once we have it, how do we evaluate it? Despite the importance of these questions to our lives, many of us (...) have only a vague idea of how to answer them. In this admirably clear and engaging book, Mark Battersby provides a practical guide to thinking critically about scientific and statistical information. The goal of the book is not only to explain how to identify misleading statistical information, but also to give readers the understanding necessary to evaluate and use statistical and statistically based scientific information in their own decision making. (shrink)
Quantum Bayesianism, or QBism, is a recent development of the epistemic view of quantum states, according to which the state vector represents knowledge about a quantum system, rather than the true state of the system. QBism explicitly adopts the subjective view of probability, wherein probability assignments express an agent’s personal degrees of belief about an event. QBists claim that most if not all conceptual problems of quantum mechanics vanish if we simply take a proper epistemic and probabilistic perspective. Although this (...) judgement is largely subjective and logically consistent, I explain why I do not share it. (shrink)
Alternative theories to quantum mechanics motivate important fundamental tests of our understanding and descriptions of the smallest physical systems. Here, using spontaneous parametric downconversion as a heralded single-photon source, we place experimental limits on a class of alternative theories, consisting of classical field theories which result in power-dependent normalized correlation functions. In addition, we compare our results with standard quantum mechanical interpretations of our spontaneous parametric downconversion source over an order of magnitude in intensity. Our data match the quantum mechanical (...) expectations, and do not show a statistically significant dependence on power, limiting quantum mechanics alternatives which require power-dependent autocorrelation functions. (shrink)
We give a few results concerning the notions of causal completability and causal closedness of classical probability spaces . We prove that any classical probability space has a causally closed extension; any finite classical probability space with positive rational probabilities on the atoms of the event algebra can be extended to a causally up-to-three-closed finite space; and any classical probability space can be extended to a space in which all correlations between events that are logically independent modulo measure zero event (...) have a countably infinite common-cause system. Collectively, these results show that it is surprisingly easy to find Reichenbach-style ‘explanations' for correlations, underlining doubts as to whether this approach can yield a philosophically relevant account of causality. 1 Introduction2 Basic Definitions and Results in the Literature3 Causal Completability the Easy Way: ‘Splitting the Atom’4 Causal Completability of Classical Probability Spaces: The General Case5 Infinite Statistical Common-Cause Systems for Arbitrary Pairs6 ConclusionAppendix A. (shrink)
In order to figure out why quantum physics needs the complex Hilbert space, many attempts have been made to distinguish the C*-algebras and von Neumann algebras in more general classes of abstractly defined Jordan algebras . One particularly important distinguishing property was identified by Alfsen and Shultz and is the existence of a dynamical correspondence. It reproduces the dual role of the selfadjoint operators as observables and generators of dynamical groups in quantum mechanics. In the paper, this concept is extended (...) to another class of nonassociative algebras, arising from recent studies of the quantum logics with a conditional probability calculus and particularly of those that rule out third-order interference. The conditional probability calculus is a mathematical model of the Lüders–von Neumann quantum measurement process, and third-order interference is a property of the conditional probabilities which was discovered by Sorkin and which is ruled out by quantum mechanics. It is shown then that the postulates that a dynamical correspondence exists and that the square of any algebra element is positive still characterize, in the class considered, those algebras that emerge from the selfadjoint parts of C*-algebras equipped with the Jordan product. Within this class, the two postulates thus result in ordinary quantum mechanics using the complex Hilbert space or, vice versa, a genuine generalization of quantum theory must omit at least one of them. (shrink)
Bertand’s paradox is a fundamental problem in probability that casts doubt on the applicability of the indifference principle by showing that it may yield contradictory results, depending on the meaning assigned to “randomness”. Jaynes claimed that symmetry requirements solve the paradox by selecting a unique solution to the problem. I show that this is not the case and that every variant obtained from the principle of indifference can also be obtained from Jaynes’ principle of transformation groups. This is because the (...) same symmetries can be mathematically implemented in different ways, depending on the procedure of random selection that one uses. I describe a simple experiment that supports a result from symmetry arguments, but the solution is different from Jaynes’. Jaynes’ method is thus best seen as a tool to obtain probability distributions when the principle of indifference is inconvenient, but it cannot resolve ambiguities inherent in the use of that principle and still depends on explicitly defining the selection procedure. (shrink)