Peer review is a widely accepted instrument for raising the quality of science. Peer review limits the enormous unstructured influx of information and the sheer amount of dubious data, which in its absence would plunge science into chaos. In particular, peer review offers the benefit of eliminating papers that suffer from poor craftsmanship or methodological shortcomings, especially in the experimental sciences. However, we believe that peer review is not always appropriate for the evaluation of controversial hypothetical science. We argue that (...) the process of peer review can be prone to bias towards ideas that affirm the prior convictions of reviewers and against innovation and radical new ideas. Innovative hypotheses are thus highly vulnerable to being “filtered out” or made to accord with conventional wisdom by the peer review process. Consequently, having introduced peer review, the Elsevier journal Medical Hypotheses may be unable to continue its tradition as a radical journal allowing discussion of improbable or unconventional ideas. Hence we conclude by asking the publisher to consider re-introducing the system of editorial review to Medical Hypotheses. (shrink)
One concern bothering ancient and medieval philosophers is the logical worry discussed in Aristotle's De Interpretatione 9, that if future contingent propositions are true, then they are settled in a way that is incompatible with freedom. Another is if we grant God foreknowledge of future contingent events then God's foreknowledge will determine those events in a way precluding freedom. ;I begin by discussing the standard compatibilist solution to these problems as represented in Boethius's Consolation of Philosophy and then examine theories (...) that allegedly deviate from the Boethian solution. Boethius's solution to these separate problems involves showing that both problems operate on an ambiguity in the scope of the modal operator 'necessarily' present in the articulation of the problem. Once the ambiguity is removed we see that both disambiguations fail to offer a sound argument against the compatibility of free action with either God's omniscience or future contingent proposition's being true. The only difference between the solutions is that before executing the scope distinction strategy in the theological problem, Boethius reminds us that God knows future contingents rather than foreknowing them, since God is timeless. ;The rest of my discussion examines positions that allegedly deviate from the Boethian solution: positions held by Peter de Rivo, William Ockham and Plotinus. I argue that Ockham doesn't in fact deviate from the Boethian solution to the theological problem as is commonly held. Instead of offering a compatibilist position where God's omniscience includes foreknowledge, Ockham denies that God foreknows the future advocating instead a more sophisticated Boethian position. The other two philosophers, Rivo and Plotinus, deviate from Boethius, but unfortunately neither position appears philosophically plausible. Rivo's incompatibilist solution to the logical problem is inconsistent with his retention of the Boethian solution to the theological problem and is probably implausible on its own. Plotinus's compatibilist account fails not because it claims that necessity and freedom are compatible, but because the account of moral responsibility Plotinus offers to justify the compatibility fails. (shrink)
Building on self-professed perspectival approaches to both scientific knowledge and causation, I explore the potentially radical suggestion that perspectivalism can be extended to account for a type of objectivity in science. Motivated by recent claims from quantum foundations that quantum mechanics must admit the possibility of observer-dependent facts, I develop the notion of ‘perspectival objectivity’, and suggest that an easier pill to swallow, philosophically speaking, than observer-dependency is perspective-dependency, allowing for a notion of observer-independence indexed to an agent perspective. Working (...) through the case studies of colour perception and causal perspectivalism, I identify two places within which I claim perspectival objectivity is already employed, and make the connection to quantum mechanics through Bohr’s philosophy of quantum theory. I contend that perspectival objectivity can ensure, despite the possibility of perspective-dependent scientific facts, the objectivity of scientific inquiry. (shrink)
Skow ([2007]), and much more recently Callender ([2017]), argue that time can be distinguished from space due to the special role it plays in our laws of nature: our laws determine the behaviour of physical systems across time, but not across space. In this work we assess the claim that the laws of nature might provide the basis for distinguishing time from space. We find that there is an obvious reason to be sceptical of the argument Skow submits for distinguishing (...) time from space: Skow fails to pay sufficient attention to the relationship between the dynamical laws and the antecedent conditions required to establish a complete solution from the laws. Callender’s more sophisticated arguments in favour of distinguishing time from space by virtue of the laws of nature presents a much stronger basis to draw the distinction. By developing a radical reading of Callender’s view we propose a novel approach to differentiating time and space that we call temporal perspectivalism. This is the view according to which the difference between time and space is a function of the agentive perspective. (shrink)
The best case for thinking that quantum mechanics is nonlocal rests on Bell's Theorem, and later results of the same kind. However, the correlations characteristic of Einstein–Podolsky–Rosen (EPR)–Bell (EPRB) experiments also arise in familiar cases elsewhere in quantum mechanics (QM), where the two measurements involved are timelike rather than spacelike separated; and in which the correlations are usually assumed to have a local causal explanation, requiring no action-at-a-distance (AAD). It is interesting to ask how this is possible, in the light (...) of Bell's Theorem. We investigate this question, and present two options. Either (i) the new cases are nonlocal too, in which case AAD is more widespread in QM than has previously been appreciated (and does not depend on entanglement, as usually construed); or (ii) the means of avoiding AAD in the new cases extends in a natural way to EPRB, removing AAD in these cases too. There is a third option, viz., that the new cases are strongly disanalogous to EPRB. But this option requires an argument, so far missing, that the physical world breaks the symmetries which otherwise support the analogy. In the absence of such an argument, the orthodox combination of views—action-at-a-distance in EPRB, but local causality in its timelike analogue—is less well established than it is usually assumed to be. 1 Introduction1.1 Background1.2 Outline of the argument2 The Experiments2.1 Standard EPRB2.2 Sideways EPRB2.3 Comparing the experiments2.4 The need for beables3 The Symmetry Considerations3.1 The action symmetry3.2 Time-symmetry in SEPRB4 The Basic Trilemma4.1 An intuitive defence of Option III?5 Avoiding the Trilemma?6 The Classical Objection7 Defending Option III7.1 The free will argument7.2 Independence and consistency8 Entanglement and Epistemic Perspective. (shrink)
To demarcate the limits of experimental knowledge, we probe the limits of what might be called an experiment. By appeal to examples of scientific practice from astrophysics and analogue gravity, we demonstrate that the reliability of knowledge regarding certain phenomena gained from an experiment is not circumscribed by the manipulability or accessibility of the target phenomena. Rather, the limits of experimental knowledge are set by the extent to which strategies for what we call ‘inductive triangulation’ are available: that is, the (...) validation of the mode of inductive reasoning involved in the source-target inference via appeal to one or more distinct and independent modes of inductive reasoning. When such strategies are able to partially mitigate reasonable doubt, we can take a theory regarding the phenomena to be well supported by experiment. When such strategies are able to fully mitigate reasonable doubt, we can take a theory regarding the phenomena to be established by experiment. There are good reasons to expect the next generation of analogue experiments to provide genuine knowledge of unmanipulable and inaccessible phenomena such that the relevant theories can be understood as well supported. This article is part of a discussion meeting issue ‘The next generation of analogue gravity experiments’. (shrink)
Wood and Spekkens argue that any causal model explaining the EPRB correlations and satisfying the no-signalling constraint must also violate the assumption that the model faithfully reproduces the statistical dependences and independences—a so-called ‘fine-tuning’ of the causal parameters. This includes, in particular, retrocausal explanations of the EPRB correlations. I consider this analysis with a view to enumerating the possible responses an advocate of retrocausal explanations might propose. I focus on the response of Näger, who argues that the central ideas of (...) causal explanations can be saved if one accepts the possibility of a stable fine-tuning of the causal parameters. I argue that in light of this view, a violation of faithfulness does not necessarily rule out retrocausal explanations of the EPRB correlations. However, when we consider a plausible retrocausal picture in some detail, it becomes clear that the causal modelling framework is not a natural arena for representing such an account of retrocausality. _1_ Causal Models, Quantum Mechanics, and Faithfulness _2_ Fine-Tuning _2.1_ Fine-tuning in a retrocausal model _3_ Possible Responses _4_ Quantum Causal Models and Retrocausality _4.1_ A more detailed retrocausal account _4.2_ A model of the EPRB probabilities _4.3_ Mapping to a causal model _5_ Conclusion. (shrink)
This paper addresses the extent to which both Julian Barbour‘s Machian formulation of general relativity and his interpretation of canonical quantum gravity can be called timeless. We differentiate two types of timelessness in Barbour‘s (1994a, 1994b and 1999c). We argue that Barbour‘s metaphysical contention that ours is a timeless world is crucially lacking an account of the essential features of time—an account of what features our world would need to have if it were to count as being one in which (...) there is time. We attempt to provide such an account through considerations of both the representation of time in physical theory and in orthodox metaphysical analyses. We subsequently argue that Barbour‘s claim of timelessness is dubious with respect to his Machian formulation of general relativity but warranted with respect to his interpretation of canonical quantum gravity. We conclude by discussing the extent to which we should be concerned by the implications of Barbour‘s view. (shrink)
A recent series of experiments have demonstrated that a classical fluid mechanical system, constituted by an oil droplet bouncing on a vibrating fluid surface, can be induced to display a number of behaviours previously considered to be distinctly quantum. To explain this correspondence it has been suggested that the fluid mechanical system provides a single-particle classical model of de Broglie’s idiosyncratic ‘double solution’ pilot wave theory of quantum mechanics. In this paper we assess the epistemic function of the bouncing oil (...) droplet experiments in relation to quantum mechanics. We find that the bouncing oil droplets are best conceived as an analogue illustration of quantum phenomena, rather than an analogue simulation, and, furthermore, that their epistemic value should be understood in terms of how-possibly explanation, rather than confirmation. Analogue illustration, unlike analogue simulation, is not a form of ‘material surrogacy’, in which source empirical phenomena in a system of one kind can be understood as ‘standing in for’ target phenomena in a system of another kind. Rather, analogue illustration leverages a correspondence between certain empirical phenomena displayed by a source system and aspects of the ontology of a target system. On the one hand, this limits the potential inferential power of analogue illustrations, but, on the other, it widens their potential inferential scope. In particular, through analogue illustration we can learn, in the sense of gaining how-possibly understanding, about the putative ontology of a target system via an experiment. As such, the potential scientific value of these extraordinary experiments is undoubtedly a significant one. (shrink)
We apply spatialized game theory and multi-agent computational modeling as philosophical tools: (1) for assessing the primary social psychological hypothesis regarding prejudice reduction, and (2) for pursuing a deeper understanding of the basic mechanisms of prejudice reduction.
The principle of common cause asserts that positive correlations between causally unrelated events ought to be explained through the action of some shared causal factors. Reichenbachian common cause systems are probabilistic structures aimed at accounting for cases where correlations of the aforesaid sort cannot be explained through the action of a single common cause. The existence of Reichenbachian common cause systems of arbitrary finite size for each pair of non-causally correlated events was allegedly demonstrated by Hofer-Szabó and Rédei in 2006. (...) This paper shows that their proof is logically deficient, and we propose an improved proof. (shrink)
Despite attempts to apply causal modeling techniques to quantum systems, Wood and Spekkens argue that any causal model purporting to explain quantum correlations must be fine tuned; it must violate the assumption of faithfulness. This paper is an attempt to undermine the reasonableness of the assumption of faithfulness in the quantum context. Employing a symmetry relation between an entangled quantum system and a “sideways” quantum system consisting of a single photon passing sequentially through two polarizers, I argue that Wood and (...) Spekkens’s analysis applies equally to this sideways system also. As a result, we must either reject a causal explanation in this single photon system, or the sideways system must be fine tuned. If the latter, a violation of faithfulness in the ordinary entangled system may be more tolerable than first thought. Thus, extending the classical “no fine-tuning” principle of parsimony to the quantum realm may be too hasty. (shrink)
This innovative volume brings together specialists in international relations to tackle a set of difficult questions about what it means to live in a globalized world where the purpose and direction of world politics are no longer clear-cut. What emerges from these essays is a very clear sense that while we may be living in an era that lacks a single, universal purpose, ours is still a world replete with meaning. The authors in this volume stress the need for a (...) pluralistic conception of meaning in a globalized world and demonstrate how increased communication and interaction in transnational spaces work to produce complex tapestries of culture and politics. Meaning and International Relations also makes an original and convincing case for the relevance of hermeneutic approaches to understanding contemporary international relations. (shrink)
This paper provides a prospectus for a new way of thinking about the wavefunction of the universe: a Ψ-epistemic quantum cosmology. We present a proposal that, if successfully implemented, would resolve the cosmological measurement problem and simultaneously allow us to think sensibly about probability and evolution in quantum cosmology. Our analysis draws upon recent work on the problem of time in quantum gravity and causally symmet- ric local hidden variable theories. Our conclusion weighs the strengths and weaknesses of the approach (...) and points towards paths for future development. (shrink)
This thesis is a study of the notion of time in modern physics, consisting of two parts. Part I takes seriously the doctrine that modern physics should be treated as the primary guide to the nature of time. To this end, it offers an analysis of the various conceptions of time that emerge in the context of various physical theories and, furthermore, an analysis of the relation between these conceptions of time and the more orthodox philosophical views on the nature (...) of time. In Part II I explore the interpretation of nonrelativistic quantum mechanics in light of the suggestion that an overly Newtonian conception of time might be contributing to some of the difficulties that we face in interpreting the quantum mechanical formalism. In particular, I argue in favour of introducing backwards-in-time causal influences as part of an alternative conception of time that is consistent with the picture of reality that arises in the context of the quantum formalism. Moreover, I demonstrate that this conception of time can already be found in a particular formulation of classical mechanics. One might see that one of the central themes of Part II originates from a failure to heed properly the doctrine of Part I: study into the nature of time should be guided by modern physics and thus we should be careful not to insert a preconceived Newtonian conception of time unwittingly into our interpretation of the quantum mechanical formalism. Thus, whereas Part I is intended as a demonstration of methodology with respect to the study of time, Part II in a sense explores a confusion that can be seen as arising in the absence of this methodology. (shrink)
Based on months of conversations with Onassis and interviews with those who knew him, this biography reveals the complex personality of the man whose business dealings manipulated history and shook governments.
Pearl and Woodward are both well-known advocates of interventionist causation. What is less well-known is the interesting relationship between their respective accounts. In this paper we discuss the different perspectives of causation these two accounts present and show that they are two sides of the same coin. Pearl’s focus is on leveraging global network constraints to correctly identify local causal relations. The rules by which global causal structures are composed from distinct causal relations are precisely defined by the global constraints. (...) Woodward’s focus, however, is on the use of local manipulation to identify single causal relations that then compose into global causal structures. The rules by which this composition takes place emerge as a result of local interventionist constraints. We contend that the complete picture of causality to be found between these two perspectives from the interventionist tradition must recognise both the global constraints of the sort identified by Pearl and the local constraints of the sort identified by Woodward, and the interplay between them: Pearl requires the possibility of local interventions and Woodward requires a global statistical framework within which to build composite causal structures. (shrink)
The different interpretations of quark mixing involved in weak interaction processes in the Standard Model and the Generation Model are discussed with a view to obtaining a physical understanding of the Cabibbo angle and related quantities. It is proposed that hadrons are composed of mixed-quark states, with the quark mixing parameters being determined by the Cabibbo-Kobayashi-Maskawa matrix elements. In this model, protons and neutrons contain a contribution of about 5% and 10%, respectively, of strange valency quarks.
Evaluates trade-offs and uncertainties inherent in achieving sustainable energy, analyzes the major energy technologies, and provides a framework for assessing policy options.
Henry Wellman and colleagues have provided evidence of a robust developmental progression in theory-of-mind (or as we will say, “mindreading”) abilities, using verbal tasks. Understanding diverse desires is said to be easier than understanding diverse beliefs, which is easier than understanding that lack of perceptual access issues in ignorance, which is easier than understanding false belief, which is easier than understanding that people can hide their true emotions. These findings present a challenge to nativists about mindreading, and are said to (...) support a social-constructivist account of mindreading development instead. This article takes up the challenge on behalf of nativism. Our goal is to show that the mindreading-scale findings fail to support constructivism because well-motivated alternative hypotheses have not yet been controlled for and ruled out. These have to do with the pragmatic demands of verbal tasks. (shrink)
Erratum to: Book Symposium on Peter Paul Verbeek’s Moralizing Technology: Understanding and Designing the Morality of Things . Chicago: University of Chicago Press, 2011 Content Type Journal Article Category Erratum Pages 1-27 DOI 10.1007/s13347-011-0058-z Authors Evan Selinger, Dept. Philosophy, Rochester Institute of Technology, Rochester, NY, USA Don Ihde, Dept. Philosophy, Stony Brook University, Stony Brook, NY, USA Ibo van de Poel, Delft University of Technology, Delft, the Netherlands Martin Peterson, Eindhoven University of Technology, Eindhoven, the Netherlands Peter-Paul Verbeek, (...) Dept. Philosophy, Twente University, Enschede, the Netherlands Journal Philosophy & Technology Online ISSN 2210-5441 Print ISSN 2210-5433. (shrink)
He was about five feet eight inches tall, rather thin, and for the last thirty or so years of his life sported a bushy beard and moustache, fashionable for the time. His pleasing low-pitched voice, ideal for conversation, did not carry well to large audiences, and although he was much in demand as a public speaker he rarely spoke from the floor at faculty or professional meetings. As a young man, within the family or with close friends, he was frequently (...) the source and centre of fun, vying with his father in devising practical jokes or in generating lively argument. Like his father he was the victim of his moods, and his own wife and children had much to contend with; typically, he assigned the hour of his evening meal to student consultation, and would refuse to see invited guests if he suddenly felt antisocial. He hated what he called ‘loutish’ informality in dress, and the American way of eating boiled eggs; he loved bright neckties, animals and hill walking. He had no exotic tastes in food, avoided tea and coffee, and drank no alcohol—one of his brothers became an alcoholic, like their father in his younger days. From his early twenties until the end of his life he experienced, and perhaps savoured, a series of physical and mental depressions; remarkably, so did his father, his four brothers, and even more dramatically, his sister. (shrink)
This article gives two arguments for believing that our society is unknowingly guilty of serious, large-scale wrongdoing. First is an inductive argument: most other societies, in history and in the world today, have been unknowingly guilty of serious wrongdoing, so ours probably is too. Second is a disjunctive argument: there are a large number of distinct ways in which our practices could turn out to be horribly wrong, so even if no particular hypothesized moral mistake strikes us as very likely, (...) the disjunction of all such mistakes should receive significant credence. The article then discusses what our society should do in light of the likelihood that we are doing something seriously wrong: we should regard intellectual progress, of the sort that will allow us to find and correct our moral mistakes as soon as possible, as an urgent moral priority rather than as a mere luxury; and we should also consider it important to save resources and cultivate flexibility, so that when the time comes to change our policies we will be able to do so quickly and smoothly. (shrink)
Volume 2 covers one of the richest eras for the philosophical study of religion. Covering the period from the 6th century to the Renaissance, this volume shows how Christian, Islamic and Jewish thinkers explicated and defended their religious faith in light of the philosophical traditions they inherited from the ancient Greeks and Romans. The enterprise of 'faith seeking understanding', as it was dubbed by the medievals themselves, emerges as a vibrant encounter between - and a complex synthesis of - the (...) Platonic, Aristotelian and Hellenistic traditions of antiquity on the one hand, and the scholastic and monastic religious schools of the medieval West, on the other. (shrink)
Objective tests of olfaction are widely available to aid in the assessment of olfaction. Their clearest role is in the characterization of olfactory changes, either reported by or suspected in a patient. There is a rapidly growing literature concerned with the association of olfactory changes with certain neuropsychiatric conditions and the use of olfactory testing to supplement conventional assessments in clinical and research practice is evolving. Neural pathways important for olfactory processing overlap extensively with pathways important for cognitive functioning, and (...) especially those important for executive functioning, many of which are concentrated in the frontal lobes. Previous work has identified associations between performance on certain olfactory tests and executive functioning and behavioral measures. More recently, similar associations have also been identified in non-clinical samples, raising new questions as to the utility of olfactory test scores as proxy measures for non-olfactory phenomena. In this systemic review, we sought to identify studies, both clinical and non-clinical, that investigated the associations of olfaction with performance on tasks sensitive to frontal lobe functioning. Our search criteria led to the identification of 70 studies published in English. We examined in detail and tabulated the data from these studies, highlighted each study's key findings, and critically evaluated these studies. We use the results of this review to reflect on some of the current and future challenges concerning the use of olfactory testing in clinical neuropsychiatric practice and research and speculate on the potential benefits of administering phonemic fluency in combination with olfactory testing to enhance its predictive value. (shrink)
I supply an argument for Evans's principle that whatever justifies me in believing that p also justifies me in believing that I believe that p. I show how this principle helps explain how I come to know my own beliefs in a way that normally makes me the best authority on them. Then I show how the principle helps to solve Moore's paradoxes.