Using ideas from evolution and postformal stages of hierarchical complexity, a hypothetical scenario, premised on genetic engineering advances, portrays the development of a new humanoid species, Superions. How would Superions impact and treat current humans? If the Superion scenario came to pass, it would be the ultimate genocidal terrorism of eliminating an entire species, Homo Sapiens. We speculate about defenses Homo Sapiens might mount. The tasks to relate two species (systems) constitutes a postformal, Metasystematic task. Developing a system of discourse (...) to prevent destruction requires postformal Paradigmatic-stage tasks. Implications are twofold: species survival and sufficient evolution to survive. (shrink)
In the 1970s and 1980s, Simon Blackburn published a number of much-discussed works in which he argued that the supervenience of the moral on the natural generates a serious problem for moral realism, a problem which his own brand of moral projectivism can avoid. As we will see below, Blackburn construed moral supervenience in terms of what is known as weak supervenience. Partly in response to Blackburn, a number of philosophers have argued that weak supervenience is too weak to capture (...) the intuitive sense in which the moral supervenes on the natural. Instead, it is argued, we should opt for strong supervenience, and, further, that strong supervenience completely disables Blackburn's argument against moral realism. This idea – that strong supervenience undermines Blackburn's argument – is very common in recent metaethics, and the purpose of the present paper is to develop a challenge to this near-orthodoxy. (shrink)
Widowhood is common in old age, can be accompanied by serious health consequences and is often linked to substantial changes in social network. Little is known about the impact of social isolation on the development of depressive symptoms over time taking widowhood into account. We provide results from the follow-up 5 to follow-up 9 from the longitudinal study AgeCoDe and its follow-up study AgeQualiDe. Depression was measured with GDS-15 and social isolation was assessed using the Lubben Social Network Scale. The (...) group was aligned of married and widowed people in old age and education through entropy balancing. Linear mixed models were used to examine the frequency of occurrence of depressive symptoms for widowed and married elderly people depending on the risk of social isolation. Our study shows that widowhood alone does not lead to an increased occurrence of depressive symptoms. However, "widowed oldest old", who are also at risk of social isolation, have significantly more depressive symptoms than those without risk. In the group of "married oldest old", women have significantly more depressive symptoms than men, but isolated and non-isolated do not differ. Especially for people who have lost a spouse, the social network changes significantly and increases the risk for social isolation. This represents a risk factor for the occurrence of depressive symptoms. (shrink)
In this article, I discuss how Samuel Stanhope Smith advanced Reidian themes in his moral philosophy and examine their reception by Presbyterian revivalists Ashbel Green, Samuel Miller, and Archibald Alexander. Smith, seventh president and moral philosophy professor of the College of New Jersey (1779–1812), has received marginal scholarly attention regarding his moral philosophy and rational theology, in comparison to his predecessor John Witherspoon. As an early American philosopher who drew on the ideals of the Scottish Enlightenment including Common (...) Sense philosophy, Smith faced heightened scrutiny from American revivalists regarding the danger his epistemology presented to the institution of religion. The Scottish School of Common Sense was widely praised and applied in nineteenth-century American moral philosophy, but before the more general American acceptance of Common Sense, Smith already appealed to Reidian themes in his methodology and treatment of external sensations, internal sensations, intellectual powers, and active powers of the human mind. In this paper, I argue that Smith's use of Reidian themes for grooming his student's morality conflicted with the educational expectations from revivalists on Princeton's board of trustees who demanded more attention on orthodox theology. I identify Smith's notions of causation, liberty, and the moral faculty as primary reasons for this tension over Princeton's educational purpose during the first decade of the nineteenth century. (shrink)
This new edition of AlexanderMiller’s highly readable introduction to contemporary metaethics provides a critical overview of the main arguments and themes in twentieth- and twenty-first-century contemporary metaethics. Miller traces the development of contemporary debates in metaethics from their beginnings in the work of G. E. Moore up to the most recent arguments between naturalism and non-naturalism, cognitivism and non-cognitivism. From Moore’s attack on ethical naturalism, A. J. Ayer’s emotivism and Simon Blackburn’s quasi-realism to anti-realist and best (...) opinion accounts of moral truth and the non-reductionist naturalism of the ‘Cornell realists’, this book addresses all the key theories and ideas in this field. As well as revisiting the whole terrain with revised and updated guides to further reading, Miller also introduces major new sections on the revolutionary fictionalism of Richard Joyce and the hermeneutic fictionalism of Mark Kalderon. The new edition will continue to be essential reading for students, teachers and professional philosophers with an interest in contemporary metaethics. (shrink)
In Eckhart, Heidegger, and the Imperative of Releasement, Ian Alexander Moore investigates Martin Heidegger’s use of releasement. Moore argues that this conceptual development was greatly influenced by Meister Eckhart’s thought. In addition to their shared use of releasement, Moore suggests, both Heidegger and Eckhart share similar philosophical strategies. The task of Moore’s monograph is to illuminate how releasement functions in Heidegger’s work and to argue that Eckhart was one of Heidegger’s central influences. This review examines Moore’s method for assessing (...) the function of releasement in Heidegger and Eckhart’s thought, while noting the distinctive and compelling aspects of this monograph. (shrink)
In this paper, I argue for three main claims. First, that there are two broad sorts of error theory about a particular region of thought and talk, eliminativist error theories and non-eliminativist error theories. Second, that an error theory about rule following can only be an eliminativist view of rule following, and therefore an eliminativist view of meaning and content on a par with Paul Churchland’s prima facie implausible eliminativism about the propositional attitudes. Third, that despite some superficial appearances to (...) the contrary, non-eliminativist error theory does not provide a plausible vehicle for understanding the ‘sceptical solution’ to the sceptical paradox about rule - following developed in Saul Kripke’s Wittgenstein on Rules and Private Language. (shrink)
The rule-following debate, in its concern with the metaphysics and epistemology of linguistic meaning and mental content, goes to the heart of the most fundamental questions of contemporary philosophy of mind and language. This volume gathers together the most important contributions to the topic, including papers by Simon Blackburn, Paul Boghossian, Graeme Forbes, Warren Goldfarb, Paul Horwich, John McDowell, Colin McGinn, Ruth Millikan, Philip Pettit, George Wilson, and José Zalabardo. This debate has centred on Saul Kripke's reading of the rule-following (...) sections in Wittgenstein and his consequent posing of a "sceptical paradox" that threatens our every day notions of linguistic meaning and mental content. These essays are attempts to respond to this challenge and represent some of the most important work in contemporary theory of meaning. They examine the notion of meaning; whether it is possible to find a suitable meaning-constituting fact from our previous behaviour or mental histories; objections to, and defenses of, dispositional accounts of meaning; the plausibility of non-factualism about meaning; our attempts to develop non-reductionist accounts of meaning; and the sources of the normativity which attaches to meaning, such as the linguistic practice of the community or the dispositions of the individual. With an introductory essay and a comprehensive guide to further reading the book is an excellent resource for courses in philosophy of mind, philosophy of language, Wittgenstein, and metaphysics, as well as for all philosophers, linguists, and cognitive scientists with interests in these areas. Contributors include Simon Blackburn, Paul Boghossian, Graeme Forbes, Warren Goldfarb, Paul Horwich, John McDowell, Colin McGinn, AlexanderMiller, Ruth Garrett Millikan, Philip Pettit, George. M. Wilson, Crispin Wright, and José L. Zalabardo. (shrink)
I discuss the role of translatability in philosophical justification. I begin by discussing and defending Thomas Reid’s account of the role that facts about comparative linguistics can play in philosophical justification. Reid believes that common sense offers a reliable but defeasible form of justification. We cannot know by introspection, however, which of our judgments belong to common sense. Judgments of common sense are universal, and so he argues that the strongest evidence that a judgment is a part of common sense (...) is that it is to be found in all languages. For Reid, then, evidence that a certain distinction is to be found in all languages is evidence that the distinction is part of common sense rather than being a common local prejudice. From such a perspective, empirical work in comparative linguistics can play a defeasible justificatory role in philosophical arguments. I contrast Reid’s position with the more radical position of defenders of the Natural Semantic Metalanguage approach, such as Anna Wierzbicka, who argues that only judgments that are translatable into all natural languages are justifiable. I show how such a position is rooted in an implausible view, although one common among cognitive scientists and linguistics, about the nature of concepts, which does not allow for novel concepts. (shrink)
Common ancestry is a central feature of the theory of evolution, yet it is not clear what “common ancestry” actually means; nor is it clear how it is related to other terms such as “the Tree of Life” and “the last universal common ancestor”. I argue these terms describe three distinct hypotheses ordered in a logical way: that there is a Tree of Life is a claim about the pattern of evolutionary history, that there is a last universal common ancestor (...) is an ontological claim about the existence of an entity of a specific kind, and that there is universal common ancestry is a claim about a causal pattern in the history of life. With these generalizations in mind, I argue that the existence of a Tree of Life entails a last universal common ancestor, which would entail universal common ancestry, but neither of the converse entailments hold. This allows us to make sense of the debates surrounding the Tree, as well as our lack of knowledge about the last universal common ancestor, while still maintaining the uncontroversial truth of universal common ancestry. (shrink)
This volume provides a survey of contemporary philosophy of language. As well as providing a synoptic view of the key issues, figures, concepts and debates, each essay makes new and original contributions to ongoing debate.
Starting with Gottlob Frege's foundational theories of sense and reference, Miller provides a useful introduction to the formal logic used in all subsequent philosophy of language. He communicates a sense of active philosophical debate by confronting the views of the early theorists concerned with building systematic theories - such as Frege, Bertrand Russell, and the logical positivists - with the attacks mounted by sceptics - such as W.O. Quine, Saul Kripke, and Ludwig Wittgenstein. This leads to important excursions into (...) related areas of metaphysics, philosophy of mind, and cognitive science that present the more recent attempts to save the notions of sense and meaning by philosophers such as Paul Grice, John Searle, Jerry Fodor, Colin McGinn, and Crispin Wright. Miller then returns to the systematic program by examining the formal theories of Donald Davidson, concluding with a chapter surveying the relevance of philosophy of language to the broader metaphysical debates between realists and anti-realists. Miller's clear, engaged, and coherently structured approach makes Philosophy of Language an ideal text for undergraduate courses. The guides to further reading provided in each chapter help the reader pursue interesting topics further and facilitate using the book in conjunction with primary sources. (shrink)
In chapter 8 of Miller 2003, I argued against the idea that Jackson and Pettit's notion of program explanation might help Sturgeon's non-reductive naturalist version of moral realism respond to the explanatory challenge posed by Harman. In a recent paper in the AJP[Nelson 2006, Mark Nelson has attempted to defend the idea that program explanation might prove useful to Sturgeon in replying to Harman. In this note, I suggest that Nelson's argument fails.
This paper is the product of an interdisciplinary, interreligious dialogue aiming to outline some of the possibilities and rational limits of supernatural religious belief, in the light of a critique of David Hume’s familiar sceptical arguments -- including a rejection of his famous Maxim on miracles -- combined with a range of striking recent empirical research. The Humean nexus leads us to the formulation of a new ”Common-Core/Diversity Dilemma’, which suggests that the contradictions between different religious belief systems, in conjunction (...) with new understandings of the cognitive forces that shape their common features, persuasively challenge the rationality of most kinds of supernatural belief. In support of this conclusion, we survey empirical research concerning intercessory prayer, religious experience, near-death experience, and various cognitive biases. But we then go on to consider evidence that supernaturalism -- even when rationally unwarranted -- has significant beneficial individual and social effects, despite others that are far less desirable. This prompts the formulation of a ”Normal/Objective Dilemma’, identifying important trade-offs to be found in the choice between our humanly evolved ”normal’ outlook on the world, and one that is more rational and ”objective’. Can we retain the pragmatic benefits of supernatural belief while avoiding irrationality and intergroup conflict? It may well seem that rationality is incompatible with any wilful sacrifice of objectivity. But in a situation of uncertainty, an attractive compromise may be available by moving from the competing factions and mutual contradictions of ”first-order’ supernaturalism to a more abstract and tolerant ”second-order’ view, which itself can be given some distinctive intellectual support through the increasingly popular Fine Tuning Argument. We end by proposing a ”Maxim of the Moon’ to express the undogmatic spirit of this second-order religiosity, providing a cautionary metaphor to counter the pervasive bias endemic to the human condition, and offering a more cooperation- and humility-enhancing understanding of religious diversity in a tense and precarious globalised age. (shrink)
Emergency departments are challenging research settings, where truly informed consent can be difficult to obtain. A deeper understanding of emergency medical patients’ opinions about research is needed. We conducted a systematic review and meta-summary of quantitative and qualitative studies on which values, attitudes, or beliefs of emergent medical research participants influence research participation. We included studies of adults that investigated opinions toward emergency medicine research participation. We excluded studies focused on the association between demographics or consent document features and participation (...) and those focused on non-emergency research. In August 2011, we searched the following databases: MEDLINE, EMBASE, Google Scholar, Scirus, PsycINFO, AgeLine and Global Health. Titles, abstracts and then full manuscripts were independently evaluated by two reviewers. Disagreements were resolved by consensus and adjudicated by a third author. Studies were evaluated for bias using standardised scores. We report themes associated with participation or refusal. Our initial search produced over 1800 articles. A total of 44 articles were extracted for full-manuscript analysis, and 14 were retained based on our eligibility criteria. Among factors favouring participation, altruism and personal health benefit had the highest frequency. Mistrust of researchers, feeling like a ‘guinea pig’ and risk were leading factors favouring refusal. Many studies noted limitations of informed consent processes in emergent conditions. We conclude that highlighting the benefits to the participant and society, mitigating risk and increasing public trust may increase research participation in emergency medical research. New methods for conducting informed consent in such studies are needed. (shrink)
This engaging and accessible introduction to the philosophy of language provides an important guide to one of the liveliest and most challenging areas of study in philosophy. Interweaving the historical development of the subject with a thematic overview of the different approaches to meaning, the book provides students with the tools necessary to understand contemporary analytical philosophy. The second edition includes new material on: Chomsky, Wittgenstein and Davidson as well as new chapters on the causal theory of reference, possible worlds (...) semantics and semantic externalism. (shrink)
My initial hope when I first saw Miller’s book was that here at least would be a work which satisfies the long standing need for a comprehensive introduction to contemporary metaethics which is accessible enough to be employed in advanced undergraduate courses and introductory graduate seminars. This hope was only partially realized, however, as Miller ends up oscillating between clear presentations of extant debates in the recent literature and his own extended attempts to determine where the truth of (...) the matter lies. The result is an interesting book that likely will appeal both to those looking for a classroom text in metaethics as well as to experts on the relevant issues. (shrink)
BackgroundThe U.S. Food and Drug Administration traditionally has kept confidential significant amounts of information relevant to the approval or non-approval of specific drugs, devices, and biologics and about the regulatory status of such medical products in FDA’s pipeline.ObjectiveTo develop practical recommendations for FDA to improve its transparency to the public that FDA could implement by rulemaking or other regulatory processes without further congressional authorization. These recommendations would build on the work of FDA’s Transparency Task Force in 2010.MethodsIn 2016-2017, we convened (...) a team of academic faculty from Harvard Medical School, Brigham and Women’s Hospital, Yale Medical School, Yale Law School, and Johns Hopkins Bloomberg School of Public Health to develop recommendations through an iterative process of reviewing FDA’s practices, considering the legal and policy constraints on FDA in expanding transparency, and obtaining insights from independent observers of FDA.ResultsThe team developed 18 specific recommendations for improving FDA’s transparency to the public. FDA could adopt all these recommendations without further congressional action.FundingThe development of the Blueprint for Transparency at the U.S. Food and Drug Administration was funded by the Laura and John Arnold Foundation. (shrink)
We show 13 stages of the development of tool-use and tool making during different eras in the evolution of Homo sapiens. We used the NeoPiagetian Model of Hierarchical Complexity rather than Piaget's. We distinguished the use of existing methods imitated or learned from others, from doing such a task on one's own.
This paper explores the prospects for using the notion of a primitive normative attitude in responding to the sceptical argument about meaning developed in chapter 2 of Saul Kripke’s Wittgenstein on Rules and Private Language. It takes as its stalking-horse the response to Kripke’s Wittgenstein developed in a recent series of important works by Hannah Ginsborg. The paper concludes that Ginsborg’s attempted solution fails for a number of reasons: it depends on an inadequate response to Kripke’s Wittgenstein’s ‘finitude’ objection to (...) reductive dispositionalism; it erroneously rejects the idea that a speaker’s understanding of an expression guides her use; it threatens to collapse into either full-blown non-reductionism or reductive dispositionalism; and there is no motive for accepting it over forms of non-reductionism such as those developed by Barry Stroud and John McDowell. (shrink)
This paper is concerned with the relationship between the metaphysical doctrine of realism about the external world and semantic realism, as characterised by Michael Dummett. I argue that Dummett's conception of the relationship is flawed, and that Crispin Wright's account of the relationship, although designed to avoid the problems which beset Dummett's, nevertheless fails for similar reasons. I then aim to show that despite the fact that Dummett and Wright both fail to give a plausible account of the relationship between (...) semantic realism and the metaphysical doctrine of realism, the semantic issue and the metaphysical issue are importantly related. I outline the precise sense in which the evaluation of semantic realism is relevant to the evaluation of realism about the external world, a sense overlooked by opponents of Dummett, such as Simon Blackburn and Michael Devitt. I finish with some brief remarks on metaphysics, semantics, and the nature of philosophy, and suggest that Dummett's arguments against semantic realism can retain their relevance to metaphysical debate even if we reject Dummett's idea that the theory of meaning is thefoundation of all philosophy. (shrink)
Given that natural selection is so powerful at optimizing complex adaptations, why does it seem unable to eliminate genes (susceptibility alleles) that predispose to common, harmful, heritable mental disorders, such as schizophrenia or bipolar disorder? We assess three leading explanations for this apparent paradox from evolutionary genetic theory: (1) ancestral neutrality (susceptibility alleles were not harmful among ancestors), (2) balancing selection (susceptibility alleles sometimes increased fitness), and (3) polygenic mutation-selection balance (mental disorders reflect the inevitable mutational load on the thousands (...) of genes underlying human behavior). The first two explanations are commonly assumed in psychiatric genetics and Darwinian psychiatry, while mutation-selection has often been discounted. All three models can explain persistent genetic variance in some traits under some conditions, but the first two have serious problems in explaining human mental disorders. Ancestral neutrality fails to explain low mental disorder frequencies and requires implausibly small selection coefficients against mental disorders given the data on the reproductive costs and impairment of mental disorders. Balancing selection (including spatio-temporal variation in selection, heterozygote advantage, antagonistic pleiotropy, and frequency-dependent selection) tends to favor environmentally contingent adaptations (which would show no heritability) or high-frequency alleles (which psychiatric genetics would have already found). Only polygenic mutation-selection balance seems consistent with the data on mental disorder prevalence rates, fitness costs, the likely rarity of susceptibility alleles, and the increased risks of mental disorders with brain trauma, inbreeding, and paternal age. This evolutionary genetic framework for mental disorders has wide-ranging implications for psychology, psychiatry, behavior genetics, molecular genetics, and evolutionary approaches to studying human behavior. (Published Online November 9 2006) Key Words: adaptation; behavior genetics; Darwinian psychiatry; evolution; evolutionary genetics; evolutionary psychology; mental disorders; mutation-selection balance; psychiatric genetics; quantitative trait loci (QTL). (shrink)
We use the primitive ontology framework of Allori et al. to analyze the quantum information-theoretic interpretation of Bub and Pitowsky. There are interesting parallels between the two approaches, which differentiate them both from the more standard realist interpretations of quantum theory. Where they differ, however, is in terms of their commitments to an underlying ontology on which the manifest image of the world supervenes. Employing the primitive ontology framework in this way makes perspicuous the differences between the quantum information-theoretic interpretation, (...) and the various realist interpretations of quantum theory. It also allows us to identify a sense in which the commitments of quantum information-theoretic interpretation are underspecified. Several possible ways of completing the interpretation are presented, and it is suggested that the most likely strategy would leave the information-theoretic interpretation such that it would fail to qualify as a theory, according to the primitive ontology approach. (shrink)