A key challenge in experimental social science research is the incentivisation of subjects such that they take the tasks presented to them seriously and answer honestly. If subject responses can be evaluated against an objective baseline, a standard way of incentivising participants is by rewarding them monetarily as a function of their performance. However, the subject area of experimental philosophy is such that this mode of incentivisation is not applicable as participant responses cannot easily be scored along a true-false spectrum (...) by the experimenters. We claim that experimental philosophers’ neglect of and claims of unimportance about incentivisation mechanisms in their surveys and experiments has plausibly led to poorer data quality and worse conclusions drawn overall, potentially threatening the research programme of experimental philosophy in the long run. As a solution to this, we propose the adoption of the Bayesian Truth Serum, an incentive-compatible mechanism used in economics and marketing, designed for eliciting honest responding in subjective data designs by rewarding participant answers that are surprisingly common. We argue that the Bayesian Truth Serum adequately addresses the issue of incentive compatibility in subjective data research designs and that it should be applied to the vast majority of research in experimental philosophy. Further, we provide an empirical application of the method, demonstrating its qualified impact on the distribution of answers on a number of standard experimental philosophy items and outline guidance for researchers aiming to apply this mechanism in future research by specifying the additional costs and design steps involved. (shrink)
In this paper, we draw attention to the epistemological assumptions of market liberalism and standpoint theory and argue that they have more in common than previously thought. We show that both traditions draw on a similar epistemological bedrock, specifically relating to the fragmentation of knowledge in society and the fact that some of this knowledge cannot easily be shared between agents. We go on to investigate how market liberals and standpoint theorists argue with recourse to these similar foundations, and sometimes (...) diverge, primarily because of normative pre-commitments. One conclusion we draw from this is that these similarities suggest that market liberals ought to, by their own epistemological lights, be more attentive towards various problems raised by feminist standpoint theorists, and feminist standpoint theorists ought to be more open to various claims made by market liberals. (shrink)
Jean-Luc Nancy discusses his life's work with Pierre-Philippe Jandin. As Nancy looks back on his philosophical texts, he thinks anew about democracy, community, jouissance, love, Christianity, and the arts.
Avec un titre comme Luther et la philosophie, depuis le xviiie siècle et dans les milieux « libéraux » du xixe siècle, on aurait pu s’attendre à un exposé, bien sûr complet, de la philosophie du Réformateur. On trouve l’expression, par exemple, dans les tables analytiques de L’Encyclopédie, à l’entrée « luthéranisme ». Bien que Philippe Büttgen se soit donné comme objet, pour d’autres travaux, « la confessionnalisation de la philosophie ..
In current debates, many philosophers of science have sympathies for the project of introducing a new approach to the scientific realism debate that forges a middle way between traditional forms of scientific realism and anti-realism. One promising approach is perspectivism. Although different proponents of perspectivism differ in their respective characterizations of perspectivism, the common idea is that scientific knowledge is necessarily partial and incomplete. Perspectivism is a new position in current debates but it does have its forerunners. Figures that are (...) typically mentioned in this context include Dewey, Feyerabend, Leibniz, Kant, Kuhn, and Putnam. Interestingly, to my knowledge, there exists no work that discusses similarities to the phenomenological tradition. This is surprising because here one can find systematically similar ideas and even a very similar terminology. It is startling because early modern physics was noticeably influenced by phenomenological ideas. And it is unfortunate because the analysis of perspectival approaches in the phenomenological tradition can help us to achieve a more nuanced understanding of different forms of perspectivism. The main objective of this paper is to show that in the phenomenological tradition one finds a well-elaborated philosophy of science that shares important similarities with current versions of perspectivism. Engaging with the phenomenological tradition is also of systematic value since it helps us to gain a better understanding of the distinctive claims of perspectivism and to distinguish various grades of perspectivism. (shrink)
A great mathematician and teacher, and a physicist and philosopher in his own right, bridges the gap between science and the humanities in this exposition of the philosophy of science. He traces the history of science from Aristotle to Einstein to illustrate philosophy's ongoing role in the scientific process. In this volume he explains modern technology's gradual erosion of the rapport between physical theories and philosophical systems, and offers suggestions for restoring the link between these related areas. This book is (...) suitable for undergraduate students and other readers. 1962 ed. Index. 36 figures. (shrink)
Ariès traces Western man's attitudes toward mortality from the early medieval conception of death as the familiar collective destiny of the human race to the modern tendency, so pronounced in industrial societies, to hide death as if it were an embarrassing family secret.
The concept of the cortical column refers to vertical cell bands with similar response properties, which were initially observed by Vernon Mountcastle’s mapping of single cell recordings in the cat somatic cortex. It has subsequently guided over 50 years of neuroscientific research, in which fundamental questions about the modularity of the cortex and basic principles of sensory information processing were empirically investigated. Nevertheless, the status of the column remains controversial today, as skeptical commentators proclaim that the vertical cell bands are (...) a functionally insignificant by-product of ontogenetic development. This paper inquires how the column came to be viewed as an elementary unit of the cortex from Mountcastle’s discovery in 1955 until David Hubel and Torsten Wiesel’s reception of the Nobel Prize in 1981. I first argue that Mountcastle’s vertical electrode recordings served as criteria for applying the column concept to electrophysiological data. In contrast to previous authors, I claim that this move from electrophysiological data to the phenomenon of columnar responses was concept-laden, but not theory-laden. In the second part of the paper, I argue that Mountcastle’s criteria provided Hubel Wiesel with a conceptual outlook, i.e. it allowed them to anticipate columnar patterns in the cat and macaque visual cortex. I argue that in the late 1970s, this outlook only briefly took a form that one could call a ‘theory’ of the cerebral cortex, before new experimental techniques started to diversify column research. I end by showing how this account of early column research fits into a larger project that follows the conceptual development of the column into the present. (shrink)
This work has been selected by scholars as being culturally important and is part of the knowledge base of civilization as we know it. This work is in the public domain in the United States of America, and possibly other nations. Within the United States, you may freely copy and distribute this work, as no entity (individual or corporate) has a copyright on the body of the work. Scholars believe, and we concur, that this work is important enough to be (...) preserved, reproduced, and made generally available to the public. To ensure a quality reading experience, this work has been proofread and republished using a format that seamlessly blends the original graphical elements with text in an easy-to-read typeface. We appreciate your support of the preservation process, and thank you for being an important part of keeping this knowledge alive and relevant. (shrink)
Abstract: Economists are accustomed to distinguishing between a positive and a normative component of their work, a distinction that is peculiar to their field, having no exact counterpart in the other social sciences. The distinction has substantially changed over time, and the different ways of understanding it today are reflective of its history. Our objective is to trace the origins and initial forms of the distinction, from the English classical political economy of the first half of the 19th century to (...) the emergence of welfare economics in the first half of the 20th century. This sequential account will also serve to identify the main representative positions along with the arguments used to support them, and it thus prepares the ground for a discussion that will be less historical and more strictly conceptual. -/- Résumé : Les économistes ont coutume de distinguer entre une composante positive et une composante normative de leurs travaux, ce qui est une singularité de leur discipline, car cette distinction n'a pas de répondant exact dans les autres sciences sociales. Elle a fortement évolué au cours du temps et les différentes manières de la concevoir aujourd'hui en reflètent l'histoire. On se propose ici d'en retracer les origines et les premières formes, de l'économie politique classique anglaise de la première moitié du XIXe siècle jusqu'à l'apparition de l'économie du bien-être dans la première moitié du XXe siècle. Ce parcours séquentiel vise aussi à identifier les positions les plus représentatives et les arguments invoqués pour les soutenir, en préparant ainsi une discussion qui serait moins historique et plus strictement conceptuelle. (shrink)
This book aims to make the pragmatist intellectual framework accessible to organization and management scholars. It presents some fundamental concepts of Pragmatism, their potential application to the study of organizations and the resulting theoretical, methodological, and practical issues.
This paper proposes an analysis of the discursive dynamics of high-impact concepts in the humanities. These are concepts whose formation and development have a lasting and wide-ranging effect on research and our understanding of discursive reality in general. The notion of a conceptual practice, based on a normative conception of practice, is introduced, and practices are identified, on this perspective, according to the way their respective performances are held mutually accountable. This normative conception of practices is then combined with recent (...) work from philosophy of science that characterizes concepts in terms of conceptual capacities that are productive, open-ended, and applicable beyond the original context they were developed in. It is shown that the formation of concepts can be identified by changes in how practitioners hold exercise of their conceptual capacities accountable when producing knowledge about a phenomenon. In a manner similar to the use of operational definitions in scientific practices, such concepts can also be used to intervene in various discourses within or outside the conceptual practice. Using the formation of the concepts “mechanism” and “performative” as examples, the paper shows how high-impact concepts reconfigure what is at issue and at stake in conceptual practices. As philosophy and other humanities disciplines are its domain of interest, it is a contribution to the methodology of the humanities. (shrink)
Upon learning that John C. Harsanyi was awarded the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, in 1994, for his pioneering work in game theory, few economists probably questioned the appropriateness of that choice. The Budapest-born social scientist had already been recognized as a first-rank contributor to non-cooperative game theory for some time. However, as many readers of this journal will be aware, Harsanyi first contributed to welfare economics, not game theory. More importantly, he was (...) philosophically minded and accordingly has been “acknowledged as the most influential philosopher in economics”. 1 This is of some significance since, before Harsanyi became acquainted with economics around 1950, his main interest was philosophy and, to a lesser extent, sociology and psychology. Rather than an economist with philosophical leanings, Harsanyi was actually a philosopher turned economist. (shrink)
Philippe Rochat's FINITUDE is a rumination on time and self-consciousness. It is built around the premise that finitude and separation form the human self-conscious reality of time. It argues that we need to reclaim time from current theories in physics that tend to debunk time as an illusion, or state that time simply does not exist. This thought-provoking book considers how, from a human psychological and existential standpoint, time is very real. It examines how we make sense of such reality (...) in human development and in comparison to other living creatures. The book explores how we represent time and live with it. It tries to capture the essence of time in our self-conscious mind. If we opt to live for as long as possible and knowing that it is going to end, how should we exist? FINITUDE contemplates this most serious psychological question. It considers the developmental origins of human subjectivity, the foundations of our sense of being alive and the explicit awareness of existing in finite time. It deals with how we live and represent our finite time, how we construe and archive in memory the events of our life, how we project ourselves into the future, and how we are all constrained to knowingly exist in finite time Offering an overarching understanding of concepts, above and beyond the methodological details, this book will be an essential reading for all advanced students and researchers interested in the psychology of time, and the development of self. (shrink)
Aristoteles' Ethik basiert auf der These, dass sich Güter als Strebensziele begreifen lassen. Die vorliegende Arbeit soll dabei helfen, diese These besser zu verstehen. Sie untersucht die Voraussetzungen und die Konsequenzen der teleologischen Konzeption des Guten. Der Gemeinplatz von der Aristotelischen "Strebensethik" wird neu beleuchtet. Als Ausgangspunkt dient eine genaue Lektüre der ersten Kapitel der Nikomachischen Ethik. Hier wird deutlich, dass Aristoteles einer teleologischen Güterkonzeption kritischer gegenübersteht, als üblicherweise angenommen wird. Die Gleichsetzung von Gütern und Zielen bietet zwar den Zugang (...) zur Bestimmung des Glücks; sie ist aber keine Definition des Guten. Aristoteles geht vielmehr davon aus, dass die als Ziele aufgefassten Güter in relevanter Hinsicht verschieden sind. Wie aber kann man dieser Verschiedenheit gerecht werden, ohne die Identifikation von Gütern und Zielen aufzugeben? Die Arbeit zeigt, dass sich wesentliche Bestandteile der Nikomachischen Ethik auf genau diese Frage beziehen lassen. Dazu gehören etwa das "ergon-Argument" und die Einführung des Tugendhaften als "Maßstab" des in Wahrheit Guten. Auf diese Weise wird eine Antwort auf einige Deutungsprobleme gegeben, die die Debatten um diese Schrift nachhaltig geprägt haben. Außerdem eröffnet sich eine andere Sicht auf das Projekt, das Aristoteles in seiner Ethik verfolgt. (shrink)
This paper argues that besides mechanistic explanations, there is a kind of explanation that relies upon “topological” properties of systems in order to derive the explanandum as a consequence, and which does not consider mechanisms or causal processes. I first investigate topological explanations in the case of ecological research on the stability of ecosystems. Then I contrast them with mechanistic explanations, thereby distinguishing the kind of realization they involve from the realization relations entailed by mechanistic explanations, and explain how both (...) kinds of explanations may be articulated in practice. The second section, expanding on the case of ecological stability, considers the phenomenon of robustness at all levels of the biological hierarchy in order to show that topological explanations are indeed pervasive there. Reasons are suggested for this, in which “neutral network” explanations are singled out as a form of topological explanation that spans across many levels. Finally, I appeal to the distinction of explanatory regimes to cast light on a controversy in philosophy of biology, the issue of contingence in evolution, which is shown to essentially involve issues about realization. (shrink)
Was verbindet Hammer, Pinsel und Geige? Werkzeuge und Instrumente vermitteln zwischen menschlichem Körper und Materie. So genießen diese Objekte eine genuine Gemeinsamkeit, und doch gründet gerade in der Differenz beider Begriffe die abendländische Unterscheidung zwischen handwerklichen und künstlerischen bzw. musikalischen oder wissenschaftlichen Tätigkeiten. Die Beiträge des achten Bandes der Hamburger Forschungen zur Kunstgeschichte nehmen Werkzeuge und Instrumente aus einer kunsthistorischen Perspektive und im interdisziplinären Dialog in den Blick. Das Augenmerk liegt gleichermaßen auf den Techniken ihrer Handhabung, ihrer Diskursivierung in Kritik (...) und Theorie sowie ihrer Darstellung im Bild. Mit Beiträgen von: Gotlind Birkle, Martine Clouzot, Philippe Cordez, Gottfried Korff, Matthias Krüger, François Lamy, Katja Müller-Helle, Ulrich Pfisterer, Albrecht Pohlmann, François Poplin, Julia Ann Saviello, Monika Wagner. (shrink)
ABSTRACTWhat is the relation between ethical reflection and moral behavior? Does professional reflection on ethical issues positively impact moral behaviors? To address these questions, Schwitzgebel and Rust empirically investigated if philosophy professors engaged with ethics on a professional basis behave any morally better or, at least, more consistently with their expressed values than do non-ethicist professors. Findings from their original US-based sample indicated that neither is the case, suggesting that there is no positive influence of ethical reflection on moral action. (...) In the study at hand, we attempted to cross-validate this pattern of results in the German-speaking countries and surveyed 417 professors using a replication-extension research design. Our results indicate a successful replication of the original effect that ethicists do not behave any morally better compared to other academics across the vast majority of normative issues. Yet, unlike the original study, we found mixed results o... (shrink)
Motivated by results of Bagaria, Magidor and Väänänen, we study characterizations of large cardinal properties through reflection principles for classes of structures. More specifically, we aim to...
Today, one out of every six children suffers from some form of neurodevelopmental abnormality. The causes are mostly unknown. Some environmental chemicals are known to cause brain damage and many more are suspected of it, but few have been tested for such effects. Philippe Grandjean provides an authoritative and engaging analysis of how environmental hazards can damage brain development and what we can do about it. The brain's development is uniquely sensitive to toxic chemicals, and even small deficits may negatively (...) impact our academic achievements, economic success, risk of delinquency, and quality of life. Chemicals such as mercury, polychlorinated biphenyls, arsenic, and certain pesticides pose an insidious threat to the development of the next generation's brains. When chemicals in the environment affect the development of a child's brain, he or she is at risk for mental retardation, cerebral palsy, autism, ADHD, and a range of learning disabilities and other deficits that will remain for a lifetime.We can halt chemical brain drain and protect the next generation, however, and Grandjean tells us how. First, we need to control all of the 200 industrial chemicals that have already been proven to affect brain functions in adults, as their effects on the developing brain are likely even worse. We must also push for routine testing for brain toxicity, stricter regulation of chemical emissions, and more required disclosure on the part of industries who unleash hazardous chemicals into products and the environment. Decisions can still be made to protect the brains of future generations."In his crisply written, deeply documented book, Dr. Philippe Grandjean, renowned physician and public health specialist, describes the exquisite vulnerability of the developing human brain to toxic chemicals in the environment, a vulnerability that he ascribes to the brain's almost unimaginable complexity. Today, nearly one in 6 children is born with a neurodevelopmental disorder - a birth defect of the brain. One in 8 has attention deficit disorder. One in 68 is diagnosed with autism spectrum disorder. These rates are far higher than those of a generation ago, and, although they are less publicized, the problems are more prevalent than those caused by thalidomide in the 1960's. The increases are far too rapid to be genetic. They cannot be explained by better diagnosis. How then could they have come to be? Dr. Grandjean has a diagnosis -- the thousands of toxic chemicals that have been released to the environment in the past 40 years with no testing for toxicity. David P. Rall, former Director of the US National Institute of Environmental Health Sciences, once stated that 'If thalidomide had caused a ten-point loss of IQ rather than obvious birth defects of the limbs, it would probably still be on the market'. This is the core message of Dr. Grandjean's 'must read' book." - Philip J. Landrigan, Dean for Global Health, Ethel H. Wise Professor and Chairman and Director, Children's Environmental Health Center, Mount Sinai School of Medicine. (shrink)
Philippe Pinel (1745–1826) is often said to be the father of modern clinical psychiatry. He is most famous for being a committed pioneer and advocate of humanitarian methods in the treatment of the mentally ill, and for the development of a mode of psychological therapy known as moral treatment. Pinel also made important contributions to nosology and the diagnosis and treatment of mental disorder, especially the psychopathology of affectivity, stressing the role of the passions in mental disorder. Pinel also conducted (...) what may be considered one of the first large‐scale clinical trials in psychiatry and was also arguably the first to introduce the new statistical methods of the time to that domain. (shrink)
William Alston’s argument against the deontological conception of epistemic justification is a classic—and much debated—piece of contemporary epistemology. At the heart of Alston’s argument, however, lies a very simple mistake which, surprisingly, appears to have gone unnoticed in the vast literature now devoted to the argument. After having shown why some of the standard responses to Alston’s argument don’t work, we elucidate the mistake and offer a hypothesis as to why it has escaped attention.
In 1968, Michel Foucault agreed to a series of interviews with critic Claude Bonnefoy, which were to be published in book form. Bonnefoy wanted a dialogue with Foucault about his relationship to writing rather than about the content of his books. The project was abandoned, but a transcript of the initial interview survived and is now being published for the first time in English. In this brief and lively exchange, Foucault reflects on how he approached the written word throughout his (...) life, from his school days to his discovery of the pleasure of writing. Wide ranging, characteristically insightful, and unexpectedly autobiographical, the discussion is revelatory of Foucault’s intellectual development, his aims as a writer, his clinical methodology, and his interest in other authors, including Raymond Roussel and Antonin Artaud. Foucault discloses, in ways he never had previously, details about his home life, his family history, and the profound sense of obligation he feels to the act of writing. In his Introduction, Philippe Artières investigates Foucault’s engagement in various forms of oral discourse—lectures, speeches, debates, press conferences, and interviews—and their place in his work. _Speech Begins after Death_ shows Foucault adopting a new language, an innovative autobiographical communication that is neither conversation nor monologue, and is one of his most personal statements about his life and writing. (shrink)
This paper argues that in some explanations mathematics are playing an explanatory rather than a representational role, and that this feature unifies many types of non-causal or non-mechanistic explanations that some philosophers of science have been recently exploring under various names. After showing how mathematics can play either a representational or an explanatory role by considering two alternative explanations of a same biological pattern—“Bergmann’s rule”—I offer an example of an explanation where the bulk of the explanatory job is done by (...) a mathematical theorem, and where mechanisms involved in the target systems are not explanatorily relevant. Then I account for the way mathematical properties may function in an explanatory way within an explanation by arguing that some mathematical propositions involving variables non directly referring to the target system features constitute constraints to which a whole class of systems should comply, provided they are describable by a mathematical object concerned by those propositions. According to such “constraint account”, those mathematical facts are directly entailing the explanandum, as a consequence of such constraints. I call those explanations “structural”, because here properties of mathematical structures are accounting for the explanandum; various kinds of mathematical structures thereby define various types of structural explanations. (shrink)
How could the initial, drastic decisions to implement “lockdowns” to control the spread of COVID-19 infections be justifiable, when they were made on the basis of such uncertain evidence? We defend the imposition of lockdowns in some countries by first, and focusing on the UK, looking at the evidence that undergirded the decision, second, arguing that this provided us with sufficient grounds to restrict liberty given the circumstances, and third, defending the use of poorly-empirically-constrained epidemiological models as tools that can (...) legitimately guide public policy. (shrink)
Recent interest in phenomena of simulation, pretense, and play has given rise to new philosophical debates on the basic structure of human action and action planning. Some philosophers sought to transform Hume's desire-belief-action model by sophisticating its basic structure. For example, they introduced “hypothetical world boxes” or imaginary “i-desires” and “i-beliefs” into the standard model, in order to account for the representational and motivational structures of imaginary scripts. Others used phenomena of behavior driven by imagination to attempt a more fundamental (...) critique of the Humean tradition. This article aims to show how the pragmatist tradition could be used as a resource in reframing current debates on imagination, pretense, and simulation in the cognitive sciences. This will help determine the role of imagination in intelligent human deliberation. (shrink)