The Monographs produced by the International Agency for Research on Cancer (IARC) apply rigorous procedures for the scientific review and evaluation of carcinogenic hazards by independent experts. The Preamble to the IARC Monographs, which outlines these procedures, was updated in 2019, following recommendations of a 2018 expert Advisory Group. This article presents the key features of the updated Preamble, a major milestone that will enable IARC to take advantage of recent scientific and procedural advances made during the 12 years since (...) the last Preamble amendments. The updated Preamble formalizes important developments already being pioneered in the Monographs Programme. These developments were taken forward in a clarified and strengthened process for identifying, reviewing, evaluating and integrating evidence to identify causes of human cancer. The advancements adopted include strengthening of systematic review methodologies; greater emphasis on mechanistic evidence, based on key characteristics of carcinogens; greater consideration of quality and informativeness in the critical evaluation of epidemiological studies, including their exposure assessment methods; improved harmonization of evaluation criteria for the different evidence streams; and a single-step process of integrating evidence on cancer in humans, cancer in experimental animals and mechanisms for reaching overall evaluations. In all, the updated Preamble underpins a stronger and more transparent method for the identification of carcinogenic hazards, the essential first step in cancer prevention. (shrink)
Religious believers understand the meaning of their lives in the light of the way in which they are related to God. Life is significant because it is lived in the presence of God, and ultimate bliss consists in being in the right relation with God. Through sin, however, our relationship with God has been drastically disrupted. The fundamental religious issue which we all have to face, therefore, is how this relationship can be restored. How can we attain ultimate bliss by (...) being reconciled with God? Basically, this is the issue with which the doctrine of atonement has to deal: The English word ‘atonement’ is derived from the words ‘at-one-ment’, to make two parties at one, to reconcile two parties one to another. It means essentially reconciliation… In current usage, the phrase ‘to atone for’ means the undertaking of a course of action designed to undo the consequences of a wrong act with a view to the restoration of the relationship broken by the wrong act. (shrink)
Vincent Descombes is a French philosopher. He has taught at the University of Montréal, Johns Hopkins University, and Emory University. Presently, he is director of studies at the École des Hautes Études en Sciences Sociales, Paris, and regular visiting professor at the University of Chicago in the Department of Romance. Descombes’s main areas of research are in the philosophy of mind, philosophy of language and philosophy of literature. The following interview covers various aspects of his research in the philosophy (...) of mind and language: semantic anti-realism, phenomenology, the content of mental states, description and transparency, the linguistic turn, metaphysics and linguistic analysis, fictional names and animal intentionality. (shrink)
This paper offers a critical reconsideration of the traditional doctrine that responsibility for a crime requires a voluntary act. I defend three general propositions: first, that orthodox Anglo-American criminal theory fails to explain adequately why criminal responsibility requires an act. Second, when it comes to the just definition of crimes, the act requirement is at best a rough generalization rather than a substantive limiting principle. Third, that the intuition underlying the so-called “act requirement” is better explained by what I call (...) the “practical-agency condition,” according to which punishment in a specific instance is unjust unless the crime charged was caused or constituted by the agent's conduct qua practically rational agent. The practical-agency condition is defended as a reconstruction of what is worth retaining in Anglo-American criminal law's traditional notion of an “act requirement.”. (shrink)
In his Institutes 2.2.5 Calvin declares that he ‘willingly accepts’ the distinction between freedom from necessity, from sin and from misery originally developed by St Bernard. It is remarkable that a determinist like Calvin seems here to accept a libertarian view of human freedom. In this paper I set out Bernard's doctrine of the three kinds of freedom and show that all its basic elements can in fact be found in Calvin's argument in chapters 2 and 3 of the Institutes (...) part II. Towards the end of chapter 3, however, Calvin's doctrine of ‘perseverance’ makes him revert to a deterministic view of the divine-human relationship. I show that the considerations which prompt Calvin to this can be adequately met on the basis of Bernard's libertarian concept of human freedom. (shrink)
It is now a problem more or less universally acknowledged that religion, even in an ostensibly secular age, must be in need of good commentary. The underlying problem is: what would constitute good commentary at this point? It is not as if religion has just appeared on the horizon of the secular intellectual. Even if we restrict our purview to nonreligious, nontheological discourse, there is a long tradition of critical appraisals and histories of religious phenomena, dating from the ancient Greeks. (...) The field receives an intellectual boost of sorts in the late eighteenth, the nineteenth, and the early twentieth centuries, as the religion of the theologians and prophetic reformers gives way to anthropological and sociological disciplines, the better to be scientifically understood and codified. This upsurge in the secular accounting for religious belief is often explained as the result of the Enlightenment—that is, materialist explanations of nature, textual authority, and psychology eventually turn religion into a natural function of human will, a series of authorial inventions, and a psychological manifestation of deeper impulses, from love, to class-based self-narcotizing illusion, to fear of the loss of paternal care. Max Weber proposed the most intriguing and far-reaching hypothesis about how the Enlightenment superseded religion in the West: Protestant reform within Christianity itself—beginning with Luther and Calvin—designed to produce a purer and far less magical, mystical, hierarchic, and corrupt system of belief, had the unintended consequence of laying the psychological foundations for ascetic capitalism, and hence the seemingly inevitable decline of religion in favor of worldly pursuits. (shrink)
In his paper Hampus Lyttkens tries to explore the relation between religious experience and the concept of transcendence. Lyttkens limits his enquiry to religious experience in the sense of ‘specific and extraordinary psychic experiences’ which are interpreted as experiences of a transcendent God. By ‘transcendence’ Lyttkens means more than ‘objective reference’. The object of religious experiences in the above sense is not only claimed to transcend the experience itself, in the sense in which the external world is claimed to transcend (...) our perception of it. It is also claimed to be transcendent with respect to the spatio-temporal world as such, by existing ‘beyond space and time’ in some sense or other, or in the sense put forward by Karl Heim, as existing in an extra dimension beside the four dimensions of space and time. (shrink)
The French composer, Hector Berlioz, reacted as follows to the critics of his opera La Damnation de Faust : ‘I have already recounted how I … wrote the march on the Hungarian theme of Rákóczy in the course of one night. The passionate reception that this march received in Pest made me decide to include it in my Faust, and in doing so I took the liberty to use Hungary as the setting for the opening of the action, and had (...) my hero, deep in reflection, see a Hungarian army passing across the plain. A German critic considered it most remarkable that I should portray Faust in such a manner. I do not see why I should not have done that, and I would, without hesitation, have let him travel to any place whatsoever if it had been to the benefit of the music I was writing. I had not set myself the task of blindly following Goethe's framework, and a character like Faust can be portrayed as making the most outlandish journeys without doing harm in anyway to the credibility of his person. When other German critics … attacked me even more strongly for the departures that my libretto made from the text and the structure of Goethe's Faust, … I wondered why these critics had not reproached me in any way for the libretto of my symphony, Roméo et Juliette, which only shows a slight resemblance to Shakespeare's immortal tragedy! Obviously because Shakespeare was not a German. Chauvinists! Fetishists! Imbeciles! (shrink)
[Müller, Vincent C. (ed.), (2016), Fundamental issues of artificial intelligence (Synthese Library, 377; Berlin: Springer). 570 pp.] -- This volume offers a look at the fundamental issues of present and future AI, especially from cognitive science, computer science, neuroscience and philosophy. This work examines the conditions for artificial intelligence, how these relate to the conditions for intelligence in humans and other natural agents, as well as ethical and societal problems that artificial intelligence raises or will raise. The key issues (...) this volume investigates include the relation of AI and cognitive science, ethics of AI and robotics, brain emulation and simulation, hybrid systems and cyborgs, intelligence and intelligence testing, interactive systems, multi-agent systems, and superintelligence. Based on the 2nd conference on “Theory and Philosophy of Artificial Intelligence” held in Oxford, the volume includes prominent researchers within the field from around the world. (shrink)
Mainstream and Formal Epistemology provides the first, easily accessible, yet erudite and original analysis of the meeting point between mainstream and formal theories of knowledge. These two strands of thinking have traditionally proceeded in isolation from one another, but in this book, Vincent F. Hendricks brings them together for a systematic comparative treatment. He demonstrates how mainstream and formal epistemology may significantly benefit from one another, paving the way for a new unifying program of 'plethoric' epistemology. His book will (...) both define and further the debate between philosophers from two very different sides of the epistemological spectrum. (shrink)
Artificial intelligence (AI) and robotics are digital technologies that will have significant impact on the development of humanity in the near future. They have raised fundamental questions about what we should do with these systems, what the systems themselves should do, what risks they involve, and how we can control these. - After the Introduction to the field (§1), the main themes (§2) of this article are: Ethical issues that arise with AI systems as objects, i.e., tools made and used (...) by humans. This includes issues of privacy (§2.1) and manipulation (§2.2), opacity (§2.3) and bias (§2.4), human-robot interaction (§2.5), employment (§2.6), and the effects of autonomy (§2.7). Then AI systems as subjects, i.e., ethics for the AI systems themselves in machine ethics (§2.8) and artificial moral agency (§2.9). Finally, the problem of a possible future AI superintelligence leading to a “singularity” (§2.10). We close with a remark on the vision of AI (§3). - For each section within these themes, we provide a general explanation of the ethical issues, outline existing positions and arguments, then analyse how these play out with current technologies and finally, what policy consequences may be drawn. (shrink)
There is, in some quarters, concern about high–level machine intelligence and superintelligent AI coming up in a few decades, bringing with it significant risks for humanity. In other quarters, these issues are ignored or considered science fiction. We wanted to clarify what the distribution of opinions actually is, what probability the best experts currently assign to high–level machine intelligence coming up within a particular time–frame, which risks they see with that development, and how fast they see these developing. We thus (...) designed a brief questionnaire and distributed it to four groups of experts in 2012/2013. The median estimate of respondents was for a one in two chance that high-level machine intelligence will be developed around 2040-2050, rising to a nine in ten chance by 2075. Experts expect that systems will move on to superintelligence in less than 30 years thereafter. They estimate the chance is about one in three that this development turns out to be ‘bad’ or ‘extremely bad’ for humanity. (shrink)
The Canadian-American biologist Edmund Vincent Cowdry played an important role in the birth and development of the science of aging, gerontology. In particular, he contributed to the growth of gerontology as a multidisciplinary scientific field in the United States during the 1930s and 1940s. With the support of the Josiah Macy, Jr. Foundation, he organized the first scientific conference on aging at Woods Hole, Massachusetts, where scientists from various fields gathered to discuss aging as a scientific research topic. He (...) also edited Problems of Ageing (1939), the first handbook on the current state of aging research, to which specialists from diverse disciplines contributed. The authors of this book eventually formed the Gerontological Society in 1945 as a multidisciplinary scientific organization, and some of its members, under Cowdry's leadership, formed the International Association of Gerontology in 1950. This article historically traces this development by focusing on Cowdry's ideas and activities. I argue that the social and economic turmoil during the Great Depression along with Cowdry's training and experience as a biologist – cytologist in particular – and as a textbook editor became an important basis of his efforts to construct gerontology in this direction. (shrink)
Theories of quantum gravity generically presuppose or predict that the reality underlying relativistic spacetimes they are describing is significantly non-spatiotemporal. On pain of empirical incoherence, approaches to quantum gravity must establish how relativistic spacetime emerges from their non-spatiotemporal structures. We argue that in order to secure this emergence, it is sufficient to establish that only those features of relativistic spacetimes functionally relevant in producing empirical evidence must be recovered. In order to complete this task, an account must be given of (...) how the more fundamental structures instantiate these functional roles. We illustrate the general idea in the context of causal set theory and loop quantum gravity, two prominent approaches to quantum gravity. (shrink)
This open access book looks at how a democracy can devolve into a post-factual state. The media is being flooded by populist narratives, fake news, conspiracy theories and make-believe. Misinformation is turning into a challenge for all of us, whether politicians, journalists, or citizens. In the age of information, attention is a prime asset and may be converted into money, power, and influence – sometimes at the cost of facts. The point is to obtain exposure on the air and in (...) print media, and to generate traffic on social media platforms. With information in abundance and attention scarce, the competition is ever fiercer with truth all too often becoming the first victim. Reality Lost: Markets of Attention, Misinformation and Manipulation is an analysis by philosophers Vincent F. Hendricks and Mads Vestergaard of the nuts and bolts of the information market, the attention economy and media eco-system which may pave way to postfactual democracy. Here misleading narratives become the basis for political opinion formation, debate, and legislation. To curb this development and the threat it poses to democratic deliberation, political self-determination and freedom, it is necessary that we first grasp the mechanisms and structural conditions that cause it. (shrink)
This is a critical introduction to modern French philosophy, commissioned from one of the liveliest contemporary practitioners and intended for an English-speaking readership. The dominant 'Anglo-Saxon' reaction to philosophical development in France has for some decades been one of suspicion, occasionally tempered by curiosity but more often hardening into dismissive rejection. But there are signs now of a more sympathetic interest and an increasing readiness to admit and explore shared concerns, even if these are still expressed in a very different (...) idiom and intellectual context. Vincent Descombes offers here a personal guide to the main movements and figures of the last forty-five years. He traces over this period the evolution of thought from a generation preoccupied with the 'three H's' - Hegel, Husserl and Heidegger, to a generation influenced since about 1960 by the 'three masters of suspicion' - Marx, Nietzsche and Freud. In this framework he deals in turn with the thought of Sartre, Merleau-Ponty, the early structuralists, Foucault, Althusser, Serres, Derrida, and finally Deleuze and Lyotard. The 'internal' intellectual history of the period is related to its institutional setting and the wider cultural and political context which has given French philosophy so much of its distinctive character. (shrink)
In this paper it is argued that existing ‘self-representational’ theories of phenomenal consciousness do not adequately address the problem of higher-order misrepresentation. Drawing a page from the phenomenal concepts literature, a novel self-representational account is introduced that does. This is the quotational theory of phenomenal consciousness, according to which the higher-order component of a conscious state is constituted by the quotational component of a quotational phenomenal concept. According to the quotational theory of consciousness, phenomenal concepts help to account for the (...) very nature of phenomenally conscious states. Thus, the paper integrates two largely distinct explanatory projects in the field of consciousness studies: (i) the project of explaining how we think about our phenomenally conscious states, and (ii) the project of explaining what phenomenally conscious states are in the first place. (shrink)
[This is the short version of: Müller, Vincent C. and Bostrom, Nick (forthcoming 2016), ‘Future progress in artificial intelligence: A survey of expert opinion’, in Vincent C. Müller (ed.), Fundamental Issues of Artificial Intelligence (Synthese Library 377; Berlin: Springer).] - - - In some quarters, there is intense concern about high–level machine intelligence and superintelligent AI coming up in a few dec- ades, bringing with it significant risks for human- ity; in other quarters, these issues are ignored or (...) considered science fiction. We wanted to clarify what the distribution of opinions actually is, what probability the best experts currently assign to high–level machine intelligence coming up within a particular time–frame, which risks they see with that development and how fast they see these developing. We thus designed a brief questionnaire and distributed it to four groups of experts. Overall, the results show an agreement among experts that AI systems will probably reach overall human ability around 2040-2050 and move on to superintelligence in less than 30 years thereafter. The experts say the probability is about one in three that this development turns out to be ‘bad’ or ‘extremely bad’ for humanity. (shrink)
A basic intuition we have regarding the nature of time is that the future is open whereas the past is fixed. For example, whereas we think that there are things we can do to affect how the future will unfold, we think that there are not things we can do to affect how the past unfolded. However, although this intuition is largely shared, it is not a straightforward matter to determine the nature of the asymmetry it reflects. So, in this (...) paper, I survey various philosophical ways of characterizing the asymmetry between the ‘open future’ and the ‘fixed past’ in order to account for our intuition. In particular, I wonder whether the asymmetry is to be characterized in semantic, epistemic, metaphysical or ontological terms. I conclude that, although many of these characterizations may contribute to a global understanding of the phenomenon, an ontological characterization of the asymmetry is to be preferred, since it is superior to the alternatives in explanatory power, intelligibility, and in how it coheres with interesting senses of openness. (shrink)
[Müller, Vincent C. (ed.), (2013), Philosophy and theory of artificial intelligence (SAPERE, 5; Berlin: Springer). 429 pp. ] --- Can we make machines that think and act like humans or other natural intelligent agents? The answer to this question depends on how we see ourselves and how we see the machines in question. Classical AI and cognitive science had claimed that cognition is computation, and can thus be reproduced on other computing machines, possibly surpassing the abilities of human intelligence. (...) This consensus has now come under threat and the agenda for the philosophy and theory of AI must be set anew, re-defining the relation between AI and Cognitive Science. We can re-claim the original vision of general AI from the technical AI disciplines; we can reject classical cognitive science and replace it with a new theory (e.g. embodied); or we can try to find new ways to approach AI, for example from neuroscience or from systems theory. To do this, we must go back to the basic questions on computing, cognition and ethics for AI. The 30 papers in this volume provide cutting-edge work from leading researchers that define where we stand and where we should go from here. (shrink)
Some authors have recently suggested that it is time to consider rights for robots. These suggestions are based on the claim that the question of robot rights should not depend on a standard set of conditions for ‘moral status’; but instead, the question is to be framed in a new way, by rejecting the is/ought distinction, making a relational turn, or assuming a methodological behaviourism. We try to clarify these suggestions and to show their highly problematic consequences. While we find (...) the suggestions ultimately unmotivated, the discussion shows that our epistemic condition with respect to the moral status of others does raise problems, and that the human tendency to empathise with things that do not have moral status should be taken seriously—we suggest that it produces a “derived moral status”. Finally, it turns out that there is typically no individual in real AI that could even be said to be the bearer of moral status. Overall, there is no reason to think that robot rights are an issue now. (shrink)
Will future lethal autonomous weapon systems (LAWS), or ‘killer robots’, be a threat to humanity? The European Parliament has called for a moratorium or ban of LAWS; the ‘Contracting Parties to the Geneva Convention at the United Nations’ are presently discussing such a ban, which is supported by the great majority of writers and campaigners on the issue. However, the main arguments in favour of a ban are unsound. LAWS do not support extrajudicial killings, they do not take responsibility away (...) from humans; in fact they increase the abil-ity to hold humans accountable for war crimes. Using LAWS in war would probably reduce human suffering overall. Finally, the availability of LAWS would probably not increase the probability of war or other lethal conflict—especially as compared to extant remote-controlled weapons. The widespread fear of killer robots is unfounded: They are probably good news. (shrink)
A basic intuition we have regarding the nature of time is that the future is open whereas the past is fixed. For example, whereas we think that there are things we can do to affect how the future will unfold (e.g. acting in an environmentally responsible manner), we think that there are not things we can do to affect how the past unfolded (“what is done is done”). However, although this intuition is largely shared, it is not a straightforward matter (...) to determine the nature of the asymmetry it reflects. So, in this paper, I survey various philosophical ways of characterizing the asymmetry between the ‘open future’ and the ‘fixed past’ in order to account for our intuition. In particular, I wonder whether the asymmetry is to be characterized in semantic, epis‐ temic, metaphysical or ontological terms. I conclude that, although many of these characterizations may contribute to a global understanding of the phenomenon, an ontological characterization of the asymmetry is to be preferred, since it is superior to the alternatives in explanatory power, intelligibility, and in how it coheres with interesting senses of openness. (shrink)
The procedures of canonical quantization of the gravitational field apparently lead to entities for which any interpretation in terms of spatio-temporal localization or spatio-temporal extension seems difficult. This fact is the main ground for the suggestion that can often be found in the physics literature on canonical quantum gravity according to which spacetime may not be fundamental in some sense. This paper aims to investigate this radical suggestion from an ontologically serious point of view in the cases of two standard (...) forms of canonical quantum gravity, quantum geometrodynamics and loop quantum gravity. We start by discussing the physical features of the quantum wave functional of quantum geometrodynamics and of the spin networks of loop quantum gravity that motivate the view according to which spacetime is not fundamental. We then point out that, by contrast, for any known ontologically serious understanding of quantum entanglement, the commitment to spacetime seems indispensable. Against this background, we then critically discuss the idea that spacetime may emerge from more fundamental entities. As a consequence, we finally suggest that the emergence of classical spacetime in canonical quantum gravity faces a dilemma: either spacetime ontologically emerges from more fundamental non-spatio-temporal entities or it already belongs to the fundamental quantum gravitational level and the emergence of the classical picture is merely a matter of levels of description. On the first horn of the dilemma, it is unclear how to make sense of concrete physical entities that are not in spacetime and of the notion of ontological emergence that is involved. The second horn runs into the difficulties raised by the physics of canonical quantum gravity. (shrink)
In presenting a selection of twenty-eight texts in translation with introductory essays, Vincent L. Wimbush and his co-authors have produced the first book on asceticism that does full justice to the varieties of ascetic behavior in the Greco-Roman world. The texts, representative of different religious cults, philosophical schools, and geographical locations, are organized by literary genre into five parts that give a fascinating overview of the ascetic tradition.
Radical ontic structural realism (ROSR) asserts an ontological commitment to ‘free-standing’ physical structures understood solely in terms of fundamental relations, without any recourse to relata that stand in these relations. Bain ([2013], pp.1621–35) has recently defended ROSR against the common charge of incoherence by arguing that a reformulation of fundamental physical theories in category-theoretic terms (rather than the usual set-theoretic ones) offers a coherent and precise articulation of the commitments accepted by ROSR. In this essay, we argue that category theory (...) does not offer a more hospitable environment to ROSR than set theory. We also show that the application of category-theoretic tools to topological quantum field theory and to algebraic generalizations of general relativity do not warrant the claim that these theories describe ‘object-free’ structures. We conclude that category theory offers little if any comfort to ROSR. 1 Introduction: Ridding Structures of Objects2 The Set-theoretic Peril for Radical Ontic Structural Realism3 Bain’s Categorial Strategy to Save Radical Ontic Structural Realism4 Throwing out the Relations with the Relata5 Categorial and Set-theoretical Structures6 Radical Suggestions from Topological Quantum Field Theory?7 Sheaves of Einstein Algebras as Radical Structures?8 Conclusions. (shrink)
At first glance twentieth-century philosophy of science seems virtually to ignore chemistry. However this paper argues that a focus on chemistry helped shape the French philosophical reflections about the aims and foundations of scientific methods. Despite patent philosophical disagreements between Duhem, Meyerson, Metzger and Bachelard it is possible to identify the continuity of a tradition that is rooted in their common interest for chemistry. Two distinctive features of the French tradition originated in the attention to what was going on in (...) chemistry.French philosophers of science, in stark contrast with analytic philosophers, considered history of science as the necessary basis for understanding how the human intellect or the scientific spirit tries to grasp the world. This constant reference to historical data was prompted by a fierce controversy about the chemical revolution, which brought the issue of the nature of scientific changes centre stage.A second striking—albeit largely unnoticed—feature of the French tradition is that matter theories are a favourite subject with which to characterize the ways of science. Duhem, Meyerson, Metzger and Bachelard developed most of their views about the methods and aims of science through a discussion of matter theories. Just as the concern with history was prompted by a controversy between chemists, the focus on matter was triggered by a scientific controversy about atomism in the late nineteenth-century.Keywords: France; Epistemology; Chemistry; Revolution; Atomism; Realism. (shrink)
We consider to what extent the fundamental question of spacetime singularities is relevant for the philosophical debate about the nature of spacetime. After reviewing some basic aspects of the spacetime singularities within general relativity, we argue that the well known difficulty to localize them in a meaningful way may challenge the received metaphysical view of spacetime as a set of points possessing some intrinsic properties together with some spatiotemporal relations. Considering the algebraic formulation of general relativity, we argue that the (...) spacetime singularities highlight the philosophically misleading dependence on the standard geometric representation of spacetime. †To contact the author, please write to: Department of Philosophy, University of Lausanne, CH-1015 Lausanne, Switzerland; e-mail: vincent[email protected] (shrink)
The procedures of canonical quantization of the gravitational field apparently lead to entities for which any interpretation in terms of spatio-temporal localization or spatio-temporal extension seems difficult. This fact is the main ground for the suggestion that can often be found in the physics literature on canonical quantum gravity according to which spacetime may not be fundamental in some sense. This paper aims to investigate this radical suggestion from an ontologically serious point of view in the cases of two standard (...) forms of canonical quantum gravity, quantum geometrodynamics and loop quantum gravity. We start by discussing the physical features of the quantum wave functional of quantum geometrodynamics and of the spin networks of loop quantum gravity that motivate the view according to which spacetime is not fundamental. We then point out that, by contrast, for any known ontologically serious understanding of quantum entanglement, the commitment to spacetime seems indispensable. Against this background, we then critically discuss the idea that spacetime may emerge from more fundamental entities. As a consequence, we finally suggest that the emergence of classical spacetime in canonical quantum gravity faces a dilemma: either spacetime ontologically emerges from more fundamental non-spatio-temporal entities or it already belongs to the fundamental quantum gravitational level and the emergence of the classical picture is merely a matter of levels of description. On the first horn of the dilemma, it is unclear how to make sense of concrete physical entities that are not in spacetime and of the notion of ontological emergence that is involved. The second horn runs into the difficulties raised by the physics of canonical quantum gravity. (shrink)
Black Natural Law offers a new way of understanding the African American political tradition, and it argues that this tradition has collapsed into incoherence. Vincent William Lloyd revives Black politics by telling stories of its central figures in a way that exhibits the connections between their religious, philosophical, and political ideas.
In prior work, we have argued that spacetime functionalism provides tools for clarifying the conceptual difficulties specifically linked to the emergence of spacetime in certain approaches to quantum gravity. We argue in this article that spacetime functionalism in quantum gravity is radically different from other functionalist approaches that have been suggested in quantum mechanics and general relativity: in contrast to these latter cases, it does not compete with purely interpretative alternatives, but is rather intertwined with the physical theorizing itself at (...) the level of quantum gravity. Spacetime functionalism allows one to articulate a coherent realist perspective in the context of quantum gravity, and to relate it to a straightforward realist understanding of general relativity. (shrink)
The philosophy of AI has seen some changes, in particular: 1) AI moves away from cognitive science, and 2) the long term risks of AI now appear to be a worthy concern. In this context, the classical central concerns – such as the relation of cognition and computation, embodiment, intelligence & rationality, and information – will regain urgency.
The contribution of the body to cognition and control in natural and artificial agents is increasingly described as “off-loading computation from the brain to the body”, where the body is said to perform “morphological computation”. Our investigation of four characteristic cases of morphological computation in animals and robots shows that the ‘off-loading’ perspective is misleading. Actually, the contribution of body morphology to cognition and control is rarely computational, in any useful sense of the word. We thus distinguish (1) morphology that (...) facilitates control, (2) morphology that facilitates perception and the rare cases of (3) morphological computation proper, such as ‘reservoir computing.’ where the body is actually used for computation. This result contributes to the understanding of the relation between embodiment and computation: The question for robot design and cognitive science is not whether computation is offloaded to the body, but to what extent the body facilitates cognition and control – how it contributes to the overall ‘orchestration’ of intelligent behaviour. (shrink)
" Façonner le monde atome par atome " : tel est l'objectif incroyablement ambitieux affiché par les promoteurs américains de la " National Nanoinitiative ", lancée en 1999. Un projet global de " convergence des sciences ", visant à " initier une nouvelle Renaissance, incorporant une conception holiste de la technologie fondée sur [..] une analyse causale du monde physique, unifiée depuis l'échelle nano jusqu'à l'échelle planétaire. " Ce projet démiurgique est aujourd'hui au coeur de ce qu'on appelle la " (...) technoscience ", étendard pour certains, repoussoir pour d'autres. En précisant dans ce livre la signification de ce concept, pour sortir enfin du sempiternel conflit entre technophiles et technophobes, son auteur propose d'abord une sorte d'archéologie du terme " technoscience ". Loin d'être un simple renversement de hiérarchie entre science et technique, il s'agit d'un changement de régime de la connaissance scientifique, ayant désormais intégré la logique entrepreneuriale du monde des affaires et mobilisant des moyens considérables. Surtout, Bernadette Bensaude-Vincent montre que le brouillage de la frontière entre science et technique n'est que la manifestation d'un tremblement plus général, marqué par l'effacement progressif des distinctions traditionnelles : nature/artifice, inerte/vivant, matière/esprit, homme/machine, etc. Alors que nos sociétés sont silencieusement reconfigurées par les nanotechnologies, Internet, le génie génétique ou les OGM, ce livre montre l'importance de faire enfin pleinement entrer les questions de choix technologiques et scientifiques dans la sphère du politique et dans l'arène publique. Car la technoscience est un processus historique qui engage la nature en la refaçonnant et qui implique la société dans son ensemble. (shrink)
Special Issue “Risks of artificial general intelligence”, Journal of Experimental and Theoretical Artificial Intelligence, 26/3 (2014), ed. Vincent C. Müller. http://www.tandfonline.com/toc/teta20/26/3# - Risks of general artificial intelligence, Vincent C. Müller, pages 297-301 - Autonomous technology and the greater human good - Steve Omohundro - pages 303-315 - - - The errors, insights and lessons of famous AI predictions – and what they mean for the future - Stuart Armstrong, Kaj Sotala & Seán S. Ó hÉigeartaigh - pages 317-342 (...) - - - The path to more general artificial intelligence - Ted Goertzel - pages 343-354 - - - Limitations and risks of machine ethics - Miles Brundage - pages 355-372 - - - Utility function security in artificially intelligent agents - Roman V. Yampolskiy - pages 373-389 - - - GOLEM: towards an AGI meta-architecture enabling both goal preservation and radical self-improvement - Ben Goertzel - pages 391-403 - - - Universal empathy and ethical bias for artificial general intelligence - Alexey Potapov & Sergey Rodionov - pages 405-416 - - - Bounding the impact of AGI - András Kornai - pages 417-438 - - - Ethics of brain emulations - Anders Sandberg - pages 439-457. (shrink)
In recent years, Charles Sanders Peirce has emerged, in the eyes of philosophers both in America and abroad, as one of America’s major philosophical thinkers. His work has forced us back to philosophical reflection about those basic issues that inevitably confront us as human beings, especially in an age of science. Peirce’s concern for experience, for what is actually encountered, means that his philosophy, even in its most technical aspects, forms a reflective commentary on actual life and on the world (...) in which it is lived. In Charles S. Peirce: On Norms and Ideals, Potter argues that Peirce’s doctrine of the normative sciences is essential to his pragmatism. No part of Peirce’s philosophy is bolder than his attempt to establish esthetics, ethics, and logic as the three normative sciences and to argue for the priority of esthetics among the trio. Logic, Potter cites, is normative because it governs thought and aims at truth; ethics is normative because it analyzes the ends to which thought should be directed; esthetics is normative and fundamental because it considers what it means to be an end of something good in itself. This study shows that pierce took seriously the trinity of normative sciences and demonstrates that these categories apply both to the conduct of man and to the workings of the cosmos. Professor Potter combines sympathetic and informed exposition with straightforward criticism and he deals in a sensible manner with the gaps and inconsistencies in Peirce’s thought. His study shows that Peirce was above all a cosmological and ontological thinker, one who combined science both as a method and as result with a conception of reasonable actions to form a comprehensive theory of reality. Peirce’s pragmatism, although it has to do with "action and the achievement of results, is not a glorification of action but rather a theory of the dynamic nature of things in which the "ideal" dimension of reality – laws, nature of things, tendencies, and ends – has genuine power for directing the cosmic order, including man, toward reasonable goals. (shrink)