Matthias Vogel challenges the belief, dominant in contemporary philosophy, that reason is determined solely by our discursive, linguistic abilities as communicative beings. In his view, the medium of language is not the only force of reason. Music, art, and other nonlinguistic forms of communication and understanding are also significant. Introducing an expansive theory of mind that accounts for highly sophisticated, penetrative media, Vogel advances a novel conception of rationality while freeing philosophy from its exclusive attachment to linguistics. Vogel's media (...) of reason treats all kinds of understanding and thought, propositional and nonpropositional, as important to the processes and production of knowledge and thinking. By developing an account of rationality grounded in a new conception of media, he raises the profile of the prelinguistic and nonlinguistic dimensions of rationality and advances the Enlightenment project, buffering it against the postmodern critique that the movement fails to appreciate aesthetic experience. Guided by the work of Jürgen Habermas, Donald Davidson, and a range of media theorists, including Marshall McLuhan, Vogel rebuilds, if he does not remake, the relationship among various forms of media -- books, movies, newspapers, the Internet, and television -- while offering an original and exciting contribution to media theory. (shrink)
Between 1819 and 1832 Friedrich Schleiermacher was giving lectures on the life of Jesus at the University of Berlin. The following article includes two partial editions, which document the introductory parts of the lectures from 1819/20 and 1829/30. Both are based on manuscripts written by Schleiermacher’s listeners. Especially to explore the development of Schleiermacher’s conceptual considerations this two partial editions should be a useful addition to the new critical edition of Schleiermacher’s Vorlesungen über das Leben Jesu published in 2018 by (...) Walter Jaeschke. (shrink)
To study (un)conscious perception and test hypotheses about consciousness, researchers need procedures for determining whether subjects consciously perceive stimuli or not. This article is an introduction to a family of procedures called ‘confidence-based procedures’, which consist in interpreting metacognitive indicators as indicators of consciousness. I assess the validity and accuracy of these procedures, and answer a series of common objections to their use in consciousness research. I conclude that confidence-based procedures are valid for assessing consciousness, and, in most cases, accurate (...) enough for our practical and scientific purposes. (shrink)
Until the eighteenth century, Latin was the uncontested language of academic discourse, including theology. Regardless of their denominational affiliation, scholars all across Europe made use of Latin in both their publications and lectures. Then, due to the influence of various strands of post-Kantian philosophy, a change took place, at least in the German-speaking area. With recourse to classical German philosophy, many Catholic systematic theologians switched to their mother-tounge and adopted the newly coined terms in order to express the same faith. (...) In reaction to this transformative work the neo-scholastic movement came into existence. Its adherents stressed the Church’s tradition and, especially its indebtedness to medieval thought. From the mid-nineteenth century onwards, partly supported by the Magisterium, various attempts were made to re-introduce Latin into dogmatics. This project was unsuccessful, however, because of changes to the Catholic world ushered in by the Second Vatican Council and also because of developments in German educational policy, which served to lower the status of Latin in schools. (shrink)
Some moral value is transparent just in case an agent with average mental capacities can feasibly come to know whether some entity does, or does not, possess that value. In this paper, I consider whether legitimacy—that is, the property of exercises of political power to be permissible—is transparent. Implicit in much theorising about legitimacy is the idea that it is. I will offer two counter-arguments. First, injustice can defeat legitimacy, and injustice can be intransparent. Second, legitimacy can play a critical (...) function in our practical thought, which sometimes requires intransparency. (shrink)
BackgroundLarge-scale genetic data sets are frequently shared with other research groups and even released on the Internet to allow for secondary analysis. Study participants are usually not informed about such data sharing because data sets are assumed to be anonymous after stripping off personal identifiers.DiscussionThe assumption of anonymity of genetic data sets, however, is tenuous because genetic data are intrinsically self-identifying. Two types of re-identification are possible: the "Netflix" type and the "profiling" type. The "Netflix" type needs another small genetic (...) data set, usually with less than 100 SNPs but including a personal identifier. This second data set might originate from another clinical examination, a study of leftover samples or forensic testing. When merged to the primary, unidentified set it will re-identify all samples of that individual.Even with no second data set at hand, a "profiling" strategy can be developed to extract as much information as possible from a sample collection. Starting with the identification of ethnic subgroups along with predictions of body characteristics and diseases, the asthma kids case as a real-life example is used to illustrate that approach.SummaryDepending on the degree of supplemental information, there is a good chance that at least a few individuals can be identified from an anonymized data set. Any re-identification, however, may potentially harm study participants because it will release individual genetic disease risks to the public. (shrink)
I develop a theory of counterfactuals about relative computability, i.e. counterfactuals such as 'If the validity problem were algorithmically decidable, then the halting problem would also be algorithmically decidable,' which is true, and 'If the validity problem were algorithmically decidable, then arithmetical truth would also be algorithmically decidable,' which is false. These counterfactuals are counterpossibles, i.e. they have metaphysically impossible antecedents. They thus pose a challenge to the orthodoxy about counterfactuals, which would treat them as uniformly true. What’s more, I (...) argue that these counterpossibles don’t just appear in the periphery of relative computability theory but instead they play an ineliminable role in the development of the theory. Finally, I present and discuss a model theory for these counterfactuals that is a straightforward extension of the familiar comparative similarity models. (shrink)
Theodor Adorno is a widely-studied figure, but most often with regard to his work on cultural theory, philosophy and aesthetics. The Sociology of Theodor Adorno provides the first thorough English-language account of Adorno's sociological thinking. Matthias Benzer reads Adorno's sociology through six major themes: the problem of conceptualising capitalist society; empirical research; theoretical analysis; social critique; the sociological text; and the question of the non-social. Benzer explains the methodological and theoretical ideas informing Adorno's reflections on sociology and illustrates Adorno's (...) approach to examining social life, including astrology, sexual taboos and racial prejudice. Benzer clarifies Adorno's sociology in relation to his work in other disciplines and the inspiration his sociology took from social thinkers such as Marx, Weber, Durkheim, Kracauer and Benjamin. The book raises critical questions about the viability of Adorno's sociological mode of procedure and its potential contributions and challenges to current debates in social science. (shrink)
Economics has developed into one of the most specialised social sciences. Yet at the same time, it shares its subject matter with other social sciences and humanities and its method of analysis has developed in close correspondence with the natural and life sciences. This book offers an up to date assessment of economics in relation to other disciplines. -/- This edited collection explores fields as diverse as mathematics, physics, biology, medicine, sociology, architecture, and literature, drawing from selected contributions to the (...) 2005 Annual Conference of the European Society for the History of Economic Thought (ESHET). There is currently much discussion at the leading edges of modern economics about openness to other disciplines, such as psychology and sociology. But what we see here is that economics has drawn on (as well as contributed to) other disciplines throughout its history. In this sense, in spite of the increasing specialisation within all disciplines, economics has always been an open discipline and the chapters in this volume provide a vivid illustration for this. -/- Open Economics is a testament to the intellectual vibrancy of historical research in economics. It presents the reader with a historical introduction to the disciplinary context of economics that is the first of its kind, and will appeal to practising economists and students of the discipline alike, as well as to anybody interested in economics and its position in the scientific and social scientific landscape. -/- Table of Contents -/- Introduction: Economics in relation to other disciplines Richard Arena, Sheila Dow and Matthias Klaes Part I. Economics in relation to the humanities and social sciences 1. The social science of economics Brian J. Loasby 2. Economics and literature Bruna Ingrao 3. Happiness: what Kahneman could have learnt from Pietro Verri Pier Luigi Porta Part II. Economics in relation to the life and natural sciences 4. Newtonian physics, experimental moral philosophy and the shaping of political economy Sergio Cremaschi 5. Evolutionary biology and economic behaviour: re-visiting Veblen's instinct of workmanship Mark Harrison 6. Medicine and economics in pre-classical economics Alain Clément and Ludovic Desmedt Part III. Economics and mathematics 7. Mathematics as the role model for neoclassical economics Nicola Giocoli 8. The role of econometric method in economic analysis: A reassessment of the Keynes-Tinbergen debate, 1938-43 Giovanna Garrone and Roberto Marchionatti IV. Economics and architecture 9. Economics and architecture Maurice Lagueux 10. Economic policies and urban development in Latin America Michele Alacevich and Andrea Costa V. Economics and geography 11. ‘Space’ in economic thought Giovanna Vertova 12. Economics, geography and colonialism in the writings of William Petty Hugh Goodacre Part VI. Economics and sociology 13. Economics and sociology: Gustav Schmoller and Werner Sombart on social differentiation Joachim Zweynert 14. Is Homo Oeconomicus a 'bad guy'? Isabelle This Saint-Jean -/- . (shrink)
Whether the prefrontal cortex is part of the neural substrates of consciousness is currently debated. Against prefrontal theories of consciousness, many have argued that neural activity in the prefrontal cortex does not correlate with consciousness but with subjective reports. We defend prefrontal theories of consciousness against this argument. We surmise that the requirement for reports is not a satisfying explanation of the difference in neural activity between conscious and unconscious trials, and that prefrontal theories of consciousness come out of this (...) debate unscathed. (shrink)
Opponents to consciousness in fish argue that fish do not feel pain because they do not have a neocortex, which is a necessary condition for feeling pain. A common counter-argument appeals to the multiple realizability of pain: while a neocortex might be necessary for feeling pain in humans, pain might be realized differently in fish. This paper argues, first, that it is impossible to find a criterion allowing us to demarcate between plausible and implausible cases of multiple realization of pain (...) without running into a circular argument. Second, opponents to consciousness in fish cannot be provided with reasons to believe in the multiple realizability of pain. I conclude that the debate on the existence of pain in fish is impossible to settle by relying on the multiple realization argument. (shrink)
Traditionally, the manufacturer/operator of a machine is held (morally and legally) responsible for the consequences of its operation. Autonomous, learning machines, based on neural networks, genetic algorithms and agent architectures, create a new situation, where the manufacturer/operator of the machine is in principle not capable of predicting the future machine behaviour any more, and thus cannot be held morally responsible or liable for it. The society must decide between not using this kind of machine any more (which is not a (...) realistic option), or facing a responsibility gap, which cannot be bridged by traditional concepts of responsibility ascription. (shrink)
The core feature that distinguishes moods from emotions is that moods, in contrast to emotions, are diffuse and global. This article outlines a dispositional theory of moods (DTM) that accounts for this and other features of mood experience. DTM holds that moods are temporary dispositions to have or to generate particular kinds of emotion-relevant appraisals. Furthermore, DTM assumes that the cognitions and appraisals one is disposed to have in a given mood partly constitute the experience of mood. This article outlines (...) a number of implications of DTM (e.g., regarding the noncognitive causation and rationality of moods) and summarizes empirical results supporting the theory. (shrink)
Phillips argues that blindsight is due to response criterion artefacts under degraded conscious vision. His view provides alternative explanations for some studies, but may not work well when one considers several key findings in conjunction. Empirically, not all criterion effects are decidedly non-perceptual. Awareness is not completely abolished for some stimuli, in some patients. But in other cases, it was clearly impaired relative to the corresponding visual sensitivity. This relative dissociation is what makes blindsight so important and interesting.
In der Studie geht Matthias Neuber der Frage nach, in welchem Verhältnis das Konzept des Realismus und der logische Empirismus des Wiener Kreises, eine der dominanten Strömungen der deutschsprachigen theoretischen Philosophie des frühen 20. Jahrhunderts, zueinander stehen. Diese Fragestellung ist in der philosophiehistorischen Forschung bislang nur am Rande behandelt worden. Das ist umso erstaunlicher, als die neuere wissenschaftsphilosophische Realismusdebatte gerade durch den logischen Empirismus maßgeblich mitbestimmt worden ist. Der Autor geht aber noch einen Schritt weiter: Er begründet in dem (...) Band die These, dass es innerhalb des logischen Empirismus selbst Strömungen gab, die mit dem wissenschaftlichen Realismus kompatibel sind. Damit bezieht er eine Gegenposition zum Mainstream in der Deutung der wissenschaftsphilosophischen Realismusdebatte des 20. Jahrhunderts, denn der versteht den wissenschaftlichen Realismus als Gegenprogramm zum logischen Empirismus. Neuber liefert mit seiner philosophiehistorischen Studie nicht weniger als eine Neubewertung des Verhältnisses von Realismus und logischem Empirismus. Ein Werk, das sich insbesondere an Wissenschaftler, aber auch an fortgeschrittene Studierende auf dem Gebiet der Geschichte der Wissenschaftsphilosophie richtet. (shrink)
According to William Alston, we lack voluntary control over our propositional attitudes because we cannot believe intentionally, and we cannot believe intentionally because our will is not causally connected to belief formation. Against Alston, I argue that we can believe intentionally because our will is causally connected to belief formation. My defense of this claim is based on examples in which agents have reasons for and against believing p, deliberate on what attitude to take towards p, and subsequently acquire an (...) attitude A towards p because they have decided to take attitude A. From the possibility of intentional belief, two conclusions follow. First, the kind of control we have over our propositional attitudes is direct; it is possible for us to believe at will. Second, the question of whether what we believe is under our control ultimately depends on whether our will itself is under our control. It is, therefore, a question of the metaphysics of free will. (shrink)
Eleven pairs of newly commissioned essays face off on opposite sides of fundamental problems in current theories of knowledge. Brings together fresh debates on eleven of the most controversial issues in epistemology. Questions addressed include: Is knowledge contextual? Can skepticism be refuted? Can beliefs be justified through coherence alone? Is justified belief responsible belief? Lively debate format sharply defines the issues, and paves the way for further discussion. Will serve as an accessible introduction to the major topics in contemporary epistemology, (...) whilst also capturing the imagination of professional philosophers. (shrink)
As for most measurement procedures in the course of their development, measures of consciousness face the problem of coordination, i.e., the problem of knowing whether a measurement procedure actually measures what it is intended to measure. I focus on the case of the Perceptual Awareness Scale to illustrate how ignoring this problem leads to ambiguous interpretations of subjective reports in consciousness science. In turn, I show that empirical results based on this measurement procedure might be systematically misinterpreted.
Epistemic deontology is the view that the concept of epistemic justification is deontological: a justified belief is, by definition, an epistemically permissible belief. I defend this view against the argument from doxastic involuntarism, according to which our doxastic attitudes are not under our voluntary control, and thus are not proper objects for deontological evaluation. I argue that, in order to assess this argument, we must distinguish between a compatibilist and a libertarian construal of the concept of voluntary control. If we (...) endorse a compatibilist construal, it turns out that we enjoy voluntary control over our doxastic attitudes after all. If, on the other hand, we endorse a libertarian construal, the result is that, for our doxastic attitudes to be suitable objects of deontological evaluation, they need not be under our voluntary control. (shrink)
Making good decisions in extremely complex and difficult processes and situations has always been both a key task as well as a challenge in the clinic and has led to a large amount of clinical, legal and ethical routines, protocols and reflections in order to guarantee fair, participatory and up-to-date pathways for clinical decision-making. Nevertheless, the complexity of processes and physical phenomena, time as well as economic constraints and not least further endeavours as well as achievements in medicine and healthcare (...) continuously raise the need to evaluate and to improve clinical decision-making. This article scrutinises if and how clinical decision-making processes are challenged by the rise of so-called artificial intelligence-driven decision support systems. In a first step, this article analyses how the rise of AI-DSS will affect and transform the modes of interaction between different agents in the clinic. In a second step, we point out how these changing modes of interaction also imply shifts in the conditions of trustworthiness, epistemic challenges regarding transparency, the underlying normative concepts of agency and its embedding into concrete contexts of deployment and, finally, the consequences for ascriptions of responsibility. Third, we draw first conclusions for further steps regarding a ‘meaningful human control’ of clinical AI-DSS. (shrink)
Simulations are used in very different contexts and for very different purposes. An emerging development is the possibility of using simulations to obtain a more or less representative reproduction of organs or even entire persons. Such simulations are framed and discussed using the term ‘digital twin’. This paper unpacks and scrutinises the current use of such digital twins in medicine and the ideas embedded in this practice. First, the paper maps the different types of digital twins. A special focus is (...) put on the concrete challenges inherent in the interactions between persons and their digital twin. Second, the paper addresses the questions of how far a digital twin can represent a person and what the consequences of this may be. Against the background of these two analytical steps, the paper defines first conditions for digital twins to take on an ethically justifiable form of representation. (shrink)
The conflict over the classic problem of philosophical anthropology, i. e., what man actually is, is not only a conflict about what – X – determines something to be human. It also requires clarification of the manner in which something is determined to be human by the X in question. There being different options for the latter, the classic anthropological conflict concerns not only definitions of being human, but also models of being human. The present paper investigates four such models: (...) the addition model, the interior model, the privation model, and the transformation model. While the first will serve as a baseline for comparison, the three other models will, in order to escape the danger of making too formal an argument, be discussed exemplarily, i. e. by focusing in each case on a certain proponent of the respective model. Those proponents will be Martin Heidegger for the interior model, Arnold Gehlen for the privation model, and Helmuth Plessner for the transformation model. (shrink)
Existing proposals concerning the ontology of quantum mechanics either involve speculation that goes beyond the scientific evidence or abandon realism about large parts of QM. This paper proposes a way out of this dilemma, by showing that QM as it is formulated in standard textbooks allows for a much more substantive ontological commitment than is usually acknowledged. For this purpose, I defend a non-fundamentalist approach to ontology, which is then applied to various aspects of QM. In particular, I will defend (...) realism about spin, which has been viewed as a particularly hard case for the ontology of QM. (shrink)
Consciousness scientists have not reached consensus on two of the most central questions in their field: first, on whether consciousness overflows reportability; second, on the physical basis of consciousness. I review the scientific literature of the 19th century to provide evidence that disagreement on these questions has been a feature of the scientific study of consciousness for a long time. Based on this historical review, I hypothesize that a unifying explanation of disagreement on these questions, up to this day, is (...) that scientific theories of consciousness are underdetermined by the evidence, namely, that they can be preserved “come what may” in front of (seemingly) disconfirming evidence. Consciousness scientists may have to find a way of solving the persistent underdetermination of theories of consciousness to make further progress. (shrink)
Defined narrowly, epistemology is the study of knowledge and justified belief. As the study of knowledge, epistemology is concerned with the following questions: What are the necessary and sufficient conditions of knowledge? What are its sources? What is its structure, and what are its limits? As the study of justified belief, epistemology aims to answer questions such as: How we are to understand the concept of justification? What makes justified beliefs justified? Is justification internal or external to one's own mind? (...) Understood more broadly, epistemology is about issues having to do with the creation and dissemination of knowledge in particular areas of inquiry. This article will provide a systematic overview of the problems that the questions above raise and focus in some depth on issues relating to the structure and the limits of knowledge and justification. (shrink)
In this paper, I argue that the rejection of doxastic voluntarism is not as straightforward as its opponents take it to be. I begin with a critical examination of William Alston's defense of involuntarism and then focus on the question of whether belief is intentional.
The paper explains in what sense the GRW matter density theory is a primitive ontology theory of quantum mechanics and why, thus conceived, the standard objections against the GRW formalism do not apply to GRWm. We consider the different options for conceiving the quantum state in GRWm and argue that dispositionalism is the most attractive one.
In this paper, I examine Alston's arguments for doxastic involuntarism. Alston fails to distinguish (i) between volitional and executional lack of control, and (ii) between compatibilist and libertarian control. As a result, he fails to notice that, if one endorses a compatibilist notion of voluntary control, the outcome is a straightforward and compelling case for doxastic voluntarism. Advocates of involuntarism have recently argued that the compatibilist case for doxastic voluntarism can be blocked by pointing out that belief is never intentional. (...) In response to this strategy, I distinguish between two types of intentionality and argue that belief is no less intentional than action is. (shrink)
I argue that scientific realism, insofar as it is only committed to those scientific posits of which we have causal knowledge, is immune to Kyle Stanford’s argument from unconceived alternatives. This causal strategy is shown not to repeat the shortcomings of previous realist responses to Stanford’s argument. Furthermore, I show that the notion of causal knowledge underlying it can be made sufficiently precise by means of conceptual tools recently introduced into the debate on scientific realism. Finally, I apply this strategy (...) to the case of Jean Perrin’s experimental work on the atomic hypothesis, disputing Stanford’s claim that the problem of unconceived alternatives invalidates a realist interpretation of this historical episode. 1 Stanford’s Argument from Unconceived Alternatives2 Previous Attempts to Undermine the Problem of Unconceived Alternatives2.1 The plausibility of unconceived alternatives2.2 The distinctness of unconceived alternatives2.3 The induction from past to present3 Causal Knowledge as a Criterion for the Realist3.1 How Chakravartty’s proposal differs from earlier causal strategies3.2 Causal realism and the detection/auxiliary distinction4 Causal Realism, Unconceived Alternatives, and the Atomic Hypothesis4.1 Perrin and the philosophers: some initial observations4.2 Roush and Stanford on Perrin4.3 From Brownian motion to the reality of atoms4.4 What we know about atoms5 Conclusion. (shrink)
Advance directives (ADs) are assumed to reflect the patients’ preferences, even if these are not clearly expressed. Research into whether this assumption is correct has been lacking. This study explores to what extent ADs reflect the true wishes of the signatories.
Consciousness is scientifically challenging to study because of its subjective aspect. This leads researchers to rely on report-based experimental paradigms in order to discover neural correlates of consciousness (NCCs). I argue that the reliance on reports has biased the search for NCCs, thus creating what I call 'methodological artefacts'. This paper has three main goals: first, describe the measurement problem in consciousness science and argue that this problem led to the emergence of methodological artefacts. Second, provide a critical assessment of (...) the NCCs put forward by the global neuronal workspace theory. Third, provide the means of dissociating genuine NCCs from methodological artefacts. (shrink)
Conditional structures lie at the heart of the sciences, humanities, and everyday reasoning. It is hence not surprising that conditional logics – logics specifically designed to account for natural language conditionals – are an active and interdisciplinary area. The present book gives a formal and a philosophical account of indicative and counterfactual conditionals in terms of Chellas-Segerberg semantics. For that purpose a range of topics are discussed such as Bennett’s arguments against truth value based semantics for indicative conditionals.
Is perceptual processing in dedicated sensory areas sufficient for conscious perception? Localists say ‘Yes—given some background conditions.’ Prefrontalists say ‘No: conscious perceptual experience requires the involvement of prefrontal structures.’ I review the evidence for prefrontalism. I start by presenting correlational evidence. In doing so, I answer the ‘report argument’, according to which the apparent involvement of the prefrontal cortex in consciousness stems from the requirement for reports. I then review causal evidence for prefrontalism and answer the ‘lesion argument’, which purports (...) to show that prefrontalism is wrong because lesions to the prefrontal cortex do not abolish consciousness. I conclude that multiple sources of evidence converge toward the view that the prefrontal cortex plays a significant role in consciousness. (shrink)
When discussing the philosophical question of the relation between mind and nature, dualistic approaches are often contrasted with scientistic approaches. However, mind can be situated in nature in a non-scientistic manner and outside of nature in a non-dualistic manner. John McDowell represents the first approach, as he connects mind to our second nature. In his attempt to specify the categorial relation between first and second nature, McDowell finds himself in a dilemma which cannot be solved within his framework. The second (...) approach is represented by Nicolai Hartmann, for whom mind does not belong to nature, but to the real world. Hartmann’s ontology of layers is able to avoid McDowell’s dilemma and the unity of the real world is made intelligible. (shrink)