The mere exposure effect is the increase in positive affect that results from the repeated exposure to previously novel stimuli. We sought to determine if judgments other than affective preference could reliably produce a mere exposure effect for two-dimensional random shapes. In two experiments, we found that brighter and darker judgments did not differentiate target from distracter shapes, liking judgments led to target selection greater than chance, and disliking judgments led to distracter selection greater than chance. These results for brighter, (...) darker, and liking judgments were obtained regardless of whether shape recognition was greater (Experiment 1) or not greater (Experiment 2) than chance. Effects of prior exposure to novel shapes were reliably observed only for affective judgment tasks. These results are inconsistent with general predictions made by the nonspecific activation hypothesis, but not the affective primacy or perceptual fluency hypotheses which were discussed in terms of cognitive neuroscience research. (shrink)
Writing about the intellectual development of a philosopher is a delicate business. My own endeavor to reinterpret the influence of Hegel on Dewey troubles some scholars because, they believe, I make Dewey seem less original.1 But if, like Dewey, we overcome Cartesian dualism, placing the development of the self firmly within a complex matrix of social processes, we are forced to reexamine, without necessarily surrendering, the notion of individual originality, or what Neil Gross calls “discourse[s] of creative genius.”2 To (...) use a mundane example, I can recall several conversations with Dewey scholars about his dislike for his home state of Vermont, all of which revolved around personal reasons he may.. (shrink)
The following is a transcript of the interview I (Yasuko Kitano) conducted with Neil Levy (The Centre for Applied Philosophy and Public Ethics, CAPPE) on the 23rd in July 2009, while he was in Tokyo to give a series of lectures on neuroethics at The University of Tokyo Center for Philosophy. I edited his words for publication with his approval.
Neil E. Williams develops a systematic metaphysics centred on the idea of powers, as a rival to neo-Humeanism, the dominant systematic metaphysics in philosophy today. Williams takes powers to be inherently causal properties and uses them as the foundation of his explanations of causation, persistence, laws, and modality.
There is good evidence that many people harbour attitudes that conflict with those they endorse. In the language of social psychology, they seem to have implicit attitudes that conflict with their explicit beliefs. There has been a great deal of attention paid to the question whether agents like this are responsible for actions caused by their implicit attitudes, but much less to the question whether they can rightly be described as racist in virtue of harbouring them. In this paper, I (...) attempt to answer this question using three different standards, providing by the three dominant kinds of accounts of racism. I argue that on none of these accounts should agents like this be described as racists. However, it would be misleading to say, without qualification, that they are not racists. On none of these accounts are agents like this entirely off the hook. (shrink)
This work gives an extended presentation of the treatment of variable-binding operators adumbrated in [3:1993d]. Illustrative examples include elementary languages with quantifiers and lambda-equipped categorial languages. Some remarks are also offered to illustrate the philosophical import of the resulting picture. Particularly, a certain conception of logic emerges from the account: the view that logics are true theories in the model-theoretic sense, i.e. the result of selecting a certain class of models as the only “admissible” interpretation structures (for a given language).
On his death in 2007, Richard Rorty was heralded by the New York Times as “one of the world’s most influential contemporary thinkers.” Controversial on the left and the right for his critiques of objectivity and political radicalism, Rorty experienced a renown denied to all but a handful of living philosophers. In this masterly biography, Neil Gross explores the path of Rorty’s thought over the decades in order to trace the intellectual and professional journey that led him to that (...) prominence. The child of a pair of leftist writers who worried that their precocious son “wasn’t rebellious enough,” Rorty enrolled at the University of Chicago at the age of fifteen. There he came under the tutelage of polymath Richard McKeon, whose catholic approach to philosophical systems would profoundly influence Rorty’s own thought. Doctoral work at Yale led to Rorty’s landing a job at Princeton, where his colleagues were primarily analytic philosophers. With a series of publications in the 1960s, Rorty quickly established himself as a strong thinker in that tradition—but by the late 1970s Rorty had eschewed the idea of objective truth altogether, urging philosophers to take a “relaxed attitude” toward the question of logical rigor. Drawing on the pragmatism of John Dewey, he argued that philosophers should instead open themselves up to multiple methods of thought and sources of knowledge—an approach that would culminate in the publication of Philosophy and the Mirror of Nature , one of the most seminal and controversial philosophical works of our time. In clear and compelling fashion, Gross sets that surprising shift in Rorty’s thought in the context of his life and social experiences, revealing the many disparate influences that contribute to the making of knowledge. As much a book about the growth of ideas as it is a biography of a philosopher, Richard Rorty will provide readers with a fresh understanding of both the man and the course of twentieth-century thought. (shrink)
Since Bruno Latour's discussion of a Sakhalin island map used by La Pérouse as part of a global network of “immutable mobiles,” the commensurability of European and non-European knowledge has become an important issue for historians of science. But recent studies have challenged these dichotomous categories as reductive and inadequate for understanding the fluid nature of identities, their relational origins, and their historically constituted character. Itineraries of knowledge transfer, traced in the wake of objects and individuals, offer a powerful heuristic (...) alternative, bypassing artificial epistemological divides and avoiding the limited scale of national or monolingual frames. Approaches that place undue emphasis either on the omnipotence of the imperial center or the centrality of the colonial periphery see only half the picture. Instead, practices of knowledge collection, codification, elaboration, and dissemination—in European, indigenous, and mixed or hybrid contexts—can be better understood by following their moveable parts, with a keen sensitivity toward non-normative epistemologies and more profound temporal frameworks. -/- . (shrink)
Die vorliegende Arbeit hat zum Ziel, „dem Latinisten/Romanisten für eine Reihe vulgärlat. Phänomene griech. Belege zur Verfügung zu stellen“…; „wer ein primär gräzistisches Interesse verfolgt, ….wird eine Antwort auf die Frage finden, welche „ungewöhnlichen“ Schreibungen Verf. für den Niederschlag von Prozessen der lat. Sprachgeschichte hält“ . Die Verf. geht dem gr.-lat. Sprachkontakt der Kaiserzeit, der früh- und mittelbyzantinischen Epoche nach , und zwar anhand der lat. Lehnwörter im Gr. von der Antike bis in die neugriechischen Dialekte, nicht wie üblich dem (...) umgekehrten Einfluss. „Lehnwörter sind eine ausgezeichnete sprachgeschichtliche Quelle sowohl für die Nehmer – als auch für die Gebersprache“, …„Zeugen eines historischen Vorgangs, nämlich des Sprachkontakts und damit Kulturkontakts;“… an ihnen „lassen sich …Akkulturationsphänomene zeigen“ . Dass das byz. Gr. und das Ngr. eine Fülle von lat. LWW enthalten, die durch die historische Rolle Byzanz‘ als Nachfolger des Römischen Reiches, aber auch durch die Präsenz von Romanen in Südosteuropa bis heute zu erklären sind, hat die Forschung am Ende des 19. und in den ersten Dezennien des 20. Jhd. stärker beschäftigt, und die Verf. setzt sich in ihrer „Einführung“ ausführlich mit den Ergebnissen ihrer Vorgänger auseinander: Gleich, ob diese einen altertumswissenschaftlichen , papyrologischen, epigraphischen, romanistischen oder byzantinistisch-neogräzistischen Hintergrund hatten – das Thema, so stellt die Verf. zu Recht fest, ist nach den Veröffentlichungen Zilliacus' und Dölgers umfangreicher Rez. seit den dreissiger Jahren praktisch liegengeblieben, wenn man von der Arbeit der beiden Kahanes absieht, in denen freilich „Lautliches und Morphologisches ausser Betracht bleibe“ . So ist denn für die Romanistik, ausser für J. Kramer und seine Schule, das „Balkanlatein“ weiterhin eine feste Grösse: Zu ihm wird traditionell das Rumänische mit seinen Dialekten , der lat. LWortschatz im Albanischen, das Dalmatische und eben der lat. Anteil im Byz. – Ngr. gerechnet. (shrink)
Philosophical tradition and conspiracy theorists converge in suggesting that ordinary people ought to do their own research, rather than accept the word of others. In this paper, I argue that it’s no accident that conspiracy theorists value lay research on expert topics: such research is likely to undermine knowledge, via its effects on truth and justification. Accepting expert testimony is a far more reliable route to truth. Nevertheless, lay research has a range of benefits; in particular, it is likely to (...) lead to greater understanding, even when it does not lead to knowledge. I argue that we can reap most of the genuine benefits of lay research while minimizing the risks by engaging in exploratory, rather than truth-directed, inquiry. To engage in exploratory inquiry is to engage dogmatically, expecting to be unable to confirm the expert view or to disconfirm rivals. (shrink)
For more than 20 years, research has proven the beneficial effect of natural frequencies when it comes to solving Bayesian reasoning tasks (Gigerenzer & Hoffrage, 1995). In a recent meta-analysis, McDowell & Jacobs (2017) showed that presenting a task in natural frequency format increases performance rates to 24% compared to only 4% when the same task is presented in probability format. Nevertheless, on average three quarters of participants in their meta-analysis failed to obtain the correct solution for such a task (...) in frequency format. In this paper, we present an empirical study on what participants typically do wrong when confronted with natural frequencies. We found that many of them did not actually use natural frequencies for their calculations, but translated them back into complicated probabilities instead. This switch from the intuitive presentation format to a less intuitive calculation format will be discussed within the framework of psychological theories (e.g., the Einstellung effect). (shrink)
The Taming of the True poses a broad challenge to realist views of meaning and truth that have been prominent in recent philosophy. Neil Tennant argues compellingly that every truth is knowable, and that an effective logical system can be based on this principle. He lays the foundations for global semantic anti-realism and extends its consequences from the philosophy of mathematics and logic to the theory of meaning, metaphysics, and epistemology.
Neil Levy defends no-platforming people who espouse dangerous or unacceptable views. I reject his notion of higher-order evidence as authoritarian and dogmatic. I argue that no-platforming frustrates the growth of knowledge.
What is the opposite of freedom? In _Freedom as Marronage_, Neil Roberts answers this question with definitive force: slavery, and from there he unveils powerful new insights on the human condition as it has been understood between these poles. Crucial to his investigation is the concept of marronage—a form of slave escape that was an important aspect of Caribbean and Latin American slave systems. Examining this overlooked phenomenon—one of action from slavery and toward freedom—he deepens our understanding of freedom (...) itself and the origin of our political ideals. Roberts examines the liminal and transitional space of slave escape in order to develop a theory of freedom as marronage, which contends that freedom is fundamentally located within this space—that it is a form of perpetual flight. He engages a stunning variety of writers, including Hannah Arendt, W. E. B. Du Bois, Angela Davis, Frederick Douglass, Samuel Taylor Coleridge, and the Rastafari, among others, to develop a compelling lens through which to interpret the quandaries of slavery, freedom, and politics that still confront us today. The result is a sophisticated, interdisciplinary work that unsettles the ways we think about freedom by always casting it in the light of its critical opposite. (shrink)
Given the tremendous proliferation of student plagiarism involving the Internet, the purpose of this study is to determine which theory of ethical reasoning students invoke when defending their transgressions: deontology, utilitarianism, rational self-interest, Machiavellianism, cultural relativism, or situational ethics. Understanding which theory of ethical reasoning students employ is critical, as preemptive steps can be taken by faculty to counteract this reasoning and prevent plagiarism. Additionally, it has been demonstrated that unethical behavior in school can lead to unethical behavior in business; (...) therefore, correcting unethical behavior in school can have a positive impact on organizational ethics. To meet this objective, a content analysis was conducted on the written records of students formally charged with plagiarizing at a large West Coast university. Each case was classified according to the primary ethical reasoning that the student used to justify plagiarism. Results indicate that students predominately invoke deontology, situational ethics, and Machiavellianism. Based on these findings, specific recommendations are offered to curb plagiarism. (shrink)
Informed consent is a central topic in contemporary biomedical ethics. Yet attempts to set defensible and feasible standards for consenting have led to persistent difficulties. In Rethinking Informed Consent in Bioethics Neil Manson and Onora O'Neill set debates about informed consent in medicine and research in a fresh light. They show why informed consent cannot be fully specific or fully explicit, and why more specific consent is not always ethically better. They argue that consent needs distinctive communicative transactions, by (...) which other obligations, prohibitions, and rights can be waived or set aside in controlled and specific ways. Their book offers a coherent, wide-ranging and practical account of the role of consent in biomedicine which will be valuable to readers working in a range of areas in bioethics, medicine and law. (shrink)
Neil Gascoigne provides the first comprehensive introduction Richard Rorty's work. He demonstrates to the general reader and to the student of philosophy alike how the radical views on truth, objectivity and rationality expressed in Rorty's widely-read essays on contemporary culture and politics derive from his earliest work in the philosophy of mind and language. He avoids the partisanship that characterizes much discussion of Rorty's work whilst providing a critical account of some of the dominant concerns of contemporary thought. Beginning (...) with Rorty's early work on concept-change in the philosophy of mind, the book traces his increasing hostility to the idea that philosophy is cognitively privileged with respect to other disciplines. After the publication of Philosophy and the Mirror of Nature, this led to a new emphasis on preserving the moral and political inheritance of the enlightenment by detaching it from the traditional search for rational foundations. This emerging project led Rorty to champion 'ironic' thinkers like Foucault and Derrida, and to his attempt to update the liberalism of J. S. Mill by offering a non-universalistic account of the individual's need to balance their own private interests against their commitments to others. By returning him to his philosophical roots, Gascoigne shows why Rorty's pragmatism is of continuing relevance to anyone interested in ongoing debates about the nature and limits of philosophy, and the implications these debates have for our understanding of what role the intellectual might play in contemporary life. This book serves as both an excellent introduction to Rorty's work and an innovative critique which contributes to ongoing debates in the field. (shrink)
Suppose that one thinks that certain symmetries of a theory reveal “surplus structure”. What would a formalism without that surplus structure look like? The conventional answer is that it would be a reduced theory: a theory which traffics only in structures invariant under the relevant symmetry. In this paper, I argue that there is a neglected alternative: one can work with a sophisticated version of the theory, in which the symmetries act as isomorphisms.
In Elements of Legislation, Neil Duxbury examines the history of English law through the lens of legal philosophy in an effort to draw out the differences between judge-made and enacted law and to explain what courts do with the laws that legislatures enact. He presents a series of rigorously researched and carefully rehearsed arguments concerning the law-making functions of legislatures and courts, the concepts of legislative supremacy and judicial review, the nature of legislative intent and the core principles of (...) statutory interpretation. (shrink)
Neil Tennant presents an original logical system with unusual philosophical, proof-theoretic, metalogical, computational, and revision-theoretic virtues. Core Logic is the first system that ensures both relevance and adequacy for the formalization of all mathematical and scientific reasoning.
This Element explores what it means for two theories in physics to be equivalent, and what lessons can be drawn about their structure as a result. It does so through a twofold approach. On the one hand, it provides a synoptic overview of the logical tools that have been employed in recent philosophy of physics to explore these topics: definition, translation, Ramsey sentences, and category theory. On the other, it provides a detailed case study of how these ideas may be (...) applied to understand the dynamical and spatiotemporal structure of Newtonian mechanics - in particular, in light of the symmetries of Newtonian theory. In so doing, it brings together a great deal of exciting recent work in the literature, and is sure to be a valuable companion for all those interested in these topics. (shrink)
In this paper, we make the case that a person who is considering or has already made a decision that appears seriously harmful to that person should in some cases be judged incapable of making that...
This study investigated the effects of the 6 Minutes Journal, a commercial diary combining several positive psychology interventions, including gratitude, goal-setting, and self-affirmation exercises, on several mental health outcome measures. In a randomized controlled trial, university students were randomly assigned to one of two groups: 6MT and a wait list control group. Participants in the intervention group were instructed to follow the instructions of the 6MT for 4 weeks. Participants in both groups completed measures of perceived stress, positive and negative (...) affect, self-efficacy and resilience at baseline, after 2, and 4 weeks. We used path-analyses with autoregressive and cross-lagged effects to test our hypotheses of the effects of the 6MT. Participants in the intervention group reported decreased levels of perceived stress and negative affect, as well as increased levels of resilience and self-efficacy compared to the control group. Positive affect was not statistically significantly influenced. The data showed a statistically significant increased levels of self-efficacy and resilience only after 4 weeks, suggesting that changing these constructs needs more time. The 6-minute diary does not appear to make individuals fundamentally more positive. However, the intervention may have a protective function against negative influences on well-being. (shrink)
Cognitive Science is a single-source undergraduate text that broadly surveys the theories and empirical results of cognitive science within a consistent computational perspective. In addition to covering the individual contributions of psychology, philosophy, linguistics, and artificial intelligence to cognitive science, the book has been revised to introduce the connectionist approach as well as the classical symbolic approach and adds a new chapter on cognitively related advances in neuroscience. Cognitive science is a rapidly evolving field that is characterized by considerable contention (...) among different views and approaches. Cognitive Science presents these in a relatively neutral manner. It covers many new orientations theories and findings, embedding them in an integrated computational perspective and establishing a sense of continuity and contrast with more traditional work in cognitive science. The text assumes no prerequisite knowledge, introducing all topics in a uniform, accessible style. Many topics, such as natural language processing and vision, however, are developed in considerable depth, which allows the book to be used with more advanced undergraduates or even in beginning graduate settings. A Bradford Book. (shrink)