When making consumption choices, people often fail to meet their own standards of both ethics and frugality. People also generally tend to demand more of others than they do of themselves. But little is known about how these different types of hypocrisy interact, particularly in relation to attitudes toward ethical consumption. In three experiments, we integrate research methods using anchoring and hypocrisy within the context of ethical consumption. Across three experiments, we find a default expectation that people should spend less (...) on consumer items than they actually do. This default position can be inverted by making the ethical context of consumption salient, whereby the expectation is then that people should spend more on consumer items than they actually do. Experiments 2 and 3 show that a moderate price anchor for ethical consumption is sufficient to shift expected standards for other people, but a higher price anchor is required to shift expected standards in personal behaviour. We discuss the countervailing roles of frugality and ethical consumption in understanding hypocrisy and ethical decision-making. (shrink)
A symposium was held at the Centre for Research in the Arts, Social Sciences and Humanities at the University of Cambridge on June 12th 2019, ‘Rethinking Repetition in a Digital Age’, at which Geoff Stead, a leading mobile tech designer, was a keynote speaker. The focus of the Cambridge UK event was on how the potentials of digital technologies—whose harms have received widespread attention—could be redirected for the social good. For Stead, this is precisely what Babbel are doing in (...) their approach to commercial digital language learning. Stead spoke to the idea of reversing our personal relationships to mechanical affordances, and finding empowerment in understanding their designed logics. The transcript of the interview below, made in October 2021, revisits some of the main points he raised at that event. (shrink)
Michela Bella & Matteo Santarelli – What did you know about Pragmatism when you started? Where did you start as a student? Charlene Haddock Seigfried – I came to pragmatism by way of existentialism. During the late sixties, I took my first graduate class at the University of Southern California – an introduction to empiricism – which I didn’t like at all, and I also attended a lecture on existentialism, which intrigued me. But I was always interested in social (...) and political issues and I was m... (shrink)
I critically discuss two claims which Hannah Ginsborg makes on behalf of her account of meaning in terms of ‘primitive normativity’: first, that it avoids the sceptical regress articulated by Kripke's Wittgenstein; second, that it makes sense of the thought—central to Kripke's Wittgenstein—that ‘meaning is normative’, in a way which shows this thought not only to be immune from recent criticisms but also to undermine reductively naturalistic theories of content. In the course of the discussion, I consider and attempt to (...) shed light on a number of issues: the structure of the sceptical regress; the content of the thought that ‘meaning is normative’, and its force against reductive theories; the connection between meaning and justification; and the notion of ‘primitive normativity’. (shrink)
Inspired by the writings of J. M. Hinton (1967a, 1967b, 1973), but ushered into the mainstream by Paul Snowdon (1980–1, 1990–1), John McDowell (1982, 1986), and M. G. F. Martin (2002, 2004, 2006), disjunctivism is currently discussed, advocated, and opposed in the philosophy of perception, the theory of knowledge, the theory of practical reason, and the philosophy of action. But what is disjunctivism?
Recent epistemology has reflected a growing interest in issues about the value of knowledge and the values informing epistemic appraisal. Is knowledge more valuable that merely true belief or even justified true belief? Is truth the central value informing epistemic appraisal or do other values enter the picture? Epistemic Value is a collection of previously unpublished articles on such issues by leading philosophers in the field. It will stimulate discussion of the nature of knowledge and of directions that might be (...) taken by the theory of knowledge. The contributors are Jason Baehr, Michael Brady, Berit Brogaard, Michael DePaul, Pascal Engel, Catherine Elgin, Alvin Goldman, John Greco, Stephen Grimm, Ward Jones, Martin Kusch, Jonathan Kvanvig, Michael Lynch, Erik Olsson, Wayne Riggs and Matthew Weiner. (shrink)
This paper argues against the view that trolley cases are of little or no relevance to the ethics of automated vehicles. Four arguments for this view are outlined and rejected: the Not Going to Happen Argument, the Moral Difference Argument, the Impossible Deliberation Argument and the Wrong Question Argument. In making clear where these arguments go wrong, a positive account is developed of how trolley cases can inform the ethics of automated vehicles.
Derided and disregarded by many of his contemporaries, Michel Foucault is now regarded as probably the most influential thinker of the twentieth century, his work is studied across the humanities and social sciences. Reading Foucault, however, can be a challenge, as can writing about him, but in Understanding Foucault, the authors offer an entertaining and informative introduction to his thinking. They cover all the issues Foucault dealt with, including power, knowledge, subjectivity and sexuality and discuss the development of his analysis (...) throughout his work. (shrink)
1. Introduction Geoff Cockfield, Ann Firth and John Laurent -/- 2. The Role of Thumos in Adam Smith’s System Lisa Hill -/- 3. Adam Smith’s Treatment of the Greeks in The Theory of Moral Sentiments: The Case of Aristotle Richard Temple-Smith -/- 4. Adam Smith, Religion and the Scottish Enlightenment Pete Clarke -/- 5. The ‘New View’ of Adam Smith and the Development of his Views Over Time James E. Alvey -/- 6. The Moon Before the Dawn: A Seventeenth-Century (...) Precursor of Smith’s The Theory of Moral Sentiments Jack Barbalet -/- 7. Adam Smith’s Moral Philosophy as Ethical Self-formation Ann Firth -/- 8. Science and its Applications in The Theory of Moral Sentiments David Thorpe -/- 9. Adam Smith, Charles Darwin and the Moral Sense John Laurent and Geoff Cockfield. (shrink)
Traditional approaches tend to regard figuration (and by extension, deference in general) as an essentially marked or playful use of language, which is associated with a pronounced stylistic effect. For linguistic purposes, however, there is no reason for assigning a special place to deferred uses that are stylistically notable — the sorts of usages that people sometimes qualify with a phrase like "figuratively speaking." There is no important linguistic difference between using redcoat to refer to a British soldier and using (...) suit to refer to a corporate executive (as in "A couple of suits stopped by to talk about the new products"). What creates the stylistic effect of the latter is not the mechanism that generates it, but the marked background assumptions that license it — here, the playful presupposition that.. (shrink)
The phenomenon of systematic polysemy offers a fruitful domain for examining the theoretical differences between lexicological and lexicographic approaches to description. We consider here the process that provides for systematic conversion of count to mass nouns in English (a chicken Æ chicken, an oak Æ oak etc.). From the point of view of lexicology, we argue, standard syntactic and pragmatic tests suggest the phenomenon should be described by means of a single unindividuated transfer function that does not distinguish between interpretations (...) (rabbit = "meat" vs. "fur"). From the point of view of lexicography, however, these pragmatically determined"sense precisions" are made part of explicit description via the inclusion of semantic "licenses," a mechanism distinct from lexical rules. (shrink)
Contextualism in epistemology has traditionally been understood as the view that “know” functions semantically like an indexical term, encoding different contents in contexts with different epistemic standards. But the indexical hypothesis about “know” faces a range of objections. This article explores an alternative version of contextualism on which “know” is a semantically stable term, and the truth-conditional variability in knowledge claims is a matter of pragmatic enrichment. The central idea is that in contexts with stringent epistemic standards, knowledge claims are (...) narrowed: “know” is used in such contexts to make assertions about particularly demanding types of knowledge. The resulting picture captures all of the intuitive data that motivate contextualism while sidestepping the controversial linguistic thesis at its heart. After developing the view, the article shows in detail how it avoids one influential linguistic objection to traditional contextualism concerning indirect speech reports, and then answers an objection concerning the unavailability of certain types of clarification speeches. (shrink)
Alvin Plantinga has argued that evolutionary naturalism (the idea that God does not tinker with evolution) undermines its own rationality. Natural selection is concerned with survival and reproduction, and false beliefs conjoined with complementary motivational drives could serve the same aims as true beliefs. Thus, argues Plantinga, if we believe we evolved naturally, we should not think our beliefs are, on average, likely to be true, including our beliefs in evolution and naturalism. I argue herein that our cognitive faculties are (...) less reliable than we often take them to be, that it is theism which has difficulty explaining the nature of our cognition, that much of our knowledge is not passed through biological evolution but learned and transferred through culture, and that the unreliability of our cognition helps explain the usefulness of science. (shrink)
Allshorn, Geoff I believe the year in which I was born to be a very important year, perhaps not surprisingly, but particularly because of other events which would ultimately become significant in my own life.
Teaching Secondary Science: Theory and Practice provides a dynamic approach to preparing preservice science teachers for practice. Divided into two parts - theory and practice - the text allows students to first become confident in the theory of teaching science before showing how this theory can be applied to practice through ideas for implementation, such as sample lesson plans. These examples span a variety of age levels and subject areas, allowing preservice teachers to adapt each exercise to suit their needs (...) when they enter the classroom.Each chapter is supported by pedagogical features, including learning objectives, reflections, scenarios, key terms, questions, research topics and further readings. Written by leading science education researchers from universities across Australia, Teaching Secondary Science is a practical resource that will continue to inspire preservice teachers as they move from study into the classroom. This book includes a single-use twelve-month subscription to Cambridge Dynamic Science. (shrink)
Suppose a driverless car encounters a scenario where harm to at least one person is unavoidable and a choice about how to distribute harms between different persons is required. How should the driverless car be programmed to behave in this situation? I call this the moral design problem. Santoni de Sio defends a legal-philosophical approach to this problem, which aims to bring us to a consensus on the moral design problem despite our disagreements about which moral principles provide the correct (...) account of justified harm. He then articulates an answer to the moral design problem based on the legal doctrine of necessity. In this paper, I argue that Santoni de Sio’s answer to the moral design problem does not achieve the aim of the legal-philosophical approach. This is because his answer relies on moral principles which, at least, utilitarians have reason to reject. I then articulate an alternative reading of the doctrine of necessity, and construct a partial answer to the moral design problem based on this. I argue that utilitarians, contractualists and deontologists can agree on this partial answer, even if they disagree about which moral principles offer the correct account of justified harm. (shrink)
In his excellent essay, ‘Nudges in a post-truth world’, Neil Levy argues that ‘nudges to reason’, or nudges which aim to make us more receptive to evidence, are morally permissible. A strong argument against the moral permissibility of nudging is that nudges fail to respect the autonomy of the individuals affected by them. Levy argues that nudges to reason do respect individual autonomy, such that the standard autonomy objection fails against nudges to reason. In this paper, I argue that Levy (...) fails to show that nudges to reason respect individual autonomy. (shrink)
This book will be of interest to any person, whether an interested party, student, or scholar of the Roman Empire. It highlights the way in which we should consider ancient figures—be they good or bad.
This book will be of interest to any person, whether an interested party, student, or scholar of the Roman Empire. It highlights the way in which we should consider ancient figures—be they good or bad.
The product/process distinction with regards to “argument” has a longstanding history and foundational role in argumentation theory. I shall argue that, regardless of one’s chosen ontology of arguments, arguments are not the product of some process of arguing. Hence, appeal to the distinction is distorting the very organizational foundations of argumentation theory and should be abandoned.
Transitional justice scholars are increasingly concerned with measuring the impact of transitional justice initiatives. Scholars often assume that TJ mechanisms must be properly designed and ordered to achieve lasting effect, but the impact of TJ timing and sequencing has attracted relatively little theoretical or empirical attention. Focusing on Latin America, this article explores variation within the region as to when TJ occurs and the order in which mechanisms are implemented. We utilize qualitative comparative analysis to assess the impact of TJ (...) timing and sequencing on democratic development. We find little evidence for path dependency owing to the chronological order of mechanisms. We do find, however, that amnesties and trials approach a sufficient condition for democratic consolidation in Latin America; trials, however, come closest to being a necessary condition for successful democratic consolidation. (shrink)
Even if our justified beliefs are closed under known entailment, there may still be instances of transmission failure. Transmission failure occurs when P entails Q, but a subject cannot acquire a justified belief that Q by deducing it from P. Paradigm cases of transmission failure involve inferences from mundane beliefs (e.g., that the wall in front of you is red) to the denials of skeptical hypotheses relative to those beliefs (e.g., that the wall in front of you is not white (...) and lit by red lights). According to the Bayesian explanation, transmission failure occurs when (i) the subject’s belief that P is based on E, and (ii) P(Q|E) P(Q). No modifications of the Bayesian explanation are capable of accommodating such cases, so the explanation must be rejected as inadequate. Alternative explanations employing simple subjunctive conditionals are fully capable of capturing all of the paradigm cases, as well as those missed by the Bayesian explanation. (shrink)
Collective intelligence is much talked about but remains very underdeveloped as a field. There are small pockets in computer science and psychology and fragments in other fields, ranging from economics to biology. New networks and social media also provide a rich source of emerging evidence. However, there are surprisingly few useable theories, and many of the fashionable claims have not stood up to scrutiny. The field of analysis should be how intelligence is organised at large scale—in organisations, cities, nations and (...) networks. The paper sets out some of the potential theoretical building blocks, suggests an experimental and research agenda, shows how it could be analysed within an organisation or business sector and points to the possible intellectual barriers to progress. (shrink)
Arendt famously pointed out that only citizenship actually confers rights in the modern world. To be a citizen is to be one who has the ‘right to have rights’. Arendt’s analysis emerges out of her recognition that there is a contradiction between this way of conferring rights as tied to the nation-state system and the more philosophical and ethical conceptions of the ‘rights of man’ and notions of ‘human rights’ like those championed by thinkers such as Immanuel Kant who understands (...) rights belonging universally to all humans as a result of facts having to do with what it means to be human. Étienne Balibar, in his recent work, adds to this by pointing out that there is a contradictory movement between this universalizing tendency in philosophical thought and the production of the citizen-subject out of the exclusionary acts of law and force. In this article, I put Balibar’s work in dialogue with the contemporary moment where we are witnessing the re-emergence of a nativist right populism. I use Balibar to help distinguish between three modes of political existence that we find today. Two of these three are more or less well understood. They are the non-citizen, who has no – or almost no – rights in a given nation-state and the citizen who enjoys the full benefit of the rights a given nation-state has to give. The third category is what I term the ‘nominal citizen’. This last category is somewhere in between full citizenship and non-citizenship. Individuals in this last category have rights in name but are largely unable to exercise them. Understanding this last category can, among other things, help us at least partially make sense of the return of right populism and also help us see the ways in which the modern category of citizenship, with its contradictions as elaborated by Balibar, can provide a means for resistance. (shrink)
Following the American indie cinema boom of the 1990s and the creation of "specialty" divisions by several Hollywood studios, many predicted an end to both the indie sector's viability and the making of films with ambitions beyond the commercial mainstream. Yet, as Geoff King demonstrates, plenty of distinct indie productions continue to thrive, even in the face of difficult economic circumstances. Recasting the term "indie" to denote a particular form of independent feature production that has risen to prominence in (...) the twenty-first century, King identifies and discusses the new opportunities available to indie filmmakers. These new options and techniques include low-cost digital video and a range of Internet and social-media ventures providing funding, distribution, promotion, and sales. He also covers the ultra-low-budget "mumblecore" movement; the social realism of such filmmakers as Kelly Reichardt and Ramin Bahrani; the "digital desktop" aesthetics of Jonathan Caouette's Tarnation and Arin Crumley and Susan Buice's Four Eyed Monsters ; and the affect of certain dominant discourses, such as the articulation of notions of "true" indie film and its opposition to what some see as the quirky contrivances of crossover hits such as Little Miss Sunshine and Juno. King ultimately locates a strong vein of continuity in indie practice, both industrially and in the textual qualities that define individual features. (shrink)
The use of machine learning systems for decision-support in healthcare may exacerbate health inequalities. However, recent work suggests that algorithms trained on sufficiently diverse datasets could in principle combat health inequalities. One concern about these algorithms is that their performance for patients in traditionally disadvantaged groups exceeds their performance for patients in traditionally advantaged groups. This renders the algorithmic decisions unfair relative to the standard fairness metrics in machine learning. In this paper, we defend the permissible use of affirmative algorithms; (...) that is, algorithms trained on diverse datasets that perform better for traditionally disadvantaged groups. Whilst such algorithmic decisions may be unfair, the fairness of algorithmic decisions is not the appropriate locus of moral evaluation. What matters is the fairness of final decisions, such as diagnoses, resulting from collaboration between clinicians and algorithms. We argue that affirmative algorithms can permissibly be deployed provided the resultant final decisions are fair. (shrink)
The Thousands of Problems for Theorem Provers (TPTP) World is a well-established infrastructure that supports research, development and deployment of automated theorem proving systems. This paper provides an overview of the logic languages of the TPTP World, from classical first-order form (FOF), through typed FOF, up to typed higher-order form, and beyond to non-classical forms. The logic languages are described in a non-technical way and are illustrated with examples using the TPTP language.
Whether it seems that you know something depends in part upon practical factors. When the stakes are low, it can seem to you that you know that p, but when the stakes go up it'll seem to you that you don't. The apparent sensitivity of knowledge to stakes presents a serious challenge to epistemologists who endorse a stable semantics for knowledge attributions and reject the idea that whether you know something depends on how much is at stake. After arguing that (...) previous attempts to meet this challenge fall short, I offer a new solution: the unassertability account. The account starts with the observation that high stakes subjects aren't in an epistemic position to assert. We generally presuppose that knowing is sufficient for epistemically proper assertion, but this presupposition only stands up to scrutiny if we draw a distinction between two notions of epistemic propriety, and we shouldn't expect ordinary speakers to draw it. A subject in a high stakes situation who fails to draw the distinction will be led by the sufficiency claim to treat anything she isn't in a position to assert as something she isn't in a position to know. The sensitivity of epistemically proper assertion to practical factors explains the merely apparent sensitivity of knowledge to stakes. (shrink)
Suppose that an autonomous vehicle encounters a situation where (i) imposing a risk of harm on at least one person is unavoidable; and (ii) a choice about how to allocate risks of harm between different persons is required. What does morality require in these cases? Derek Leben defends a Rawlsian answer to this question. I argue that we have reason to reject Leben’s answer.
This book provides an integrated and philosophically-grounded framework which enables a coherent approach to organizations and organizational ethics from the perspective of practitioners in the workplace, from the perspective of managers in organizations, as well as from the perspective of organizations themselves.
Some expressions of English, like the demonstratives ‘this’ and ‘that’, are referentially promiscuous: distinct free occurrences of them in the same sentence can differ in content relative to the same context. One lesson of referentially promiscuous expressions is that basic logical properties like validity and logical truth obtain or fail to obtain only relative to a context. This approach to logic can be developed in just as rigorous a manner as David Kaplan’s classic logic of demonstratives. The result is a (...) logic that applies to arguments in English containing multiple occurrences of referentially promiscuous expressions. (shrink)