What is the Moral Problem? NORMATIVE ETHICS VS. META-ETHICS It is a common fact of everyday life that we appraise each others' behaviour and attitudes from ...
This book explores a question central to philosophy--namely, what does it take for a belief to be justified or rational? According to a widespread view, whether one has justification for believing a proposition is determined by how probable that proposition is, given one's evidence. In this book this view is rejected and replaced with another: in order for one to have justification for believing a proposition, one's evidence must normically support it--roughly, one's evidence must make the falsity of that proposition (...) abnormal in the sense of calling for special, independent explanation. This conception of justification bears upon a range of topics in epistemology and beyond. Ultimately, this way of looking at justification guides us to a new, unfamiliar picture of how we should respond to our evidence and manage our own fallibility. This picture is developed here. (shrink)
There is something puzzling about statistical evidence. One place this manifests is in the law, where courts are reluctant to base affirmative verdicts on evidence that is purely statistical, in spite of the fact that it is perfectly capable of meeting the standards of proof enshrined in legal doctrine. After surveying some proposed explanations for this, I shall outline a new approach – one that makes use of a notion of normalcy that is distinct from the idea of statistical frequency. (...) The puzzle is not, however, merely a legal one. Our unwillingness to base beliefs on statistical evidence is by no means limited to the courtroom, and is at odds with almost every general principle that epistemologists have proposed as to how we ought to manage our beliefs. (shrink)
According to a captivating picture, epistemic justification is essentially a matter of epistemic or evidential likelihood. While certain problems for this view are well known, it is motivated by a very natural thought—if justification can fall short of epistemic certainty, then what else could it possibly be? In this paper I shall develop an alternative way of thinking about epistemic justification. On this conception, the difference between justification and likelihood turns out to be akin to the more widely recognised difference (...) between ceteris paribus laws and brute statistical generalisations. I go on to discuss, in light of this suggestion, issues such as classical and lottery-driven scepticism as well as the lottery and preface paradoxes. (shrink)
Is it right to convict a person of a crime on the basis of purely statistical evidence? Many who have considered this question agree that it is not, posing a direct challenge to legal probabilism – the claim that the criminal standard of proof should be understood in terms of a high probability threshold. Some defenders of legal probabilism have, however, held their ground: Schoeman (1987) argues that there are no clear epistemic or moral problems with convictions based on purely (...) statistical evidence, and speculates that our aversion to such convictions may be nothing more than an irrational bias. More recently, Hedden and Colyvan (2019, section VI) describe our reluctance to convict on the basis of purely statistical evidence as an ‘intuition’, but suggest that there may be no ‘in principle’ problem with such convictions (see also Papineau, forthcoming, section 6). In this paper, I argue that there is, in some cases, an in principle problem with a conviction based upon statistical evidence alone – namely, it commits us to a precedent which, if consistently followed through, could lead to the deliberate conviction of an innocent person. I conclude with some reflections on the idea that the criminal justice system should strive to maximise the accuracy of its verdicts – and the related idea that we should each strive to maximise the accuracy of our beliefs. (shrink)
We ordinarily suppose that there is a difference between having and failing to exercise a rational capacity on the one hand, and lacking a rational capacity altogether on the other. This is crucial for our allocations of responsibility. Someone who has but fails to exercise a capacity is responsible for their failure to exercise their capacity, whereas someone who lacks a capacity altogether is not. However, as Gary Watson pointed out in his seminal essay ’Skepticism about Weakness of Will’, the (...) idea of an unexercised capacity is much more difficult to make sense of than it initially appears. The aim of ’Rational Capacities’ is to provide the needed explication of this idea. (shrink)
The idea that there is such an analytic connection will hardly come as news. It amounts to no more and no less than an endorsement of the claim that all reasons are 'internal', as opposed to 'external', to use Bernard Williams's terms (Williams 1980). Or, to put things in the way Christine Korsgaard favours, it amounts to an endorsement of the 'internalism requirement' on reasons (Korsgaard 1986). But how exactly is the internalism requirement to be understood? What does it tell (...) us about the nature of reasons? And where-in lies its appeal? My aim in this paper is to answer these ques- tions. (shrink)
Granted that desire is always present in the genesis of human action, is it something on the presence of which the agent always reflects? I may act on a belief without coming to recognize that I have the belief. Can I act on a desire without recognizing that I have the desire? In particular, can the desire have a motivational presence in my decision making, figuring in the background, as it were, without appearing in the content of my deliberation, in (...) the foreground? We argue, perhaps unsurprisingly, that yes, desire can figure in the background without figuring in the foreground: we call this the strict background view of desire. But we then show, and this is where the surprise comes, that the strict background view of desire has significant implications for contemporary moral philosophy. (shrink)
Martin Ferguson Smith's work on Lucretius is both well known and highly regarded. However, his 1969 translation of _De Rerum Natura_--long out of print--is virtually unknown. Readers will share our excitement in the discovery of this accurate and fluent prose rendering. For this edition, Professor Smith provides a revised translation, new Introduction, headnotes and bibliography.
One of the most intriguing claims in Sven Rosenkranz’s Justification as Ignorance is that Timothy Williamson’s celebrated anti-luminosity argument can be resisted when it comes to the condition ~ K ~ KP—the condition that one is in no position to know that one is in no position to know P. In this paper, I critically assess this claim.
A ‘lottery belief’ is a belief that a particular ticket has lost a large, fair lottery, based on nothing more than the odds against it winning. The lottery paradox brings out a tension between the idea that lottery beliefs are justified and the idea that that one can always justifiably believe the deductive consequences of things that one justifiably believes – what is sometimes called the principle of closure. Many philosophers have treated the lottery paradox as an argument against the (...) second idea – but I make a case here that it is the first idea that should be given up. As I shall show, there are a number of independent arguments for denying that lottery beliefs are justified. (shrink)
Theories of epistemic justification are commonly assessed by exploring their predictions about particular hypothetical cases – predictions as to whether justification is present or absent in this or that case. With a few exceptions, it is much less common for theories of epistemic justification to be assessed by exploring their predictions about logical principles. The exceptions are a handful of ‘closure’ principles, which have received a lot of attention, and which certain theories of justification are well known to invalidate. But (...) these closure principles are only a small sample of the logical principles that we might consider. In this paper, I will outline four further logical principles that plausibly hold for justification and two which plausibly do not. While my primary aim is just to put these principles forward, I will use them to evaluate some different approaches to justification and (tentatively) conclude that a ‘normic’ theory of justification best captures its logic. (shrink)
People ordinarily suppose that there are certain things they ought to believe and certain things they ought not to believe. In supposing this to be so, they make corresponding assumptions about their belief-forming capacities. They assume that they are generally responsive to what they think they ought to believe in the things they actually come to believe. In much the same sense, people ordinarily suppose that there are certain things they ought to desire and do and they make corresponding assumptions (...) about their capacities to form desires and act on them. We chart these assumptions and argue that they entail that people are responsible and free on two fronts: they are free and responsible believers and free and responsible desirers. (shrink)
Although clinical ethics consultation is a high-stakes endeavor with an increasing prominence in health care systems, progress in developing standards for quality is challenging. In this article, we describe the results of a pilot project utilizing portfolios as an evaluation tool. We found that this approach is feasible and resulted in a reasonably wide distribution of scores among the 23 submitted portfolios that we evaluated. We discuss limitations and implications of these results, and suggest that this is a significant step (...) on the pathway to an eventual certification process for clinical ethics consultants. (shrink)
Michael Smith has written a series of seminal essays about the nature of belief and desire, the status of normative judgment, and the relevance of the views we take on both these topics to the accounts we give of our nature as free and responsible agents. This long awaited collection comprises some of the most influential of Smith's essays. Among the topics covered are: the Humean theory of motivating reasons, the nature of normative reasons, Williams and Korsgaard on internal and (...) external reasons, the nature of self-control, weakness of will, compulsion, freedom, responsibility, the analysis of our rational capacities, moral realism, the dispositional theory of value, the supervenience of the normative on the non-normative, the error theory, rationalist treatments of moral judgment, the practicality requirement on moral judgment and non-cognivist. This collection will be of interest to students in philosophy and psychology. (shrink)
Constitutivism is the view that we can derive a substantive account of normative reasons for action—perhaps a Kantian account, perhaps a hedonistic account, perhaps a desire-fulfillment account, this is up for grabs—from abstract premises about the nature of action and agency. Constitutivists are thus bound together by their conviction that such a derivation is possible, not by their agreement about which substantive reasons can be derived, and not by agreement about the features of action and agency that permit the derivation. (...) In the final section of the penultimate chapter of Reasons, a chapter devoted to discussing the merits of constitutivism, Eric Wiland has this to say: Constitutivism is ambitious. It attempts to extract an account of reasons for action from reflection on the bare ideas of action and agency. So we shouldn’t be surprised if doubts remain. Those who claim to extract reasons out of agency might remind us of those who claim to pull rabbits out of hats. you suspect that there must be a trick. (shrink)
In this paper, we present the results of two surveys that investigate subjects’ judgments about what can be known or justifiably believed about lottery outcomes on the basis of statistical evidence, testimonial evidence, and “mixed” evidence, while considering possible anchoring and priming effects. We discuss these results in light of seven distinct hypotheses that capture various claims made by philosophers about lay people’s lottery judgments. We conclude by summarizing the main findings, pointing to future research, and comparing our findings to (...) recent studies by Turri and Friedman. (shrink)
According to the principle of Conjunction Closure, if one has justification for believing each of a set of propositions, one has justification for believing their conjunction. The lottery and preface paradoxes can both be seen as posing challenges for Closure, but leave open familiar strategies for preserving the principle. While this is all relatively well-trodden ground, a new Closure-challenging paradox has recently emerged, in two somewhat different forms, due to Marvin Backes (2019a) and Francesco Praolini (2019). This paradox synthesises elements (...) of the lottery and the preface and is designed to close off the familiar Closure-preserving strategies. By appealing to a normic theory of justification, I will defend Closure in the face of this new paradox. Along the way I will draw more general conclusions about justification, normalcy and defeat, which bear upon what Backes (2019b) has dubbed the ‘easy defeat’ problem for the normic theory. (shrink)
Our understanding of subjunctive conditionals has been greatly enhanced through the use of possible world semantics and, more precisely, by the idea that they involve variably strict quantification over possible worlds. I propose to extend this treatment to ceteris paribus conditionals – that is, conditionals that incorporate a ceteris paribus or ‘other things being equal’ clause. Although such conditionals are commonly invoked in scientific theorising, they traditionally arouse suspicion and apprehensiveness amongst philosophers. By treating ceteris paribus conditionals as a species (...) of variably strict conditional I hope to shed new light upon their content and their logic. (shrink)
The notion of risk plays a central role in economics, finance, health, psychology, law and elsewhere, and is prevalent in managing challenges and resources in day-to-day life. In recent work, Duncan Pritchard (2015, 2016) has argued against the orthodox probabilistic conception of risk on which the risk of a hypothetical scenario is determined by how probable it is, and in favour of a modal conception on which the risk of a hypothetical scenario is determined by how modally close it is. (...) In this article, we use Pritchard’s discussion as a springboard for a more wide-ranging discussion of the notion of risk. We introduce three different conceptions of risk: the standard probabilistic conception, Pritchard’s modal conception, and a normalcy conception that is new (though it has some precursors in the psychological literature on risk perception). Ultimately, we argue that the modal conception is ill-suited to the roles that a notion of risk is required to play and explore the prospects for a form of pluralism about risk, embracing both the probabilistic and the normalcy conceptions. (shrink)
In this paper I draw attention to a peculiar epistemic feature exhibited by certain deductively valid inferences. Certain deductively valid inferences are unable to enhance the reliability of one's belief that the conclusion is true—in a sense that will be fully explained. As I shall show, this feature is demonstrably present in certain philosophically significant inferences—such as GE Moore's notorious 'proof' of the existence of the external world. I suggest that this peculiar epistemic feature might be correlated with the much (...) discussed phenomenon that Crispin Wright and Martin Davies have called 'transmission failure'—the apparent failure, on the part of some deductively valid inferences to transmit one's justification for believing the premises. (shrink)
In this paper I respond to Marcello Di Bello’s criticisms of the ‘normic account’ of the criminal standard of proof. In so doing, I further elaborate on what the normic account predicts about certain significant legal categories of evidence, including DNA and fingerprint evidence and eyewitness identifications.
Many epistemologists have responded to the lottery paradox by proposing formal rules according to which high probability defeasibly warrants acceptance. Douven and Williamson present an ingenious argument purporting to show that such rules invariably trivialise, in that they reduce to the claim that a probability of 1 warrants acceptance. Douven and Williamson’s argument does, however, rest upon significant assumptions – amongst them a relatively strong structural assumption to the effect that the underlying probability space is both finite and uniform. In (...) this paper, I will show that something very like Douven and Williamson’s argument can in fact survive with much weaker structural assumptions – and, in particular, can apply to infinite probability spaces. (shrink)
In ‘The normative role of knowledge’ (2012), Declan Smithies defends a ‘JK-rule’ for belief: One has justification to believe that P iff one has justification to believe that one is in a position to know that P. Similar claims have been defended by others (Huemer, 2007, Reynolds, forthcoming). In this paper, I shall argue that the JK-rule is false. The standard and familiar way of arguing against putative rules for belief or assertion is, of course, to describe putative counterexamples. My (...) argument, though, won’t be like this – indeed I doubt that there are any intuitively compelling counterexamples to the JK-rule. Nevertheless, the claim that there are counterexamples to the JK-rule can, I think, be given something approaching a formal proof. My primary aim here is to sketch this proof. I will briefly consider some broader implications for how we ought to think about the epistemic standards governing belief and assertion. (shrink)
Evaluative judgements have both belief-like and desire-like features. While cognitivists think that they can easily explain the belief-like features, and have trouble explaining the desire-like features, non-cognitivists think the reverse. I argue that the belief-like features of evaluative judgement are quite complex, and that these complexities crucially affect the way in which an agent's values explain her actions, and hence the desire-like features. While one form of cognitivism can, it turns out that non-cognitivism cannot, accommodate all of these complexities. The (...) upshot is that that form of cognitivism can explain both features of evaluative judgements, and that non-cognitivism can explain neither. (shrink)
Despite continuing controversies regarding the vital status of both brain-dead donors and individuals who undergo donation after circulatory death (DCD), respecting the dead donor rule (DDR) remains the standard moral framework for organ procurement. The DDR increases organ supply without jeopardizing trust in transplantation systems, reassuring society that donors will not experience harm during organ procurement. While the assumption that individuals cannot be harmed once they are dead is reasonable in the case of brain-dead protocols, we argue that the DDR (...) is not an acceptable strategy to protect donors from harm in DCD protocols. We propose a threefold alternative to justify organ procurement practices: (1) ensuring that donors are sufficiently protected from harm; (2) ensuring that they are respected through informed consent; and (3) ensuring that society is fully informed of the inherently debatable nature of any criterion to declare death. (shrink)
This article was conceived as a sequel to “The Humean Theory of Motivation.” The paper addresses various challenges to the standard account of the explanation of intentional action in terms of desire and means-end belief, challenges that didn’t occur to me when I wrote “The Humean Theory of Motivation.” I begin by suggesting that the attraction of the standard account lies in the way in which it allows us to unify a vast array of otherwise diverse types of action explanation. (...) I go on to consider a range of other challenges to the standard account of the explanation of action: Rosalind Hursthouse’s challenge based on the possibility of what she calls “arational” actions (Hursthouse 1991); Michael Stocker’s challenge based on the idea that some explanations of action are nonteleological (Stocker 1981); Mark Platts’s challenge based on the idea that our evaluative beliefs can sometimes explain our actions all by themselves (Platts 1981); a voluntarist challenge based on the possibility of explaining actions by the exercise of self-control; and a challenge from Jonathan Dancy based on the idea that reasons can themselves sometimes explain actions all by themselves (Dancy 1994). (shrink)
Say that two goals are normatively coincident just in case one cannot aim for one goal without automatically aiming for the other. While knowledge and justification are distinct epistemic goals, with distinct achievement conditions, this paper begins from the suggestion that they are nevertheless normatively coincident—aiming for knowledge and aiming for justification are one and the same activity. A number of surprising consequences follow from this—both specific consequences about how we can ascribe knowledge and justification in lottery cases and more (...) general consequences about the nature of justification and the relationship between justification and evidential probability. Many of these consequences turn out to be at variance with conventional, prevailing views. (shrink)
Alexander Miller objects to the argument for moral judgement internalism that I provide in _The Moral Problem. Miller's objection suggests a misunderstanding of the argument. In this reply I take the opportunity to restate the argument in slightly different terms, and to explain why Miller's objection betrays a misunderstanding.
Oxford Handbooks offer authoritative and up-to-date surveys of original research in a particular subject area. Specially commissioned essays from leading figures in the discipline give critical examinations of the progress and direction of debates. Oxford Handbooks provide scholars and graduate students with compelling new perspectives upon a wide range of subjects in the humanities and social sciences. The Oxford Handbook of Contemporary Philosophy is the definitive guide to what's going on in this lively and fascinating subject. Jackson and Smith, themselves (...) two of the world's most eminent philosophers, have assembled more than thirty distinguished scholars to contribute incisive and up-to-date critical surveys of the principal areas of research. The coverage is broad, with sections devoted to moral philosophy, social and political philosophy, philosophy of mind and action, philosophy of language, metaphysics, epistemology, and philosophy of the sciences. This Handbook will be a rich source of insight and stimulation for philosophers, students of philosophy, and for people working in other disciplines of the humanities, social sciences, and sciences, who are interested in the state of philosophy today. (shrink)
Tom Stoppard’s “Rosencrantz and Guildenstern Are Dead” opens with a puzzling scene in which the title characters are betting on coin throws and observe a seemingly astonishing run of 92 heads in a row. Guildenstern grows uneasy and proposes a number of unsettling explanations for what is occurring. Then, in a sudden change of heart, he appears to suggest that there is nothing surprising about what they are witnessing, and nothing that needs any explanation. He says ‘…each individual coin spun (...) individually is as likely to come down heads as tails and therefore should cause no surprise each time it does.’ In this article I argue that Guildenstern is right – there is nothing surprising about throwing 92 heads in a row. I go on to consider the relationship between surprise, probability and belief. (shrink)
The standard of proof applied in civil trials is the preponderance of evidence, often said to be met when a proposition is shown to be more than 50% likely to be true. A number of theorists have argued that this 50%+ standard is too weak – there are circumstances in which a court should find that the defendant is not liable, even though the evidence presented makes it more than 50% likely that the plaintiff’s claim is true. In this paper, (...) I will recapitulate the familiar arguments for this thesis, before defending a more radical one: The 50%+ standard is also too strong – there are circumstances in which a court should find that a defendant is liable, even though the evidence presented makes it less than 50% likely that the plaintiff’s claim is true. I will argue that the latter thesis follows naturally from the former once we accept that the parties in a civil trial are to be treated equally. I will conclude by sketching an alternative interpretation of the civil standard of proof. (shrink)
My concern in this paper is with the claim that knowledge is a mental state – a claim that Williamson places front and centre in Knowledge and Its Limits. While I am not by any means convinced that the claim is false, I do think it carries certain costs that have not been widely appreciated. One source of resistance to this claim derives from internalism about the mental – the view, roughly speaking, that one’s mental states are determined by one’s (...) internal physical state. In order to know that something is the case it is not, in general, enough for one’s internal physical state to be a certain way – the wider world must also be a certain way. If we accept that knowledge is a mental state, we must give up internalism. One might think that this is no cost, since much recent work in the philosophy of mind has, in any case, converged on the view that internalism is false. This thought, though, is too quick. As I will argue here, the claim that knowledge is a mental state would take us to a view much further from internalism than anything philosophers of mind have converged upon. (shrink)
Some contemporary theories treat phenomena like weakness of will, compulsion and wantonness as practical failures but not as failures of rationality: say, as failures of autonomy or whatever. Other current theories-the majority see the phenomena as failures of rationality but not as distinctively practical failures. They depict them as always involving a theoretical deficiency: a sort of ignorance, error, inattention or illogic. They represent them as failures which are on a par with breakdowns of theoretical reason; the failures may not (...) have exact theoretical analogues, exact analogues in the breakdown of belief, but they are of essentially the same, cognitive kind. Our approach gives us quite a different view of things. The pathologies which we identify in our taxonomy are distinctively rational failures and distinctively practical failures; they are failures of pure practical reason. (shrink)