What is structural rationality? Specifically, what is the distinctive feature of structural requirements of rationality? Some philosophers have argued, roughly, that the distinctive feature of structural requirements is coherence. But what does coherence mean, exactly? Or, at least, what do structuralists about rationality have in mind when they claim that structural rationality is coherence? This issue matters for making progress in various active debates concerning rationality. In this paper, I analyze three strategies for figuring out what coherence means in the (...) debates on structural rationality. I argue that these strategies face problems. (shrink)
A conjunction of two hypotheses may provide a better explanation than either one of them individually, even if each already provides a good explanation on its own. An appropriate measure of explanatory power should reflect this, but none of the measures discussed in the literature do so because they only consider how much an explanatory hypothesis reduces our surprise at the evidence – which is problematic. This chapter introduces and defends a class of coherentist measures of explanatory power, and shows (...) that one particular such measure that combines different intuitions underlying the epistemological notion of coherence overcomes the aforementioned problems of surprise reduction measures. (shrink)
Coherence considerations play an important role in science and in everyday reasoning. However, it is unclear what exactly is meant by coherence of information and why we prefer more coherent information over less coherent information. To answer these questions, we first explore how to explicate the dazzling notion of ``coherence'' and how to measure the coherence of an information set. To do so, we critique prima facie plausible proposals that incorporate normative principles such as ``Agreement'' or ``Dependence'' and then argue (...) that the coherence of an information set is best understood as an indicator of the truth of the set under certain conditions. Using computer simulations, we then show that a new probabilistic measure of coherence that combines aspects of the two principles above, but without strictly satisfying either principle, performs particularly well in this regard. (shrink)
In the early twentieth century, Uchiyama Gudō, Seno’o Girō, Lin Qiuwu, and others advocated a Buddhism that was radical in two respects. Firstly, they adopted a more or less naturalist stance with respect to Buddhist doctrine and related matters, rejecting karma or other supernatural beliefs. And secondly, they held political and economic views that were radically anti-hegemonic, anti-capitalist, and revolutionary. Taking the idea of such a “radical Buddhism” seriously, A Buddha Land in This World: Philosophy, Utopia, and Radical Buddhism asks (...) whether it is possible to develop a philosophy that is simultaneously naturalist, anti-capitalist, Buddhist, and consistent. Rather than a study of radical Buddhism, then, this book is an attempt to radicalize it. The foundations of this “radicalized radical Buddhism” are provided by a realist interpretation of Yogācāra, elucidated and elaborated with some help from thinkers in the broader Tiantai/Tendai tradition and American philosophers Donald Davidson and W.V.O. Quine. A key implication of this foundation is that only this world and only this life are real, from which it follows that if Buddhism aims to alleviate suffering, it has to do so in this world and in this life. Twentieth-century radical Buddhists (as well as some engaged Buddhists) came to a similar conclusion, often expressed in their aim to realize “a Buddha land in this world.” Building on this foundation, but also on Mahāyāna moral philosophy, this book argues for an ethics and social philosophy based on a definition of evil as that what is or should be expected to cause death or suffering. On that ground, capitalism should be rejected indeed, but utopianism must be treated with caution as well, which raises questions about what it means – from a radicalized radical Buddhist perspective – to aim for a Buddha land in this world. (shrink)
Vindicating the claim that agents ought to be consistent has proved to be a difficult task. Recently, some have argued that we can use accuracy-dominance arguments to vindicate the normativity of such requirements. But what do these arguments prove, exactly? In this paper, I argue that we can make a distinction between two theses on the normativity of consistency: the view that one ought to be consistent and the view that one ought to avoid being inconsistent. I argue that accuracy-dominance (...) arguments for consistency support the latter view, but not necessarily the former. I also argue that the distinction between these two theses matters in the debate on the normativity of epistemic rationality. Specifically, the distinction suggests that there are interesting alternatives to vindicating the strong claim that one ought to be consistent. (shrink)
Epistemically immodest agents take their own epistemic standards to be among the most truth-conducive ones available to them. Many philosophers have argued that immodesty is epistemically required of agents, notably because being modest entails a problematic kind of incoherence or self-distrust. In this paper, I argue that modesty is epistemically permitted in some social contexts. I focus on social contexts where agents with limited cognitive capacities cooperate with each other (like juries).
Книгата е посветена на проблема за обосноваността от класическата интерналистка гледна точка на епистемологията. Тя преразглежда традиционните проблеми в интерналистката рамка в по-широкия контекст на развитието в тази област през последните петдесет години.Централните проблеми в изследването се отнасят до условията на обосноваността като резултат от епистемичната оценка на вярванията с помощта на свидетелства и основания. Сред тях са проблемът за характера на свидетелствата и мястото им в процедурата на епистемична оценка, проблемите за добрите и достатъчни основания и проблемът за регреса (...) на основанията. (shrink)
Reliabilism is an intuitive and attractive view about epistemic justification. However, it has many well-known problems. I offer a novel condition on reliabilist theories of justification. This method coherence condition requires that a method be appropriately tested by appeal to a subject’s other belief-forming methods. Adding this condition to reliabilism provides a solution to epistemic circularity worries, including the bootstrapping problem.
In a series of papers, Adam Leite has developed a novel view of justification tied to being able to responsibly justify a belief. Leite touts his view as faithful to our ordinary practice of justifying beliefs, providing a novel response to an epistemological problem of the infinite regress, and resolving the “persistent interlocutor” problem. Though I find elements of Leite’s view of being able to justify a belief promising, I hold that there are several problems afflicting the overall picture of (...) justification. In this paper, I argue that despite its ambitions, Leite’s view fails to solve the persistent interlocutor problem and does not avoid a vicious regress. (shrink)
For many epistemologists, and for many philosophers more broadly, it is axiomatic that rationality requires you to take the doxastic attitudes that your evidence supports. Yet there is also another current in our talk about rationality. On this usage, rationality is a matter of the right kind of coherence between one's mental attitudes. Surprisingly little work in epistemology is explicitly devoted to answering the question of how these two currents of talk are related. But many implicitly assume that evidence -responsiveness (...) guarantees coherence, so that the rational impermissibility of incoherence will just fall out of the putative requirement to take the attitudes that one's evidence supports, and so that coherence requirements do not need to be theorized in their own right, apart from evidential reasons. In this paper, I argue that this is a mistake, since coherence and evidence -responsiveness can in fact come into conflict. More specifically, I argue that in cases of misleading higher-order evidence, there can be a conflict between believing what one's evidence supports and satisfying a requirement that I call “inter-level coherence ”. This illustrates why coherence requirements and evidential reasons must be separated and theorized separately. (shrink)
Phenomenal conservatism as developed by some philosophers faces a previously unnoticed problem. The problem stems from the fact that, as some develop the view, phenomenal conservatism holds that seemings alone justify—sensations have no justificatory impact. Given this, phenomenal conservatism faces a problem analogous to the isolation objection to coherentism. As foundationalists, supporters of phenomenal conservatism will want to allow that the isolation objection is effective against coherentism, and yet claim that a similar objection is not effective against their view. Unfortunately, (...) it appears that on most understandings of the nature of seemings phenomenal conservatism can only avoid its version of the isolation objection by sacrificing its internalist character. (shrink)
In the Transcendental Dialectic (KrV, A 599-560/B 627-628), Kant presents the argument of the hundred talers as a concrete example of his general claim against conceiving existence as a real predicate. According to Kant, the content of concepts can be completely determined as merely possible content; in the existential judgment, the subject then relates the completely determined content of his internal thoughts with perception: it is only through perception that the subject knows the content of his concepts as real things (...) of the world. Thus, although in his epistemology conceptual activity plays a crucial role in perceptual activity, Kant still offers an empiricist account of empirical knowledge. Hegel criticizes Kant´s theory of perception by distinguishing representation (Vorstellen) from comprehension (Begreifen) and by developing on that basis a more complex theory of the relation between concept, existence, and empirical knowledge. From the standpoint of representation, concept and existence, being-determined and being-real exclude each other; in comprehension, on the contrary, representation and perception are no longer unilateral forms of knowing: for the mind that comprehends what it perceives, the existence of what it perceives is –when abstractly considered as such– only a collateral product of the way it knows the real world. According to Hegel, comprehension develops the internal necessity of the contents of knowledge and, by doing so, grasps the individual objects that constitute our world. (shrink)
Several authors suggest that understanding and epistemic coherence are tightly connected. Using an account of understanding that makes no appeal to coherence, I explain away the intuitions that motivate this position. I then show that the leading coherentist epistemologies only place plausible constraints on understanding insofar as they replicate my own account’s requirements. I conclude that understanding is only superficially coherent.
Coherence is the property of propositions hanging or fitting together. Intuitively, adding a proposition to a set of propositions should be compatible with either increasing or decreasing the set’s degree of coherence. In this paper we show that probabilistic coherence measures based on relative overlap are in conflict with this intuitive verdict. More precisely, we prove that according to the naive overlap measure it is impossible to increase a set’s degree of coherence by adding propositions and that according to the (...) refined overlap measure no set’s degree of coherence exceeds the degree of coherence of its maximally coherent subset. We also show that this result carries over to all other subset-sensitive refinements of the naive overlap measure. As both results stand in sharp contrast to elementary coherence intuitions, we conclude that extant relative overlap measures of coherence are inadequate. (shrink)
In “What price coherence?”, Klein and Warfield put forward a simple argument that triggered an extensive debate on the epistemic virtues of coherence. As is well-known, this debate yielded far-reaching impossibility results to the effect that coherence is not conducive to truth, even if construed in a ceteris paribus sense. A large part of the present paper is devoted to a re-evaluation of these results. As is argued, all explications of truth-conduciveness leave out an important aspect: while it might not (...) be the case that coherence is truth-conducive, it might be conducive to verisimilitude or epistemic utility. Unfortunately, it is shown that the answer for both these issues must be in the negative, again. Furthermore, we shift the focus from sets of beliefs to particular beliefs: as is shown, neither is any of the extant probabilistic measures of coherence truth-conducive on the level of particular beliefs, nor does weakening these measures to quasi-orderings establish the link between coherence and truth for an important amount of measures. All in all, the results in this paper cast a serious doubt on the approach of establishing a link between coherence and truth. Finally, recent arguments that shift the focus from the relationship between coherence and truth to the one between coherence and confirmation are assessed. (shrink)
The proposition that Tweety is a bird coheres better with the proposition that Tweety has wings than with the proposition that Tweety cannot fly. This relationship of contrastive coherence is the focus of the present paper. Based on recent work in formal epistemology we consider various possibilities to model this relationship by means of probability theory. In a second step we consider different applications of these models. Among others, we offer a coherentist interpretation of the conjunction fallacy.
Once upon a time, coherentism was the dominant response to the regress problem in epistemology, but in recent decades the view has fallen into disrepute: now almost everyone is a foundationalist (with a few infinitists sprinkled here and there). In this paper, I sketch a new way of thinking about coherentism, and show how it avoids many of the problems often thought fatal for the view, including the isolation objection, worries over circularity, and concerns that the concept of coherence is (...) too vague or metaphorical for serious theoretical use. The key to my approach is to take a familiar tool from discussions of the regress problem -- namely, directed graphs depicting the support relations between beliefs -- and to use that tool in a more sophisticated manner than it is standardly employed. (shrink)
A common objection to coherence theories of justification comes from belief revision processes: in a system of knowledge, perceptual beliefs seem to bear more importance than other members of the coherent set do. They are more stable in the face of confronting evidence, and may be preserved despite their degrading effect on the coherence properties of the system. This appears to be inconsistent with coherentism, according to which beliefs cannot possess independent credibility. In order to abide by the coherence theory, (...) one must explain the stability of perceptual beliefs in belief revision in a manner that does not rely on foundationalist premises. A suggestion about the personal justification of perceptual beliefs in terms of coherence is presented in the paper to explain their stability in belief revision processes. The coherence of perceptual beliefs and a network account of knowledge are advocated in order to avoid weak foundationalism and to provide a new perspective to the normative problems of epistemic justification. (shrink)
Plantinga argues that cases involving ‘fixed’ beliefs refute the coherentist thesis that a belief’s belonging to a coherent set of beliefs suffices for its having justification (warrant). According to Plantinga, a belief cannot be justified if there is a ‘lack of fit’ between it and its subject’s experiences. I defend coherentism by showing that if Plantinga means to claim that any ‘lack of fit’ destroys justification, his argument is obviously false. If he means to claim that significant ‘lack of fit’ (...) destroys justification, his argument suffers a critical lack of support. Either way, Plantinga’s argument fails and coherentism emerges unscathed. (shrink)
One of the integral parts of Bayesian coherentism is the view that the relation of ‘being no less coherent than’ is fully determined by the probabilistic features of the sets of propositions to be ordered. In the last one and a half decades, a variety of probabilistic measures of coherence have been put forward. However, there is large disagreement as to which of these measures best captures the pre-theoretic notion of coherence. This paper contributes to the debate on coherence measures (...) by considering three classes of adequacy constraints. Various independence and dependence relations between the members of each class will be taken into account in order to reveal the ‘grammar’ of probabilistic coherence measures. Afterwards, existing proposals are examined with respect to this list of desiderata. Given that for purely mathematical reasons there can be no measure that satisfies all constraints, the grammar allows the coherentist to articulate an informed pluralist stance as regards probabilistic measures of coherence. (shrink)
The debate on probabilistic measures of coherence has focused on evaluating sets of consistent propositions. In this paper we draw attention to the largely neglected question of whether such measures concur with intuitions on test cases involving inconsistent propositions and whether they satisfy general adequacy constraints on coherence and inconsistency. While it turns out that, for the vast majority of measures in their original shape, this question must be answered in the negative, we show that it is possible to adapt (...) many of them in order to improve their performance. (shrink)
As one of the first modern philosophers, Georg Simmel systematically developed a “relativistic world view” (Simmel 2004, VI). In this paper I attempt to examine Simmel’s relativistic answer to the question of truth. I trace his main arguments regarding the concept of truth and present his justification of epistemic relativism. In doing so, I also want to show that some of Simmel’s claims are surprisingly timely. Simmel’s relativistic concept of truth is supported by an evolutionary argument. The first part of (...) this paper outlines that pragmatic foundation of his epistemology. The second part of the paper shows that Simmel develops what today would be called a coherence theory of truth. He presents his coherentist view that every belief is true only in relation to another one primarily as a theory of epistemic justification. The third part turns to Simmel’s original way of dealing with the (in)famous self-refutation charge against relativism. (shrink)
Dans Word and Object, Quine propose explicitement une définition de la notion de signification-stimulus dans les termes du behaviorisme. Cette caractérisation est commandée par son naturalisme, qui consiste à faire de la philosophie une partie de la science empirique. Il est néanmoins généralement admis que la science entend décrire les phénomènes tandis qu’à la suite du tournant linguistique, la philosophie se concentre sur le langage. Dans ces conditions, il est légitime de chercher à déterminer le statut logique que l’on doit (...) assigner à la notion de signification-stimulus chez Quine. Nous présentons à cet effet trois procédures langagières bien distinctes dans la littérature. Nous ébauchons ensuite le cadre théorique dans lequel se situe la notion de signification-stimulus avant d’en proposer une reconstruction rationnelle en termes neurophysiologiques. Ces considérations apportées, nous montrons que la caractérisation que Quine donne de cette notion constitue non une définition, mais une explication. (shrink)
Many philosophers think that requirements of rationality are “wide-scope”. That is to say: they are requirements to satisfy some material conditional, such that one counts as satisfying the requirement iff one either makes the conditional’s antecedent false or makes its consequent true. These contrast with narrow-scope requirements, where the requirement takes scope only over the consequent of the conditional. Many of the philosophers who have preferred wide-scope requirements to narrow-scope requirements have also endorsed a corresponding semantic claim, namely that ordinary (...) talk about rationality, despite appearances to the contrary, expresses wide-scope claims. In doing so, they seek to avoid attributing massive error to ordinary speakers. However, it is becoming increasingly clear that the wide-scope semantics inadequately captures the meaning of ordinary talk about rationality. It seems, then, that we are left with a dilemma: either give up the view that requirements of rationality are wide-scope, or accept an implausible semantics for ordinary talk about rationality, or attribute massive error to speakers. In this paper, I argue that this dilemma is only apparent, since we can appeal to a standard kind of contextualist semantics for modals to explain why narrow-scope talk comes out true in virtue of the wide-scope requirements. My view, then, combines wide-scoping about the explanatorily fundamental requirements of rationality with a contextualist variant of a narrow-scope semantics. I argue that this view gives us the best of both worlds, as well as solving related puzzles and challenges for the extant views in the literature. (shrink)
In this paper we show that the coherence measures of Olsson (J Philos 94:246–272, 2002), Shogenji (Log Anal 59:338–345, 1999), and Fitelson (Log Anal 63:194–199, 2003) satisfy the two most important adequacy requirements for the purpose of assessing theories. Following Hempel (Synthese 12:439–469, 1960), Levi (Gambling with truth, New York, A. A. Knopf, 1967), and recently Huber (Synthese 161:89–118, 2008) we require, as minimal or necessary conditions, that adequate assessment functions favor true theories over false theories and true and informative (...) theories over true but uninformative theories. We then demonstrate that the coherence measures of Olsson, Shogenji, and Fitelson satisfy these minimal conditions if we confront the hypotheses with a separating sequence of observational statements. In the concluding remarks we set out the philosophical relevance, and limitations, of the formal results. Inter alia, we discuss the problematic implications of our precondition that competing hypotheses must be confronted with a separating sequence of observational statements, which also leads us to discuss theory assessment in the context of scientific antirealism. (shrink)
A brief account of epistemological models that try to unfold the intertheoretic context of theory change is proposed. It is stated that all of them has a host of drawbacks, the most salient one being the lack of adequate description of the research traditions interaction process. The epistemological model of mature theory change, eliminating the drawback, is contemplated and illustrated.
I argue that coherence is truth-conducive in that coherence implies an increase in the probability of truth. Central to my argument is a certain principle for transitivity in probabilistic support. I then address a question concerning the truth-conduciveness of coherence as it relates to (something else I argue for) the truth-conduciveness of consistency, and consider how the truth-conduciveness of coherence bears on coherentist theories of justification.
Recently there have been several attempts in formal epistemology to develop an adequate probabilistic measure of coherence. There is much to recommend probabilistic measures of coherence. They are quantitative and render formally precise a notion—coherence—notorious for its elusiveness. Further, some of them do very well, intuitively, on a variety of test cases. Siebel, however, argues that there can be no adequate probabilistic measure of coherence. Take some set of propositions A, some probabilistic measure of coherence, and a probability distribution such (...) that all the probabilities on which A’s degree of coherence depends (according to the measure in question) are defined. Then, the argument goes, the degree to which A is coherent depends solely on the details of the distribution in question and not at all on the explanatory relations, if any, standing between the propositions in A. This is problematic, the argument continues, because, first, explanation matters for coherence, and, second, explanation cannot be adequately captured solely in terms of probability. We argue that Siebel’s argument falls short. (shrink)
The debate on probabilistic measures of coherence flourishes for about 15 years now. Initiated by papers that have been published around the turn of the millennium, many different proposals have since then been put forward. This contribution is partly devoted to a reassessment of extant coherence measures. Focusing on a small number of reasonable adequacy constraints I show that (i) there can be no coherence measure that satisfies all constraints, and that (ii) subsets of these adequacy constraints motivate two different (...) classes of coherence measures. These classes do not coincide with the common distinction between coherence as mutual support and coherence as relative set-theoretic overlap. Finally, I put forward arguments to the effect that for each such class of coherence measures there is an outstanding measure that outperforms all other extant proposals. One of these measures has recently been put forward in the literature, while the other one is based on a novel probabilistic measure of confirmation. (shrink)
Striving for a probabilistic explication of coherence, scholars proposed a distinction between agreement and striking agreement. In this paper I argue that only the former should be considered a genuine concept of coherence. In a second step the relation between coherence and reliability is assessed. I show that it is possible to concur with common intuitions regarding the impact of coherence on reliability in various types of witness scenarios by means of an agreement measure of coherence. Highlighting the need to (...) separate the impact of coherence and specificity on reliability it is finally shown that a recently proposed vindication of the Shogenji measure qua measure of coherence vanishes. (shrink)
Coherentism maintains that coherent beliefs are more likely to be true than incoherent beliefs, and that coherent evidence provides more confirmation of a hypothesis when the evidence is made coherent by the explanation provided by that hypothesis. Although probabilistic models of credence ought to be well-suited to justifying such claims, negative results from Bayesian epistemology have suggested otherwise. In this essay we argue that the connection between coherence and confirmation should be understood as a relation mediated by the causal relationships (...) among the evidence and a hypothesis, and we offer a framework for doing so by fitting together probabilistic models of coherence, confirmation, and causation. We show that the causal structure among the evidence and hypothesis is sometimes enough to determine whether the coherence of the evidence boosts confirmation of the hypothesis, makes no difference to it, or even reduces it. We also show that, ceteris paribus, it is not the coherence of the evidence that boosts confirmation, but rather the ratio of the coherence of the evidence to the coherence of the evidence conditional on a hypothesis. (shrink)
I develop a probabilistic account of coherence, and argue that at least in certain respects it is preferable to (at least some of) the main extant probabilistic accounts of coherence: (i) Igor Douven and Wouter Meijs’s account, (ii) Branden Fitelson’s account, (iii) Erik Olsson’s account, and (iv) Tomoji Shogenji’s account. Further, I relate the account to an important, but little discussed, problem for standard varieties of coherentism, viz., the “Problem of Justified Inconsistent Beliefs.”.
The most pressing difficulty coherentism faces is, I believe, the problem of justified inconsistent beliefs. In a nutshell, there are cases in which our beliefs appear to be both fully rational and justified, and yet the contents of the beliefs are inconsistent, often knowingly so. This fact contradicts the seemingly obvious idea that a minimal requirement for coherence is logical consistency. Here, I present a solution to one version of this problem.
Some recent work in formal epistemology shows that “witness agreement” by itself implies neither an increase in the probability of truth nor a high probability of truth—the witnesses need to have some “individual credibility.” It can seem that, from this formal epistemological result, it follows that coherentist justification (i.e., doxastic coherence) is not truth-conducive. I argue that this does not follow. Central to my argument is the thesis that, though coherentists deny that there can be noninferential justification, coherentists do not (...) deny that there can be individual credibility. (shrink)
A measure of coherence is said to be reliability conducive if and only if a higher degree of coherence (as measured) results in a higher likelihood that the witnesses are reliable. Recently, it has been proved that several coherence measures proposed in the literature are reliability conducive in a restricted scenario (Olsson and Schubert 2007, Synthese 157:297–308). My aim is to investigate which coherence measures turn out to be reliability conducive in the more general scenario where it is any finite (...) number of witnesses that give equivalent reports. It is shown that only the so-called Shogenji measure is reliability conducive in this scenario. I take that to be an argument for the Shogenji measure being a fruitful explication of coherence. (shrink)
A measure of coherence is said to be reliability conducive if and only if a higher degree of coherence (asmeasured) of a set of testimonies implies a higher probability that the witnesses are reliable. Recently, it has been proved that the Shogenji measure of coherence is reliability conducive in restricted scenarios (e.g., Olsson and Schubert, Synthese, 157:297–308, 2007). In this article, I investigate whether the Shogenji measure, or any other coherence measure, is reliability conducive in general. An impossibility theorem is (...) proved to the effect that this is not the case. I conclude that coherence is not reliability conducive. (shrink)
Let us by ‘first-order beliefs’ mean beliefs about the world, such as the belief that it will rain tomorrow, and by ‘second-order beliefs’ let us mean beliefs about the reliability of first-order, belief-forming processes. In formal epistemology, coherence has been studied, with much ingenuity and precision, for sets of first-order beliefs. However, to the best of our knowledge, sets including second-order beliefs have not yet received serious attention in that literature. In informal epistemology, by contrast, sets of the latter kind (...) play an important role in some respectable coherence theories of knowledge and justification. In this paper, we extend the formal treatment of coherence to second-order beliefs. Our main conclusion is that while extending the framework to second-order beliefs sheds doubt on the generality of the notorious impossibility results for coherentism, another problem crops up that might be no less damaging to the coherentist project: facts of coherence turn out to be epistemically accessible only to agents who have a good deal of insight into matters external to their own belief states. (shrink)
M. Rosengren developed doxology as an 'other' take on epistemology, as teaching about how we actually do create the knowledge that we need. He has chosen to call his epistemic stance doxological in order to emphasise that all knowledge is doxic knowledge, thus turning the seminal Platonic distinction between doxa (beliefs, opinions) and episteme (objective, eternal knowledge) upside down.
If a subject’s belief system is inconsistent, does it follow that the subject’s beliefs (all of them) are unjustified? It seems not. But, coherentist theories of justification (at least some of them) imply otherwise, and so, it seems, are open to counterexample. This is the “Problem of Justified Inconsistent Beliefs”. I examine two main versions of the Problem of Justified Inconsistent Beliefs, and argue that coherentists can give at least a promising line of response to each of them.
Focused correlation compares the degree of association within an evidence set to the degree of association in that evidence set given that some hypothesis is true. A difference between the confirmation lent to a hypothesis by one evidence set and the confirmation lent to that hypothesis by another evidence set is robustly tracked by a difference in focused correlations of those evidence sets on that hypothesis, provided that all the individual pieces of evidence are equally, positively relevant to that hypothesis. (...) However, that result depends on a very strong equal relevance condition on individual pieces of evidence. In this essay, we prove tracking results for focused correlation analogous to Wheeler and Scheines’s results but for cases involving unequal relevance. Our result is robust as well, and we retain conditions for bidirectional tracking between incremental confirmation measures and focused correlation. (shrink)
A measure of coherence is said to be reliability conducive if and only if a higher degree of coherence (as measured) among testimonies implies a higher probability that the witnesses are reliable. Recently, it has been proved that several coherence measures proposed in the literature are reliability conducive in scenarios of equivalent testimonies (Olsson and Schubert 2007; Schubert, to appear). My aim is to investigate which coherence measures turn out to be reliability conducive in the more general scenario where the (...) testimonies do not have to be equivalent. It is shown that four measures are reliability conducive in the present scenario, all of which are ordinally equivalent to the Shogenji measure. I take that to be an argument for the Shogenji measure being a fruitful explication of coherence. (shrink)
Some philosophers, most notably Hempel and Salmon, have tried to reduce explanation to probability by proposing analyses of explanation in probabilistic terms. Hempel claims, roughly, that a hypothesis H explains a datum D if and only if the conditional probability P is close to 1. It is well known that such an account fails in cases where H is irrelevant for D. Even though it is highly likely that Tom will not become pregnant, given that he regularly takes his wife’s (...) birth control pills, the latter does not explain the former. Neither does an idea work which is in the proximity of Salmon’s, namely, that H explains D if and only if P > P. Suppose Susan swallows a pound of arsenic in order to commit suicide. Shortly after, however, she dies because she is run over by a bus. The probability of dying, given that one ingests a pound of arsenic, is usually higher than the prior probability of dying. Nonetheless, it is not the arsenic but the collision with the bus which explains Susan’s death. The aforementioned objections are directed against …. (shrink)
Can a perceptual experience justify (epistemically) a belief? More generally, can a nonbelief justify a belief? Coherentists answer in the negative: Only a belief can justify a belief. A perceptual experience can cause a belief but cannot justify a belief. Coherentists eschew all noninferential justification—justification independent of evidential support from beliefs—and, with it, the idea that justification has a foundation. Instead, justification is holistic in structure. Beliefs are justified together, not in isolation, as members of a coherent belief system. The (...) main question of the paper is whether coherentism is consistent. I set out an apparent inconsistency in coherentism and then give a resolution to that apparent inconsistency. (shrink)