Edmund Gettier’s three-page article is generally regarded as a classic of epistemology. I argue that Gettier cases depend upon three false assumptions and are irrelevant to the theory of knowledge. I suggest that we follow Karl Popper in abandoning subject-centred epistemologies in favour of theories of objective knowledge.
I distinguish arguments and arguing and I explain some important logical features of arguments. I then explain how philosophers have been misled, apparently by Euclid, into giving seriously mistaken accounts of arguing. I give a few examples. I then offer a seven-step guide on how to argue. After that, I conclude.
A process of intelligence gathering begins when a user enters a query into the system. Several objects can match the result of a query with different degrees of relevance. Most systems estimate a numeric value about how well each object matches the query and classifies objects according to this value. Many researches have focused on practices of intelligence gathering. In knowledge engineering, knowledge gathering consists in fiding it from structured and unstructured sources in a way that must represent knowledge in (...) a way that facilitates inference. DOI: 10.13140/RG.2.2.32191.15527. (shrink)
In this chapter, we defend an explanationist version of proper functionalism. After explaining proper functionalism’s initial appeal, we note two major objections to proper functionalism: creatures with no design plan who appear to have knowledge (Swampman) and creatures with malfunctions that increase reliability. We then note how proper functionalism needs to be clarified because there are cases of what we call warrant-compatible malfunction. We then formulate our own view: explanationist proper functionalism, which explains the warrant-compatible malfunction cases and helps to (...) block the above objections. We also advance a positive argument for explanationist proper functionalism. (shrink)
According to pragmatic encroachment, whether an epistemic attitude towards p has some positive epistemic status (e.g., whether a belief is epistemically rational or justified, or it amounts to knowledge) partially depends on practical factors such as the costs of being wrong or the practical goals of the agent. Pragmatic encroachment comes in many varieties. This survey article provides an overview of different kinds of pragmatic encroachment. It focuses on three dimensions under which kinds of pragmatic encroachment differ: the type of (...) epistemic status affected by practical factors, the type of practical factors affecting the epistemic status, and the type of normative considerations encroaching on the epistemic status. (shrink)
Recent research has identified a tension between the Safety principle that knowledge is belief without risk of error, and the Closure principle that knowledge is preserved by competent deduction. Timothy Williamson reconciles Safety and Closure by proposing that when an agent deduces a conclusion from some premises, the agent’s method for believing the conclusion includes their method for believing each premise. We argue that this theory is untenable because it implies problematically easy epistemic access to one’s methods. Several possible solutions (...) are explored and rejected. (shrink)
The threshold problem is the task of adequately answering the question: “Where does the threshold lie between knowledge and lack thereof?” I start this paper by articulating two conditions for solving it. The first is that the threshold be neither too high nor too low; the second is that the threshold accommodate the significance of knowledge. In addition to explaining these conditions, I also argue that it is plausible that they can be met. Next, I argue that many popular accounts (...) of knowledge cannot meet them. In particular, I lay out a number of problems that standard accounts of knowledge face in trying to meet these conditions. Finally, near the end of this paper, I argue that there is one sort of account that seems to evade these problems. This sort of account, which is called a cluster account of knowledge, says that knowledge is to be accounted for in terms of truth, belief and a cluster of epistemic properties and also that knowledge doesn’t require having all members of the cluster, but merely some subset. (shrink)
Despite recent controversies surrounding the principle that knowledge entails truth (KT), this paper aims to prove that the principle is true. It offers a proof of (KT) in the following sense. It advances a deductively valid argument for (KT), whose premises are, by most lights, obviously true. Moreover, each premise is buttressed by at least two supporting arguments. And finally, all premises and supporting arguments can be rationally accepted by people who don’t already accept (KT).
Recently, a number of cases have been proposed which seem to show that – contrary to widely held opinion – a subject can inferentially come to know some proposition p from an inference which relies on a false belief q which is essential. The standard response to these cases is to insist that there is really an additional true belief in the vicinity, making the false belief inessential. I present a new kind of case suggesting that a subject can inferentially (...) come to know a proposition from an essential false belief where no truth in the vicinity seems to be present. (shrink)
I argue that knowledge can be seen as a quality standard that governs our sharing and storing of information. This standard satisfies certain functional needs, namely it allows us to share and store trustworthy information more easily. I argue that this makes knowledge a social kind, similar in important ways to other social kinds like money. This provides us with a way of talking about knowledge without limiting ourselves to the concept of knowledge. I also outline three ways in which (...) this view of knowledge can shed light on familiar epistemological problems: it can explain why knowledge is the norm of assertion, it can help us carve out the harm associated with testimonial injustice, and it can provide us with a clear analysis of the dangers associated with spreading misinformation. (shrink)
Methods have been a controversial element in theories of knowledge for the last 40 years. Recent developments in theories of justification, concerning the identification and individuation of belief-forming processes, can shed new light on methods, solving some longstanding problems in the theory of knowledge. We needn’t and shouldn’t shy away from methods; rather, methods, construed as psychological processes of belief-formation, need to play a central role in any credible theory of knowledge.
In this paper, I offer some reasons for thinking that knowledge is a social phenomenon. My argument is based on Helen Longino’s work on scientific knowledge, in particular her 2002 book The Fate of Knowledge. Longino’s basic idea is that a scientific hypothesis or theory is justified when it emerges (relatively) unscathed from social interactions between scientists. If we accept – as Longino and many others do – that knowledge requires justification, it follows that scientific knowledge is a social phenomenon. (...) I argue that much the same goes for other forms of empirical knowledge, such as historical knowledge, various forms of collective knowledge and our knowledge of the ‘social world’. But does it go for empirical knowledge in general? I offer some – admittedly far from conclusive – reasons for thinking that it might. I also argue that, even if it doesn’t, we still need to distinguish between two kinds of knowledge, one of which is constitutively social and the other of which isn’t. Scientific knowledge is a paradigm example of the former and simple perceptual knowledge is a paradigm example of the latter. The end result is a form of pluralism about knowledge. (shrink)
The developing body of empirical work on the "Gettier effect" indicates that, in general, the presence of a Gettier-type structure in a case makes participants less likely to attribute knowledge in that case. But is that a sufficient reason to diverge from a JTB theory of knowledge? I argue that considerations of good model selection, and worries about noise and overfitting, should lead us to consider that a live, open question. The Gettier effect is perhaps so transient, and so sensitive (...) to other, epistemologically-inappropriate factors, that it raises the question of whether it ought to be counted as something to include in our theories -- or as a piece of noise to be excluded from them. (shrink)
This research intends to show a Kantian influence in Transcendental Thomism, particularly in Rahner, Lonergan and Von Balthasar. What is meant by a Kantian influence is a certain attitude regarding the problem of the universals, an attitude which is radically different from St. Thomas’. In my previous work (The Radical Difference between Aquinas and Kant: Human Understanding and the Agent Intellect in Aquinas [Chillum: IVE Press, 2021], the radical difference between St. Thomas and Kant was shown. In this present research, (...) what is argued is that Rahner, Lonergan and Von Balthasar follow Kant, not St. Thomas, with regard to the analysis of human understanding. From each Transcendental Thomist author mentioned above, I have taken one sample text, one most significant work where each author’s epistemology can be explored. Thus, I have selected Rahner’s Spirit in the World, Lonergan’s Verbum articles and Von Balthasar’s Theologic I. (shrink)
Modal epistemologies aim to explicate the necessary link between belief and truth that constitutes knowledge. This strain of epistemological theorizing is typically externalist; hence, it does not require that the agent know or understand the nature of the knowledge-constituting link. A central concern of modal epistemology is to articulate conditions on knowing such that no merely lucky true belief counts as knowledge. In the effort to eliminate luck, epistemic principles are often cast modally, requiring that an agent’s belief is true (...) not only in the actual world but also in relevant possible worlds, indicating that the link between truth and belief is more than an actual world lucky coincidence. (Note, then, that this entry is not about the epistemology of modals—statements involving modal operators such as “necessarily,” “possibly,” and the like—but about the use of modal principles in characterizing the nature of knowledge in general. For purposes of disambiguation and in deference to the other topic, perhaps “modalized epistemologies” would be a better term, though it is not typically used here.) Modal epistemologies typically have antiskeptical consequences, but the strengths of the antiskeptical results vary significantly, especially between the two best-known modal principles, sensitivity and safety. (shrink)
Kant’s theory of cognition aimed to explain the possibility of scientific knowledge. Aesthetics and life science were not considered by Kant in the context of cognition. By contrast, Cassirer set himself a philosophical task to extend Kant’s theory of cognition to all forms of culture, including pre-scientific knowledge and aesthetics. The present study demonstrates how Cassirer explained the possibility of different objective forms, named symbolic, by employing and transforming Kant’s theory of cognition. For this goal, Cassirer took the following steps: (...) modified the definitions of a priori synthesis (the act of understanding) and pure intuition (the forms of space and time) - main building blocks of Kant’s cognition; indicated the necessary correlation of intuition and synthesis; characterized a priori synthesis and the intuition as notions which include contradicted meanings. Cassirer called this contradiction “twofold oppositions” as characteristic of a priori synthesis. The first argument of the article is that the possibility of various synthetic acts is rooted in the nature of a priori synthesis which carries together two different meanings: the act of uniting elements and the initial unity. One synthetic act forms the world of nature and is connected to scientific space and time, and the other is the product of immediate perceptional space and time, from which the world of myth and aesthetics appears. Thus, Cassirer expanded the scope of “pure” synthesis. The second argument is that Cassirer specified a priori synthesis and pure intuition as a functional concept. The functional concept belongs to the model of concept as-relation that Cassirer has elaborated. It includes moments that are separated and united simultaneously. This definition of concept breaks the rules of consistency. The concept of as-relation justifies the contradictory characteristics of a priori synthesis and pure intuition, which include both the combination of moments in a synthesizing act and the initial unity of intuition. (shrink)
This book is the first volume featuring the work of American women philosophers in the first half of the twentieth century. It provides selected papers authored by Mary Whiton Calkins, Grace Andrus de Laguna, Grace Neal Dolson, Marjorie Glicksman Grene, Marjorie Silliman Harris, Thelma Zemo Lavine, Marie Collins Swabey, Ellen Bliss Talbot, Dorothy Walsh and Margaret Floy Washburn. The book also provides the historical and philosophical background to their work. The papers focus on the nature of philosophy, knowledge, the philosophy (...) of science, the mind-matter nexus, the nature of time, and the question of freedom and the individual. The material is suitable for scholars, researchers and advanced philosophy students interested in (history of) philosophy; theories of knowledge; philosophy of science; mind, and reality. (shrink)
Sans régularité le monde ne serait que chaos. C’est la régularité dans nos comportements et dans nos représentations qui nous permet d’anticiper ce qui peut ou non arriver. La source de cette régularité n’est pas cependant facile à établir sans postuler des entités telles que les « lois de la nature » ou les « maximes morales », autrement dit, des « règles abstraites » dont la source, la forme et l’implantation semblent nous échapper. Dans cet essai on renonce à (...) la quête de ces insaisissables règles abstraites qui existeraient indépendamment de nous. Tout ce qui est général, des lois physiques aux principes éthiques, relèverait des énonces normatifs émanant d’une autorité plus ou moins diffuse. Des énoncés normatifs qui parviennent à s’implanter dans nos sociétés en imposant une façon de regarder le monde et d’y agir. Le reste des régularités, à commencer par celles façonnées par nos connaissances langagières, seraient le résultat des processus spontanés se déroulant soit au niveau de l’individu (règles individuelles) soit au niveau du collectif (conventions). (shrink)
Depending on whether we are somewhat tolerant of nearby error-possibilities or not, the safety condition on knowledge is open to a strong reading and a weak reading. In this paper, it is argued that induction and conjunction introduction constitute two horns of a dilemma for the safety account of knowledge. If we opt for the strong reading, then the safety account fails to account for inductive knowledge. In contrast, if we opt for the weak reading, then the safety account fails (...) to accommodate knowledge obtained via the method of conjunction introduction. (shrink)
Modalists think that knowledge requires forming your belief in a “modally stable” way: using a method that wouldn't easily go wrong, or using a method that wouldn't have given you this belief had it been false. Recent Modalist projects from Justin Clarke-Doane and Dan Baras defend a principle they call “Modal Security,” roughly: if evidence undermines your belief, then it must give you a reason to doubt the safety or sensitivity of your belief. Another recent Modalist project from Carlotta Pavese (...) and Bob Beddor defends “Modal Virtue Epistemology”: knowledge is a belief that is maximally modally robust across “normal” worlds. We'll offer new objections to these recent Modalist projects. We will then argue for a rival view, Explanationism: knowing something is believing it because it's true. We will show how Explanationism offers a better account of undermining defeaters than Modalism, and a better account of knowledge. (shrink)
This essay presents a unified account of safety, sensitivity, and severe testing. S’s belief is safe iff, roughly, S could not easily have falsely believed p, and S’s belief is sensitive iff were p false S would not believe p. These two conditions are typically viewed as rivals but, we argue, they instead play symbiotic roles. Safety and sensitivity are both valuable epistemic conditions, and the relevant alternatives framework provides the scaffolding for their mutually supportive roles. The relevant alternatives condition (...) holds that a belief is warranted only if the evidence rules out relevant error possibilities. The safety condition helps categorise relevant from irrelevant possibilities. The sensitivity condition captures ‘ruling out’. Safety, sensitivity, and the relevant alternatives condition are typically presented as conditions on warranted belief or knowledge. But these properties, once generalised, help characterise other epistemic phenomena, including warranted inference, legal verdicts, scientific claims, reaching conclusions, addressing questions, warranted assertion, and the epistemic force of corroborating evidence. We introduce and explain Mayo’s severe testing account of statistical inference. A hypothesis is severely tested to the extent it passes tests that probably would have found errors, were they present. We argue Mayo’s account is fruitfully understood using the resulting relevant alternatives framework. Recasting Mayo’s condition using the conceptual framework of contemporary epistemology helps forge fruitful connections between two research areas—philosophy of statistics and the analysis of knowledge—not currently in sufficient dialogue. The resulting union benefits both research areas. (shrink)
Many defend the thesis that when someone knows p, they couldn’t easily have been wrong about p. But the notion of easy possibility in play is relatively undertheorized. One structural idea in the literature, the principle of Counterfactual Closure (CC), connects easy possibility with counterfactuals: if it easily could have happened that p, and if p were the case, then q would be the case, it follows that it easily could have happened that q. We first argue that while CC (...) is false, there is a true restriction of it to cases involving counterfactual dependence on a coin flip. The failure of CC falsifies a model where the easy possibilities are counterfactually similar to actuality. Next, we show that extant normality models, where the easy possibilities are the sufficiently normal ones, are incompatible with the restricted CC thesis involving coin flips. Next, we develop a new kind of normality theory that can accommodate the restricted version of CC. This new theory introduces a principle of Counterfactual Contamination, which says roughly that any world is fairly abnormal if at that world very abnormal events counterfactually depend on a coin flip. Finally, we explain why coin flips and other related events have a special status. A central take home lesson is that the correct principle in the vicinity of Safety is importantly normality-theoretic rather than (as it is usually conceived) similarity-theoretic. (shrink)
Several anti-debunkers have argued that evolutionary explanations of our moral beliefs fail to meet a necessary condition on undermining defeat called modal security. They conclude that evolution, therefore, does not debunk our moral beliefs. This article shows that modal security is false if knowledge is virtuous achievement. New information can undermine a given belief without providing reason to doubt that that belief is sensitive or safe. This leads to a novel conception of undermining defeat, and it shows that successful debunking (...) of moral realism is possible. (shrink)
I discuss three understandings of the idea of “Knowledge First Epistemology”, i.e. Timothy Williamson’s suggestion that we should take knowledge as a starting point, rather than trying to analyze it. Some have taken this to be a suggestion about the role of the concept of knowledge, but Williamson also seems to be concerned with intuition-based metaphysics. As an alternative, I develop the idea that knowledge may be a social kind that can be understood through a functional analysis in the tradition (...) of Edward Craig. (shrink)
This essay uses a puzzle about assertion and time to explore the pragmatics, semantics, and epistemology of future discourse. The puzzle concerns cases in which a subject is in a position to say, at an initial time t, that it will be that ϕ, but is not in a position to say, at a later time t′, that it is or was that ϕ, despite not losing or gaining any relevant evidence between t and t′. We consider a number of (...) approaches to the puzzle and defend the view that subjects in these cases lose knowledge simply by moving through time. (shrink)
It has been argued that an advantage of the safety account over the sensitivity account is that the safety account preserves epistemic closure, while the sensitivity account implies epistemic closure failure. However, the argument fails to take the method-relativity of the modal conditions on knowledge, viz., sensitivity and safety, into account. In this paper, I argue that the sensitivity account and the safety account are on a par with respect to epistemic closure once the method-relativity of the modal conditions is (...) taken into account. Therefore, epistemic closure is no longer an arbiter in the debate. (shrink)
The safety condition is supposed to be a necessary condition on knowledge which helps to eliminate epistemic luck. It has been argued that the condition should be globalized to a set of propositions rather than the target proposition believed to account for why not all beliefs in necessary truths are safe. A remaining issue is which propositions are relevant when evaluating whether the target belief is safe or not. In the literature, solutions have been proposed to determine the relevance of (...) propositions. This paper examines a case of luckily true belief—thus a case of ignorance—and a case of knowledge. It argues that no solution in the literature offers a correct verdict in either case. Therefore, the strategy to globalize safety remains unsatisfactory. (shrink)
In recent decades, the problem of post-truth has emerged. Values such as fairness, objectivity, and critical dialogue have become more difficult to achieve. Various characteristics are associated with this, such as the emergence of new technologies and a new era in political relations with the rise of fundamentalism and populism. Besides, the reference to postmodernism is always commonplace in the bibliography on the subject. Considering this, the article’s main objective is to philosophically analyze the theoretical foundation of post-truth, postmodernism. From (...) the methodological point of view, this theoretical study will take the interpretive approach as a reference. Interpretive hermeneutical criticism has been combined with a documentary analysis of the main works that address this problem. The article explains the main characteristics of the concept, considering the current and notorious interpretation, and then interprets the position that criticizes postmodernism as the theoretical basis of the post-truth era. It concludes by defining that the relationship between post-truth and its theoretical foundation has a dogmatic and contradictory character since it confronts subjectivist relativism with the dogma of a realist metaphysics. (shrink)
We often claim to know what might be—or probably is—the case. Modal knowledge along these lines creates a puzzle for information-sensitive semantics for epistemic modals. This paper develops a solution. We start with the idea that knowledge requires safe belief: a belief amounts to knowledge only if it could not easily have been held falsely. We then develop an interpretation of the modal operator in safety that allows it to non-trivially embed information-sensitive contents. The resulting theory avoids various paradoxes that (...) arise from other accounts of modal knowledge. It also delivers plausible predictions about modal Gettier cases. (shrink)
Two major arguments have been advanced for the claim that there is a transmission failure in G. E. Moore’s famous proof of an external world. The first argument, due to Crispin Wright, is based on an epistemological doctrine now known as “conservatism.” Proponents of the second argument, like Nicholas Silins, invoke probabilistic considerations, most important among them Bayes’ theorem. The aim of this essay is to defend Moore’s proof against these two arguments. It is shown, first, that Wright’s argument founders (...) because one of its premises, viz, conservatism, invites skepticism and must therefore be rejected. Then the probabilistic argument is challenged, yet not because its formal part is dubious, but rather on the grounds that it incorporates as an implicit premise an unconvincing philosophical claim. Finally, the most promising objection to dogmatism – understood here as the negation of conservatism – is repudiated. (shrink)
Le livre se propose d'étudier la philosophie de Clarence I. Lewis (1883-1964), auteur méconnu dans l'histoire de la pensée. Toutefois, à son époque, il était un des philosophes les plus populaires aux Etats-Unis. Plusieurs des idées de ses disciples célèbres (Quine, Goodman, Sellars) sont déjà présentes dans ses écrits. Lors de leur arrivée en Amérique, les néopositivistes ne rencontrèrent pas seulement en sa personne un avocat du pragmatisme, mais aussi un interlocuteur intéressé par leurs sujets et versé dans leur méthodologie (...) logique. L'étude propose l'analyse détaillée de sa position épistémologique. La compréhension du donné dans le sens préépistémologique comme expérience immédiate (contre le mythe du donné), l'entente fonctionnaliste des a priori (nécessaires pragmatiquement, mais sans statut ontologique privilégié) et la conception de la connaissance comme activité (l'interprétation comme interaction, contre la théorie copiste), faillible, mais capable d'assurer la vérité nécessaire pour la vie pratique, constituent un sujet d'étude passionant aujourd'hui. (shrink)
“Noverim me, noverim te.” – Saint Augustine, Confessions, 10.1.1. (397-400 AD). -/- What would and will an urban commons look like that is slowly and incrementally being re-socialized? How would that affect urban planning “now” and in times of crisis? How do we prepare for the likelihood of rolling similar crises with an eye on returning the urban commons to citizens? -/- There is the old adage that under capitalism, risk is always socialized and profit is always privatized. We are (...) seeing it now, under the COVID-19 crisis. The huge bailouts launched by governments are symptomatic of the crisis in political economy, just as they were post-2008. “Too big to fail” has sponsored monsters that refuse to back off without threatening the collapse of the entire system. Francisco Goya’s “The sleep of Reason produces monsters …” comes to mind. -/- Physical and immaterial culture, in our current Western civilisation, are intimately linked. Yet the focus for urban design is generally on the material or physical side, with the immaterial left to its own devices. Increasingly, urban design measures are merely ameliorative and aesthetic, with the larger share shaped by a political economy dictated by market ideology or “politicalology.” What transpires, nonetheless, is an immaterial commons that constitutes a public or private intellectual commons – often a mix of the two; but, in the case of domination by market ideology, the privatization of “general intellect” proceeds by abject appropriation. In such a technocratically driven model, subjective states become increasingly important. As Indian architect Balkrishna Vithaldas Doshi once said, “Smart cities are smart people.” -/- How might these two otherwise contiguous and synchronous systems be brought back into a properly civic-minded rapport with or without crisis-driven change? Are there alternate models for the urban commons? What measures might be put in place in advance, or as provisional intercessions? (shrink)
This book provides readers with an introduction to epistemology within the Buddhist intellectual tradition. It is designed to be accessible to those whose primary background is in the “Western” tradition of philosophy and who have little or no previous exposure to Buddhist philosophical writings. The book examines many of the most important topics in the field of epistemology, topics that are central both to contemporary discussions of epistemology and to the classical Buddhist tradition of epistemology in India and Tibet. Among (...) the topics discussed are Buddhist accounts of: the nature of knowledge episodes, the defining conditions of perceptual knowledge and of inferential knowledge, the status of testimonial knowledge, and skeptical criticisms of the entire project of epistemology. The book seeks to put the field of Buddhist epistemology in conversation with contemporary debates in philosophy. It shows that many of the arguments and debates occurring within classical Buddhist epistemological treatises coincide with the arguments and disagreements found in contemporary epistemology. The book shows, for example, how Buddhist epistemologists developed an anti-luck epistemology—one that is linked to a sensitivity requirement for knowledge. Likewise, the book explores the question of how the study of Buddhist epistemology can be of relevance to contemporary debates about the value of contributions from experimental epistemology, and to broader debates concerning the use of philosophical intuitions about knowledge. (shrink)
Kant and Aquinas: who can doubt they are different? And however, there are some who equate Aquinas and Kant in doctrines in which they are actually opposed; some attribute to St. Thomas Aquinas approaches that are Kantian and by no means Thomistic. They make those mistakes by misinterpreting or misusing Aquinas’ texts. This paper intends to clarify a little bit the radical difference between the approaches of Aquinas and Kant to human knowledge. In my view, we need first of all (...) to understand the problem of the universals, which is the basic problem of human understanding and, in a sense, the only problem of philosophy. Second, we need to understand the stance of both philosophers in front of this same problem. Only then will we be able to see what and how radical is the difference between Aquinas and Kant. (shrink)
Recently John Turri (2015b) has argued, contra the orthodoxy amongst epistemologists, that reliability is not a necessary condition for knowledge. From this result, Turri (2015a, 2017, 2016a, 2019) defends a new account of knowledge - called abilism - that allows for unreliable knowledge. I argue that Turri's arguments fail to establish that unreliable knowledge is possible and argue that Turri's account of knowledge is false because reliability must be a necessary condition for knowledge.
Neil Levy defends no-platforming people who espouse dangerous or unacceptable views. I reject his notion of higher-order evidence as authoritarian and dogmatic. I argue that no-platforming frustrates the growth of knowledge.
Many theorists hold that outright verdicts based on bare statistical evidence are unwarranted. Bare statistical evidence may support high credence, on these views, but does not support outright belief or legal verdicts of culpability. The vignettes that constitute the lottery paradox and the proof paradox are marshalled to support this claim. Some theorists argue, furthermore, that examples of profiling also indicate that bare statistical evidence is insufficient for warranting outright verdicts.I examine Pritchard's and Buchak's treatments of these three kinds of (...) case. Pritchard argues that his safety condition explains the insufficiency of bare statistical evidence for outright verdicts in each of the three cases, while Buchak argues that her treatment of the distinction between credence and belief explains this. In these discussions the three kinds of cases – lottery, proof paradox, and profiling – are treated alike. The cases are taken to exhibit the same epistemic features. I identity significant overlooked epistemic differences amongst these three cases; these differences cast doubt on Pritchard's explanation of the insufficiency of bare statistical evidence for outright verdicts. Finally, I raise the question of whether we should aim for a unified explanation of the three paradoxes. (shrink)
The epistemology of risk examines how risks bear on epistemic properties. A common framework for examining the epistemology of risk holds that strength of evidential support is best modelled as numerical probability given the available evidence. In this essay I develop and motivate a rival ‘relevant alternatives’ framework for theorising about the epistemology of risk. I describe three loci for thinking about the epistemology of risk. The first locus concerns consequences of relying on a belief for action, where those consequences (...) are significant if the belief is false. The second locus concerns whether beliefs themselves—regardless of action—can be risky, costly, or harmful. The third locus concerns epistemic risks we confront as social epistemic agents. I aim to motivate the relevant alternatives framework as a fruitful approach to the epistemology of risk. I first articulate a ‘relevant alternatives’ model of the relationship between stakes, evidence, and action. I then employ the relevant alternatives framework to undermine the motivation for moral encroachment. Finally, I argue the relevant alternatives framework illuminates epistemic phenomena such as gaslighting, conspiracy theories, and crying wolf, and I draw on the framework to diagnose the undue skepticism endemic to rape accusations. (shrink)
In this paper we give reasons to think that reflective epistemic subjects cannot possess mere animal knowledge. To do so we bring together literature on defeat and higher-order evidence with literature on the distinction between animal knowledge and reflective knowledge. We then defend our argument from a series of possible objections.
The problem of synthetic judgements touches on the question of whether philosophy can draw independent statements about reality in the first place. For Kant, the synthetic judgements a priori formulate the conditions of the possibility for objectively valid knowledge. Despite the principle fallibility of its statements, modern science aims for objective knowledge. This gives the topic of synthetic a priori unbroken currency. This paper aims to show that a modernized version of transcendental philosophy, if it is to be feasible at (...) all, must “bid farewell” to the concept of being “free of empiricism” or the “purity” of the a priori. Approaches to this end can already been found in Kant’s reflections on non-pure synthetic knowledge. Moreover, the a priori validity of knowledge does not exclude the possibility that it can be discovered empirically. In keeping with Kant, Fries and Nelson anticipated this separation (usually first attributed to Reichenbach) between the validity and discovery context of knowledge and pointed out that the a priori could be discovered empirically, but never proven. There are currently still good reasons why transcendental philosophical concepts are of fundamental importance for modern science, although it must not be overlooked that even within the framework of a modernized transcendental philosophy, several unsolved problems remain or are raised. For example, the irredeemability of the universal validity and necessary claims of the a priori, the problem of a clear demarcation between the phenomenal and noumenal world. Moreover, the “beautiful structure” or the Kantian system, which constituted its persuasive power, is lost. (shrink)
In this paper, we propose a general constraint on theories of knowledge that we call ‘normalism’. Normalism is a view about the epistemic threshold that separates knowledge from mere true belief; its basic claim is that one knows only if one has at least a normal amount of epistemic support for one’s belief. We argue that something like normalism is required to do full justice to the normative role of knowledge in many key everyday practices, such as assertion, inquiry, and (...) testimony. The view of normality we employ to flesh out this claim is inspired by experimental work on the folk notion of normality, which suggests that folk judgments of what is ‘normal’ are based upon both statistical averages as well as normative ideals within the relevant target domain. Adopting this notion of normality to set the threshold for knowledge results in a view upon which knowledge is routinely available on an everyday basis without being a merely trivial achievement. We explore several interesting consequences of this view, including the implication that the threshold for knowing may change as, e.g., the ease of availability of information in an epistemic community changes over time. The result is a ‘shifty’ view of knowledge which nonetheless retains more stability than standard contextualist or pragmatic encroachment approaches. (shrink)
The purpose of this paper is to provide a new solution to the radical sceptical paradox. A sceptical paradox purports to indicate the inconsistency within our fundamental epistemological commitments that are all seemingly plausible. Typically, sceptics employ an intuitively appealing epistemic principle (e.g., the closure principle, the underdetermination principle) to derive the sceptical conclusion. This paper will reveal a dilemma intrinsic to the sceptical paradox, which I refer to as the self-hollowing problem of radical scepticism. That is, on the one (...) hand, if the sceptical conclusion turns out to be true, then the epistemic principle employed by sceptics would lose its foundation of plausibility; on the other hand, if the sceptical conclusion does not follow, then the sceptical problem would not arise. In either case, the so-called sceptical paradox cannot be a genuine paradox. This new solution has three theoretical merits: it is undercutting, less theory-laden, and widely applicable. (shrink)
Several authors have recently argued against intellectualism, the view that one’s epistemic position with respect to p depends exclusively on one’s truth-relevant factors with respect to p. In this paper, I first examine two prominent arguments for the anti-intellectualist position and find both of them wanting. More precisely, I argue that these arguments, by themselves, are underdetermined between intellectualism and anti-intellectualism. I then manifest the intuitive plausibility of intellectualism by examining the ordinary conversational pattern of challenging a claim.
Mainstream philosophy has seen a recent flowering in discussions of intellectualism which revisits Gilbert Ryle’s famous distinction between ‘knowing how’ and ‘knowing that’, and challenges his argument that the former cannot be reduced to the latter. These debates so far appear not to have engaged with pragmatist philosophy in any substantial way, which is curious as the relation between theory and practice is one of pragmatism’s main themes. Accordingly, this paper examines the contemporary debate in the light of Charles Peirce’s (...) habit-based epistemology. We argue both that knowing-that can be understood as a particularly sophisticated form of knowing-how, and that all bodily competencies—if sufficiently deliberately developed—can be analysed as instantiating propositional structure broadly conceived. In this way, intellectualism and anti-intellectualism are seen to be not opposed, and both true, although Peirce’s original naturalistic account of propositional structure does lead him to reject what we shall call ‘linguistic intellectualism’. (shrink)
A presentation of Kant's idea for enlightenment process that was happening at that time. I try to be objective as it is needed to give a thorough explanation for what was the main subject in this process. Kant explains the main idea of enlightenment and describes it with examples for which stands descriptive and understandable for that period.