Starting with Kant’s undeveloped proposal of a “negative science,” the author describes how philosophy may be developed and strengthened by means of a systematic approach that seeks to identify and eliminate a widespread but seldom recognized form of systemic and propagating conceptual error. ¶¶¶¶¶ -/- The paper builds upon the author’s book, CRITIQUE OF IMPURE REASON: HORIZONS OF POSSIBILITY AND MEANING (Studies in Theory and Behavior, 2021). ¶¶¶¶¶ -/- The author’s purpose is twofold: first, to enable us to recognize the (...) boundaries of what is referentially forbidden—the limits beyond which reference becomes meaningless—and second, to avoid falling victims to a certain broad class of conceptual confusions that lie at the heart of many major philosophical problems. By realizing these objectives, the boundaries of possible meaning are determined. (shrink)
Abstract. Traditional epistemology of knowledge and belief can be succinctly characterized as JTB-epistemology, i.e., it is characterized by the thesis that knowledge is justified true belief. Since Gettier’s trail-blazing paper of 1963 this account has become under heavy attack. The aim of is paper is to study the Gettier problem and related issues in the framework of topological epistemic logic. It is shown that in the framework of topological epistemic logic Gettier situations necessarily occur for most topological models of knowledge (...) and belief. On the other hand, there exists a special class of topological models (based on so called nodec spaces) for which traditional JTB-epistemology is valid. Further, it is shown that for each topological model of Stalnaker’s combined logic KB of knowledge and belief a canonical JTB-model (its JTB-doppelganger) can be constructed that shares many structural properties with the original model but is free of Gettier situations. The topological model and its JTB-doppelganger both share the same justified belief operator and have very similar knowledge operators. Seen from a somewhat different perspective, the JTB-account of epistemology amounts to a simplification of a more general epistemological account of knowledge and belief that assumes that these two concepts may differ in some cases. The JTB-account of knowledge and belief assumes that the epistemic agent’s cognitive powers are rather large. Thereby in the JTB-epistemology Gettier cases do not occur. Eventually, it is shown that for all topological models of Stalnaker’s KB-logic Gettier situations are topologically characterized as nowhere dense situations. This entails that Gettier situations are epistemologically invisible in the sense that they can neither be known nor believed with justification with respect to the knowledge operator and the belief operator of the models involved. (shrink)
_Framing effects_ concern the having of different attitudes towards logically or necessarily equivalent contents. Framing is of crucial importance for cognitive science, behavioral economics, decision theory, and the social sciences at large. We model a typical kind of framing, grounded in (i) the structural distinction between beliefs activated in working memory and beliefs left inactive in long term memory, and (ii) the topic- or subject matter-sensitivity of belief: a feature of propositional attitudes which is attracting growing research attention. We introduce (...) a class of models featuring (i) and (ii) to represent, and reason about, agents whose belief states can be subject to framing effects. We axiomatize a logic which we prove to be sound and complete with respect to the class. (shrink)
The concept of hypodox is dual to the concept of paradox. Whereas a paradox is incompatibly overdetermined, a hypodox is underdetermined. Indeed, many particular paradoxes have dual hypodoxes. So, naively the dual of Russell’s Paradox is whether the set of all sets that are members of themselves is self-membered. The dual of the Liar Paradox is the Truth-teller, and a hypodoxical dual of the Heterological paradox is whether ‘autological’ is autological. I provide some analysis of the duality and I search (...) for modal hypodoxes as duals to known paradoxes. I also try to continue this search further by mapping the relations between some paradoxes and hypodoxes in duality squares. This investigation increases the known extension of the concept of hypodox, and the duality relation between paradoxes and hypodoxes assists in finding some of these. (shrink)
This short paper offers a skeptical solution to Åqvist's paradox of epistemic obligation. The solution is based on the contention that in SDL/KDT logics the externalist features of knowledge, about which we cannot have obligations, are obscured.
I explore the motivation and logical consequences of the idea that we have some (limited) ability to know contingent facts about the future, even in presence of the assumption that the future is objectively unsettled or indeterminate. I start by formally characterizing skepticism about the future. This analysis nudges the anti-skeptic towards the idea that if some propositions about the future are objectively indeterminate, then it may be indeterminate whether a suitably positioned agent knows them. -/- Philosophical Perspectives, Volume 35, (...) Issue 1, Page 50-69, December 2021. (shrink)
We define a notion of the intelligence level of an idealized mechanical knowing agent. This is motivated by efforts within artificial intelligence research to define real-number intelligence levels of compli- cated intelligent systems. Our agents are more idealized, which allows us to define a much simpler measure of intelligence level for them. In short, we define the intelligence level of a mechanical knowing agent to be the supremum of the computable ordinals that have codes the agent knows to be codes (...) of computable ordinals. We prove that if one agent knows certain things about another agent, then the former necessarily has a higher intelligence level than the latter. This allows our intelligence no- tion to serve as a stepping stone to obtain results which, by themselves, are not stated in terms of our intelligence notion (results of potential in- terest even to readers totally skeptical that our notion correctly captures intelligence). As an application, we argue that these results comprise evidence against the possibility of intelligence explosion (that is, the no- tion that sufficiently intelligent machines will eventually be capable of designing even more intelligent machines, which can then design even more intelligent machines, and so on). (shrink)
We study the structure of families of theories in the language of arithmetic extended to allow these families to refer to one another and to themselves. If a theory contains schemata expressing its own truth and expressing a specific Turing index for itself, and contains some other mild axioms, then that theory is untrue. We exhibit some families of true self-referential theories that barely avoid this forbidden pattern.
History based models suggest a process-based approach to epistemic and temporal reasoning. In this work, we introduce preferences to history based models. Motivated by game theoretical observations, we discuss how preferences can dynamically be updated in history based models. Following, we consider arrow update logic and event calculus, and give history based models for these logics. This allows us to relate dynamic logics of history based models to a broader framework.
Half a century later, a Dretskean stance on epistemic closure remains a minority view. Why? Mainly because critics have successfully poked holes in the epistemologies on which closure fails. However, none of the familiar pro-closure moves works against the counterexamples on display here. It is argued that these counterexamples pose the following dilemma: either accept that epistemic closure principles are false, and steal the thunder from those who attack classical logic on the basis of similarly problematic cases—specifically, relevance logicians and (...) like-minded philosophers—or stick with closure and surrender to relevantist claims of failure in truth-preservation aimed at classical rules of inference. Classicist closure advocates find the promise of a way out of the dilemma in the works of Roy Sorensen and John Hawthorne. The paper argues against their pro-closure move and renews Robert Audi’s call for a theory of closure-failure. (shrink)
The verb ‘to know’ can be used both in ascriptions of propositional knowledge and ascriptions of knowledge of acquaintance. In the formal epistemology literature, the former use of ‘know’ has attracted considerable attention, while the latter is typically regarded as derivative. This attitude may be unsatisfactory for those philosophers who, like Russell, are not willing to think of knowledge of acquaintance as a subsidiary or dependent kind of knowledge. In this paper we outline a logic of knowledge of acquaintance in (...) which ascriptions like ‘Mary knows Smith’ are regarded as formally interesting in their own right, remaining neutral on their relation to ascriptions of propositional knowledge. The resulting logical framework, which is based on Hintikka’s modal approach to epistemic logic, provides a fresh perspective on various issues and notions at play in the philosophical debate on acquaintance. (shrink)
In the current debate there are two epistemological approaches to the definition of ignorance: the Standard View and the New View. The former defines ignorance simply as not knowing, while the latter defines it as the absence of true belief. One of the main differences between these two positions lies in rejecting (Standard View) or in accepting (New View) the factivity of ignorance, i.e., if an agent is ignorant of φ, then φ is true. In the present article, we first (...) provide a criticism of the Standard View in favour of the New View. Secondly, we propose a formal setting to represent the notion of factive ignorance. (shrink)
In this paper I investigate an alternative to imprecise probabilism. Imprecise probabilism is a popular revision of orthodox Bayesianism: while the orthodox Bayesian claims that a rational agent’s belief-state can be represented by a single credence function, the imprecise probabilist claims instead that a rational agent’s belief-state can be represented by a set of such functions. The alternative that I put forward in this paper is to claim that the expression ‘credence’ is vague, and then apply the theory of supervaluationism (...) to sentences containing this expression. This gives us a viable alternative to imprecise probabilism, and I end by comparing the two accounts. I show that supervaluationism has a simpler way of handling sentences relating the belief-states of two different people, or of the same person at two different times; that both accounts may have the resources to develop plausible decision theories; and finally that the supervaluationist can accommodate higher-order vagueness in a way that is not available to the imprecise probabilist. (shrink)
Dogmatism is the view that perceptual experience provides immediate defeasible justification for certain beliefs. The bootstrapping problem for dogmatism is that it sanctions a certain defective form of reasoning that concludes in the belief that one's perceptual faculties are reliable. This paper argues that the only way for the dogmatist to avoid the bootstrapping problem is to claim that epistemic justification fails to have a structural property known as cut. This allows the dogmatist to admit that each step in the (...) defective reasoning considered on its own is acceptable, but when stitched together, these pieces of reasoning are unacceptable (§2). The fact that this is the only plausible solution to the bootstrapping problem is, in one way, bad news. This is because it adds another member to a family of recently uncovered results that show dogmatism is incompatible with certain connections between epistemic justification and probabilities (§3). But I try to make the best of it on the dogmatist’s behalf. I show that within a certain kind of foundationalist framework, we can make good on this idea that epistemic justification fails to satisfy cut in a way needed to solve the bootstrapping problem (§4–5). (shrink)
This paper reviews the central points and presents some recent developments of the epistemic approach to paraconsistency in terms of the preservation of evidence. Two formal systems are surveyed, the basic logic of evidence (BLE) and the logic of evidence and truth (LET J ), designed to deal, respectively, with evidence and with evidence and truth. While BLE is equivalent to Nelson’s logic N4, it has been conceived for a different purpose. Adequate valuation semantics that provide decidability are given for (...) both BLE and LET J . The meanings of the connectives of BLE and LET J , from the point of view of preservation of evidence, is explained with the aid of an inferential semantics. A formalization of the notion of evidence for BLE as proposed by M. Fitting is also reviewed here. As a novel result, the paper shows that LET J is semantically characterized through the so-called Fidel structures. Some opportunities for further research are also discussed. (shrink)
The paper analyzes dynamic epistemic logic from a topological perspective. The main contribution consists of a framework in which dynamic epistemic logic satisfies the requirements for being a topological dynamical system thus interfacing discrete dynamic logics with continuous mappings of dynamical systems. The setting is based on a notion of logical convergence, demonstratively equivalent with convergence in Stone topology. Presented is a flexible, parametrized family of metrics inducing the latter, used as an analytical aid. We show maps induced by action (...) model transformations continuous with respect to the Stone topology and present results on the recurrent behavior of said maps. (shrink)
In this paper, we discuss Hintikka’s theory of interrogative approach to inquiry with a focus on bracketing. First, we dispute the use of bracketing in the interrogative model of inquiry arguing that bracketing provides an indispensable component of an inquiry. Then, we suggest a formal system based on strategy logic and logic of paradox to describe the epistemic aspects of an inquiry, and obtain a naturally paraconsistent system. We then apply our framework to some cases to illustrate its use.
A dynamic epistemic logic is presented in which the single agent can reason about his knowledge stages before and after announcements. The logic is generated by reinterpreting multi agent private announcements in a single agent environment. It is shown that a knowability principle is valid for such logic: any initially true ϕ can be known after a certain number of announcements.
The article studies knowledge in multiagent systems where data available to the agents may have small errors. To reason about such uncertain knowledge, a formal semantics is introduced in which indistinguishability relations, commonly used in the semantics for epistemic logic S5, are replaced with metrics to capture how much two epistemic worlds are different from an agent’s point of view. The main result is a logical system sound and complete with respect to the proposed semantics.
This work proposes an understanding of deductive, default and abductive reasoning as different instances of the same phenomenon: epistemic dynamics. It discusses the main intuitions behind each one of these reasoning processes, and suggest how they can be understood as different epistemic actions that modify an agent’s knowledge and/or beliefs in a different way, making formal the discussion with the use of the dynamic epistemic logic framework. The ideas in this paper put the studied processes under the same umbrella, thus (...) highlighting their relationship and allowing a better understanding of how they interact together. (shrink)
This article provides a brief overview of several formal frameworks concerning the relation between knowledge on the one hand, and obligation on the other. We discuss the paradox of the knower, knowledge based obligation, knowingly doing, deontic dynamic epistemology, descriptive obligations, and responsibilities as dynamic epistemology.
In this paper we present a brief overview of logic-based belief change, a research area concerned with the question of how a rational agent ought to change its mind in the face of new, possibly conflicting, information. Our intention is to provide the reader with a basic introduction to the work done in this area over the past 30 years. In doing so we hope to sketch the main historical results, provide appropriate pointers to further references, and discuss some current (...) developments. We trust that this will spur on the interested reader to learn more about the topic, and perhaps to join us in the further development of this exciting field of research. (shrink)
Multiple contraction (simultaneous contraction by several sentences) and iterated contraction are investigated in the framework of specified meet contraction (s.m.c.) that is extended for this purpose. Multiple contraction is axiomatized, and so is finitely multiple contraction (contraction by a finite set of sentences). Two ways to reduce finitely multiple contraction to contraction by single sentences are introduced. The reduced operations are axiomatically characterized and their properties are investigated. Furthermore, it is shown how iterated contraction can be reduced to single-step, single-sentence (...) contraction. However, in this framework the outcome of iterated contraction depends unavoidably on the order in which the inputs are received. This order-dependence makes it impossible to treat two inputs on an equal footing. Therefore it is often preferable to perform changes involving several pieces of information as multiple rather than iterated change. (shrink)
We look at two fundamental logical processes, often intertwined in planning and problem solving: inference and update. Inference is an internal process with which we uncover what is implicit in the information we already have. Update, on the other hand, is produced by external communication, usually in the form of announcements and in general in the form of observations, giving us information that might not have been available (even implicitly) before. Both processes have received attention from the logic community, usually (...) separately. In this work, we develop a logical language that allows us to describe them together. We present syntax, semantics and a complete axiom system; we discuss similarities and differences with other approaches and mention how the work can be extended. (shrink)
When a belief set is contracted only some beliefs are eligible for removal. By introducing eligibility for removal as a new semantic primitive for contraction and combining it with epistemic entrenchment we get a contraction operator with a number of interesting properties. By placing some minimal constraint upon eligibility we get an explicit contraction recipe that exactly characterises the so called interpolation thesis, a thesis that states upper and lower bounds for the amount of information to be given up in (...) contraction. As a result we drop the controversial property of recovery. By placing additional constraints on eligibility we get representation theorems for a number of contraction operators of varying strength. In addition it is shown that recovery contraction is a special case that we get if eligibility is explicitly constructed in terms of logical relevance. (shrink)
An account of belief revision is developed which takes account of the cognitive capabilities of human epistemic agents. We begin with an agent's commitment sets, i.e., sets comprised of those sentences which she is both epistemically committed to accepting, and which she should be able to cognitively grasp in a manner sufficient for praiseworthy belief revision. ;Whether an agent is epistemically committed to accepting a sentence depends on those epistemic standards of her epistemic communities which apply in her situation. These (...) standards determine which information is to count as intersubjectively evident in her situation. She must take account of this information if she is to be epistemically responsible. ;The extent to which it should motivate revision of her commitment sets depends on several factors. First, it depends on the extent to which it coheres with her commitment sets. A sentence coheres with a set if members of that set can serve as premisses in an argument which enables her to infer that sentence on the basis of those premisses, and thus enable her to improve the explanatory and inferential integration of those sets. ;Second, it depends on the extent to which members of the commitment set with which it coheres are entrenched in her conception of the world. Entrenchment is understood in terms of the role which the sentences play in the maintenance and improvement of her conception of the world. ;A sentence's contribution to our conception of the world is determined by considering the contextual effects of that sentence. Contextual effects are those sentences which can be obtained by means of elimination rules when a sentence is integrated with a commitment set, which could not be obtained from the commitment set alone, or the sentence alone. An elimination rule analyzes or explicates the content of sentences to which it is applied. ;After addressing various objections to the positions outlined above, we end up with an account of the contribution which the acceptance of a sentence would make to maintaining and improving our conception of the world. (shrink)
I argue for the moderate probabilist view that probability theory plays much the same role in epistemology as does logic, and so is as indispensable to epistemology as is logic; but probability theory by itself does not constitute a theory of rational degree of belief, just as deductive logic does not by itself constitute a theory of rational belief. I defend a version of Ramsey's view that degrees of belief, which are defined using the notion of mathematical expectation, must obey (...) the laws of probability or an implicit logical error, namely preferring one thing over itself under two different but logically equivalent descriptions, will be committed. This fact is what licenses speaking of probability theory as an extension of formal logic. I also discuss how the notions of coherence and consistency are related to principles of epistemic rationality. I argue that coherence is best understood as an epistemic ideal, i.e., a quality that the opinion of an ideally rational being who makes no logical errors in forming its preferences would have. For us humans, who are not ideally rational, ideals of reason such as coherence and consistency are things towards which we should strive even though they are not things we can attain. They have substantive normative force for how we manage our opinion, however, since rationality only requires that we approximate to epistemic ideals such as coherence as much as is possible for us. To help make sense of the notion of approximation to coherence, I develop a generalized model of degree of belief and give a precise formal explication of the notion of increasing coherence within this framework. I argue that logical improvements of the sort that move a person from lesser to greater coherence cannot account for all epistemically interesting notions such as justification or warrant. In particular, I argue that there is little reason to think that the notion of scientific confirmation can be explicated wholly in terms of subjective probabilistic relations between hypotheses and evidence, even if probability theory is liberalized to allow for incoherent degrees of belief. (shrink)
This paper reorganizes and further develops the theory of partial meet contraction which was introduced in a classic paper by Alchourron, Gardenfors, and Makinson. Our purpose is threefold. First, we put the theory in a broader perspective by decomposing it into two layers which can respectively be treated by the general theory of choice and preference and elementary model theory. Second, we reprove the two main representation theorems of AGM and present two more representation results for the finite case that (...) "lie between" the former, thereby partially answering an open question of AGM. Our method of proof is uniform insofar as it uses only one form of "revealed preference", and it explains where and why the finiteness assumption is needed. Third, as an application, we explore the logic characterizing theory contractions in the finite case which are governed by the structure of simple and prioritized belief bases. (shrink)
One of the important issues in research on knowledge based computer systems is development of methods for reasoning about knowledge. In the present paper semantics for knowledge operators is introduced. The underlying logic is developed with epistemic operators relative to indiscernibility. Facts about knowledge expressible in the logic are discussed, in particular common knowledge and joint knowledge of n group of agents. Some paradoxes of epistemic logic are shown to be eliminated in the given system. A formal logical analysis of (...) reasoning about knowledge is a subject of investigations both in logic and computer science , and several epistemic systems have been proposed to formalize the operator ‘an agent knows’. In the present paper we propose a formalization based on a semantic treatment of knowledge within the framework of rough set theory . The inspiration for the underlying epistemic logic came from the analysis of knowledge transfer in distributed systems developed in Orlowska and Sanders and from the author’s earlier work on indiscernibility and relative accessibility semantics. (shrink)
Logicians generally employ coherence and consistency as synonyms naming the absence of contradictions in a group of SENTENCES, propositions, or beliefs, where a contradiction is the conjunction of a proposition and its negation. In metaphysical terms, logical incoherence or contradiction is the impossible instantiation of a property and some other, incompatible property, as in "the circle was square." Epistemically, a contradiction is an irrational belief in both a proposition and its denial.
In this paper we compare central elements of Dialogue Logic and Belief Revision theory. Dialogue Logic of the Hamblin/Mackenzie style, or Formal Dialectic, contains three main features. First, there is a rule governed interaction between dialogue participants—the minimal case being two participants. Second, each participant has a commitment store which changes as the dialogue progresses. Third, the changes in the commitment store are governed by rules for additions and withdrawals of material. Withdrawal of material is one major source of difficulty (...) in proposing rules for commitment store change. The classic Belief Revision theory is the AGM theory. AGM theory is a theory about ideal rational believers who change their sets of beliefs by either expansion or contraction. Contraction is a major source of difficulty in belief revision theory. We claim that the commitment stores of dialogue logic include, in a sense, the belief sets of belief revision theory. Further, withdrawal and contraction are essentially the same process. We consider various kinds of withdrawal and contraction, and show how approaches to these processes illuminate certain of the formal fallacies. (shrink)