Just started a new book. The aim is to establish a science of knowledge in the same way that we have a science of physics or a science of materials. This might appear as an overly ambitious, possibly arrogant, objective, but bear with me. On the day I am beginning to write it–June 7th, 2020–, I think I am in possession of a few things that will help me to achieve this objective. Again, bear with me. My aim is well (...) reflected in the title I chose (just now) for this book: Knowledge & Logic: Towards a science of knowledge. Its most important feature is that I shall take logic to be to knowledge science as calculus is to physics or to materials science. I do not intend to reclaim knowledge from the bosom of philosophy, in which, known as epistemology its erudite discussion has hardly progressed since Plato first defined it as true belief with logos. With only a few adjustments, it will actually provide me with the right, science-bound start. More recently, knowledge has been reclaimed by the field of BA, a reclaim that has opened the box of Pandora: Among the evils, and perhaps at the head of the list, is an overly lay, essentially naive, notion of knowledge. But the very idea that one can have something like “knowledge (management) software” puts us on the right track. (shrink)
Georg Cantor was the genuine discoverer of the Mathematical Infinity, and whatever he claimed, suggested, or even surmised should be taken seriously -- albeit not necessary at its face value. Because alongside his exquisite in beauty ordinal construction and his fundamental powerset description of the continuum, Cantor has also left to us his obsessive presumption that the universe of sets should be subjected to laws similar to those governing the set of natural numbers, including the universal principles of cardinal comparability (...) and well-ordering -- and implying an ordinal re-creation of the continuum. During the last hundred years, the mainstream set-theoretical research -- all insights and adjustments due to Kurt G\"odel's revolutionary insights and discoveries notwithstanding -- has compliantly centered its efforts on ad hoc axiomatizations of Cantor's intuitive transfinite design. We demonstrate here that the ontological and epistemic sustainability} of this design has been irremediably compromised by the underlying peremptory, Reductionist mindset of the XIXth century's ideology of science. (shrink)
Homo deceptus is a book that brings together new ideas on language, consciousness and physics into a comprehensive theory that unifies science and philosophy in a different kind of Theory of Everything. The subject of how we are to make sense of the world is addressed in a structured and ordered manner, which starts with a recognition that scientific truths are constructed within a linguistic framework. The author argues that an epistemic foundation of natural language must be understood before laying (...) claim to any notion of reality. This foundation begins with Ludwig Wittgenstein’s Tractatus Logico-Philosophicus and the relationship of language to formal logic. Ultimately, we arrive at an answer to the question of why people believe the things they do. This is effectively a modification of Alfred Tarski’s semantic theory of truth. The second major issue addressed is the ‘dreaded’ Hard Problem of Consciousness as first stated by David Chalmers in 1995. The solution is found in the unification of consciousness, information theory and notions of physicalism. The physical world is shown to be an isomorphic representation of the phenomenological conscious experience. New concepts in understanding how language operates help to explain why this relationship has been so difficult to appreciate. The inclusion of concepts from information theory shows how a digital mechanics resolves heretofore conflicting theories in physics, cognitive science and linguistics. Scientific orthodoxy is supported, but viewed in a different light. Mainstream science is not challenged, but findings are interpreted in a manner that unifies consciousness without contradiction. Digital mechanics and formal systems of logic play central roles in combining language, consciousness and the physical world into a unified theory where all can be understood within a single consistent framework. (shrink)
According to Aristotle if a universal proposition (for example: “All men are white”) is true, its contrary proposition (“All men are not white”) must be false; and, according to Aristotle, if a universal proposition (for example: “All men are white”) is true, its contradictory proposition (“Not all men are white”) must be false. I agree with what Aristotle wrote about universal propositions, but there are universal propositions which have no contrary proposition and have no contradictory proposition. The proposition X “All (...) the propositions that contradict this proposition are true” does not have the contrary proposition and does not have the contradictory proposition. In fact: FEX “All the propositions that contradict this proposition are not true” has a different subject: the subject of the proposition X is constituted by all the propositions that contradict the proposition X; by contrast, the subject of the proposition FEX is constituted by all the propositions that contradict the proposition FEX. And FOX “Not all the propositions that contradict this proposition are true” has a different subject: the subject of the proposition X is constituted by the propositions that contradict the proposition X; by contrast, the subject of the proposition FOX is constituted by the propositions that contradict the proposition FOX. According to Aristotle, a singular proposition (in his example: "Socrates is white") which is true must have its negative proposition ("Socrates is not white") which is false. I agree with Aristotle, but there are singular propositions which do not have the corresponding negative proposition. The proposition (or, rather, the pseudo-proposition) L “This same statement is not true” does not have the negative proposition because the proposition FNL “This same statement is true” has a different subject from the subject of the proposition L: the subject of the proposition L is the proposition L; by contrast, the subject of the proposition FNL is the proposition FNL. L and FNL cannot have the same subject. By contrast, the proposition M “This mount is entirely in Swiss territory” and the proposition NM “This mount is not entirely in Swiss territory” can have the same subject (for example, the Mount Eiger): in case the subject of the proposition M and the subject of the proposition NM is the same, M and NM are opposite propositions, NM is the negative proposition of M. By contrast, the proposition (or, rather, the pseudo-proposition) L "This same statement is not true" cannot have the corresponding negative proposition because FNL "This same statement is true" has a different subject: the subject of L is L ; by contrast, the subject of FNL is FNL. Then the paper continues by analyzing some variants of the liar’s paradox: L1 “The statement L1 is not true”; the so-called liar cycle; and the so-called Yablo’s paradox. (shrink)
The concept of truth has many aims but only one source. The article describes the primary concept of truth, here called the synthetic concept of truth, according to which truth is the objective result of the synthesis of us and nature in the process of rational cognition. It is shown how various aspects of the concept of truth -- logical, scientific, and mathematical aspect -- arise from the synthetic concept of truth. Also, it is shown how the paradoxes of truth (...) arise. (shrink)
Textbook for students in mathematical logic. Part 1. Total formalization is possible! Formal theories. First order languages. Axioms of constructive and classical logic. Proving formulas in propositional and predicate logic. Glivenko's theorem and constructive embedding. Axiom independence. Interpretations, models and completeness theorems. Normal forms. Tableaux method. Resolution method. Herbrand's theorem.
Throughout this paper, we are trying to show how and why our Mathematical frame-work seems inappropriate to solve problems in Theory of Computation. More exactly, the concept of turning back in time in paradoxes causes inconsistency in modeling of the concept of Time in some semantic situations. As we see in the first chapter, by introducing a version of “Unexpected Hanging Paradox”,first we attempt to open a new explanation for some paradoxes. In the second step, by applying this paradox, it (...) is demonstrated that any formalized system for the Theory of Computation based on Classical Logic and Turing Model of Computation leads us to a contradiction. We conclude that our mathematical frame work is inappropriate for Theory of Computation. Furthermore, the result provides us a reason that many problems in Complexity Theory resist to be solved.(This work is completed in 2017 -5- 2, it is in vixra in 2017-5-14, presented in Unilog 2018, Vichy). (shrink)
I give an account of proof terms for derivations in a sequent calculus for classical propositional logic. The term for a derivation δ of a sequent Σ≻Δ encodes how the premises Σ and conclusions Δ are related in δ. This encoding is many–to–one in the sense that different derivations can have the same proof term, since different derivations may be different ways of representing the same underlying connection between premises and conclusions. However, not all proof terms for a sequent Σ≻Δ (...) are the same. There may be different ways to connect those premises and conclusions. -/- Proof terms can be simplified in a process corresponding to the elimination of cut inferences in sequent derivations. However, unlike cut elimination in the sequent calculus, each proof term has a unique normal form (from which all cuts have been eliminated) and it is straightforward to show that term reduction is strongly normalising—every reduction process terminates in that unique normal form. Further- more, proof terms are invariants for sequent derivations in a strong sense—two derivations δ1 and δ2 have the same proof term if and only if some permutation of derivation steps sends δ1 to δ2 (given a relatively natural class of permutations of derivations in the sequent calculus). Since not every derivation of a sequent can be permuted into every other derivation of that sequent, proof terms provide a non-trivial account of the identity of proofs, independent of the syntactic representation of those proofs. (shrink)
Epistemic modals have peculiar logical features that are challenging to account for in a broadly classical framework. For instance, while a sentence of the form ‘p, but it might be that not p’ appears to be a contradiction, 'might not p' does not entail 'not p', which would follow in classical logic. Likewise, the classical laws of distributivity and disjunctive syllogism fail for epistemic modals. Existing attempts to account for these facts generally either under- or over-correct. Some theories predict that (...) 'p and might not p', a so-called epistemic contradiction, is a contradiction only in an etiolated sense, under a notion of entailment that does not allow substitution of logical equivalents; these theories underpredict the infelicity of embedded epistemic contradictions. Other theories savage classical logic, eliminating not just rules that intuitively fail, like distributivity and disjunctive syllogism, but also rules like non-contradiction, excluded middle, De Morgan’s laws, and disjunction introduction, which intuitively remain valid for epistemic modals. In this paper, we aim for a middle ground, developing a semantics and logic for epistemic modals that makes epistemic contradictions genuine contradictions and that invalidates distributivity and disjunctive syllogism but that otherwise preserves classical laws that intuitively remain valid. We start with an algebraic semantics, based on ortholattices instead of Boolean algebras, and then propose a more concrete possibility semantics, based on partial possibilities related by compatibility. Both semantics yield the same consequence relation, which we axiomatize. Then we show how to extend our semantics to explain parallel phenomena involving probabilities and conditionals. The goal throughout is to retain what is desirable about classical logic while accounting for the non-classicality of epistemic vocabulary. (shrink)
CAT4 is proposed as a general method for representing information, enabling a powerful programming method for large-scale information systems. It enables generalised machine learning, software automation and novel AI capabilities. This is Part 3 of a five-part introduction. The focus here is on explaining the semantic model for CAT4. Points in CAT4 graphs represent facts. We introduce all the formal (data) elements used in the classic semantic model: sense or intension (1st and 2nd joins), reference (3rd join), functions (4th join), (...) time and truth (logical fields), and symbolic content (name/value fields). Concepts are introduced through examples alternating with theoretical discussion. Some concepts are assumed from Part 1 and 2, but key ideas are re-introduced. The purpose is to explain the CAT4 interpretation, and why the data structure and CAT4 axioms have been chosen: to make the semantic model consistent and complete. We start with methods to translate information from database tables into graph DBs and into CAT4. We conclude with a method for translating natural language into CAT4. We conclude with a comparison of the system with an advanced semantic logic, the hyper-intensional logic TIL, which also aims to translate NL into a logical calculus. The CAT4 Natural Language Translator is discussed in further detail in Part 4, when we introduce functions more formally. Part 5 discusses software design considerations. (shrink)
We can classify the (truth-theoretic) paradoxes according to their degrees of paradoxicality. Roughly speaking, two paradoxes have the same degrees of paradoxicality, if they lead to a contradiction under the same conditions, and one paradox has a (non-strictly) lower degree of paradoxicality than another, if whenever the former leads to a contradiction under a condition, the latter does so under the same condition. In this paper, we outline some results and questions around the degrees of paradoxicality and summarize recent progress.
Is logic empirical? Is logic to be found in the world? Or is logic rather a convention, a product of conventions, part of the many rules that regulate the language game? Answers fall in either camp. We like the linguistic answer. In this paper, we want to analyze how a linguistic community would tackle the problem of developing a logic and show how the linguistic conventions adopted by the community determine the properties of the local logic. Then show how to (...) move from a notion of logic that varies from community to community to a notion of logic that is in a sense universal. The framework is conventional up to a point: we have sentences, atomic and composite, the connectives are interpreted, values are computed, and the value of a composite sentence is a function of the values of its subsentences. Less conventional is the use of a plurality of truth values, and the sharp distinction we draw between sentences and statements, in the spirit of the distinction between proposition and judgment that one may find in proof theory. The linguistic community will face many choices. What are the good ones, the ones to avoid? Are there, in some sense, optimal choices? These are the kind of issues we are addressing. Where do we end up? With some kind of universal bivalent logic, ironically enough. We start from an arbitrarily large number of truth values, atomic sentences and connectives, construct a generic many-valued logic, recover more or less the usual results and issues, and in the end it all comes down to a positive bivalent logic with two connectives, `and' and `or', as if logic is nothing more than a mere accounting of possibilities. (shrink)
Section 1 reviews Strawson’s logic of presuppositions. Strawson’s justification is critiqued and a new justification proposed. Section 2 extends the logic of presuppositions to cases when the subject class is necessarily empty, such as (x)((Px & ~Px) → Qx) . The strong similarity of the resulting logic with Richard Diaz’s truth-relevant logic is pointed out. Section 3 further extends the logic of presuppositions to sentences with many variables, and a certain valuation is proposed. It is noted that, given this valuation, (...) Gödel’s sentence becomes neither true nor false. The similarity of this outcome with Goldstein and Gaifman’s solution of the Liar paradox, which is discussed in section 4, is emphasized. Section 5 returns to the definition of meaningfulness; the meaninglessness of certain sentences with empty subjects and of the Liar sentence is discussed. The objective of this paper is to show how all of the above-mentioned concepts are interrelated. (shrink)
When are two formal theories of broadly logical concepts, such as truth, equivalent? The paper investigates a case study, involving two well-known variants Kripke-Feferman truth. The first, KF+CONS, features a consistent but partial truth predicate. The second, KF+COMP, an inconsistent but complete truth predicate. It is well-known that the two truth predicates are dual to each other. We show that this duality reveals a much stricter correspondence between the two theories: they are intertraslatable. Intertranslatability under natural assumptions coincides with definitional (...) equivalence, and is arguably the strictest notion of theoretical equivalence different from logical equivalence. The case of KF+CONS and KF+COMP raises a puzzle: the two theories can be proved to be strictly related, yet they appear to embody remarkably different conceptions of truth. We discuss the significance of the result for the broader debate on formal criteria of conceptual reducibility for theories of truth. (shrink)
This sentence G ↔ ¬(F ⊢ G) and its negation G ↔ ~(F ⊢ ¬G) are shown to meet the conventional definition of incompleteness: Incomplete(T) ↔ ∃φ ((T ⊬ φ) ∧ (T ⊬ ¬φ)). They meet conventional definition of incompleteness because neither the sentence nor its negation is provable in F (or any other formal system). -- .
We can simply define Gödel 1931 Incompleteness away by redefining the meaning of the standard definition of Incompleteness: A theory T is incomplete if and only if there is some sentence φ such that (T ⊬ φ) and (T ⊬ ¬φ). This definition construes the existence of self-contradictory expressions in a formal system as proof that this formal system is incomplete because self-contradictory expressions are neither provable nor disprovable in this formal system. Since self-contradictory expressions are neither provable nor disprovable (...) only because they are self-contradictory we could define them as unsound instead of defining the formal system as incomplete. (shrink)
The conventional notion of a formal system is adapted to conform to the sound deductive inference model operating on finite strings. Finite strings stipulated to have the semantic property of Boolean true provide the sound deductive premises. Truth preserving finite string transformation rules provide valid the deductive inference. Conclusions of sound arguments are derived from truth preserving finite string transformations applied to true premises.
Could the intersection of [formal proofs of mathematical logic] and [sound deductive inference] specify formal systems having [deductively sound formal proofs of mathematical logic]? All that we have to do to provide [deductively sound formal proofs of mathematical logic] is select the subset of conventional [formal proofs of mathematical logic] having true premises and now we have [deductively sound formal proofs of mathematical logic].
The conventional notion of a formal system is adapted to conform to the sound deductive inference model operating on finite strings. Finite strings stipulated to have the semantic value of Boolean true provide the sound deductive premises. Truth preserving finite string transformation rules provide the valid deductive inference. Sound deductive conclusions are the result of these finite string transformation rules.
By extending the notion of a Well Formed Formula to include syntactically formalized rules for rejecting semantically incorrect expressions we recognize and reject expressions that have the semantic error of Pathological self-reference(Olcott 2004). The foundation of this system requires the notion of a BaseFact that anchors the semantic notions of True and False. When-so-ever a formal proof from BaseFacts of language L to a closed WFF X or ~X of language L does not exist X is decided to be semantically (...) incorrect. (shrink)
Minimal Type Theory (MTT) is based on type theory in that it is agnostic about Predicate Logic level and expressly disallows the evaluation of incompatible types. It is called Minimal because it has the fewest possible number of fundamental types, and has all of its syntax expressed entirely as the connections in a directed acyclic graph.
Some such as Dean (2014) suggest that Montague's paradox requires the necessitation rule, and that the use of the rule in such a context is contentious. But here, I show that the paradox arises independently of the necessitation rule. A derivation of the paradox is given in modal system T without deploying necessitation; a necessitation-free derivation is also formulated in a weaker system that is not even complete with respect to truth-functional validities.
Jc Beall is known for defending modest dialetheism; this is the view that there are dialetheia, but only in the form of “spandrels” arising otherwise reasonable semantic terminology (e.g., the Liar paradox). Beall also regards his view as modest in partaking of a deflationary view of truth, a view where ‘true’ is a device of disquotational inference which expresses no “substantive property.” Beall supports deflationism by an appeal to Ockham’s razor; however, the premise that ‘true’ is always disquotational is found (...) dubious. Nonetheless, we can craft an ultra-modest dialetheism which says merely that at least one utterance of ‘This sentence is not true’ uses ‘true’ as a disquotational device, and maintains neturality on whether it expresses a substantive property. This is shown sufficient for the existence of at least one dialetheia in first-degree entailment (a non-explosive logic that also assigns the Liar a truth-value gap). The limited scope of the ultra-modest view will be disappointing to formal semanticists hoping to capture the behavior of ‘true’ throughout the language. But its modest basis gives dialetheism the best hope for wider acceptance in the discipline. (shrink)
If a semantically open language has no constraints on self-reference, one can prove an absurdity. The argument exploits a self-referential function symbol where the expressed function ends up being intensional in virtue of self-reference. The prohibition on intensional functions thus entails that self-reference cannot be unconstrained, even in a language that is free of semantic terms. However, since intensional functions are already excluded in classical logic, there are no drastic revisionary implications here. Still, the argument reveals a new sort of (...) intensional context, one which does not depend on a propositional attitude verb, a modal operator, an idiom like 'so called', etc. Moreover, since classical logicians do not seem aware of the potential danger with self-reference, a word of warning is in order. (shrink)
Why do we use epistemic modals like 'might'? According to Factualism, the function of 'might' is to exchange information about state-of-affairs in the modal universe. As an alternative to Factualism, this paper offers a game-theoretic rationale for epistemic possibility operators in a Bayesian setting. The background picture is one whereby communication facilitates coordination, but coordination could fail if there's too much uncertainty, since the players' ability to share a belief is undermined. However, 'might' and related expressions can be used to (...) reveal one's uncertainty, and exploit this to coordinate despite the lack of a common epistemic ground. The final result is a way to articulate a non-Factualist view of epistemic possibility modals that builds on their standard semantics. (shrink)
By separating the general concept of truth into syntactic truth and semantic truth, this article proposes a new theory of truth to explain several paradoxes like the Liar paradox, Card paradox, Curry’s paradox, etc. By revealing the relationship between syntactic /semantic truth and being-nothing-becoming which are the core concepts of dialectical logic, it is able to formalize dialectical logic. It also provides a logical basis for complexity theory by transferring all reasoning into a directed (cyclic/acyclic) graph which explains both paradoxical (...) and paradox-free reasoning. The different structure between cyclic graphs and acyclic graphs is the key to understanding paradoxes. By explaining the immanent between logic, paradox, and cellular automata, it also illustrates dialectical logic as ontology. (shrink)
This paper introduces a new method of interpreting complex relation terms in a second-order quantified modal language. We develop a completely general second-order modal language with two kinds of complex terms: one kind for denoting individuals and one kind for denoting n-place relations. Several issues arise in connection with previous, algebraic methods for interpreting the relation terms. The new method of interpreting these terms described here addresses those issues while establishing an interesting connection between λ and ε calculi. The resulting (...) semantics provides a precise understanding of the theory of relations. (shrink)
I present a solution to the epistemological or characterisation problem of induction. In part I, Bayesian Confirmation Theory (BCT) is discussed as a good contender for such a solution but with a fundamental explanatory gap (along with other well discussed problems); useful assigned probabilities like priors require substantive degrees of belief about the world. I assert that one does not have such substantive information about the world. Consequently, an explanation is needed for how one can be licensed to act as (...) if one has substantive information about the world when one does not. I sketch the outlines of a solution in part I, showing how it differs from others, with full details to follow in subsequent parts. The solution is pragmatic in sentiment (though differs in specifics to arguments from, for example, William James); the conceptions we use to guide our actions are and should be at least partly determined by preferences. This is cashed out in a reformulation of decision theory motivated by a non-reductive formulation of hypotheses and logic. A distinction emerges between initial assumptions--that can be non-dogmatic--and effective assumptions that can simultaneously be substantive. An explanation is provided for the plausibility arguments used to explain assigned probabilities in BCT. -/- In subsequent parts, logic is constructed from principles independent of language and mind. In particular, propositions are defined to not have form. Probabilities are logical and uniquely determined by assumptions. The problems considered fatal to logical probabilities--Goodman's `grue' problem and the uniqueness of priors problem are dissolved due to the particular formulation of logic used. Other problems such as the zero-prior problem are also solved. -/- A universal theory of (non-linguistic) meaning is developed. Problems with counterfactual conditionals are solved by developing concepts of abstractions and corresponding pictures that make up hypotheses. Spaces of hypotheses and the version of Bayes' theorem that utilises them emerge from first principles. -/- Theoretical virtues for hypotheses emerge from the theory. Explanatory force is explicated. The significance of effective assumptions is partly determined by combinatoric factors relating to the structure of hypotheses. I conjecture that this is the origin of simplicity. (shrink)
In this multi-disciplinary investigation we show how an evidence-based perspective of quantification---in terms of algorithmic verifiability and algorithmic computability---admits evidence-based definitions of well-definedness and effective computability, which yield two unarguably constructive interpretations of the first-order Peano Arithmetic PA---over the structure N of the natural numbers---that are complementary, not contradictory. The first yields the weak, standard, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically verifiable Tarskian truth values to the formulas of PA under the interpretation. (...) The second yields a strong, finitary, interpretation of PA over N, which is well-defined with respect to assignments of algorithmically computable Tarskian truth values to the formulas of PA under the interpretation. We situate our investigation within a broad analysis of quantification vis a vis: * Hilbert's epsilon-calculus * Goedel's omega-consistency * The Law of the Excluded Middle * Hilbert's omega-Rule * An Algorithmic omega-Rule * Gentzen's Rule of Infinite Induction * Rosser's Rule C * Markov's Principle * The Church-Turing Thesis * Aristotle's particularisation * Wittgenstein's perspective of constructive mathematics * An evidence-based perspective of quantification. By showing how these are formally inter-related, we highlight the fragility of both the persisting, theistic, classical/Platonic interpretation of quantification grounded in Hilbert's epsilon-calculus; and the persisting, atheistic, constructive/Intuitionistic interpretation of quantification rooted in Brouwer's belief that the Law of the Excluded Middle is non-finitary. We then consider some consequences for mathematics, mathematics education, philosophy, and the natural sciences, of an agnostic, evidence-based, finitary interpretation of quantification that challenges classical paradigms in all these disciplines. (shrink)
Description Logics (DLs) are a family of formal knowledge representation formalisms and the most well-known formalisms in semantics-based systems. The central focus of this research is on logical-terminological characterisation/analysis of possibilistic and probabilistic descriptions of events in DLs. Based on a logical characterisation of the concept of `being', this paper conceptualises events within DLs world descriptions. Accordingly, it deals with the concepts of `possibility of events' and `probability of events'. The main goal of this research is to investigate how possible (...) and probable notations can, terminologically and logically, be interpreted and analysed within DLs world descriptions. In other words, the research logically-terminologically investigates how possibilistic and probabilistic descriptions of events can be expressed in DLs. (shrink)
¿Cómo puede la lógica representar expresiones indéxicas como “yo”, “aquí” y “ahora”? ¿Cómo no debe representarlas? Examino estas dos preguntas a partir de la Lógica de los Demostrativos (LD) de Kaplan y su impopular prohibición de operadores monstruosos. A pesar de algunos defectos de formulación, sostengo que dicha prohibición está guiada por una poderosa visión de las relaciones lógicas de validez entre oraciones con indéxicos que desafía la concepción tradicional de consecuencia lógica como preservación de la verdad y resalta el (...) papel fundamental de la información semántica en la construcción de sistemas formales concebidos como modelos de las lenguas naturales. (shrink)
In this paper, we present two variants of Peirce’s Triadic Logic within a language containing only conjunction, disjunction, and negation. The peculiarity of our systems is that conjunction and disjunction are interpreted by means of Peirce’s mysterious binary operations Ψ and Φ from his ‘Logical Notebook’. We show that semantic conditions that can be extracted from the definitions of Ψ and Φ agree (in some sense) with the traditional view on the semantic conditions of conjunction and disjunction. Thus, we support (...) the conjecture that Peirce’s special interest in these operations is due to the fact that he interpreted them as conjunction and disjunction, respectively. We also show that one of our systems may serve as a suitable base for an interesting implicative expansion, namely the connexive three-valued logic by Cooper. Sound and complete natural deduction calculi are presented for all systems examined in this paper. (shrink)
We prove that the existence of a measurable cardinal is equivalent to the existence of a normal space whose modal logic coincides with the modal logic of the Kripke frame isomorphic to the powerset of a two element set.
This paper is a short introduction to Carnap’s writings on semantics with an emphasis on the transition from the syntactic period to the semantic one. I claim that one of Carnap’s main aims was to investigate the possibility of the symmetry between the syntactic and the semantic methods of approaching philosophical problems, both in logic and in the philosophy of science. This ideal of methodological symmetry could be described as an attempt to obtain categorical logical systems, i.e., systems that allow (...) only the intended semantical interpretation. (shrink)
We propose in this paper a family of algebraic models of ZFC based on the three-valued paraconsistent logic LPT0, a linguistic variant of da Costa and D’Ottaviano’s logic J3. The semantics is given by twist structures defined over complete Boolean agebras. The Boolean-valued models of ZFC are adapted to twist-valued models of an expansion of ZFC by adding a paraconsistent negation. This allows for inconsistent sets w satisfying ‘not (w = w)’, where ‘not’ stands for the paraconsistent negation. Finally, our (...) framework is adapted to provide a class of twist-valued models generalizing Löwe and Tarafder’s model based on logic (PS 3,∗), showing that they are paraconsistent models of ZFC. The present approach offers more options for investigating independence results in paraconsistent set theory. (shrink)
One of the most expected properties of a logical system is that it can be algebraizable, in the sense that an algebraic counterpart of the deductive machinery could be found. Since the inception of da Costa's paraconsistent calculi, an algebraic equivalent for such systems have been searched. It is known that these systems are non self-extensional (i.e., they do not satisfy the replacement property). More than this, they are not algebraizable in the sense of Blok-Pigozzi. The same negative results hold (...) for several systems of the hierarchy of paraconsistent logics known as Logics of Formal Inconsistency (LFIs). Because of this, these logics are uniquely characterized by semantics of non-deterministic kind. This paper offers a solution for two open problems in the domain of paraconsistency, in particular connected to algebraization of LFIs, by obtaining several LFIs weaker than C1, each of one is algebraizable in the standard Lindenbaum-Tarski's sense by a suitable variety of Boolean algebras extended with operators. This means that such LFIs satisfy the replacement property. The weakest LFI satisfying replacement presented here is called RmbC, which is obtained from the basic LFI called mbC. Some axiomatic extensions of RmbC are also studied, and in addition a neighborhood semantics is defined for such systems. It is shown that RmbC can be defined within the minimal bimodal non-normal logic E+E defined by the fusion of the non-normal modal logic E with itself. Finally, the framework is extended to first-order languages. RQmbC, the quantified extension of RmbC, is shown to be sound and complete w.r.t. BALFI semantics. (shrink)
Tarski’s Convention T—presenting his notion of adequate definition of truth (sic)—contains two conditions: alpha and beta. Alpha requires that all instances of a certain T Schema be provable. Beta requires in effect the provability of ‘every truth is a sentence’. Beta formally recognizes the fact, repeatedly emphasized by Tarski, that sentences (devoid of free variable occurrences)—as opposed to pre-sentences (having free occurrences of variables)—exhaust the range of significance of is true. In Tarski’s preferred usage, it is part of the meaning (...) of true that attribution of being true to a given thing presupposes the thing is a sentence. Beta’s importance is further highlighted by the fact that alpha can be satisfied using the recursively definable concept of being satisfied by every infinite sequence, which Tarski explicitly rejects. Moreover, in Definition 23, the famous truth-definition, Tarski supplements “being satisfied by every infinite sequence” by adding the condition “being a sentence”. Even where truth is undefinable and treated by Tarski axiomatically, he adds as an explicit axiom a sentence to the effect that every truth is a sentence. Surprisingly, the sentence just before the presentation of Convention T seems to imply that alpha alone might be sufficient. Even more surprising is the sentence just after Convention T saying beta “is not essential”. Why include a condition if it is not essential? Tarski says nothing about this dissonance. Considering the broader context, the Polish original, the German translation from which the English was derived, and other sources, we attempt to determine what Tarski might have intended by the two troubling sentences which, as they stand, are contrary to the spirit, if not the letter, of several other passages in Tarski’s corpus. (shrink)
In this paper we present two new approaches for dealing with semantic paradoxes and soritical predicates based on fuzzy logic. We show that both of them have conceptual advantages over the more traditional Łukasiewicz approach, and that the second one even avoids standard proofs of ω-nconsistency.
This paper explores the analysis of ability, where ability is to be understood in the epistemic sense—in contrast to what might be called a causal sense. There are plenty of cases where an agent is able to perform an action that guarantees a given result even though she does not know which of her actions guarantees that result. Such an agent possesses the causal ability but lacks the epistemic ability. The standard analysis of such epistemic abilities relies on the notion (...) of action types—as opposed to action tokens—and then posits that an agent has the epistemic ability to do something if and only if there is an action type available to her that she knows guarantees it. We show that these action types are not needed: we present a formalism without action types that can simulate analyzes of epistemic ability that rely on action types. Our formalism is a standard epistemic extension of the theory of “seeing to it that”, which arose from a modal tradition in the logic of action. (shrink)
As the final component of a chain of reasoning intended to take us all the way to logical nihilism, Russell (2018) presents the atomic sentence ‘prem’ which is supposed to be true when featuring as premise in an argument and false when featuring as conclusion in an argument. Such a sentence requires a non-reflexive logic and an endnote by Russell (2018) could easily leave the reader with the impression that going non-reflexive suffices for logical nihilism. This paper shows how one (...) can obtain non-reflexive logics in which ‘prem’ behaves as stipulated by Russell (2018) but which nonetheless has valid inferences supporting uniform substitution of any formula for propositional variables such as modus tollens and modus ponens. (shrink)
Hyperlogic is a hyperintensional system designed to regiment metalogical claims (e.g., "Intuitionistic logic is correct" or "The law of excluded middle holds") into the object language, including within embedded environments such as attitude reports and counterfactuals. This paper is the first of a two-part series exploring the logic of hyperlogic. This part presents a minimal logic of hyperlogic and proves its completeness. It consists of two interdefined axiomatic systems: one for classical consequence (truth preservation under a classical interpretation of the (...) connectives) and one for "universal" consequence (truth preservation under any interpretation). The sequel to this paper explores stronger logics that are sound and complete over various restricted classes of models as well as languages with hyperintensional operators. (shrink)
I make a point concerning the construction ‘A or B or both’ in English, to the effect that if the connective ‘or’ is understood exclusively across the board then this familiar construction cannot convey the intended inclusive sense of disjunction. If we take ‘or’ inclusively, ‘A or B or both’ has the function of emphasizing that the disjunction is inclusive; taking ‘or’ exclusively, it does nothing.
In this paper, we provide a semantics for a range of positive substructural logics, including both logics with and logics without modal connectives. The semantics is novel insofar as it is meant to explicitly capture the computational flavor of these logics, and to do so in a way that builds in both nondeterministic and nonconcurrent computational processes.
The logics BN4 and E4 can be considered as the 4-valued logics of the relevant conditional and (relevant) entailment, respectively. The logic BN4 was developed by Brady in 1982 and the logic E4 by Robles and Méndez in 2016. The aim of this paper is to investigate the implicative variants (of both systems) which contain Routley and Meyer’s logic B and endow them with a Belnap-Dunn type bivalent semantics.
G3-style sequent calculi for the logics in the cube of non-normal modal logics and for their deontic extensions are studied. For each calculus we prove that weakening and contraction are height-preserving admissible, and we give a syntactic proof of the admissibility of cut. This implies that the subformula property holds and that derivability can be decided by a terminating proof search whose complexity is in Pspace. These calculi are shown to be equivalent to the axiomatic ones and, therefore, they are (...) sound and complete with respect to neighbourhood semantics. Finally, a Maehara-style proof of Craig’s interpolation theorem for most of the logics considered is given. (shrink)
A notion of strictly primitive recursive realizability is introduced by Damnjanovic in 1994. It is a kind of constructive semantics of the arithmetical sentences using primitive recursive functions. It is of interest to study the corresponding predicate logic. It was argued by Park in 2003 that the predicate logic of strictly primitive recursive realizability is not arithmetical. Park’s argument is essentially based on a claim of Damnjanovic that intuitionistic logic is sound with respect to strictly primitive recursive realizability, but that (...) claim was disproved by the author of this article in 2006. The aim of this paper is to present a correct proof of the result of Park. (shrink)
Lewis’s  counterpart theory (LCT for short), motivated by his modal realism, made its appearance within a year of Chisholm’s modal paradox . We are not modal realists, but we argue that a satisfactory resolution to the paradox calls for a counterpart-theoretic (CT-)semantics. We make our case by showing that the Chandler–Salmon strategy of denying the S4 axiom [◊◊ψ →◊ψ] is inadequate to resolve the paradox – we take on Salmon’s attempts to defend that strategy against objects from Lewis and (...) Williamson. We then consider three substantially different CT-approaches: Lewis’s LCT, Forbes’s (FCT), including his fuzzy version, and Ramachandran’s (RCT). We argue that the best approach is a mish-mash of FCT and RCT. (shrink)
Minimalism about truth is one of the main contenders for our best theory of truth, but minimalists face the charge of being unable to properly state their theory. Donald Davidson incisively pointed out that minimalists must generalize over occurrences of the same expression placed in two different contexts, which is futile. In order to meet the challenge, Paul Horwich argues that one can nevertheless characterize the axioms of the minimalist theory. Sten Lindström and Tim Button have independently argued that Horwich’s (...) attempt to formulate minimalism remains unsuccessful. We show how to properly state Horwich’s axioms by appealing to propositional functions that are given by definite descriptions. Both Lindström and Button discuss proposals similar to ours and conclude that they are unsuccessful. Our new suggestion avoids these objections. (shrink)
This paper gives a semantics for schematic logic, proving soundness and completeness. The argument for soundness is carried out in ontologically innocent fashion, relying only on the existence of formulae which are actually written down in the course of a derivation in the logic. This makes the logic available to a nominalist, even a nominalist who does not wish to rely on modal notions, and who accepts the possibility that the universe may in fact be finite.