Quantification, Negation, and Focus: Challenges at the Conceptual-Intentional Semantic Interface Tista Bagchi National Institute of Science, Technology, and Development Studies (NISTADS) and the University of Delhi Since the proposal of Logical Form (LF) was put forward by Robert May in his 1977 MIT doctoral dissertation and was subsequently adopted into the overall architecture of language as conceived under Government-Binding Theory (Chomsky 1981), there has been a steady research effort to determine the nature of LF in language in light of structurally (...) diverse languages around the world, which has ultimately contributed to the reinterpretation of LF as a Conceptual-Intentional (C-I) interface level between the computational syntactic component of the faculty of language and one or more interpretive faculties of the human mind. While this has opened up further possibilities of research in phenomena such as quantifier scope and scope interactions between negation, quantification, and focus, it has also given rise to a few real challenges to linguistic theory as well. Some of these are: (i) the split between lexical meaning – a matter supposedly belonging to the phase-wise selection of lexical arrays – and issues of semantic interpretation that arise purely from binding and scope phenomena (Mukherji 2010); (ii) partially relatedly, the level at which theta role assignment can be argued to take place, an issue that is taken up by me in Bagchi (2007); and (iii) how supposedly “pure” scopal phenomena relating to quantifiers, negation, and emphasizing expressions such as only and even (comparable to, e.g., Urdu/Hindi hii and bhii, Bangla –i and –o) also have dimensions of both focus and discourse reference. While recognizing all of these challenges, this talk aims to highlight particularly challenge (iii), both in terms of scholarship in the past and for the rich prospects for research on languages of south Asia with the semantics of quantification, negation, and focus in view. The scholarship of the past that I seek to relate this issue to is where, parallel to (and largely independently of) the research on LF that had been happening, Barwise and Cooper were developing their influential view of noun phrases as generalized quantifiers, culminating in their key 1981 article “Generalized Quantifiers and Natural Language” while, independently, McCawley, in his 1981 book Everything that Linguists have Always Wanted to Know about Logic, established through argumentation that all noun phrases semantically behave like generalized quantified expressions (further elaborated by him in the second – 1994 – revised edition of his book). I seek to demonstrate, based on limited data analysis from selected languages of south Asia, that our current understanding of quantification, negation, and focus under the Minimalist view owes something significant to the two major, but now largely marginalized, works of scholarship, and that for the way forward it is essential to adopt a more formal-semantic approach as adopted by them and also by later works such as Denis Bouchard’s (1995) The Semantics of Syntax, Mats Rooth’s work on focus (e.g., Rooth 1996, “Focus” in Shalom Lappin’s Handbook of Contemporary Semantic Theory), Heim and Kratzer’s Semantics in Generative Grammar (1998), and Yoad Winter’s (2002) Linguistic Inquiry article on semantic number, to cite just a few instances. (shrink)
This paper discusses the logical possibility of testing inconsistent empirical theories. The main challenge for answering this affirmatively is to avoid that the inconsistent consequences of a theory both corroborate it and falsify it. I answer affirmatively by showing that we can define a class of empirical sentences whose truth would force us to abandon such inconsistent theory: the class of its potential rejecters. Despite this, I show that the observational contradictions implied by a theory could only be verified (provided (...) we make some assumptions), but not rejected. From this, it follows that, although inconsistent theories are rejectable, they cannot be rejected qua inconsistent. (shrink)
The term 'covert mixed quotation' describes cases in which linguistic material is interpreted in the manner of mixed quotation — that is, used in addition to being mentioned — despite the superficial absence of any commonly recognized conventional devices indicating quotation. After developing a novel theory of mixed quotation, I show that positing covert mixed quotation allows us to give simple and unified treatments of a number of puzzling semantic phenomena, including the projective behavior of conventional implicature items embedded in (...) indirect speech reports and propositional attitude ascriptions, so-called 'c-monsters,' metalinguistic negation, metalinguistic negotiation, and 'in a sense' constructions. (shrink)
Intensional evidence is any reason to accept a proposition that is not the truth values of the proposition accepted or, if it is a complex proposition, is not the truth values of its propositional contents. Extensional evidence is non-intensional evidence. Someone can accept a complex proposition, but deny its logical consequences when her acceptance is based on intensional evidence, while the logical consequences of the proposition presuppose the acceptance of extensional evidence, e.g., she can refuse the logical consequence of a (...) proposition she accepts because she doesn’t know what are the truth-values of its propositional contents. This tension motivates counterexamples to the negation of conditionals, the propositional analysis of conditionals, hypothetical syllogism, contraposition and or-to-if. It is argued that these counterexamples are non-starters because they rely on a mix of intensionally based premises and extensionally based conclusions. Instead, a genuine counterexample to classical argumentative forms should present circumstances where an intuitively true and extensionally based premise leads to an intuitively false conclusion that is also extensionally based. The other point is that evidentiary concerns about intensionally based beliefs should be constrained by the truth conditions of propositions presented by classical logic, which are nothing more than requirements of coherence in distributions of truth value. It is argued that this restriction also dissolves some known puzzles such as conditional stand-offs, the Adams pair, the opt-out property and the burglar’s puzzle. (shrink)
We develop a novel solution to the negation version of the Frege-Geach problem by taking up recent insights from the bilateral programme in logic. Bilateralists derive the meaning of negation from a primitive *B-type* inconsistency involving the attitudes of assent and dissent. Some may demand an explanation of this inconsistency in simpler terms, but we argue that bilateralism’s assumptions are no less explanatory than those of *A-type* semantics that only require a single primitive attitude, but must stipulate inconsistency elsewhere. Based (...) on these insights, we develop a version of B-type expressivism called *inferential expressivism*. This is a novel semantic framework that characterises meanings by inferential roles that define which *attitudes* one can *infer* from the use of terms. We apply this framework to normative vocabulary, thereby solving the Frege-Geach problem generally and comprehensively. Our account moreover includes a semantics for epistemic modals, thereby also explaining normative terms under epistemic modals. (shrink)
I argue that rejection is a speech act that cannot be reduced to assertion. Adapting an argument by Huw Price, I conclude that rejection is best conceived of as the speech act that is used to register that some other speech act is (or would be) violating a rule of the conversation game. This can be naturally understood as registering *norm violations* where speech acts are characterised by their essential norms. However, I argue that rejection itself is not to be (...) characterised by a norm. Instead, registering violations is a necessary condition for understanding the normative framework in the first place. The core observation is that the concept of an 'illegal move' is intelligible, so a speech act can be (say) an assertion, despite violating the essential norm of asserting. Rejection has the function of pointing out that a move is illegal. But registering rule violations is a precondition of playing games with rules (it is part of the concept 'game'), not itself a rule in a game. A similar special role of rejection (that it is not explicable in the terms provided by a framework, but needed to conceptualise these terms) likely occurs in other frameworks as well, e.g. characterising speech acts by commitments or their effect on a common ground. (shrink)
We give a proof-theoretic as well as a semantic characterization of a logic in the signature with conjunction, disjunction, negation, and the universal and existential quantifiers that we suggest has a certain fundamental status. We present a Fitch-style natural deduction system for the logic that contains only the introduction and elimination rules for the logical constants. From this starting point, if one adds the rule that Fitch called Reiteration, one obtains a proof system for intuitionistic logic in the given signature; (...) if instead of adding Reiteration, one adds the rule of Reductio ad Absurdum, one obtains a proof system for orthologic; by adding both Reiteration and Reductio, one obtains a proof system for classical logic. Arguably neither Reiteration nor Reductio is as intimately related to the meaning of the connectives as the introduction and elimination rules are, so the base logic we identify serves as a more fundamental starting point and common ground between proponents of intuitionistic logic, orthologic, and classical logic. The algebraic semantics for the logic we motivate proof-theoretically is based on bounded lattices equipped with what has been called a weak pseudocomplementation. We show that such lattice expansions are representable using a set together with a reflexive binary relation satisfying a simple first-order condition, which yields an elegant relational semantics for the logic. This builds on our previous study of representations of lattices with negations, which we extend and specialize for several types of negation in addition to weak pseudocomplementation; in an appendix, we further extend this representation to lattices with implications. Finally, we discuss adding to our logic a conditional obeying only introduction and elimination rules, interpreted as a modality using a family of accessibility relations. (shrink)
There are essentially two ways to develop the Peircean idea that future contingents are all false. One is to provide a quantificational semantics for "will," as is usually done. The other is to define a quantificational postsemantics based on a linear semantics for "will." As we will suggest, the second option, although less conventional, is more plausible than the first in some crucial respects. The postsemantic approach overcomes three major troubles that have been raised in connection with Peirceanism: the apparent (...) scopelessness of "will" with respect to negation, the failure of Future Excluded Middle, and the so-called zero credence problem. (shrink)
This paper investigates a particular philosophical puzzle via an examination of its status in the writings of Wittgenstein. The puzzle concerns negation and can take on three interrelated guises. The first puzzle is how not-p can so much as negate p at all – for if p is not the case, then nothing corresponds to p. The second puzzle is how not-p can so much as negate p at all when not-p rejects p not as false but as unintelligible – (...) for if p is unintelligible, then p is nothing but scratches and sounds and does not seem apt for negation. And the third puzzle is how “not” could be anything but hopelessly equivocal if it sometimes (per the first puzzle) requires, and sometimes (per the second puzzle) precludes the intelligibility of p. The paper investigates these three puzzles, their respective structures, and their relations to each other. The second puzzle is expounded as the centre of gravity, and in countering two objections to the threefold puzzle, a special predicament is expounded with regard to the second puzzle’s concern with unipolar propositions – propositions that do not admit of an intelligible negation. The text concludes by indicating the first steps that could potentially lead us out of the threefold puzzle. (shrink)
Deontic necessity modals (e.g. 'have to', 'ought to', 'must', 'need to', 'should', etc.) seem to vary in how they interact with negation. According to some accounts, what forces modals like 'ought' and 'should' to outscope negation is their polarity sensitivity -- modals that scope over negation do so because they are positive polarity items. But there is a conflict between this account and a widely assumed theory of if-clauses, namely the restrictor analysis. In particular, the conflict arises for constructions containing (...) a bound pronoun in the if-clause. This note spells out the core conflict. (shrink)
In this paper, we study three representations of lattices by means of a set with a binary relation of compatibility in the tradition of Ploščica. The standard representations of complete ortholattices and complete perfect Heyting algebras drop out as special cases of the first representation, while the second covers arbitrary complete lattices, as well as complete lattices equipped with a negation we call a protocomplementation. The third topological representation is a variant of that of Craig, Haviar, and Priestley. We then (...) extend each of the three representations to lattices with a multiplicative unary modality; the representing structures, like so-called graph-based frames, add a second relation of accessibility interacting with compatibility. The three representations generalize possibility semantics for classical modal logics to non-classical modal logics, motivated by a recent application of modal orthologic to natural language semantics. (shrink)
This paper formulates a bilateral account of harmony that is an alternative to one proposed by Francez. It builds on an account of harmony for unilateral logic proposed by Kürbis and the observation that reading the rules for the connectives of bilateral logic bottom up gives the grounds and consequences of formulas with the opposite speech act. I formulate a process I call 'inversion' which allows the determination of assertive elimination rules from assertive introduction rules, and rejective elimination rules from (...) rejective introduction rules, and conversely. It corresponds to Francez's notion of vertical harmony. I also formulate a process I call 'conversion', which allows the determination of rejective introduction rules from assertive elimination rules and conversely, and the determination of assertive introduction rules from rejective elimination rules and conversely. It corresponds to Francez's notion of horizontal harmony. The account has a number of features that distinguish it from Francez's. (shrink)
The ability to express negation in language may have been the result of an adaptive process. However, there are different accounts of adaptation in linguistics, and more than one of them may describe the case of negation. In this paper, I distinguish different versions of the claim that negation is adaptive and defend a proposal, based on recent work by Steinert-Threlkeld (2016) and Incurvati and Sbardolini (2021), on which negation is an indirect adaptation.
In one of their papers, Michael De and Hitoshi Omori observed that the notion of classical negation is not uniquely determined in the context of so-called Belnap-Dunn logic, and in fact there are 16 unary operations that qualify to be called classical negation. These varieties are due to different falsity conditions one may assume for classical negation. The aim of this paper is to observe that there is an interesting way to make sense of classical negation independent of falsity conditions. (...) We discuss two equivalent semantics, and offer a Hilbert-style system that is sound and complete with respect to the semantics. (shrink)
From antiquity through the twentieth century, philosophers have hypothesized that, intuitively, it is harder to know negations than to know affirmations. This paper provides direct evidence for that hypothesis. In a series of studies, I found that people naturally view negations as harder to know than affirmations. Participants read simple scenarios and made judgments about truth, probability, belief, and knowledge. Participants were more likely to attribute knowledge of an outcome when framed affirmatively than when framed negatively. Participants did this even (...) though the affirmative and negative framings were logically equivalent. The asymmetry was unique to knowledge attributions: it did not occur when participants rated truth, probability, or belief. These findings show new consequences of negation on people’s judgments and reasoning and can inform philosophical theorizing about the ordinary concept of knowledge. (shrink)
In this article, I provide Urquhart-style semilattice semantics for three connexive logics in an implication-negation language (I call these “pure theories of connexive implication”). The systems semantically characterized include the implication-negation fragment of a connexive logic of Wansing, a relevant connexive logic recently developed proof-theoretically by Francez, and an intermediate system that is novel to this article. Simple proofs of soundness and completeness are given and the semantics is used to establish various facts about the systems (e.g., that two of (...) the systems have the variable sharing property). I emphasize the intuitive content of the semantics and discuss how natural informational considerations underly each of the examined systems. (shrink)
Logical realism is a view about the metaphysical status of logic. Common to most if not all the views captured by the label ‘logical realism’ is that logical facts are mind- and language-independent. But that does not tell us anything about the nature of logical facts or about our epistemic access to them. The goal of this paper is to outline and systematize the different ways that logical realism could be entertained and to examine some of the challenges that these (...) views face. It will be suggested that logical realism is best understood as a metaphysical view about the logical structure of the world, but this raises an important question: does logical realism collapse into standard metaphysical realism? It will be argued that this result can be accommodated, even if it cannot be altogether avoided. (shrink)
In this article I consider some recent objections raised against the syntactic treatment of negation in English multiclausal structures, in particular what has been called NEGraising. I argue that the objections based on pronominalisation and ellipsis presented in the recent literature do pose a problem for syntactic accounts of the mechanisms of so-called NOT-transportation that rely on a rule of leftwards movement, as is customary in generative grammar. However, there is an alternative syntactic treatment that assumes that negation originates as (...) a higher predicate and is subject to a rule of lowering. I show that a syntactic theory of NOT-transportation is tenable and accounts for the problematic data if NEGraising is replaced, in the analysis of the cases considered here, by a rule of NEG-lowering. (shrink)
This is a commentary on MM McCabe's "First Chop your logos... Socrates and the sophists on language, logic, and development". In her paper MM analyses Plato's Euthydemos, in which Plato tackles the problem of falsity in a way that takes into account the speaker and complements the Sophist's discussion of what is said. The dialogue looks as if it is merely a demonstration of the silly consequences of eristic combat. And so it is. But a main point of MM's paper (...) is that there is serious philosophy in the Euthydemos, too. MM argues that to counter the sophist brothers Euthydemos and Dionysodoros, Socrates points out that that there are different aspects to the verb 'to say' that run in parallel to the different aspects of the very 'to learn'. So just as there is continuity rather than ambiguity between 'to learn' and 'to understand', so there is continuity between the different aspects of saying. Thus Socrates puts forward a teleological account of both learning and meaning. Following up on some of MM's thoughts, I argue that the sophists subscribe, despite appearance, to a theory of meaning that respects serious and widely accepted philosophical theses on meaning. -/- Forthcoming in the Australasian Philosophical Review. The curator of the volume is Fiona Leigh, and the committee also has Hugh Benson and Tim Clarke. You can find MM's paper as well as the commentaries by Nicholas Denyer and Russell E. Jones and Ravi Sharma (and myself) by registering. (shrink)
A general framework for translating various logical systems is presented, including a set of partial unary operators of affirmation and negation. Despite its usual reading, affirmation is not redundant in any domain of values and whenever it does not behave like a full mapping. After depicting the process of partial functions, a number of logics are translated through a variety of affirmations and a unique pair of negations. This relies upon two preconditions: a deconstruction of truth-values as ordered and structured (...) objects, unlike its mainstream presentation as a simple object; a redefinition of the Principle of Bivalence as a set of four independent properties, such that its definition does not equate with normality. (shrink)
It is known that many relevant logics can be conservatively extended by the truth constant known as the Ackermann constant. It is also known that many relevant logics can be conservatively extended by Boolean negation. This essay, however, shows that a range of relevant logics with the Ackermann constant cannot be conservatively extended by a Boolean negation.
Many relevant logics can be conservatively extended by Boolean negation. Mares showed, however, that E is a notable exception. Mares’ proof is by and large a rather involved model-theoretic one. This paper presents a much easier proof-theoretic proof which not only covers E but also generalizes so as to also cover relevant logics with a primitive modal operator added. It is shown that from even very weak relevant logics augmented by a weak K-ish modal operator, and up to the strong (...) relevant logic R with a S5 modal operator, all fail to be conservatively extended by Boolean negation. The proof, therefore, also covers Meyer and Mares’ proof that NR—R with a primitive S4-modality added—also fails to be conservatively extended by Boolean negation. (shrink)
Many relevant logics are conservatively extended by Boolean negation. Not all, however. This paper shows an acute form of non-conservativeness, namely that the Boolean-free fragment of the Boolean extension of a relevant logic need not always satisfy the variable-sharing property. In fact, it is shown that such an extension can in fact yield classical logic. For a vast range of relevant logic, however, it is shown that the variable-sharing property, restricted to the Boolean-free fragment, still holds for the Boolean extended (...) logic. (shrink)
This paper defines provably non-trivial theories that characterize Frege’s notion of a set, taking into account that the notion is inconsistent. By choosing an adaptive underlying logic, consistent sets behave classically notwithstanding the presence of inconsistent sets. Some of the theories have a full-blown presumably consistent set theory T as a subtheory, provided T is indeed consistent. An unexpected feature is the presence of classical negation within the language.
Proof-theoretic methods are developed for subsystems of Johansson’s logic obtained by extending the positive fragment of intuitionistic logic with weak negations. These methods are exploited to establish properties of the logical systems. In particular, cut-free complete sequent calculi are introduced and used to provide a proof of the fact that the systems satisfy the Craig interpolation property. Alternative versions of the calculi are later obtained by means of an appropriate loop-checking history mechanism. Termination of the new calculi is proved, and (...) used to conclude that the considered logical systems are PSPACE-complete. (shrink)
Many think that expressivists have a special problem with negation. I disagree. For if there is a problem with negation, I argue, it is a problem shared by those who accept some plausible claims about the nature of intentionality. Whether there is any special problem for expressivists turns, I will argue, on whether facts about what truth-conditions beliefs have can explain facts about basic inferential relations among those beliefs. And I will suggest that the answer to this last question is, (...) on most plausible attempts at solving the problem of intentionality, ‘no’. (shrink)
The recent literature abounds with accounts of the semantics and pragmatics of so-called predicates of personal taste, i.e. predicates whose application is, in some sense or other, a subjective matter. Relativism and contextualism are the major types of theories. One crucial difference between these theories concerns how we should assess previous taste claims. Relativism predicts that we should assess them in the light of the taste standard governing the context of assessment. Contextualism predicts that we should assess them in the (...) light of the taste standard governing the context of use. We show in a range of experiments that neither prediction is correct. People have no clear preferences either way and which taste standard they choose in evaluating a previous taste claim crucially depends on whether they start out with a favorable attitude towards the object in question and then come to have an unfavorable attitude or vice versa. We suggest an account of the data in terms of what we call hybrid relativism. (shrink)
In the context of modal logics one standardly considers two modal operators: possibility ) and necessity ) [see for example Chellas ]. If the classical negation is present these operators can be treated as inter-definable. However, negative modalities ) and ) are also considered in the literature [see for example Béziau ; Došen :3–14, 1984); Gödel, in: Feferman, Collected works, vol 1, Publications 1929–1936, Oxford University Press, New York, 1986, p. 300; Lewis and Langford ]. Both of them can be (...) treated as negations. In Béziau a logic \ has been defined on the basis of the modal logic \. \ is proposed as a solution of so-called Jaśkowski’s problem [see also Jaśkowski ]. The only negation considered in the language of \ is ‘it is not necessary’. It appears that logic \ and \ inter-definable. This initial correspondence result between \ and \ has been generalised for the case of normal logics, in particular soundness-completeness results were obtained [see Marcos :279–300, 2005); Mruczek-Nasieniewska and Nasieniewski :229–248, 2005)]. In Mruczek-Nasieniewska and Nasieniewski it has been proved that there is a correspondence between \-like logics and regular extensions of the smallest deontic logic. To obtain this result both negative modalities were used. This result has been strengthened in Mruczek-Nasieniewska and Nasieniewski :261–280, 2017) since on the basis of classical positive logic it is enough to solely use \ to equivalently express both positive modalities and negation. Here we strengthen results given in Mruczek-Nasieniewska and Nasieniewski by showing correspondence for the smallest regular logic. In particular we give a syntactic formulation of a logic that corresponds to the smallest regular logic. As a result we characterise all logics that arise from regular logics. From this follows via respective translations a characterisation of a class of logics corresponding to some quasi-regular logics where \ is the smallest element. Moreover, if a given quasi-regular logic is characterised by some class of models, the same class can be used to semantically characterise the logic obtained by our translation. (shrink)
Various philosophers have long since been attracted to the doctrine that future contingent propositions systematically fail to be true—what is sometimes called the doctrine of the open future. However, open futurists have always struggled to articulate how their view interacts with standard principles of classical logic—most notably, with the Law of Excluded Middle. For consider the following two claims: Trump will be impeached tomorrow; Trump will not be impeached tomorrow. According to the kind of open futurist at issue, both of (...) these claims may well fail to be true. According to many, however, the disjunction of these claims can be represented as p ∨ ~p—that is, as an instance of LEM. In this essay, however, I wish to defend the view that the disjunction these claims cannot be represented as an instance of p ∨ ~p. And this is for the following reason: the latter claim is not, in fact, the strict negation of the former. More particularly, there is an important semantic distinction between the strict negation of the first claim [~] and the latter claim. However, the viability of this approach has been denied by Thomason, and more recently by MacFarlane and Cariani and Santorio, the latter of whom call the denial of the given semantic distinction “scopelessness”. According to these authors, that is, will is “scopeless” with respect to negation; whereas there is perhaps a syntactic distinction between ‘~Will p’ and ‘Will ~p’, there is no corresponding semantic distinction. And if this is so, the approach in question fails. In this paper, then, I criticize the claim that will is “scopeless” with respect to negation. I argue that will is a so-called “neg-raising” predicate—and that, in this light, we can see that the requisite scope distinctions aren’t missing, but are simply being masked. The result: a under-appreciated solution to the problem of future contingents that sees and as contraries, not contradictories. (shrink)
In this article, I present a semantically natural conservative extension of Urquhart’s positive semilattice logic with a sort of constructive negation. A subscripted sequent calculus is given for this logic and proofs of its soundness and completeness are sketched. It is shown that the logic lacks the finite model property. I discuss certain questions Urquhart has raised concerning the decision problem for the positive semilattice logic in the context of this logic and pose some problems for further research.
We argue that, if taken seriously, Kripke's view that a language for science can dispense with a negation operator is to be rejected. Part of the argument is a proof that positive logic, i.e., classical propositional logic without negation, is not categorical.
We present an inferentialist account of the epistemic modal operator might. Our starting point is the bilateralist programme. A bilateralist explains the operator not in terms of the speech act of rejection ; we explain the operator might in terms of weak assertion, a speech act whose existence we argue for on the basis of linguistic evidence. We show that our account of might provides a solution to certain well-known puzzles about the semantics of modal vocabulary whilst retaining classical logic. (...) This demonstrates that an inferentialist approach to meaning can be successfully extended beyond the core logical constants. (shrink)
The problem of negative truth is the problem of how, if everything in the world is positive, we can speak truly about the world using negative propositions. A prominent solution is to explain negation in terms of a primitive notion of metaphysical incompatibility. I argue that if this account is correct, then minimal logic is the correct logic. The negation of a proposition A is characterised as the minimal incompatible of A composed of it and the logical constant ¬. A (...) rule based account of the meanings of logical constants that appeals to the notion of incompatibility in the introduction rule for negation ensures the existence and uniqueness of the negation of every proposition. But it endows the negation operator with no more formal properties than those it has in minimal logic. (shrink)
This book argues that the meaning of negation, perhaps the most important logical constant, cannot be defined within the framework of the most comprehensive theory of proof-theoretic semantics, as formulated in the influential work of Michael Dummett and Dag Prawitz. Nils Kürbis examines three approaches that have attempted to solve the problem - defining negation in terms of metaphysical incompatibility; treating negation as an undefinable primitive; and defining negation in terms of a speech act of denial - and concludes that (...) they cannot adequately do so. He argues that whereas proof-theoretic semantics usually only appeals to a notion of truth, it also needs to appeal to a notion of falsity, and proposes a system of natural deduction in which both are incorporated. Offering new perspectives on negation, denial and falsity, his book will be important for readers working on logic, metaphysics and the philosophy of language. (shrink)
There is a relatively recent trend in treating negation as a modal operator. One such reason is that doing so provides a uniform semantics for the negations of a wide variety of logics and arguably speaks to a longstanding challenge of Quine put to non-classical logics. One might be tempted to draw the conclusion that negation is a modal operator, a claim Francesco Berto, 761–793, 2015) defends at length in a recent paper. According to one such modal account, the negation (...) of a sentence is true at a world x just in case all the worlds at which the sentence is true are incompatible with x. Incompatibility is taken to be the key notion in the account, and what minimal properties a negation has comes down to which minimal conditions incompatibility satisfies. Our aims in this paper are twofold. First, we wish to point out problems for the modal account that make us question its tenability on a fundamental level. Second, in its place we propose an alternative, non-modal, account of negation as a contradictory-forming operator that we argue is superior to, and more natural than, the modal account. (shrink)
This paper is intended to show that, at least in a considerably wide class of cases, indicative conditionals are adequately formalized as strict conditionals. The first part of the paper outlines three arguments that support the strict conditional view, that is, three reasons for thinking that an indicative conditional is true just in case it is impossible that its antecedent is true and its consequent is false. The second part of the paper develops the strict conditional view and defends it (...) from some foreseeable objections. (shrink)
In this paper, we will motivate the application of specific rules of inference from the propositional calculus to natural language sentences. Specifically, we will analyse De Morgan’s laws, which pertain to the interaction of two central topics in syntactic research: negation and coordination. We will argue that the applicability of De Morgan’s laws to natural language structures can be derived from independently motivated operations of grammar and principles restricting the application of these operations. This has direct empirical consequences for the (...) hypothesised relations between natural language and logic. (shrink)
Molnar argues that the problem of truthmakers for negative truths arises because we tend to accept four metaphysical principles that entail that all negative truths have positive truthmakers. This conclusion, however, already follows from only three of Molnar´s metaphysical principles. One purpose of this note is to set the record straight. I provide an alternative reading of two of Molnar´s principles on which they are all needed to derive the desired conclusion. Furthermore, according to Molnar, the four principles may be (...) inconsistent. By themselves, however, they are not. The other purpose of this note is to propose some plausible further principles that, when added to the four metaphysical theses, entail a contradiction. (shrink)
This paper considers whether incompatibilism, the view that negation is to be explained in terms of a primitive notion of incompatibility, and Fregeanism, the view that arithmetical truths are analytic according to Frege’s definition of that term in §3 of Foundations of Arithmetic, can both be upheld simultaneously. Both views are attractive on their own right, in particular for a certain empiricist mind-set. They promise to account for two philosophical puzzling phenomena: the problem of negative truth and the problem of (...) epistemic access to numbers. For an incompatibilist, proofs of numerical non-identities must appeal to primitive incompatibilities. I argue that no analytic primitive incompatibilities are forthcoming. Hence incompatibilists cannot be Fregeans. (shrink)
Semantic paradoxes like the liar are notorious challenges to truth theories. A paradox can be phrased with minimal resources and minimal assumptions. It is not surprising, then, that the liar is also a challenge to minimalism about truth. Horwich (1990) deals swiftly with the paradox, after discriminating between other strategies for avoiding it without compromising minimalism. He dismisses the denial of classical logic, the denial that the concept of truth can coherently be applied to propositions, and the denial that the (...) liar sentence expresses a proposition, but he endorses the denial that the liar is an acceptable instance of the equivalence schema (E). This paper has two main parts. It first shows that Horwich’s preferred denial is also problematic. As Simmons (1999), Beall and Armour-Garb (2003), and Asay (2015) argued, the solution is ad hoc, faces a possible loss of expressibility, and is ultimately unstable. Finally, the paper explores a different combination of possibilities for minimalism: treating the truth-predicate as context-dependent, rejecting the notion that the liar expresses a proposition, and reinterpreting negation in some contexts as metalinguistic denial. The paper argues that these are preferable options, but signposts possible dangers ahead. (shrink)
Routley-Meyer Ternary Relational Semantics for Intuitionistic-type Negations examines how to introduce intuitionistic-type negations into RM-semantics. RM-semantics is highly malleable and capable of modeling families of logics which are very different from each other. This semantics was introduced in the early 1970s, and was devised for interpreting relevance logics. In RM-semantics, negation is interpreted by means of the Routley operator, which has been almost exclusively used for modeling De Morgan negations. This book provides research on particular features of intuitionistic-type of negations (...) in RM-semantics, while also defining the basic systems and many of their extensions by using models with or without a set of designated points. (shrink)
Bilateralism is a theory of meaning according to which assertion and denial are independent speech acts. Bilateralism also proposes two coordination principles for assertion and denial. I argue that if assertion and denial are independent speech acts, they cannot be coordinated by the bilateralist principles.
ABSTRACTLinguistic evidence supports the claim that certain, weak rejections are less specific than assertions. On the basis of this evidence, it has been argued that rejected sentences cannot be premisses and conclusions in inferences. We give examples of inferences with weakly rejected sentences as premisses and conclusions. We then propose a logic of weak rejection which accounts for the relevant phenomena and is motivated by principles of coherence in dialogue. We give a semantics for which this logic is sound and (...) complete, show that it axiomatizes the modal logic KD45 and prove that it still derives classical logic on its asserted fragment. Finally, we defend previous logics of strong rejection as being about the linguistically preferred interpretations of weak rejections. (shrink)
There is widespread agreement that while on a Dummettian theory of meaning the justified logic is intuitionist, as its constants are governed by harmonious rules of inference, the situation is reversed on Huw Price's bilateralist account, where meanings are specified in terms of primitive speech acts assertion and denial. In bilateral logics, the rules for classical negation are in harmony. However, as it is possible to construct an intuitionist bilateral logic with harmonious rules, there is no formal argument against intuitionism (...) from the bilateralist perspective. Price gives an informal argument for classical negation based on a pragmatic notion of belief, characterised in terms of the differences they make to speakers' actions. The main part of this paper puts Price's argument under close scrutiny by regimenting it and isolating principles Price is committed to. It is shown that Price should draw a distinction between A or ¬A making a difference. According to Price, if A makes a difference to us, we treat it as decidable. This material allows the intuitionist to block Price's argument. Abandoning classical logic also brings advantages, as within intuitionist logic there is a precise meaning to what it might mean to treat A as decidable: it is to assume A ∨ ¬A. (shrink)
This short paper has two loosely connected parts. In the first part, I discuss the difference between classical and intuitionist logic in relation to different the role of hypotheses play in each logic. Harmony is normally understood as a relation between two ways of manipulating formulas in systems of natural deduction: their introduction and elimination. I argue, however, that there is at least a third way of manipulating formulas, namely the discharge of assumption, and that the difference between classical and (...) intuitionist logic can be characterised as a difference of the conditions under which discharge is allowed. Harmony, as ordinarily understood, has nothing to say about discharge. This raises the question whether the notion of harmony can be suitably extended. This requires there to be a suitable fourth way of manipulating formulas that discharge can stand in harmony to. The question is whether there is such a notion: what might it be that stands to discharge of formulas as introduction stands to elimination? One that immediately comes to mind is the making of assumptions. I leave it as an open question for further research whether the notion of harmony can be fruitfully extended in the way suggested here. In the second part, I discuss bilateralism, which proposes a wholesale revision of what it is that is assumed and manipulated by rules of inference in deductions: rules apply to speech acts – assertions and denials – rather than propositions. I point out two problems for bilateralism. First, bilaterlists cannot, contrary to what they claim to be able to do, draw a distinction between the truth and assertibility of a proposition. Secondly, it is not clear what it means to assume an expression such as '+ A' that is supposed to stand for an assertion. Worse than that, it is plausible that making an assumption is a particular speech act, as argued by Dummett (Frege: Philosophy of Language, p.309ff). Bilaterlists accept that speech acts cannot be embedded in other speech acts. But then it is meaningless to assume + A or − A. (shrink)