In this paper we investigate a semantics for first-order logic originally proposed by R. van Rooij to account for the idea that vague predicates are tolerant, that is, for the principle that if x is P, then y should be P whenever y is similar enough to x. The semantics, which makes use of indifference relations to model similarity, rests on the interaction of three notions of truth: the classical notion, and two dual notions simultaneously defined in terms of it, (...) which we call tolerant truth and strict truth. We characterize the space of consequence relations definable in terms of those and discuss the kind of solution this gives to the sorites paradox. We discuss some applications of the framework to the pragmatics and psycholinguistics of vague predicates, in particular regarding judgments about borderline cases. (shrink)
This paper presents and defends a way to add a transparent truth predicate to classical logic, such that and A are everywhere intersubstitutable, where all T-biconditionals hold, and where truth can be made compositional. A key feature of our framework, called STTT (for Strict-Tolerant Transparent Truth), is that it supports a non-transitive relation of consequence. At the same time, it can be seen that the only failures of transitivity STTT allows for arise in paradoxical cases.
The appropriateness, or acceptability, of a conditional does not just ‘go with’ the corresponding conditional probability. A condition of dependence is required as well. In this paper a particular notion of dependence is proposed. It is shown that under both a forward causal and a backward evidential reading of the conditional, this appropriateness condition reduces to conditional probability under some natural circumstances. Because this is in particular the case for the so-called diagnostic reading of the conditional, this analysis might help (...) to explain some of Douven and Verbrugge’s empirical observations. (shrink)
Cimpian et al. observed that we accept generic statements of the form ‘Gs are f’ on relatively weak evidence, but that if we are unfamiliar with group G and we learn a generic statement about it, we still treat it inferentially in a much stronger way: all Gs are f. This paper makes use of notions like ‘representativeness’, ‘contingency’ and ‘relative difference’ from psychology to provide a uniform semantics of generics that explains why people accept generics based on weak evidence. (...) The spirit of the approach has much in common with Leslie’s cognition-based ideas about generics, but the semantics will be grounded on a strengthening of Cohen’s relative readings of generic sentences. In contrast to Leslie and Cohen, we propose a uniform semantic analysis of generics. The basic intuition is that a generic of the form ‘Gs are f’ is true because f is typical for G, which means that f is valuably associated with G. We will make use of Kahneman and Tversky’s Heuristics and Biases approach, according to which people tend to confuse questions about probability with questions about representativeness, to explain pragmatically why people treat many generic statements inferentially in a much stronger way. (shrink)
In this paper an approach to the exhaustive interpretation of answers is developed. It builds on a proposal brought forward by Groenendijk and Stokhof (1984). We will use the close connection between their approach and McCarthy's (1980, 1986) predicate circumscription and describe exhaustive interpretation as an instance of interpretation in minimal models, well-known from work on counterfactuals (see for instance Lewis (1973)). It is shown that by combining this approach with independent developments in semantics/pragmatics one can overcome certain limitations of (...) Groenenedijk and Stokhof's (1984) proposal. In the last part of the paper we will provide a Gricean motivation for exhaustive interpretation building on work of Schulz (to appear) and van Rooij and Schulz (2004). (shrink)
In terms of Groenendijk and Stokhofs (1984) formalization of exhaustive interpretation, many conversational implicatures can be accounted for. In this paper we justify and generalize this approach. Our justification proceeds by relating their account via Halpern and Moses (1984) non-monotonic theory of only knowing to the Gricean maxims of Quality and the first sub-maxim of Quantity. The approach of Groenendijk and Stokhof (1984) is generalized such that it can also account for implicatures that are triggered in subclauses not entailed by (...) the whole complex sentence. (shrink)
Recent experiments have shown that naive speakers find borderline contradictions involving vague predicates acceptable. In Cobreros et al. we proposed a pragmatic explanation of the acceptability of borderline contradictions, building on a three-valued semantics. In a reply, Alxatib et al. show, however, that the pragmatic account predicts the wrong interpretations for some examples involving disjunction, and propose as a remedy a semantic analysis instead, based on fuzzy logic. In this paper we provide an explicit global pragmatic interpretation rule, based on (...) a somewhat richer semantics, and show that with its help the problem can be overcome in pragmatics after all. Furthermore, we use this pragmatic interpretation rule to define a new consequence-relation and discuss some of its properties. (shrink)
To determine what the speaker in a cooperative dialog meant with his assertion, on top of what he explicitly said, it is crucial that we assume that the assertion he gave was optimal. In determining optimal assertions we assume that dialogs are embedded in decision problems (van Rooij 2003) and use backwards induction for calculating them (Benz 2006). In this paper, we show that in terms of our framework we can account for several types of implicatures in a uniform way, (...) suggesting that there is no need for an independent linguistic theory of generalized implicatures. In the final section, we show how we can embed our theory in the framework of signaling games, and how it relates with other game theoretic analyses of implicatures. (shrink)
In a recent paper, Barrio, Tajer and Rosenblatt establish a correspondence between metainferences holding in the strict-tolerant logic of transparent truth ST+ and inferences holding in the logic of paradox LP+. They argue that LP+ is ST+’s external logic and they question whether ST+’s solution to the semantic paradoxes is fundamentally different from LP+’s. Here we establish that by parity of reasoning, ST+ can be related to LP+’s dual logic K3+. We clarify the distinction between internal and external logic and (...) argue that while ST+’s nonclassicality can be granted, its self-dual character does not tie it to LP+ more closely than to K3+. (shrink)
The felicity, or acceptability, of IS generics, i.e. generic sentences with indefinite singulars, is considerably more restricted compared to BP generics, generics with bare plurals. The goal of this paper is to account for the limited felicity of IS generics compared to BP generics, on the one hand, while preserving the close similarity between the two types of generics, on the other. We do so by proposing a causal analysis of IS generics, and show that this corresponds closely with a (...) probabilistic analysis of BP generics. (shrink)
In this article we discuss the notion of a linguistic universal, and possible sources of such invariant properties of natural languages. In the first part, we explore the conceptual issues that arise. In the second part of the paper, we focus on the explanatory potential of horizontal evolution. We particularly focus on two case studies, concerning Zipf's Law and universal properties of color terms, respectively. We show how computer simulations can be employed to study the large scale, emergent, consequences of (...) psychologically and psychologically motivated assumptions about the working of horizontal language transmission. (shrink)
Arguments based on Leibniz's Law seem to show that there is no room for either indefinite or contingent identity. The arguments seem to prove too much, but their conclusion is hard to resist if we want to keep Leibniz's Law. We present a novel approach to this issue, based on an appropriate modification of the notion of logical consequence.
Objects have dispositions. Dispositions are normally analyzed by providing a meaning to disposition ascriptions like ‘This piece of salt is soluble’. Philosophers like Carnap, Goodman, Quine, Lewis and many others have proposed analyses of such disposition ascriptions. In this paper we will argue with Quine that the proper analysis of ascriptions of the form ‘x is disposed to m ’, where ‘x’ denotes an object, ‘m’ a manifestation, and ‘C’ a condition, goes like this: ‘x is of natural kind k’, (...) and the generic ‘ks are m ’ is true. For the analysis of the generic, we propose an analysis in terms of causal powers: ‘ks have the causal power to m’. The latter, in turn, is analyzed in a very precise way, making use of Pearl’s probabilistic graphical causal models. We will show how this natural kind-analysis improves on standard conditional analyses of dispositions by avoiding the standard counterexamples, and that it gives rise to precise observable criteria under which the disposition ascription is true. (shrink)
In this paper I will give a modal two-dimensional analysis of presupposition and modal subordination. I will think of presupposition as a non-veridical propositional attitude. This allows me to evaluate what is presupposed and what is asserted at different dimensions without getting into the binding problem. What is presupposed will be represented by an accessibility relation between possible worlds. The major part of the paper consists of a proposal to account for the dependence of the interpretation of modal expressions, i.e. (...) modal subordination, in terms of an accessibility relation as well. Moreover, I show how such an analysis can be extended from the propositional to the predicate logical level. (shrink)
We give derivations of two formal models of Gricean Quantity implicature and strong exhaustivity in bidirectional optimality theory and in a signalling games framework. We show that, under a unifying model based on signalling games, these interpretative strategies are game-theoretic equilibria when the speaker is known to be respectively minimally and maximally expert in the matter at hand. That is, in this framework the optimal strategy for communication depends on the degree of knowledge the speaker is known to have concerning (...) the question she is answering. In addition, and most importantly, we give a game-theoretic characterisation of the interpretation rule Grice (formalising Quantity implicature), showing that under natural conditions this interpretation rule occurs in the unique equilibrium play of the signalling game. (shrink)
A much discussed topic in the theory of choice is how a preference order among options can be derived from the assumption that the notion of ' choice' is primitive. Assuming a choice function that selects elements from each finite set of options, Arrow (Económica 26: 121-127,1959) already showed how we can generate a weak ordering by putting constraints on the behavior of such a function such that it reflects utility maximization. Arrow proposed that rational agents can be modeled by (...) such choice functions. Arrow's standard model of rationality has been criticized in economics and gave rise to approaches of bounded rationality. Two standard assumptions of rationality will be given up in this paper. First, the idea that agents are utility optimizers (Simon). Second, the idea that the relation of ' indifference' gives rise to an equivalence relation. To account for the latter, Luce (Econometrica 24: 178-191, 1956) introduced semi-orders. Extending some ideas of Van Benthem (Pac Philos Q 63: 193-203, 1982), we will show how to derive semi-orders (and so-called interval orders) based on the idea that agents are utility satisficers rather than utility optimizers. (shrink)
The principle of tolerance characteristic of vague predicates is sometimes presented as a soft rule, namely as a default which we can use in ordinary reasoning, but which requires care in order to avoid paradoxes. We focus on two ways in which the tolerance principle can be modeled in that spirit, using special consequence relations. The first approach relates tolerant reasoning to nontransitive reasoning; the second relates tolerant reasoning to nonmonotonic reasoning. We compare the two approaches and examine three specific (...) consequence relations in relation to those, which we call: strict-to-tolerant entailment, pragmatic-to-tolerant entailment, and pragmatic-to-pragmatic entailment. The first two are nontransitive, whereas the latter two are nonmonotonic. (shrink)
This paper combines a survey of existing literature in game-theoretic pragmatics with new models that fill some voids in that literature. We start with an overview of signaling games with a conflict of interest between sender and receiver, and show that the literature on such games can be classified into models with direct, costly, noisy and imprecise signals. We then argue that this same subdivision can be used to classify signaling games with common interests, where we fill some voids in (...) the literature. For each of the signaling games treated, we show how equilibrium-refinement arguments and evolutionary arguments can be interpreted in the light of pragmatic inference. (shrink)
In this paper we seek to account for scalar implicatures and Horn's division of pragmatic labor in game?theoretical terms by making use mainly of refinements of the standard solution concept of signaling games. Scalar implicatures are accounted for in terms of Farrell's (1993) notion of a ?neologism?proof? equilibrium together with Grice's maxim of Quality. Horn's division of pragmatic labor is accounted for in terms of Cho and Kreps? (1987) notion of ?equilibrium domination? and their ?Intuitive Criterion?
This paper combines a survey of existing literature in game-theoretic pragmatics with new models that fill some voids in that literature. We start with an overview of signaling games with a conflict of interest between sender and receiver, and show that the literature on such games can be classified into models with direct, costly, noisy and imprecise signals. We then argue that this same subdivision can be used to classify signaling games with common interests, where we fill some voids in (...) the literature. For each of the signaling games treated, we show how equilibrium-refinement arguments and evolutionary arguments can be interpreted in the light of pragmatic inference. (shrink)
Many generic sentences express stable inductive generalizations. Stable inductive generalizations are typically true for a causal reason. In this paper we investigate to what extent this is also the case for the generalizations expressed by generic sentences. More in particular, we discuss the possibility that many generic sentences of the form ‘ks have feature e’ are true because kind k have the causal power to ‘produce’ feature e. We will argue that such an analysis is quite close to a probabilistic (...) based analysis of generic sentences according to which ‘relatively many’ ks have feature e, and that, in fact, this latter type of analysis can be ‘grounded’ in terms of causal powers. We will argue, moreover, that the causal power analysis is sometimes preferred to a correlation-based analysis, because it takes the causal structure that gives rise to the probabilistic data into account. (shrink)
This volume is a collection of papers presented at the colloquium, and it testifies to the growing importance of game theory as a tool that can capture concepts ...
One of the traditional pragmatic approaches to vagueness suggests that there needs to be a significant gap between individuals or objects that can be described using a vague adjective like tall and those that cannot. In contrast, intuitively, an explicit comparative like taller does not require fulfillment of the gap requirement. Our starting point for this paper is the consideration that people cannot make precise measures under time pressure and their ability to discriminate approximate heights obeys Weber’s law. We formulate (...) and experimentally test three hypotheses relating to the difference between positive and comparative forms of the vague adjectives, gap requirement, and Weber’s law. In two experiments, participants judged appropriateness of usage of positive and comparative forms of vague adjectives in a sentence-picture verification task. Consequently, we review formal analysis of vagueness using weak orders and semi-orders and suggest adjustments based on the experimental results and properties of Weber’s law. (shrink)
The principle of stability now says that if sentence ϕ is true/false in a model M, then ϕ has to stay true/false if M is getting more precise. Formally, let M = D, I be a refinement of M = D, I . Then it has to be the case that for all ϕ: (i) If VM(ϕ) = 1, then VM (ϕ) = 1. (ii) If VM(ϕ) = 0, then VM (ϕ) = 0.
In this paper, Universal any and Negative Polarity Item any are uniformly analyzed as ‘counterfactual’ donkey sentences (in disguise). Their difference in meaning is reduced here to the distinction between strong and weak readings of donkey sentences. It is shown that this explains the universal and existential character of Universal- and NPI-any, respectively, and the positive and negative contexts in which they are licensed. Our uniform analysis extends to the use of any in command and permission sentences. It predicts that (...) whereas the use of any in permission sentences is licensed and gives rise to a universal reading, it is not licensed in command sentences. (shrink)
We investigate to what extent it is possible to determine a reasonable default pragmatic value of complex sentences in a compositional manner, and --when combined with a Boolean semantics --to see under which conditions it gives rise to reasonable predictions. We discuss several notions of pragmatic value, or relevance, and compare their behavior over complex sentences. Although the goal-oriented notions of relevance give rise to the same ordering relations between propositions,the conditions under which they behave 'compositionally' vary significantly.
According to Adams, the acceptability of an indicative conditional goes with the conditional probability of the consequent given the antecedent. However, some conditionals seem to be inappropriate, although their corresponding conditional probability is high. These are cases with a missing link between antecedent and consequent. Other conditionals are appropriate even though the conditional probability is low. Finally, we have the so-called biscuit conditionals. In this paper we will generalize analyses of Douven and others to account for the appropriateness of conditionals (...) in terms of evidential support. Our generalization involves making use of Value, or intensity. We will show how this generalization helps to account for biscuit conditionals and conditional threats and promises. Finally, a link is established between this analysis of conditionals and an analysis of generic sentences. (shrink)
Definition 1. A Strict partial order is a structure X, P , with P a binary relation on X that is irreflexive (IR) and Transitive (TR): (IR) ∀x : ¬P (x, x). (TR) ∀x, y, v, w : (P (x, y) ∧ P (y, z)) → P (x, z).