The paper first introduces a cube of opposition that associates the traditional square of opposition with the dual square obtained by Piaget’s reciprocation. It is then pointed out that Blanché’s extension of the square-of-opposition structure into an conceptual hexagonal structure always relies on an abstract tripartition. Considering quadripartitions leads to organize the 16 binary connectives into a regular tetrahedron. Lastly, the cube of opposition, once interpreted in modal terms, is shown to account for a recent generalization of formal concept analysis, (...) where noticeable hexagons are also laid bare. This generalization of formal concept analysis is motivated by a parallel with bipolar possibility theory. The latter, albeit graded, is indeed based on four graded set functions that can be organized in a similar structure. (shrink)
This paper presents and discusses several methods for reasoning from inconsistent knowledge bases. A so-called argued consequence relation, taking into account the existence of consistent arguments in favour of a conclusion and the absence of consistent arguments in favour of its contrary, is particularly investigated. Flat knowledge bases, i.e., without any priority between their elements, are studied under different inconsistency-tolerant consequence relations, namely the so-called argumentative, free, universal, existential, cardinality-based, and paraconsistent consequence relations. The syntax-sensitivity of these consequence relations is (...) studied. A companion paper is devoted to the case where priorities exist between the pieces of information in the knowledge base. (shrink)
This paper first investigates logical characterizations of different structures of opposition that extend the square of opposition in a way or in another. Blanché’s hexagon of opposition is based on three disjoint sets. There are at least two meaningful cubes of opposition, proposed respectively by two of the authors and by Moretti, and pioneered by philosophers such as J. N. Keynes, W. E. Johnson, for the former, and H. Reichenbach for the latter. These cubes exhibit four and six squares of (...) opposition respectively. We clarify the differences between these two cubes, and discuss their gradual extensions, as well as the one of the hexagon when vertices are no longer two-valued. The second part of the paper is dedicated to the use of these structures of opposition for discussing the comparison of two items. Comparing two items usually involves a set of relevant attributes whose values are compared, and may be expressed in terms of different modalities such as identity, similarity, difference, opposition, analogy. Recently, J.-Y. Béziau has proposed an “analogical hexagon” that organizes the relations linking these modalities. Elementary comparisons may be a matter of degree, attributes may not have the same importance. The paper studies in which ways the structure of the hexagon may be preserved in such gradual extensions. As another illustration of the graded hexagon, we start with the hexagon of equality and inequality due to R. Blanché and extend it with fuzzy equality and fuzzy inequality. Besides, the cube induced by a tetra-partition can account for the comparison of two items in terms of preference, reversed preference, indifference and non-comparability even if these notions are a matter of degree. The other cube, which organizes the relations between the different weighted qualitative aggregation modes, is more relevant for the attribute-based comparison of items in terms of similarity. (shrink)
The starting point of this work is the gap between two distinct traditions in information engineering: knowledge representation and data - driven modelling. The first tradition emphasizes logic as a tool for representing beliefs held by an agent. The second tradition claims that the main source of knowledge is made of observed data, and generally does not use logic as a modelling tool. However, the emergence of fuzzy logic has blurred the boundaries between these two traditions by putting forward fuzzy (...) rules as a Janus-faced tool that may represent knowledge, as well as approximate non-linear functions representing data. This paper lays bare logical foundations of data - driven reasoning whereby a set of formulas is understood as a set of observed facts rather than a set of beliefs. Several representation frameworks are considered from this point of view: classical logic, possibility theory, belief functions, epistemic logic, fuzzy rule-based systems. Mamdani's fuzzy rules are recovered as belonging to the data - driven view. In possibility theory a third set-function, different from possibility and necessity plays a key role in the data - driven view, and corresponds to a particular modality in epistemic logic. A bi-modal logic system is presented which handles both beliefs and observations, and for which a completeness theorem is given. Lastly, our results may shed new light in deontic logic and allow for a distinction between explicit and implicit permission that standard deontic modal logics do not often emphasize. (shrink)
Given a 4-tuple of Boolean variables (a, b, c, d), logical proportions are modeled by a pair of equivalences relating similarity indicators ( \({a \wedge b}\) and \({\overline{a} \wedge \overline{b}}\) ), or dissimilarity indicators ( \({a \wedge \overline{b}}\) and \({\overline{a} \wedge b}\) ) pertaining to the pair (a, b), to the ones associated with the pair (c, d). There are 120 semantically distinct logical proportions. One of them models the analogical proportion which corresponds to a statement of the form “a (...) is to b as c is to d”. The paper inventories the whole set of logical proportions by dividing it into five subfamilies according to what they express, and then identifies the proportions that satisfy noticeable properties such as full identity (the pair of equivalences defining the proportion hold as true for the 4-tuple (a, a, a, a)), symmetry (if the proportion holds for (a, b, c, d), it also holds for (c, d, a, b)), or code independency (if the proportion holds for (a, b, c, d), it also holds for their negations \({{(\overline{a},\overline{b}, \overline{c}, \overline{d})}}\) ). It appears that only four proportions (including analogical proportion) are homogeneous in the sense that they use only one type of indicator (either similarity or dissimilarity) in their definition. Due to their specific patterns, they have a particular cognitive appeal, and as such are studied in greater details. Finally, the paper provides a discussion of the other existing works on analogical proportions. (shrink)
This paper investigates simple syntactic methods for revising prioritized belief bases, that are semantically meaningful in the frameworks of possibility theory and of Spohn''s ordinal conditional functions. Here, revising prioritized belief bases amounts to conditioning a distribution function on interpretations. The input information leading to the revision of a knowledge base can be sure or uncertain. Different types of scales for priorities are allowed: finite vs. infinite, numerical vs. ordinal. Syntactic revision is envisaged here as a process which transforms a (...) prioritized belief bases into a new prioritized belief base, and thus allows a subsequent iteration. (shrink)
The aim of this paper is to propose a formal approach to reasoning about desires, understood as logical propositions which we would be pleased to make true, also acknowledging the fact that desire is a matter of degree. It is first shown that, at the static level, desires should satisfy certain principles that differ from those to which beliefs obey. In this sense, from a static perspective, the logic of desires is different from the logic of beliefs. While the accumulation (...) of beliefs tend to reduce the remaining possible worlds they point at, the accumulation of desires tends to increase the set of states of affairs tentatively considered as satisfactory. Indeed beliefs are expected to be closed under conjunctions, while, in the positive view of desires developed here, one can argue that endorsing \ as a desire means to desire \ and to desire \. However, desiring \ and \ at the same time is not usually regarded as rational, since it does not make much sense to desire one thing and its contrary at the same time. Thus when a new desire is added to the set of desires of an agent, a revision process may be necessary. Just as belief revision relies on an epistemic entrenchment relation, desire revision is based on a hedonic entrenchment relation satisfying other properties, due to the different natures of belief and desire. While epistemic entrenchment relations are known to be qualitative necessity relations, hedonic relations obeying a set of reasonable postulates correspond to another set-function in possibility theory, called guaranteed possibility, that drive well-behaved desire revision operations. Then the general framework of possibilistic logic provides a syntactic setting for encoding desire change. The paper also insists that desires should be carefully distinguished from goals. (shrink)
This short paper about fuzzy set-based approximate reasoning first emphasizes the three main semantics for fuzzy sets: similarity, preference and uncertainty. The difference between truth-functional many-valued logics of vague or gradual propositions and non fully compositional calculi such as possibilistic logic or similarity logics is stressed. Then, potentials of fuzzy set-based reasoning methods are briefly outlined for various kinds of approximate reasoning: deductive reasoning about flexible constraints, reasoning under uncertainty and inconsistency, hypothetical reasoning, exception-tolerant plausible reasoning using generic knowledge, interpolative (...) reasoning, and abductive reasoning. Open problems are listed in the conclusion. (shrink)
The paper presents a ‘multiple agent’ logic where formulas are pairs of the form, made of a proposition and a subset of agents. The formula is intended to mean ‘ all agents in believe that is true’. The formal similarity of such formulas with those of possibilistic logic, where propositions are associated with certainty levels, is emphasised. However, the subsets of agents are organised in a Boolean lattice, while certainty levels belong to a totally ordered scale. The semantics of a (...) set of ‘multiple agent’ logic formulas is expressed by a mapping which associates a subset of agents with each interpretation. Soundness and completeness results are established. Then a joint extension of the multiple agent logic and possibilistic logic is outlined. In this extended logic, propositions are then associated with both sets of agents and certainty levels. A formula then expresses that ‘all agents in set believe that is true at least at some level’. The semantics is then given in terms of fuzzy sets of agents that find an interpretation more or less possible. A specific feature of possibilistic logic is that the inconsistency of a knowledge base is a matter of degree. The proposed setting enables us to distinguish between the global consistency of a set of agents and their individual consistency. In particular, given a set of multiple agent possibilistic formulas, one can compute the subset of agents that are individually consistent to some degree. (shrink)
Possibilistic logic and modal logic are knowledge representation frameworks sharing some common features, such as the duality between possibility and necessity, and the decomposability of necessity for conjunctions, as well as some obvious differences since possibility theory is graded. At the semantic level, possibilistic logic relies on possibility distributions and modal logic on accessibility relations. In the last 30 years, there have been a series of attempts for bridging the two frameworks in one way or another. In this paper, we (...) compare the relational semantics of epistemic logics with simpler possibilistic semantics of a fragment of such logics that only uses modal formulas of depth 1. This minimal epistemic logic handles both all-or-nothing beliefs and explicitly ignored facts. We also contrast epistemic logic with the S5-based rough set logic. Finally, this paper presents extensions of generalized possibilistic logic with objective and non-nested multimodal formulas, in the style of modal logics KD45 and S5. (shrink)
Fuzzy Sets, Logics and Reasoning about Knowledge reports recent results concerning the genuinely logical aspects of fuzzy sets in relation to algebraic considerations, knowledge representation and commonsense reasoning. It takes a state-of-the-art look at multiple-valued and fuzzy set-based logics, in an artificial intelligence perspective. The papers, all of which are written by leading contributors in their respective fields, are grouped into four sections. The first section presents a panorama of many-valued logics in connection with fuzzy sets. The second explores algebraic (...) foundations, with an emphasis on MV algebras. The third is devoted to approximate reasoning methods and similarity-based reasoning. The fourth explores connections between fuzzy knowledge representation, especially possibilistic logic and prioritized knowledge bases. Readership: Scholars and graduate students in logic, algebra, knowledge representation, and formal aspects of artificial intelligence. (shrink)
This article aims to achieve two goals: to show that probability is not the only way of dealing with uncertainty ; and to provide evidence that logic-based methods can well support reasoning with uncertainty. For the latter claim, two paradigmatic examples are presented: logic programming with Kleene semantics for modelling reasoning from information in a discourse, to an interpretation of the state of affairs of the intended model, and a neural-symbolic implementation of input/output logic for dealing with uncertainty in dynamic (...) normative contexts. (shrink)
At the occasion of the fiftieth anniversary of the founding article “Fuzzy sets” by L. A. Zadeh, published in 1965, we briefly outline the beginnings of fuzzy set research in France, taking place some ten years later, pointing out the pioneer- ing role of Arnold Kaufmann and few others in this emergence. Moreover, we also point out that the French counterpart of the name “fuzzy set” had appeared some 15 years before Zadeh’s paper, in a paper written in French by (...) the very person who also invented triangular norms in the 1940’s. (shrink)
Comparative thinking plays a key role in our appraisal of reality. Comparing two objects or situations A and B, described in terms of Boolean features, may involve four basic similarity or dissimilarity indicators referring to what A and B have in common, or to what is particular to A or particular to B. These four indicators are naturally organized into a cube of opposition, which includes two classical squares of opposition, as well as other noticeable squares. From the knowledge of (...) one situation A, it is possible to recover the description of another one, B, provided that we have enough information about the comparison between A and B. Then comparison indicators between A and B can be equated with comparison indicators between two other situations C and D. A conjunction of two such comparisons between pairs and gives birth to what is called a logical proportion. Among the 120 existing logical proportions, 8 are of particular interest since they are independent of the encoding used for representing the situations. Four of them have remarkable properties of homogeneity, and include the analogical proportion “A is to B as C is to D”, while the four others express heterogeneity by stating that “there is an intruder among A, B, C and D, which is not X”. Homogeneous and heterogeneous logical proportions are of interest in classification, anomaly detection tasks and IQ test solving. (shrink)
The problem of merging multiple sources information is central in many information processing areas such as databases integrating problems, multiple criteria decision making, expert opinion pooling, etc. Recently, several approaches have been proposed to merge classical propositional bases, or sets of (non-prioritized) goals. These approaches are in general semantically defined. Like in belief revision, they use priorities, generally based on Dalal's distance, for merging the classical bases and return a new classical base as a result. An immediate consequence of the (...) generation of a classical base is the impossibility of iterating the fusion process in a coherent way w.r.t. priorities since the underlying ordering is lost. This paper presents a general approach for fusing prioritized bases, both semantically and syntactically, when priorities are represented in the possibilistic logic framework. Different classes of merging operators are considered including conjunctive, disjunctive, reinforcement and adaptive operators. We show that the approaches which have been recently proposed for merging classical propositional bases can be embedded in this setting. The result is then a prioritized base, and hence the process can be coherently iterated. Moreover, we also provide a syntactic counterpart for the fusion of classical bases. (shrink)