with the meaning function [[·]] appearing on both sides. (1) is commonly construed as a prescription for computing the meaning of a based on the parts of a and their mode of combination. As equality is symmetric, however, we can also read (1) from right to left, as a constraint on the meaning [[b]] of a term b that brings in the wider context where b may occur, in accordance with what Dag Westerst˚ahl has recently described as “one version of (...) Frege’s famous Context Principle”. (shrink)
This volume examines the notion of an analytic proof as a natural deduction, suggesting that the proof's value may be understood as its normal form--a concept with significant implications to proof-theoretic semantics.
Quantification is a topic which brings together linguistics, logic, and philosophy. Quantifiers are the essential tools with which, in language or logic, we refer to quantity of things or amount of stuff. In English they include such expressions as no, some, all, both, many. Peters and Westerstahl present the definitive interdisciplinary exploration of how they work - their syntax, semantics, and inferential role.
In the 12th century the "Book of the Soul" by the philosopher Avicenna was translated from Arabic into Latin. It had an immense success among scholastic writers and deeply influenced the structure and content of many psychological works of the Middle Ages. The reception of Avicenna's book is the story of cultural contact at an imipressively high intellectural level. The present volume investigates this successful reception using two approaches. The first is chronological, tracing the stages by which Avicenna's work was (...) accepted and adapted by Latin scholars. The second is doctrinal, analyzing the fortunes of key doctrines. The sense of the original Arabic text of Avicenna is kept in mind throughout and the degree to which his original Latin interpreters succeeded in conveying it is evaluated. (shrink)
Ethics in business and economics is often attacked for being too superficial. By elaborating the conclusions of two such critics of business ethics and welfare economics respectively, this article will draw the attention to the ethics behind these apparently well-intended, but not always convincing constructions, by help of the fundamental ethics of Emmanuel Levinas. To Levinas, responsibility is more basic than language, and thus also more basic than all social constructions. Co-operation relations in organizations, markets and value networks are generated (...) from personal relations and personal responsibilities. It is not sufficient to integrate ethics in an impersonal, rational system, neither in business organizations nor in the world economy. Ethics has its source not in rationality, but in the personal. (shrink)
Levinas did not present any new ethical theories; he did not even give any normative recommendations. But his phenomenological investigations help us to understand how the idea of ethics emerges and how we try to cope with it. The purpose of this paper is to suggest some implications from a reading of Levinas on how ethical challenges are handled within a management perspective. The paper claims that management, both in theory and in practice, is necessarily egocentric and thus ethically biased. (...) Therefore, ethics, defined as the idea that it is possible to do otherwise than pursue one's own interest, must have its source outside the scope of management. Levinas identifies this source as the call for responsibility from the Other, beyond the reach of managerial control. He also suggests how this idea of ethics is put into practice, especially in economic life: the existence of money makes it both possible and necessary to compare unique and incomparable others. Management tools may be used in the service of always more justice, as it also may be used for the opposite purpose. Instead of trying to draw a normative conclusion from Levinas, we may distinguish between two ways of experiencing reality: one through language and knowledge, covering both descriptive and normative knowledge, including business ethics, and the other through the particular events of encounters, from where this knowledge is continuously questioned through the call for responsibility, and thereby may provide possibilities for continuous improvements towards always more justice. (shrink)
Dag Prawitz’s theory of grounds proposes a fresh approach to valid inferences. Its main aim is to clarify nature and reasons of their epistemic power. The notion of ground is taken to denote what one is in possession of when in a state of evidence, and valid inferences are described in terms of operations that make us pass from grounds we already have to new grounds. Thanks to a rigorously developed proof-as-chains conception, the ground-theoretic framework permits Prawitz to overcome some (...) conceptual difficulties of his earlier proof-theoretic explanation. Though from different points of view, anyway, the two accounts share an issue of recognizability of relevant operational properties. (shrink)
This volume is dedicated to Prof. Dag Prawitz and his outstanding contributions to philosophical and mathematical logic. Prawitz's eminent contributions to structural proof theory, or general proof theory, as he calls it, and inference-based meaning theories have been extremely influential in the development of modern proof theory and anti-realistic semantics. In particular, Prawitz is the main author on natural deduction in addition to Gerhard Gentzen, who defined natural deduction in his PhD thesis published in 1934. The book opens with an (...) introductory paper that surveys Prawitz's numerous contributions to proof theory and proof-theoretic semantics and puts his work into a somewhat broader perspective, both historically and systematically. Chapters include either in-depth studies of certain aspects of Dag Prawitz's work or address open research problems that are concerned with core issues in structural proof theory and range from philosophical essays to papers of a mathematical nature. Investigations into the necessity of thought and the theory of grounds and computational justifications as well as an examination of Prawitz's conception of the validity of inferences in the light of three “dogmas of proof-theoretic semantics” are included. More formal papers deal with the constructive behaviour of fragments of classical logic and fragments of the modal logic S4 among other topics. In addition, there are chapters about inversion principles, normalization of proofs, and the notion of proof-theoretic harmony and other areas of a more mathematical persuasion. Dag Prawitz also writes a chapter in which he explains his current views on the epistemic dimension of proofs and addresses the question why some inferences succeed in conferring evidence on their conclusions when applied to premises for which one already possesses evidence. (shrink)
According to a main idea of Gentzen the meanings of the logical constants are reflected by the introduction rules in his system of natural deduction. This idea is here understood as saying roughly that a closed argument ending with an introduction is valid provided that its immediate subarguments are valid and that other closed arguments are justified to the extent that they can be brought to introduction form. One main part of the paper is devoted to the exact development of (...) this notion. Another main part of the paper is concerned with a modification of this notion as it occurs in Michael Dummett’s book The Logical Basis of Metaphysics. The two notions are compared and there is a discussion of how they fare as a foundation for a theory of meaning. It is noted that Dummett’s notion has a simpler structure, but it is argued that it is less appropriate for the foundation of a theory of meaning, because the possession of a valid argument for a sentence in Dummett’s sense is not enough to be warranted to assert the sentence. (shrink)
This article addresses some Late Scholastic accounts (Suárez, Abra de Raçonis, Gamaches, Ysambert, Arriaga), of the “problem of transduction” in angels, as a possible source for the genesis of early modern occasionalism, particularly La Forge’s and Cordemoy’s. Indeed, if the “problem of transduction” is a structural issue of all the Aristotelian gnoseology, the impossibility of interaction between immaterial and material substance concerns, more generally, all spiritual substances, posing the issue about the principle of "transduction" already at the level of angels, (...) entirely immaterial creatures, unable to any relationship with the bodies. In order to answer to this peculiar version of the 'problem of transduction' Late Scholastic elaborate doctrines that are direct antecedents of occasionalist ones, and which likely directly influenced early modern occasionalism. To follow this investigative path, I will intersect three complementary argumentative lines. First, I will focus on Late Scholastic angelology, to point out how the need to solve the 'problem of transduction' in this peculiar context pushes the sixteenth and seventeenth-century Schools to a proto-occasionalist conception of angelic innatism. Secondly, I will identify in the debate on the angelic locutio of the key places for the elaboration of a model that sees God as the only efficient cause and the only guarantor of the mind's communications and contents. Thirdly, I will stress how the Late Scholastic angelological debates could be an important source for early modern dualism and the seventeenth-century occasionalism, notably for La Forge and Cordemoy. -/- . (shrink)
This volume is the product of the Proceedings of the 9th International Congress of Logic, Methodology and Philosophy of Science and contains the text of most of ...
Quantification is a topic which brings together linguistics, logic, and philosophy. Quantifiers are the essential tools with which, in language or logic, we refer to quantity of things or amount of stuff. In English they include such expressions as no, some, all, both, many. Peters and Westerstahl present the definitive interdisciplinary exploration of how they work - their syntax, semantics, and inferential role.
Reviewed Works:Gaisi Takeuti, Proof Theory.Georg Kreisel, Proof Theory: Some Personal Recollections.Wolfram Pohlers, Contributions of the Schutte School in Munich to Proof Theory.Stephen G. Simpson, Subsystems of $\mathbf{Z}_2$ and Reverse Mathematics.Solomon Feferman, Proof Theory: A Personal Report.
The traditional picture of logic takes it for granted that "valid arguments have a fundamental epistemic significance", but neither model theory nor traditional proof theory dealing with formal system has been able to give an account of this significance. Since valid arguments as usually understood do not in general have any epistemic significance, the problem is to explain how and why we can nevertheless use them sometimes to acquire knowledge. It is suggested that we should distinguish between arguments and acts (...) of inferences and that we have to reconsider the latter notion to arrive at the desired explanation. More precisely, the notions should be developed so that the following relationship holds: one gets in possession of a ground for a conclusion by inferring it from premisses for which one already has grounds, provided that the inference in question is valid. The paper proposes explications of the concepts of ground and deductively valid inference so that this relationship holds as a conceptual truth. Logical validity of inference is seen as a special case of deductive validity, but does not add anything as far as epistemic significance is concerned—it resides already in the deductively valid inferences. (shrink)
We study the logical and computational properties of basic theorems of uncountable mathematics, including the Cousin and Lindelöf lemma published in 1895 and 1903. Historically, these lemmas were among the first formulations of open-cover compactness and the Lindelöf property, respectively. These notions are of great conceptual importance: the former is commonly viewed as a way of treating uncountable sets like e.g. [Formula: see text] as “almost finite”, while the latter allows one to treat uncountable sets like e.g. [Formula: see text] (...) as “almost countable”. This reduction of the uncountable to the finite/countable turns out to have a considerable logical and computational cost: we show that the aforementioned lemmas, and many related theorems, are extremely hard to prove, while the associated sub-covers are extremely hard to compute. Indeed, in terms of the standard scale, a proof of these lemmas requires at least the full extent of second-order arithmetic, a system originating from Hilbert–Bernays’ Grundlagen der Mathematik. This observation has far-reaching implications for the Grundlagen’s spiritual successor, the program of Reverse Mathematics, and the associated Gödel hierarchy. We also show that the Cousin lemma is essential for the development of the gauge integral, a generalization of the Lebesgue and improper Riemann integrals that also uniquely provides a direct formalization of Feynman’s path integral. (shrink)
This article points out problems in current dynamic treatments of anaphora and provides a new account that solves these by grafting Muskens' Compositional Discourse Representation Theory onto a partial theory of types. Partiality is exploited to keep track of which discourse referents have been introduced in the text (thus avoiding the overwrite problem) and to account for cases of anaphoric failure. Another key assumption is that the set of discourse referents is well-ordered, so that we can keep track of the (...) order in which they have been introduced, allowing a semantic characterization of anaphoric accessibility across stretches of discourse. Unlike other dynamic approaches, the system defines semantic values for unresolved anaphors. This leads to a clear separation of monotonic and non-monotonic content (in this case anaphoric resolution) and arguably provides a sound basis for a non-monotonic theory of anaphoric resolution. (shrink)
We investigate the connections between computability theory and Nonstandard Analysis. In particular, we investigate the two following topics and show that they are intimately related. A basic property of Cantor space$2^ $ is Heine–Borel compactness: for any open covering of $2^ $, there is a finite subcovering. A natural question is: How hard is it to compute such a finite subcovering? We make this precise by analysing the complexity of so-called fan functionals that given any $G:2^ \to $, output a (...) finite sequence $\langle f_0, \ldots,f_n \rangle $ in $2^ $ such that the neighbourhoods defined from $\overline {f_i } G\left$ for $i \le n$ form a covering of $2^ $. A basic property of Cantor space in Nonstandard Analysis is Abraham Robinson’s nonstandard compactness, i.e., that every binary sequence is “infinitely close” to a standard binary sequence. We analyse the strength of this nonstandard compactness property of Cantor space, compared to the other axioms of Nonstandard Analysis and usual mathematics.Our study of yields exotic objects in computability theory, while leads to surprising results in Reverse Mathematics. We stress that and are highly intertwined, i.e., our study is holistic in nature in that results in computability theory yield results in Nonstandard Analysis and vice versa. (shrink)
This is the first part of a two-part article on semantic compositionality, that is, the principle that the meaning of a complex expression is determined by the meanings of its parts and the way they are put together. Here we provide a brief historical background, a formal framework for syntax and semantics, precise definitions, and a survey of variants of compositionality. Stronger and weaker forms are distinguished, as well as generalized forms that cover extra-linguistic context dependence as well as linguistic (...) context dependence. In the second article, we survey arguments for and arguments against the claim that natural languages are compositional, and consider some problem cases. It will be referred to as Part II. (shrink)
I see the question what it is that makes an inference valid and thereby gives a proof its epistemic power as the most fundamental problem of general proof theory. It has been surprisingly neglected in logic and philosophy of mathematics with two exceptions: Gentzen’s remarks about what justifies the rules of his system of natural deduction and proposals in the intuitionistic tradition about what a proof is. They are reviewed in the paper and I discuss to what extent they succeed (...) in answering what a proof is. Gentzen’s ideas are shown to give rise to a new notion of valid argument. At the end of the paper I summarize and briefly discuss an approach to the problem that I have proposed earlier. (shrink)
We may try to explain proofs as chains of valid inference, but the concept of validity needed in such an explanation cannot be the traditional one. For an inference to be legitimate in a proof it must have sufficient epistemic power, so that the proof really justifies its final conclusion. However, the epistemic concepts used to account for this power are in their turn usually explained in terms of the concept of proof. To get out of this circle we may (...) consider an idea within intuitionism about what it is to justify the assertion of a proposition. It depends on Heyting’s view of the meaning of a proposition, but does not presuppose the concept of inference or of proof as chains of inferences. I discuss this idea and what is required in order to use it for an adequate notion of valid inference. (shrink)
The standard relation of logical consequence allows for non-standard interpretations of logical constants, as was shown early on by Carnap. But then how can we learn the interpretations of logical constants, if not from the rules which govern their use? Answers in the literature have mostly consisted in devising clever rule formats going beyond the familiar what follows from what. A more conservative answer is possible. We may be able to learn the correct interpretations from the standard rules, because the (...) space of possible interpretations is a priori restricted by universal semantic principles. We show that this is indeed the case. The principles are familiar from modern formal semantics: compositionality, supplemented, for quantifiers, with topic-neutrality. (shrink)
Albertus Magnus favours the Aristotelian definition of the soul as the first actuality or perfection of a natural body having life potentially. But he interprets Aristotle's vocabulary in a way that it becomes compatible with the separability of the soul from the body. The term “perfectio” is understood as referring to the soul's activity only, not to its essence. The term “forma” is avoided as inadequate for defining the soul's essence. The soul is understood as a substance which exists independently (...) of its actions and its body. The article shows that Albertus' terminological decisions continue a tradition reaching from the Greek commentators, and John Philoponos in particular, to Avicenna. Albertus' position on another important issue is also influenced by Arabic sources. His defense of the unity of the soul's vegetative, animal and rational parts rests on arguments from Avicenna and Averroes. It is shown that Averroes' position on the problem is not clearcut: he advocates the unity thesis, but also teaches the plurality of the generic and individual forms in man. This double stance is visible in the Latin reception of Averroes' works, and also in Albertus, who presents Averroes both as supporter and opponent of the plurality thesis. (shrink)
The paper has three parts. First, a survey and analysis is given ofthe structure of individual rights in the recent EU Directive ondata protection. It is argued that at the core of this structure isan unexplicated notion of what the data subject can `reasonablyexpect' concerning the further processing of information about himor herself. In the second part of the paper it is argued thattheories of privacy popular among philosophers are not able to shed much light on the issues treated in (...) the Directive, whichare, arguably, among the central problems pertaining to theprotection of individual rights in the information society. Inthe third part of the paper, some suggestions are made for a richerphilosophical theory of data protection and privacy. It is arguedthat this account is better suited to the task of characterizingthe central issues raised by the Directive. (shrink)