Structural proof theory is a branch of logic that studies the general structure and properties of logical and mathematical proofs. This book is both a concise introduction to the central results and methods of structural proof theory, and a work of research that will be of interest to specialists. The book is designed to be used by students of philosophy, mathematics and computer science. The book contains a wealth of results on proof-theoretical systems, including extensions of such systems from logic (...) to mathematics, and on the connection between the two main forms of structural proof theory - natural deduction and sequent calculus. The authors emphasize the computational content of logical results. A special feature of the volume is a computerized system for developing proofs interactively, downloadable from the web and regularly updated. (shrink)
Machine generated contents note: Prologue: Hilbert's Last Problem; 1. Introduction; Part I. Proof Systems Based on Natural Deduction: 2. Rules of proof: natural deduction; 3. Axiomatic systems; 4. Order and lattice theory; 5. Theories with existence axioms; Part II. Proof Systems Based on Sequent Calculus: 6. Rules of proof: sequent calculus; 7. Linear order; Part III. Proof Systems for Geometric Theories: 8. Geometric theories; 9. Classical and intuitionistic axiomatics; 10. Proof analysis in elementary geometry; Part IV. Proof Systems for Nonclassical (...) Logics: 11. Modal logic; 12. Quantified modal logic, provability logic, and so on; Bibliography; Index of names; Index of subjects. (shrink)
The structure of derivations in natural deduction is analyzed through isomorphism with a suitable sequent calculus, with twelve hidden convertibilities revealed in usual natural deduction. A general formulation of conjunction and implication elimination rules is given, analogous to disjunction elimination. Normalization through permutative conversions now applies in all cases. Derivations in normal form have all major premisses of elimination rules as assumptions. Conversion in any order terminates.Through the condition that in a cut-free derivation of the sequent Γ⇒C, no inactive weakening (...) or contraction formulas remain in Γ, a correspondence with the formal derivability relation of natural deduction is obtained: All formulas of Γ become open assumptions in natural deduction, through an inductively defined translation. Weakenings are interpreted as vacuous discharges, and contractions as multiple discharges. In the other direction, non-normal derivations translate into derivations with cuts having the cut formula principal either in both premisses or in the right premiss only. (shrink)
Gentzen's systems of natural deduction and sequent calculus were byproducts in his program of proving the consistency of arithmetic and analysis. It is suggested that the central component in his results on logical calculi was the use of a tree form for derivations. It allows the composition of derivations and the permutation of the order of application of rules, with a full control over the structure of derivations as a result. Recently found documents shed new light on the discovery of (...) these calculi. In particular, Gentzen set up five different forms of natural calculi and gave a detailed proof of normalization for intuitionistic natural deduction. An early handwritten manuscript of his thesis shows that a direct translation from natural deduction to the axiomatic logic of Hilbert and Ackermann was, in addition to the influence of Paul Hertz, the second component in the discovery of sequent calculus. A system intermediate between the sequent calculus LI and axiomatic logic, denoted LIG in unpublished sources, is implicit in Gentzen's published thesis of 1934—35. The calculus has half rules, half “groundsequents,” and does not allow full cut elimination. Nevertheless, a translation from LI to LIG in the published thesis gives a subformula property for a complete class of derivations in LIG. After the thesis, Gentzen continued to work on variants of sequent calculi for ten more years, in the hope to find a consistency proof for arithmetic within an intuitionistic calculus. (shrink)
A normalization procedure is given for classical natural deduction with the standard rule of indirect proof applied to arbitrary formulas. For normal derivability and the subformula property, it is sufficient to permute down instances of indirect proof whenever they have been used for concluding a major premiss of an elimination rule. The result applies even to natural deduction for classical modal logic.
The way from linearly written derivations in natural deduction, introduced by Jaskowski and often used in textbooks, is a straightforward root-first translation. The other direction, instead, is tricky, because of the partially ordered assumption formulas in a tree that can get closed by the end of a derivation. An algorithm is defined that operates alternatively from the leaves and root of a derivation and solves the problem.
A sequent calculus is given in which the management of weakening and contraction is organized as in natural deduction. The latter has no explicit weakening or contraction, but vacuous and multiple discharges in rules that discharge assumptions. A comparison to natural deduction is given through translation of derivations between the two systems. It is proved that if a cut formula is never principal in a derivation leading to the right premiss of cut, it is a subformula of the conclusion. Therefore (...) it is sufficient to eliminate those cuts that correspond to detour and permutation conversions in natural deduction. (shrink)
Gentzen's original proof of the Hauptsatz used a rule of multicut in the case that the right premiss of cut was derived by contraction. Cut elimination is here proved without multicut, by transforming suitably the derivation of the premiss of the contraction.
This paper discusses different interpretations of probability in relation to determinism. It is argued that both objective and subjective views on probability can be compatible with deterministic as well as indeterministic situations. The possibility of a conceptual independence between probability and determinism is argued to hold on a general level. The subsequent philosophical analysis of recent advances in classical statistical mechanics (ergodic theory) is of independent interest, but also adds weight to the claim that it is possible to justify an (...) objective interpretation of probabilities in a theory having as a basis the paradigmatically deterministic theory of classical mechanics. (shrink)
A proof-theoretical analysis of elementary theories of order relations is effected through the formulation of order axioms as mathematical rules added to contraction-free sequent calculus. Among the results obtained are proof-theoretical formulations of conservativity theorems corresponding to Szpilrajn’s theorem on the extension of a partial order into a linear one. Decidability of the theories of partial and linear order for quantifier-free sequents is shown by giving terminating methods of proof-search.
Gentzen's “Untersuchungen” [1] gave a translation from natural deduction to sequent calculus with the property that normal derivations may translate into derivations with cuts. Prawitz in [8] gave a translation that instead produced cut-free derivations. It is shown that by writing all elimination rules in the manner of disjunction elimination, with an arbitrary consequence, an isomorphic translation between normal derivations and cut-free derivations is achieved. The standard elimination rules do not permit a full normal form, which explains the cuts in (...) Gentzen's translation. Likewise, it is shown that Prawitz' translation contains an implicit process of cut elimination. (shrink)
Recently Ekman gave a derivation in natural deduction such that it either contains a substantial redundant part or else is not normal. It is shown that this problem is caused by a non-normality inherent in the usual modus ponens rule.
Elementary geometry can be axiomatized constructively by taking as primitive the concepts of the apartness of a point from a line and the convergence of two lines, instead of incidence and parallelism as in the classical axiomatizations. I first give the axioms of a general plane geometry of apartness and convergence. Constructive projective geometry is obtained by adding the principle that any two distinct lines converge, and affine geometry by adding a parallel line construction, etc. Constructive axiomatization allows solutions to (...) geometric problems to be effected as computer programs. I present a formalization of the axiomatization in type theory. This formalization works directly as a computer implementation of geometry. (shrink)
The Löwenheim-Skolem theorem was published in Skolem's long paper of 1920, with the first section dedicated to the theorem. The second section of the paper contains a proof-theoretical analysis of derivations in lattice theory. The main result, otherwise believed to have been established in the late 1980s, was a polynomial-time decision algorithm for these derivations. Skolem did not develop any notation for the representation of derivations, which makes the proofs of his results hard to follow. Such a formal notation is (...) given here by which these proofs become transparent. A third section of Skolem's paper gives an analysis for derivations in plane projective geometry. To clear a gap in Skolem's result, a new conservativity property is shown for projective geometry, to the effect that a proper use of the axiom that gives the uniqueness of connecting lines and intersection points requires a conclusion with proper cases (logically, a disjunction in a positive part) to be proved. The forgotten parts of Skolem's first paper on the Löwenheim-Skolem theorem are the perhaps earliest combinatorial analyses of formal mathematical proofs, and at least the earliest analyses with profound results. (shrink)
De Finetti's representation theorem is a special case of the ergodic decomposition of stationary probability measures. The problems of the interpretation of probabilities centred around de Finetti's theorem are extended to this more general situation. The ergodic decomposition theorem has a physical background in the ergodic theory of dynamical systems. Thereby the interpretations of probabilities in the cases of de Finetti's theorem and its generalization and in ergodic theory are systematically connected to each other.
The standard rule of necessitation in systems of natural deduction for the modal logic S4 concludes □A from A whenever all assumptions A depends on are modal formulas. This condition prevents the composability and normalization of derivations, and therefore modifications of the rule have been suggested. It is shown that both properties hold if, instead of changing the rule of necessitation, all elimination rules are formulated in the manner of disjunction elimination, i.e. with an arbitrary consequence.
Attention is drawn to the fact that what is alternatively known as Dummett logic, Gödel logic, or Gödel-Dummett logic, was actually introduced by Skolem already in 1913. A related work of 1919 introduces implicative lattices, or Heyting algebras in today's terminology.
Attention is drawn to the fact that what is alternatively known as Dummett logic, Gödel logic, or Gödel-Dummett logic, was actually introduced by Skolem already in 1913. A related work of 1919 introduces implicative lattices, or Heyting algebras in today's terminology.
The axioms of projective and affine plane geometry are turned into rules of proof by which formal derivations are constructed. The rules act only on atomic formulas. It is shown that proof search for the derivability of atomic cases from atomic assumptions by these rules terminates . This decision method is based on the central result of the combinatorial analysis of derivations by the geometric rules: The geometric objects that occur in derivations by the rules can be restricted to those (...) known from the assumptions and cases. This “subterm property” is proved by permuting suitably the order of application of the geometric rules. As an example of the decision method, it is shown that there cannot exist a derivation of Euclid’s fifth postulate if the rule that corresponds to the uniqueness of the parallel line construction is taken away from the system of plane affine geometry. (shrink)
This chapter focuses on the development of Gerhard Gentzen's structural proof theory and its connections with intuitionism. The latter is important in proof theory for several reasons. First, the methods of Hilbert's old proof theory were limited to the “finitistic” ones. These methods proved to be insufficient, and they were extended by infinitistic principles that were still intuitionistically meaningful. It is a general tendency in proof theory to try to use weak principles. A second reason for the importance of intuitionism (...) for proof theory is that the proof-theoretical study of intuitionistic logic has become a prominent part of logic, with applications in computer science. (shrink)
A sequent calculus is given in which the management of weakening and contraction is organized as in natural deduction. The latter has no explicit weakening or contraction, but vacuous and multiple discharges in rules that discharge assumptions. A comparison to natural deduction is given through translation of derivations between the two systems. It is proved that if a cut formula is never principal in a derivation leading to the right premiss of cut, it is a subformula of the conclusion. Therefore (...) it is sufficient to eliminate those cuts that correspond to detour and permutation conversions in natural deduction. (shrink)
Bruno de Finetti's earliest works on the foundations of probability are reviewed. These include the notion of exchangeability and the theory of random processes with independent increments. The latter theory relates to de Finetti's ideas for a probabilistic science more generally. Different aspects of his work are united by his foundational programme for a theory of subjective probabilities.
What seem to be Kurt Gödel’s first notes on logic, an exercise notebook of 84 pages, contains formal proofs in higher-order arithmetic and set theory. The choice of these topics is clearly suggested by their inclusion in Hilbert and Ackermann’s logic book of 1928, the Grundzüge der theoretischen Logik. Such proofs are notoriously hard to construct within axiomatic logic. Gödel takes without further ado into use a linear system of natural deduction for the full language of higher-order logic, with formal (...) derivations closer to one hundred steps in length and up to four nested temporary assumptions with their scope indicated by vertical intermittent lines. (shrink)
Recently discovered documents have shown how Gentzen had arrived at the final form of natural deduction, namely by trying out a great number of alternative formulations. What led him to natural deduction in the first place, other than the general idea of studying “mathematical inference as it appears in practice,” is not indicated anywhere in his publications or preserved manuscripts. It is suggested that formal work in axiomatic logic lies behind the birth of Gentzen’s natural deduction, rather than any single (...) decisive influence in the work of others. Various axiomatizations are explored in turn, from the classical axioms of Hilbert and Ackermann to the intuitionistic axiomatization of Heyting. (shrink)
Mathematical Astronomy as the most developed branch of ancient exact sciences has been widely discussed - especially epistemological issues e.g. concerning astronomy as a prime example of the distinction between instrumentalist and realist understanding of theories. In contrast to these the very methodology of ancient astronomy has received little attention. Following the work of Jaakko Hintikka and Unto Remes Aristarchus' method of determining the distance of the Sun is sketched and Ptolemy's solar model is discussed in detail.