Do numbers, sets, and so forth, exist? What do mathematical statements mean? Are they literally true or false, or do they lack truth values altogether? Addressing questions that have attracted lively debate in recent years, Stewart Shapiro contends that standard realist and antirealist accounts of mathematics are both problematic. As Benacerraf first noted, we are confronted with the following powerful dilemma. The desired continuity between mathematical and, say, scientific language suggests realism, but realism in this context suggests seemingly intractable epistemic (...) problems. As a way out of this dilemma, Shapiro articulates a structuralist approach. On this view, the subject matter of arithmetic, for example, is not a fixed domain of numbers independent of each other, but rather is the natural number structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle. Using this framework, realism in mathematics can be preserved without troublesome epistemic consequences. Shapiro concludes by showing how a structuralist approach can be applied to wider philosophical questions such as the nature of an "object" and the Quinean nature of ontological commitment. Clear, compelling, and tautly argued, Shapiro's work, noteworthy both in its attempt to develop a full-length structuralist approach to mathematics and to trace its emergence in the history of mathematics, will be of deep interest to both philosophers and mathematicians. (shrink)
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed description of higher-order logic, including a comprehensive discussion of its semantics. He goes on to demonstrate the prevalence of second-order concepts in mathematics and the extent to which mathematical ideas can be formulated in higher-order logic. He also shows how first-order languages are often insufficient to codify (...) many concepts in contemporary mathematics, and thus that both first- and higher-order logics are needed to fully reflect current work. Throughout, the emphasis is on discussing the associated philosophical and historical issues and the implications they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic comparable to that provided in a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in the field today. (shrink)
Logical pluralism is the view that different logics are equally appropriate, or equally correct. Logical relativism is a pluralism according to which validity and logical consequence are relative to something. Stewart Shapiro explores various such views. He argues that the question of meaning shift is itself context-sensitive and interest-relative.
Moving beyond both realist and anti-realist accounts of mathematics, Shapiro articulates a "structuralist" approach, arguing that the subject matter of a mathematical theory is not a fixed domain of numbers that exist independent of each other, but rather is the natural structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle.
This unique book by Stewart Shapiro looks at a range of philosophical issues and positions concerning mathematics in four comprehensive sections. Part I describes questions and issues about mathematics that have motivated philosophers since the beginning of intellectual history. Part II is an historical survey, discussing the role of mathematics in the thought of such philosophers as Plato, Aristotle, Kant, and Mill. Part III covers the three major positions held throughout the twentieth century: the idea that mathematics is logic (logicism), (...) the view that the essence of mathematics is the rule-governed manipulation of characters (formalism), and a revisionist philosophy that focuses on the mental activity of mathematics (intuitionism). Finally, Part IV brings the reader up-to-date with a look at contemporary developments within the discipline. This sweeping introductory guide to the philosophy of mathematics makes these fascinating concepts accessible to those with little background in either mathematics or philosophy. (shrink)
Stewart Shapiro's ambition in Vagueness in Context is to develop a comprehensive account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary according to their context: a person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The key feature of Shapiro's account is that the extensions of vague terms (...) also vary in the course of conversations and that, in some cases, a competent speaker can go either way without sinning against the meaning of the words or the non-linguistic facts. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak; but vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. (shrink)
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall with respect to professional basketball players. The main feature (...) of Shapiro's account is that the extensions of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker of the language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann.The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox.Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems.As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects. (shrink)
The notion of potential infinity dominated in mathematical thinking about infinity from Aristotle until Cantor. The coherence and philosophical importance of the notion are defended. Particular attention is paid to the question of whether potential infinity is compatible with classical logic or requires a weaker logic, perhaps intuitionistic.
Some authors have claimed that ante rem structuralism has problems with structures that have indiscernible places. In response, I argue that there is no requirement that mathematical objects be individuated in a non-trivial way. Metaphysical principles and intuitions to the contrary do not stand up to ordinary mathematical practice, which presupposes an identity relation that, in a sense, cannot be defined. In complex analysis, the two square roots of –1 are indiscernible: anything true of one of them is true of (...) the other. I suggest that i functions like a parameter in natural deduction systems. I gave an early version of this paper at a workshop on structuralism in mathematics and science, held in the Autumn of 2006, at Bristol University. Thanks to the organizers, particularly Hannes Leitgeb, James Ladyman, and Øystein Linnebo, to my commentator Richard Pettigrew, and to the audience there. The paper also benefited considerably from a preliminary session at the Arché Research Centre at the University of St Andrews. I am indebted to my colleagues Craige Roberts, for help with the linguistics literature, and Ben Caplan and Gabriel Uzquiano, for help with the metaphysics. Thanks also to Hannes Leitgeb and Jeffrey Ketland for reading an earlier version of the manuscript and making helpful suggestions. I also benefited from conversations with Richard Heck, John Mayberry, Kevin Scharp, and Jason Stanley. CiteULike Connotea Del.icio.us What's this? (shrink)
Moving beyond both realist and anti-realist accounts of mathematics, Shapiro articulates a "structuralist" approach, arguing that the subject matter of a mathematical theory is not a fixed domain of numbers that exist independent of each other, but rather is the natural structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle.
This Oxford Handbook covers the current state of the art in the philosophy of maths and logic in a comprehensive and accessible manner, giving the reader an overview of the major problems, positions, and battle lines. The 26 newly-commissioned chapters are by established experts in the field and contain both exposition and criticism as well as substantial development of their own positions. Select major positions are represented by two chapters - one supportive and one critical. The book includes a comprehensive (...) bibliography. (shrink)
According to Ole Hjortland, Timothy Williamson, Graham Priest, and others, anti-exceptionalism about logic is the view that logic “isn’t special”, but is continuous with the sciences. Logic is revisable, and its truths are neither analytic nor a priori. And logical theories are revised on the same grounds as scientific theories are. What isn’t special, we argue, is anti-exceptionalism about logic. Anti-exceptionalists disagree with one another regarding what logic and, indeed, anti-exceptionalism are, and they are at odds with naturalist philosophers of (...) logic, who may have seemed like natural allies. Moreover, those internal battles concern well-trodden philosophical issues, and there is no hint as to how they are to be resolved on broadly scientific grounds. We close by looking at three of the founders of logic who may have seemed like obvious enemies of anti-exceptionalism—Aristotle, Frege, and Carnap—and conclude that none of their positions is clearly at odds with at least some of the main themes of anti-exceptionalism. We submit that, at least at present, anti-exceptionalism is too vague or underspecified to characterize a coherent conception of logic, one that stands opposed to more traditional approaches. (shrink)
This chapter provides broad coverage of the notion of logical consequence, exploring its modal, semantic, and epistemic aspects. It develops the contrast between proof-theoretic notion of consequence, in terms of deduction, and a model-theoretic approach, in terms of truth-conditions. The main purpose is to relate the formal, technical work in logic to the philosophical concepts that underlie reasoning.
The subject of this paper is the philosophical problem of accounting for the relationship between mathematics and non-mathematical reality. The first section, devoted to the importance of the problem, suggests that many of the reasons for engaging in philosophy at all make an account of the relationship between mathematics and reality a priority, not only in philosophy of mathematics and philosophy of science, but also in general epistemology/metaphysics. This is followed by a (rather brief) survey of the major, traditional philosophies (...) of mathematics indicating how each is prepared to deal with the present problem. It is shown that (the standard formulations of) some views seem to deny outright that there is a relationship between mathematics and any non-mathematical reality; such philosophies are clearly unacceptable. Other views leave the relationship rather mysterious and, thus, are incomplete at best. The final, more speculative section provides the direction of a positive account. A structuralist philosophy of mathematics is outlined and it is proposed that mathematics applies to reality though the discovery of mathematical structures underlying the non-mathematical universe. (shrink)
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall with respect to professionalbasketball players. The main feature of (...) Shapiro's account is that the extensions of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker ofthe language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann.The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox.Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems.As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects. (shrink)
There is a parallel between the debate between Gottlob Frege and David Hilbert at the turn of the twentieth century and at least some aspects of the current controversy over whether category theory provides the proper framework for structuralism in the philosophy of mathematics. The main issue, I think, concerns the place and interpretation of meta-mathematics in an algebraic or structuralist approach to mathematics. Can meta-mathematics itself be understood in algebraic or structural terms? Or is it an exception to the (...) slogan that mathematics is the science of structure? (shrink)
We are logical pluralists who hold that the right logic is dependent on the domain of investigation; different logics for different mathematical theories. The purpose of this article is to explore the ramifications for our pluralism concerning normativity. Is there any normative role for logic, once we give up its universality? We discuss Florian Steingerger’s “Frege and Carnap on the Normativity of Logic” as a source for possible types of normativity, and then turn to our own proposal, which postulates that (...) various logics are constitutive for thought within particular practices, but none are constitutive for thought as such. (shrink)
We examine George Boolos's proposed abstraction principle for extensions based on the limitation-of-size conception, New V, from several perspectives. Crispin Wright once suggested that New V could serve as part of a neo-logicist development of real analysis. We show that it fails both of the conservativeness criteria for abstraction principles that Wright proposes. Thus, we support Boolos against Wright. We also show that, when combined with the axioms for Boolos's iterative notion of set, New V yields a system equivalent to (...) full Zermelo-Fraenkel set theory with a principle of global choice. This advances Boolos's longstanding interest in the foundations of set theory. (shrink)
Number words seemingly function both as adjectives attributing cardinality properties to collections, as in Frege’s ‘Jupiter has four moons’, and as names referring to numbers, as in Frege’s ‘The number of Jupiter’s moons is four’. This leads to what Thomas Hofweber calls Frege’s Other Puzzle: How can number words function as modifiers and as singular terms if neither adjectives nor names can serve multiple semantic functions? Whereas most philosophers deny that one of these uses is genuine, we instead argue that (...) number words, like many related expressions, are polymorphic, having multiple uses whose meanings are systematically related via type shifting. (shrink)
This paper uses neo-Fregean-style abstraction principles to develop the integers from the natural numbers (assuming Hume’s principle), the rational numbers from the integers, and the real numbers from the rationals. The first two are first-order abstractions that treat pairs of numbers: (DIF) INT(a,b)=INT(c,d) ≡ (a+d)=(b+c). (QUOT) Q(m,n)=Q(p,q) ≡ (n=0 & q=0) ∨ (n≠0 & q≠0 & m⋅q=n⋅p). The development of the real numbers is an adaption of the Dedekind program involving “cuts” of rational numbers. Let P be a property (of (...) rational numbers) and r a rational number. Say that r is an upper bound of P, written P≤r, if for any rational number s, if Ps then either s<r or s=r. In other words, P≤r if r is greater than or equal to any rational number that P applies to. Consider the Cut Abstraction Principle: (CP) ∀P∀Q(C(P)=C(Q) ≡ ∀r(P≤r ≡ Q≤r)). In other words, the cut of P is identical to the cut of Q if and only if P and Q share all of their upper bounds. The axioms of second-order real analysis can be derived from (CP), just as the axioms of second-order Peano arithmetic can be derived from Hume’s principle. The paper raises some of the philosophical issues connected with the neo-Fregean program, using the above abstraction principles as case studies. (shrink)
According to ante rem structuralism a branch of mathematics, such as arithmetic, is about a structure, or structures, that exist independent of the mathematician, and independent of any systems that exemplify the structure. A structure is a universal of sorts: structure is to exemplified system as property is to object. So ante rem structuralist is a form of ante rem realism concerning universals. Since the appearance of my Philosophy of mathematics: Structure and ontology, a number of criticisms of the idea (...) of ante rem structures have appeared. Some argue that it is impossible to give identity conditions for places in homogeneous ante rem structures, invoking a version of the identity of indiscernibles. Others raise issues concerning the identity and distinctness of places in different structures, such as the the natural number 2 and the real number 2. The purpose of this paper is to take the measure of these objections, and to further articulate ante rem structuralism to take them into account. (shrink)
At the beginning of Die Grundlagen der Arithmetik [1884], Frege observes that “it is in the nature of mathematics to prefer proof, where proof is possible”. This, of course, is true, but thinkers differ on why it is that mathematicians prefer proof. And what of propositions for which no proof is possible? What of axioms? This talk explores various notions of self-evidence, and the role they play in various foundational systems, notably those of Frege and Zermelo. I argue that both (...) programs are undermined at a crucial point, namely when self-evidence is supported by holistic and even pragmatic considerations. (shrink)
Typically, a logic consists of a formal or informal language together with a deductive system and/or a model-theoretic semantics. The language is, or corresponds to, a part of a natural language like English or Greek. The deductive system is to capture, codify, or simply record which inferences are correct for the given language, and the semantics is to capture, codify, or record the meanings, or truth-conditions, or possible truth conditions, for at least part of the language.
This paper discusses the neo-logicist approach to the foundations of mathematics by highlighting an issue that arises from looking at the Bad Company objection from an epistemological perspective. For the most part, our issue is independent of the details of any resolution of the Bad Company objection and, as we will show, it concerns other foundational approaches in the philosophy of mathematics. In the first two sections, we give a brief overview of the "Scottish" neo-logicist school, present a generic form (...) of the Bad Company objection and introduce an epistemic issue connected to this general problem that will be the focus of the rest of the paper. In the third section, we present an alternative approach within philosophy of mathematics, a view that emerges from Hilbert's Grundlagen der Geometrie (1899, Leipzig: Teubner; Foundations of geometry (trans.: Townsend, E.). La Salle, Illinois: Open Court, 1959.). We will argue that Bad Company-style worries, and our concomitant epistemic issue, also affects this conception and other foundationalist approaches. In the following sections, we then offer various ways to address our epistemic concern, arguing, in the end, that none resolves the issue. The final section offers our own resolution which, however, runs against the foundationalist spirit of the Scottish neo-logicist program. (shrink)
Hellman and Shapiro explore the development of the idea of the continuous, from the Aristotelian view that a true continuum cannot be composed of points to the now standard, entirely punctiform frameworks for analysis and geometry. They then investigate the underlying metaphysical issues concerning the nature of space or space-time.
This edited collection covers Friedrich Waismann's most influential contributions to twentieth-century philosophy of language: his concepts of open texture and language strata, his early criticism of verificationism and the analytic-synthetic distinction, as well as their significance for experimental and legal philosophy. -/- In addition, Waismann's original papers in ethics, metaphysics, epistemology and the philosophy of mathematics are here evaluated. They introduce Waismann's theory of action along with his groundbreaking work on fiction, proper names and Kafka's Trial. -/- Waismann is known (...) as the voice of Ludwig Wittgenstein in the Vienna Circle. At the same time we find in his works a determined critic of logical positivism and ordinary language philosophy, who anticipated much later developments in the analytic tradition and devised his very own vision for its future. (shrink)
The neo-logicist argues tliat standard mathematics can be derived by purely logical means from abstraction principles—such as Hume's Principle— which are held to lie 'epistcmically innocent'. We show that the second-order axiom of comprehension applied to non-instantiated properties and the standard first-order existential instantiation and universal elimination principles are essential for the derivation of key results, specifically a theorem of infinity, but have not been shown to be epistemically innocent. We conclude that the epistemic innocence of mathematics has not been (...) established by the neo-logicist. (shrink)
There is an interesting logical/semantic issue with some mathematical languages and theories. In the language of (pure) complex analysis, the two square roots of i’ manage to pick out a unique object? This is perhaps the most prominent example of the phenomenon, but there are some others. The issue is related to matters concerning the use of definite descriptions and singular pronouns, such as donkey anaphora and the problem of indistinguishable participants. Taking a cue from some work in linguistics and (...) the philosophy of language, I suggest that i functions like a parameter in natural deduction systems. This may require some rethinking of the role of singular terms, at least in mathematical languages. (shrink)
In this paper, we outline and critically evaluate Thomas Hofweber’s solution to a semantic puzzle he calls Frege’s Other Puzzle. After sketching the Puzzle and two traditional responses to it—the Substantival Strategy and the Adjectival Strategy—we outline Hofweber’s proposed version of Adjectivalism. We argue that two key components—the syntactic and semantic components—of Hofweber’s analysis both suffer from serious empirical difficulties. Ultimately, this suggests that an altogether different solution to Frege’s Other Puzzle is required.
Graham Priest's In Contradiction (Dordrecht: Martinus Nijhoff Publishers, 1987, chapter 3) contains an argument concerning the intuitive, or ‘naïve’ notion of (arithmetic) proof, or provability. He argues that the intuitively provable arithmetic sentences constitute a recursively enumerable set, which has a Gödel sentence which is itself intuitively provable. The incompleteness theorem does not apply, since the set of provable arithmetic sentences is not consistent. The purpose of this article is to sharpen Priest's argument, avoiding reference to informal notions, consensus, or (...) Church's thesis. We add Priest's dialetheic semantics to ordinary Peano arithmetic PA, to produce a recursively axiomatized formal system PA★ that contains its own truth predicate. Whether one is a dialetheist or not, PA★ is a legitimate, rigorously defined formal system, and one can explore its proof‐theoretic properties. The system is inconsistent (but presumably non‐trivial), and it proves its own Gödel sentence as well as its own soundness. Although this much is perhaps welcome to the dialetheist, it has some untoward consequences. There are purely arithmetic (indeed, Π0) sentences that are both provable and refutable in PA★. So if the dialetheist maintains that PA★ is sound, then he must hold that there are true contradictions in the most elementary language of arithmetic. Moreover, the thorough dialetheist must hold that there is a number g which both is and is not the code of a derivation of the indicated Gödel sentence of PA★. For the thorough dialetheist, it follows ordinary PA and even Robinson arithmetic are themselves inconsistent theories. I argue that this is a bitter pill for the dialetheist to swallow. (shrink)
We are logical pluralists who hold that the right logic is dependent on the domain of investigation; different logics for different mathematical theories. The purpose of this article is to explore the ramifications for our pluralism concerning normativity. Is there any normative role for logic, once we give up its universality? We discuss Florian Steingerger’s “Frege and Carnap on the Normativity of Logic” as a source for possible types of normativity, and then turn to our own proposal, which postulates that (...) various logics are constitutive for thought within particular practices, but none are constitutive for thought as such. (shrink)
The purpose of this paper is to assess the prospects for a neo-logicist development of set theory based on a restriction of Frege's Basic Law V, which we call (RV): PQ[Ext(P) = Ext(Q) [(BAD(P) & BAD(Q)) x(Px Qx)]] BAD is taken as a primitive property of properties. We explore the features it must have for (RV) to sanction the various strong axioms of Zermelo–Fraenkel set theory. The primary interpretation is where ‘BAD’ is Dummett's ‘indefinitely extensible’. 1 Background: what and why? (...) 2 Framework 3 GOOD candidates, indefinite extensibility 4 The framework of (RV) alone, or almost alone 5 The axioms 6 Brief closing. (shrink)
One prominent criticism of the abstractionist program is the so-called Bad Company objection. The complaint is that abstraction principles cannot in general be a legitimate way to introduce mathematical theories, since some of them are inconsistent. The most notorious example, of course, is Frege’s Basic Law V. A common response to the objection suggests that an abstraction principle can be used to legitimately introduce a mathematical theory precisely when it is stable: when it can be made true on all sufficiently (...) large domains. In this paper, we raise a worry for this response to the Bad Company objection. We argue, perhaps surprisingly, that it requires very strong assumptions about the range of the second-order quantifiers; assumptions that the abstractionist should reject. (shrink)
We develop a point-free construction of the classical one- dimensional continuum, with an interval structure based on mereology and either a weak set theory or logic of plural quantification. In some respects this realizes ideas going back to Aristotle,although, unlike Aristotle, we make free use of classical "actual infinity". Also, in contrast to intuitionistic, Bishop, and smooth infinitesimal analysis, we follow classical analysis in allowing partitioning of our "gunky line" into mutually exclusive and exhaustive disjoint parts, thereby demonstrating the independence (...) of "indecomposability" from a non-punctiform conception. It is surprising that such simple axioms as ours already imply the Archimedean property and that they determine an isomorphism with the Dedekind-Cantor structure of R as a complete, separable, ordered field. We also present some simple topological models of our system, establishing consistency relative to classical analysis. Finally, after describing how to nominalize our theory, we close with comparisons with earlier efforts related to our own. (shrink)
§1. Overview. Philosophers and mathematicians have drawn lots of conclusions from Gödel's incompleteness theorems, and related results from mathematical logic. Languages, minds, and machines figure prominently in the discussion. Gödel's theorems surely tell us something about these important matters. But what?A descriptive title for this paper would be “Gödel, Lucas, Penrose, Turing, Feferman, Dummett, mechanism, optimism, reflection, and indefinite extensibility”. Adding “God and the Devil” would probably be redundant. Despite the breath-taking, whirlwind tour, I have the modest aim of forging (...) connections between different parts of this literature and clearing up some confusions, together with the less modest aim of not introducing any more confusions.I propose to focus on three spheres within the literature on incompleteness. The first, and primary, one concerns arguments that Gödel's theorem refutes the mechanistic thesis that the human mind is, or can be accurately modeled as, a digital computer or a Turing machine. The most famous instance is the much reprinted J. R. Lucas [18]. To summarize, suppose that a mechanist provides plans for a machine,M, and claims that the output ofMconsists of all and only the arithmetic truths that a human, or the totality of human mathematicians, will ever or can ever know. We assume that the output ofMis consistent. (shrink)