Hilary Putnam once suggested that “the actual existence of sets as ‘intangible objects’ suffers… from a generalization of a problem first pointed out by Paul Benacerraf… are sets a kind of function or are functions a sort of set?” Sadly, he did not elaborate; my aim, here, is to do so on his behalf. There are well-known methods for treating sets as functions and functions as sets. But these do not raise any obvious philosophical or foundational puzzles. For that, we (...) first need to provide a full-fledged function theory. I supply such a theory: it axiomatizes the iterative notion of function in exactly the same sense that ZF axiomatizes the iterative notion of set. Indeed, this function theory is synonymous with ZF. It might seem that set theory and function theory present us with rival foundations for mathematics, since they postulate different ontologies. But appearances are deceptive: set theory and function theory are the very same foundation. (shrink)
The motivating question of this paper is: ‘How are our beliefs in the theorems of mathematics justified?’ This is distinguished from the question ‘How are our mathematical beliefs reliably true?’ We examine an influential answer, outlined by Russell, championed by Gödel, and developed by those searching for new axioms to settle undecidables, that our mathematical beliefs are justified by intuitions, as our scientific beliefs are justified by observations. On this view, axioms are analogous to laws of nature. They are postulated (...) to best systematize the data to be explained. We argue that there is a decisive difference between the cases. There is agreement on the data to be systematized in the scientific case that has no analog in the mathematical one. There is virtual consensus on observations but conspicuous dispute over intuitions. In this respect, mathematics more closely resembles paradigmatic philosophy. We conclude by distinguishing two ideas that have long been associated -- realism (the idea that there is an independent reality) and objectivity (the idea that in a disagreement, only one of us can be right). We argue that, while realism is true of mathematics and philosophy, these domains fail to be fully objective. One upshot of the discussion is a kind of pragmatism. Factual questions in mathematics, modality, logic, and evaluative areas go proxy for non-factual practical ones. (shrink)
Context: Consistency of mathematical constructions in numerical analysis and the application of computerized proofs in the light of the occurrence of numerical chaos in simple systems. Purpose: To show that a computer in general and a numerical analysis in particular can add its own peculiarities to the subject under study. Hence the need of thorough theoretical studies on chaos in numerical simulation. Hence, a questioning of what e.g. a numerical disproof of a theorem in physics or a prediction in numerical (...) economics could mean. Method: An algebraic simple model system is subjected to a deeper structure of underlying variables. With an algorithm simulating the steps in taking a limit of second order difference quotients the error terms are studied at the background of their algebraic expression. Results: With the algorithm that was applied to a simple quadratic polynomial system we found unstably amplified round-off errors. The possibility of numerical chaos is already known but not in such a simple system as used in our paper. The amplification of the errors implies that it is not possible with computer means to constructively show that the algebra and numerical analysis will ‘on the long run’ converge to each other and the error term will vanish. The algebraic vanishing of the error term cannot be demonstrated with the use of the computer because the round-off errors are amplified. In philosophical terms, the amplification of the round-off error is equivalent to the continuum hypothesis. This means that the requirement of (numerical) construction of mathematical objects is no safeguard against inference-only conclusions of qualities of (numerical) mathematical objects. Unstably amplified round-off errors are a same type of problem as the ordering in size of transfinite cardinal numbers. The difference is that the former problem is created within the requirements of constructive mathematics. This can be seen as the reward for working numerically constructive. (shrink)
I discuss Benacerraf's epistemological challenge for realism about areas like mathematics, metalogic, and modality, and describe the pluralist response to it. I explain why normative pluralism is peculiarly unsatisfactory, and use this explanation to formulate a radicalization of Moore's Open Question Argument. According to the argument, the facts -- even the normative facts -- fail to settle the practical questions at the center of our normative lives. One lesson is that the concepts of realism and objectivity, which are widely identified, (...) are actually in tension. (shrink)
Gideon Rosen, Brian Leiter, and Catarina Dutilh Novaes raise deep questions about the arguments in Morality and Mathematics (M&M). Their objections bear on practical deliberation, the formulation of mathematical pluralism, the problem of universals, the argument from moral disagreement, moral ‘perception’, the contingency of our mathematical practices, and the purpose of proof. In this response, I address their objections, and the broader issues that they raise.
Logical monism is the view that there is ‘One True Logic’. This is the default position, against which pluralists react. If there were not ‘One True Logic’, it is hard to see how there could be one true theory of anything. A theory is closed under a logic! But what is logical monism? In this article, I consider semantic, logical, modal, scientific, and metaphysical proposals. I argue that, on no ‘factualist’ analysis (according to which ‘there is One True Logic’ expresses (...) a factual claim, rather than an attitude like approval), does the doctrine have both metaphysical and methodological import. Metaphysically, logics abound. Methodologically, what to infer from what is not settled by the facts, even the normative ones. I conclude that the only interesting sense in which there could be One True Logic is noncognitive. The same may be true of monism about normative areas, like moral, epistemic, and prudential ones, generally. (shrink)
In a paper titled, “The Unreasonable Effectiveness of Mathematics”, published 20 years after Wigner’s seminal paper, the mathematician Richard W. Hamming discussed what he took to be Wigner’s problem of Unreasonable Effectiveness and offered some partial explanations for this phenomenon. Whether Hamming succeeds in his explanations as answers to Wigner’s puzzle is addressed by other scholars in recent years I, on the other hand, raise a more fundamental question: does Hamming succeed in raising the same question as Wigner? The answer (...) is no. My goal is to show that Hamming’s reading misses Wigner’s highly original formulation of the problem.Through a close and contextual reading of Wigner’s work, as I will show, we are led in new directions in addressing and solving the applicability problem. (shrink)
This paper argues that, insofar as we doubt the bivalence of the Continuum Hypothesis or the truth of the Axiom of Choice, we should also doubt the consistency of third-order arithmetic, both the classical and intuitionistic versions. -/- Underlying this argument is the following philosophical view. Mathematical belief springs from certain intuitions, each of which can be either accepted or doubted in its entirety, but not half-accepted. Therefore, our beliefs about reality, bivalence, choice and consistency should all be aligned.
The article approaches the epistemological question on the concept of time from an anthropological psychology perspective. The differentiation between imminent perceptions and existence beyond imminent perception has been the earliest conceptualization of time found so far in the traces of human civilizations. The research differentiated psychological time from modern physics and astronomy as the basic hypothesis in the inquiries on the concept of time in physics and modern astronomy – is the physical unit of time an ontological existence of things (...) or an inter-subjective concept? The research adopts transcendental philosophy in the questioning of unconsciousness in the sociology of knowledge with a dissection between psychological time and physical units of time. With further questioning into the physics of time, and the nonphysical nature of time, I second with the primordial graviton background’s pragmatic approach, despite of the falsifiability of the Big Bang theory and inflation. (shrink)
Rudolf Carnap’s principle of tolerance states that there is no need to justify the adoption of a logic by philosophical means. Carnap uses the freedom provided by this principle in his philosophy of mathematics: he wants to capture the idea that mathematical truth is a matter of linguistic rules by relying on a strong metalanguage with infinitary inference rules. In this paper, I give a new interpretation of an argument by E. W. Beth, which shows that the principle of tolerance (...) does not suffice to remove all obstacles to the employment of infinitary rules. (shrink)
Wittgenstein's paradoxical theses that unproved propositions are meaningless, proofs form new concepts and rules, and contradictions are of limited concern, led to a variety of interpretations, most of them centered on rule-following skepticism. We argue, with the help of C. S. Peirce's distinction between corollarial and theorematic proofs, that his intuitions are better explained by resistance to what we call conceptual omniscience, treating meaning as fixed content specified in advance. We interpret the distinction in the context of modern epistemic logic (...) and semantic information theory, and show how removing conceptual omniscience helps resolve Wittgenstein's paradoxes and explain the puzzle of deduction, its ability to generate new knowledge and meaning. (shrink)
The paper introduces and utilizes a few new concepts: “nonstandard Peano arithmetic”, “complementary Peano arithmetic”, “Hilbert arithmetic”. They identify the foundations of both mathematics and physics demonstrating the equivalence of the newly introduced Hilbert arithmetic and the separable complex Hilbert space of quantum mechanics in turn underlying physics and all the world. That new both mathematical and physical ground can be recognized as information complemented and generalized by quantum information. A few fundamental mathematical problems of the present such as Fermat’s (...) last theorem, four-color theorem as well as its new-formulated generalization as “four-letter theorem”, Poincaré’s conjecture, “P vs NP” are considered over again, from and within the new-founding conceptual reference frame of information, as illustrations. Simple or crucially simplifying solutions and proofs are demonstrated. The link between the consistent completeness of the system mathematics-physics on the ground of information and all the great mathematical problems of the present (rather than the enumerated ones) is suggested. (shrink)
This paper investigates the determinacy of mathematics. We begin by clarifying how we are understanding the notion of determinacy before turning to the questions of whether and how famous independence results bear on issues of determinacy in mathematics. From there, we pose a metasemantic challenge for those who believe that mathematical language is determinate, motivate two important constraints on attempts to meet our challenge, and then use these constraints to develop an argument against determinacy and discuss a particularly popular approach (...) to resolving indeterminacy, before offering some brief closing reflections. We believe our discussion poses a serious challenge for most philosophical theories of mathematics, since it puts considerable pressure on all views that accept a non-trivial amount of determinacy for even basic arithmetic. (shrink)
This paper discusses the relevance of supertask computation for the determinacy of arithmetic. Recent work in the philosophy of physics has made plausible the possibility of supertask computers, capable of running through infinitely many individual computations in a finite time. A natural thought is that, if supertask computers are possible, this implies that arithmetical truth is determinate. In this paper we argue, via a careful analysis of putative arguments from supertask computations to determinacy, that this natural thought is mistaken: supertasks (...) are of no help in explaining arithmetical determinacy. (shrink)
Putnam’s most famous contribution to mathematical logic was his role in investigating Hilbert’s Tenth Problem; Putnam is the ‘P’ in the MRDP Theorem. This volume, though, focusses mostly on Putnam’s work on the philosophy of logic and mathematics. It is a somewhat bumpy ride. Of the twelve papers, two scarcely mention Putnam. Three others focus primarily on Putnam’s ‘Mathematics without foundations’ (1967), but with no interplay between them. The remaining seven papers apparently tackle unrelated themes. Some of this disjointedness would (...) doubtless have been addressed, if Putnam had been able to compose his replies to these papers; sadly, he died before this was possible. In this review, I do my best to tease out some connections between the paper; and there are some really interesting connections to be made. (shrink)
Callard (2007) argues that it is metaphysically possible that a mathematical object, although abstract, causally affects the brain. I raise the following objections. First, a successful defence of mathematical realism requires not merely the metaphysical possibility but rather the actuality that a mathematical object affects the brain. Second, mathematical realists need to confront a set of three pertinent issues: why a mathematical object does not affect other concrete objects and other mathematical objects, what counts as a mathematical object, and how (...) we can have knowledge about an unchanging object. (shrink)
It has been recently debated whether there exists a so-called “easy road” to nominalism. In this essay, I attempt to fill a lacuna in the debate by making a connection with the literature on infinite and infinitesimal idealization in science through an example from mathematical physics that has been largely ignored by philosophers. Specifically, by appealing to John Norton’s distinction between idealization and approximation, I argue that the phenomena of fractional quantum statistics bears negatively on Mary Leng’s proposed path to (...) easy road nominalism, thereby partially defending Mark Colyvan’s claim that there is no easy road to nominalism. (shrink)
Ludwig Wittgenstein selbst hielt seine Überlegungen zur Mathematik für seinen bedeutendsten Beitrag zur Philosophie. So beabsichtigte er zunächst, dem Thema einen zentralen Teil seiner Philosophischen Untersuchungen zu widmen. Tatsächlich wird kaum irgendwo sonst in Wittgensteins Werk so deutlich, wie radikal die Konsequenzen seines Denkens eigentlich sind. Vermutlich deshalb haben Wittgensteins Bemerkungen zur Mathematik unter all seinen Schriften auch den größten Widerstand provoziert: Seine Bemerkungen zu den Gödel’schen Unvollständigkeitssätzen bezeichnete Gödel selbst als Nonsens, und Alan Turing warf Wittgenstein vor, dass aufgrund (...) seiner scheinbar toleranten Haltung gegenüber Widersprüchen Brücken einstürzen könnten, die Mithilfe mathematischer Berechnungen in Wittgensteins Sinne errichtet würden. Die Beiträge des Bandes erklären zentrale Überlegungen Wittgensteins zur Mathematik, räumen weit verbreitete Missverständnisse aus und analysieren kritisch Wittgensteins Bedeutung für die traditionelle Philosophie der Mathematik. Ebenfalls wird die Frage verfolgt, inwieweit Wittgensteins Bemerkungen zur Philosophie der Mathematik über seine Philosophischen Untersuchungen hinausführen. (shrink)
Mathematical realism asserts that mathematical objects exist in the abstract world, and that a mathematical sentence is true or false, depending on whether the abstract world is as the mathematical sentence says it is. I raise two objections against mathematical realism. First, the abstract world is queer in that it allows for contradictory states of affairs. Second, mathematical realism does not have a theoretical resource to explain why a sentence about a tricle is true or false. A tricle is an (...) object that changes its shape from a triangle to a circle, and then back to a triangle with every second. (shrink)
I defend a new position in philosophy of mathematics that I call mathematical inferentialism. It holds that a mathematical sentence can perform the function of facilitating deductive inferences from some concrete sentences to other concrete sentences, that a mathematical sentence is true if and only if all of its concrete consequences are true, that the abstract world does not exist, and that we acquire mathematical knowledge by confirming concrete sentences. Mathematical inferentialism has several advantages over mathematical realism and fictionalism.
This paper investigates the question of how we manage to single out the natural number structure as the intended interpretation of our arithmetical language. Horsten submits that the reference of our arithmetical vocabulary is determined by our knowledge of some principles of arithmetic on the one hand, and by our computational abilities on the other. We argue against such a view and we submit an alternative answer. We single out the structure of natural numbers through our intuition of the absolute (...) notion of finiteness. (shrink)
Mathematical realists have long invoked the categoricity of axiomatizations of arithmetic and analysis to explain how we manage to fix the intended meaning of their respective vocabulary. Can this strategy be extended to set theory? Although traditional wisdom recommends a negative answer to this question, Vann McGee (1997) has offered a proof that purports to show otherwise. I argue that one of the two key assumptions on which the proof rests deprives McGee's result of the significance he and the realist (...) want to attribute to it. I consider two strategies to deal with the problem --- one of which is outlined by McGee himself (2000) --- and argue that both of them fail. I end with some remarks on the prospects for mathematical realism in the light of my discussion. (shrink)
Indispensablists argue that when our belief system conflicts with our experiences, we can negate a mathematical belief but we do not because if we do, we would have to make an excessive revision of our belief system. Thus, we retain a mathematical belief not because we have good evidence for it but because it is convenient to do so. I call this view ‘ mathematical convenientism.’ I argue that mathematical convenientism commits the consequential fallacy and that it demolishes the Quine-Putnam (...) indispensability argument and Baker’s enhanced indispensability argument. (shrink)
How do axioms, or first principles, in ethics compare to those in mathematics? In this companion piece to G.C. Field's 1931 "On the Role of Definition in Ethics", I argue that there are similarities between the cases. However, these are premised on an assumption which can be questioned, and which highlights the peculiarity of normative inquiry.
Gödel’s philosophical conceptions bear striking similarities to Cantor’s. Although there is no conclusive evidence that Gödel deliberately used or adhered to Cantor’s views, one can successfully reconstruct and see his “Cantorianism” at work in many parts of his thought. In this paper, I aim to describe the most prominent conceptual intersections between Cantor’s and Gödel’s thought, particularly on such matters as the nature and existence of mathematical entities (sets), concepts, Platonism, the Absolute Infinite, the progress and inexhaustibility of mathematics.
Many recent writers in the philosophy of mathematics have put great weight on the relative categoricity of the traditional axiomatizations of our foundational theories of arithmetic and set theory. Another great enterprise in contemporary philosophy of mathematics has been Wright's and Hale's project of founding mathematics on abstraction principles. In earlier work, it was noted that one traditional abstraction principle, namely Hume's Principle, had a certain relative categoricity property, which here we term natural relative categoricity. In this paper, we show (...) that most other abstraction principles are not naturally relatively categorical, so that there is in fact a large amount of incompatibility between these two recent trends in contemporary philosophy of mathematics. To better understand the precise demands of relative categoricity in the context of abstraction principles, we compare and contrast these constraints to stability-like acceptability criteria on abstraction principles, the Tarski-Sher logicality requirements on abstraction principles studied by Antonelli and Fine, and supervaluational ideas coming out of Hodes' work. (shrink)
Conventionalism about mathematics claims that mathematical truths are true by linguistic convention. This is often spelled out by appealing to facts concerning rules of inference and formal systems, but this leads to a problem: since the incompleteness theorems we’ve known that syntactic notions can be expressed using arithmetical sentences. There is serious prima facie tension here: how can mathematics be a matter of convention and syntax a matter of fact given the arithmetization of syntax? This challenge has been pressed in (...) the literature by Hilary Putnam and Peter Koellner. In this paper I sketch a conventionalist theory of mathematics, show that this conventionalist theory can meet the challenge just raised , and clarify the type of mathematical pluralism endorsed by the conventionalist by introducing the notion of a semantic counterpart. The paper’s aim is an improved understanding of conventionalism, pluralism, and the relationship between them. (shrink)
I develop a non-representationalist account of mathematical thought, on which the point of mathematical theorizing is to provide us with the conceptual capacity to structure and articulate information about the physical world in an epistemically useful way. On my view, accepting a mathematical theory is not a matter of having a belief about some subject matter; it is rather a matter of structuring logical space, in a sense to be made precise. This provides an elegant account of the cognitive utility (...) of mathematics. Further, it makes explicit how the brand of non-representationalism I develop is compatible with there being substantive rationality constraints on our mathematical theorizing. (shrink)
Sometimes we give truth-conditions for sentences of a discourse in other terms. According to Agustín Rayo, when doing so it is sometimes legitimate to use the terms of that very discourse, so long as the terms do not occur in the truth-conditions themselves. I argue that giving truth-conditions in this "outscoping" way prevents one from answering "discourse threat" (for example, the threat of indeterminacy).
I contend that mathematical domains are freestanding institutional entities that, at least typically, are introduced to serve representational functions. In this paper, I outline an account of institutional reality and a supporting metaontological perspective that clarify the content of this thesis. I also argue that a philosophy of mathematics that has this thesis as its central tenet can account for the objectivity, necessity, and atemporality of mathematics.
Tennenbaum's Theorem yields an elegant characterisation of the standard model of arithmetic. Several authors have recently claimed that this result has important philosophical consequences: in particular, it offers us a way of responding to model-theoretic worries about how we manage to grasp the standard model. We disagree. If there ever was such a problem about how we come to grasp the standard model, then Tennenbaum's Theorem does not help. We show this by examining a parallel argument, from a simpler model-theoretic (...) result. (shrink)
It is often alleged that, unlike typical axioms of mathematics, the Continuum Hypothesis (CH) is indeterminate. This position is normally defended on the ground that the CH is undecidable in a way that typical axioms are not. Call this kind of undecidability “absolute undecidability”. In this paper, I seek to understand what absolute undecidability could be such that one might hope to establish that (a) CH is absolutely undecidable, (b) typical axioms are not absolutely undecidable, and (c) if a mathematical (...) hypothesis is absolutely undecidable, then it is indeterminate. I shall argue that on no understanding of absolute undecidability could one hope to establish all of (a)–(c). However, I will identify one understanding of absolute undecidability on which one might hope to establish both (a) and (c) to the exclusion of (b). This suggests that a new style of mathematical antirealism deserves attention—one that does not depend on familiar epistemological or ontological concerns. The key idea behind this view is that typical mathematical hypotheses are indeterminate because they are relevantly similar to CH. (shrink)
There is a long tradition comparing moral knowledge to mathematical knowledge. In this paper, I discuss apparent similarities and differences between knowledge in the two areas, realistically conceived. I argue that many of these are only apparent, while others are less philosophically significant than might be thought. The picture that emerges is surprising. There are definitely differences between epistemological arguments in the two areas. However, these differences, if anything, increase the plausibility of moral realism as compared to mathematical realism. It (...) is hard to see how one might argue, on epistemological grounds, for moral antirealism while maintaining commitment to mathematical realism. But it may be possible to do the opposite. (shrink)
This paper sketches an answer to the question how we, in our arithmetical practice, succeed in singling out the natural-number structure as our intended interpretation. It is argued that we bring this about by a combination of what we assert about the natural-number structure on the one hand, and our computational capacities on the other hand.
The foundations of probability deal with the problem of modelling reasoning in face of uncertainty by a mathematical calculus, usually the standard probability calculus .The three dominating schools in the foundations of probability interpret probabilities as limiting long-run frequencies conceived as an objective property of series of repeatable experiments , or rational betting rates for an individual to bet on the unknown outcome of experiments depending on the individual’s prior assessments updated by evidence , or rational betting rates to bet (...) on the unknown outcome of experiments depending on evidence only, but not on subjective assessments .Apart from the interpretation of probability, frequentism and Bayesianism in particular also differ with respect to the advocated methodology for inference. Frequentists use tests, estimators, and confidence intervals . Bayesians usually start with a prior distribution and use the posterior distribution, which is obtained by conditioning on the evidence, in order to carry out inferences. The prior distribution either models the individual’s personal prior probability assessments or, in objective Bayesianism, is chosen according to some rules in order to allow the evidence to determine the posterior.All three approaches are riddled with difficulties . Frequentism is often accused of circularity, because the assumption of independent identically distributed outcomes is needed in order to connect observations to frequentist probabilities, but ‘iid’ is itself defined probabilistically. Subjective Bayesianism is attacked for being too …. (shrink)
Julian Cole argues that mathematical domains are the products of social construction. This view has an initial appeal in that it seems to salvage much that is good about traditional platonistic realism without taking on the ontological baggage. However, it also has problems. After a brief sketch of social constructivist theories and Cole’s philosophy of mathematics, I evaluate the arguments in favor of social constructivism. I also discuss two substantial problems with the theory. I argue that unless and until social (...) constructivists can address the two concerns, we have reason to be skeptical about social constructivism in the philosophy of mathematics. (shrink)
Kurt Gödel made many affirmations of robust realism but also showed serious engagement with the idealist tradition, especially with Leibniz, Kant, and Husserl. The root of this apparently paradoxical attitude is his conviction of the power of reason. The paper explores the question of how Gödel read Kant. His argument that relativity theory supports the idea of the ideality of time is discussed critically, in particular attempting to explain the assertion that science can go beyond the appearances and ‘approach the (...) things’. Leibniz and post-Kantian idealism are discussed more briefly, the latter as documented in the correspondence with Gotthard Günther. (shrink)
Second-order axiomatizations of certain important mathematical theories—such as arithmetic and real analysis—can be shown to be categorical. Categoricity implies semantic completeness, and semantic completeness in turn implies determinacy of truth-value. Second-order axiomatizations are thus appealing to realists as they sometimes seem to offer support for the realist thesis that mathematical statements have determinate truth-values. The status of second-order logic is a controversial issue, however. Worries about ontological commitment have been influential in the debate. Recently, Vann McGee has argued that one (...) can get some of the technical advantages of second-order axiomatizations—categoricity, in particular—while walking free of worries about ontological commitment. In so arguing he appeals to the notion of an open-ended schema—a schema that holds no matter how the language of the relevant theory is extended. Contra McGee, we argue that second-order quantification and open-ended schemas are on a par when it comes to ontological commitment. (shrink)
Recent years have seen a growing acknowledgement within the mathematical community that mathematics is cognitively/socially constructed. Yet to anyone doing mathematics, it seems totally objective. The sensation in pursuing mathematical research is of discovering prior (eternal) truths about an external (abstract) world. Although the community can and does decide which topics to pursue and which axioms to adopt, neither an individual mathematician nor the entire community can choose whether a particular mathematical statement is true or false, based on the given (...) axioms. Moreover, all the evidence suggests that all practitioners work with the same ontology. (My number 7 is exactly the same as yours.) How can we reconcile the notion that people construct mathematics, with this apparent choice-free, predetermined objectivity? I believe the answer is to be found by examining what mathematical thinking is (as a mental activity) and the way the human brain acquired the capacity for mathematical thinking. (shrink)
The purpose of this paper is to apply Crispin Wright’s criteria and various axes of objectivity to mathematics. I test the criteria and the objectivity of mathematics against each other. Along the way, various issues concerning general logic and epistemology are encountered.
Kurt Gödel is almost as famous—one might say “notorious”—for his extreme platonist views as he is famous for his mathematical theorems. Moreover his platonism is not a myth; it is well-documented in his writings. Here are two platonist declarations about set theory, the first from his paper about Bertrand Russell and the second from the revised version of his paper on the Continuum Hypotheses.Classes and concepts may, however, also be conceived as real objects, namely classes as “pluralities of things” or (...) as structures consisting of a plurality of things and concepts as the properties and relations of things existing independently of our definitions and constructions.It seems to me that the assumption of such objects is quite as legitimate as the assumption of physical bodies and there is quite as much reason to believe in their existence.But, despite their remoteness from sense experience, we do have something like a perception also of the objects of set theory, as is seen from the fact that the axioms force themselves upon us as being true. I don't see any reason why we should have less confidence in this kind of perception, i.e., in mathematical intuition, than in sense perception.The first statement is a platonist declaration of a fairly standard sort concerning set theory. What is unusual in it is the inclusion of concepts among the objects of mathematics. This I will explain below. The second statement expresses what looks like a rather wild thesis. (shrink)
I examine various claims to the effect that Cantor's Continuum Hypothesis and other problems of higher set theory are ill-posed questions. The analysis takes into account the viability of the underlying philosophical views and recent mathematical developments.
In his book Wittgenstein on the Foundations of Mathematics, Crispin Wright notes that remarkably little has been done to provide an unpictorial, substantial account of what mathematical platonism comes to. Wright proposes to investigate whether there is not some more substantial doctrine than the familiar images underpinning the platonist view. He begins with the suggestion that the essential platonist claim is that mathematical truth is objective. Although he does not demarcate them as such, Wright proposes several different tests for objectivity. (...) The paper finds problems with each of these tests. (shrink)