I exploit parallel considerations in the philosophy of mind and metaethics to argue that the reasoning employed in an important argument for panpsychism overgeneralizes to support an analogous position in metaethics: panmoralism. Next, I raise a number of problems for panmoralism and thereby build a case for taking the metaethical parallel to be a reductio ad absurdum of the argument for panpsychism. Finally, I contrast panmoralism with a position recently defended by Einar Duenger Bohn and argue that the two suffer (...) from similar problems. I conclude by drawing some general lessons for panpsychism. (shrink)
Logical paradoxes – like the Liar, Russell's, and the Sorites – are notorious. But in Paradoxes and Inconsistent Mathematics, it is argued that they are only the noisiest of many. Contradictions arise in the everyday, from the smallest points to the widest boundaries. In this book, Zach Weber uses “dialetheic paraconsistency” – a formal framework where some contradictions can be true without absurdity – as the basis for developing this idea rigorously, from mathematical foundations up. In doing so, Weber (...) directly addresses a longstanding open question: how much standard mathematics can paraconsistency capture? The guiding focus is on a more basic question, of why there are paradoxes. Details underscore a simple philosophical claim: that paradoxes are found in the ordinary, and that is what makes them so extraordinary. (shrink)
Should we believe our controversial philosophical views? Recently, several authors have argued from broadly conciliationist premises that we should not. If they are right, we philosophers face a dilemma: If we believe our views, we are irrational. If we do not, we are not sincere in holding them. This paper offers a way out, proposing an attitude we can rationally take toward our views that can support sincerity of the appropriate sort. We should arrive at our views via a certain (...) sort of ‘insulated’ reasoning – that is, reasoning that involves setting aside certain higher-order worries, such as those provided by disagreement – when we investigate philosophical questions. (shrink)
Prevailing opinion—defended by Jason Brennan and others—is that voting to change the outcome is irrational, since although the payoffs of tipping an election can be quite large, the probability of doing so is extraordinarily small. This paper argues that prevailing opinion is incorrect. Voting is shown to be rational so long as two conditions are satisfied: First, the average social benefit of electing the better candidate must be at least twice as great as the individual cost of voting, and second, (...) the chance of casting the decisive vote must be at least 1/N, where N stands for the number of citizens. It is argued that both of these conditions are often true in the real world. (shrink)
This paper is about how to aggregate outside opinion. If two experts are on one side of an issue, while three experts are on the other side, what should a non-expert believe? Certainly, the non-expert should take into account more than just the numbers. But which other factors are relevant, and why? According to the view developed here, one important factor is whether the experts should have been expected, in advance, to reach the same conclusion. When the agreement of two (...) (or of twenty) thinkers can be predicted with certainty in advance, their shared belief is worth only as much as one of their beliefs would be worth alone. This expectational model of belief dependence can be applied whether we think in terms of credences or in terms of all-or-nothing beliefs. (shrink)
Ng :255–285, 1995. https://doi.org/10.1007/bf00852469) models the evolutionary dynamics underlying the existence of suffering and enjoyment and concludes that there is likely to be more suffering than enjoyment in nature. In this paper, we find an error in Ng’s model that, when fixed, negates the original conclusion. Instead, the model offers only ambiguity as to whether suffering or enjoyment predominates in nature. We illustrate the dynamics around suffering and enjoyment with the most plausible parameters. In our illustration, we find surprising results: (...) the rate of failure to reproduce can improve or worsen average welfare depending on other characteristics of a species. Our illustration suggests that for organisms with more intense conscious experiences, the balance of enjoyment and suffering may lean more toward suffering. We offer some suggestions for empirical study of wild animal welfare. We conclude by noting that recent writings on wild animal welfare should be revised based on this correction to have a somewhat less pessimistic view of nature. (shrink)
There is a well-known moral quandary concerning how to account for the rightness or wrongness of acts that clearly contribute to some morally significant outcome – but which each seem too small, individually, to make any meaningful difference. One consequentialist-friendly response to this problem is to deny that there could ever be a case of this type. This paper pursues this general strategy, but in an unusual way. Existing arguments for the consequentialist-friendly position are sorites-style arguments. Such arguments imagine varying (...) a subject’s predicament bit by bit until it is clear that a relevant difference has been achieved. The arguments offered in this paper are structurally different, and do not rely on any sorites series. For this reason, they are not vulnerable to objections that have been leveled against the sorites-style arguments. (shrink)
Although arguments for and against competing theories of vagueness often appeal to claims about the use of vague predicates by ordinary speakers, such claims are rarely tested. An exception is Bonini et al. (1999), who report empirical results on the use of vague predicates by Italian speakers, and take the results to count in favor of epistemicism. Yet several methodological difficulties mar their experiments; we outline these problems and devise revised experiments that do not show the same results. We then (...) describe three additional empirical studies that investigate further claims in the literature on vagueness: the hypothesis that speakers confuse ‘P’ with ‘definitely P’, the relative persuasiveness of different formulations of the inductive premise of the Sorites, and the interaction of vague predicates with three different forms of negation. (shrink)
Methods available for the axiomatization of arbitrary finite-valued logics can be applied to obtain sound and complete intelim rules for all truth-functional connectives of classical logic including the Sheffer stroke and Peirce’s arrow. The restriction to a single conclusion in standard systems of natural deduction requires the introduction of additional rules to make the resulting systems complete; these rules are nevertheless still simple and correspond straightforwardly to the classical absurdity rule. Omitting these rules results in systems for intuitionistic versions of (...) the connectives in question. (shrink)
This paper begins an axiomatic development of naive set theoryin a paraconsistent logic. Results divide into two sorts. There is classical recapture, where the main theorems of ordinal and Peano arithmetic are proved, showing that naive set theory can provide a foundation for standard mathematics. Then there are major extensions, including proofs of the famous paradoxes and the axiom of choice (in the form of the well-ordering principle). At the end I indicate how later developments of cardinal numbers will lead (...) to Cantor’s theorem, the existence of large cardinals, and a counterexample to the continuum hypothesis. (shrink)
This paper develops a (nontrivial) theory of cardinal numbers from a naive set comprehension principle, in a suitable paraconsistent logic. To underwrite cardinal arithmetic, the axiom of choice is proved. A new proof of Cantor’s theorem is provided, as well as a method for demonstrating the existence of large cardinals by way of a reflection theorem.
In this book, Zach Kelehear offers readers a new perspective on an important, dynamic, and sometimes daunting issue: managing successful school-based leadership. The author uses an arts-based approach to weave together notions of research-based leadership skills for successful school-based management with standards of professional competence as represented by the Interstate School Leaders Licensure Consortium Standards for School Leaders.
Paraconsistent logic makes it possible to study inconsistent theories in a coherent way. From its modern start in the mid-20th century, paraconsistency was intended for use in mathematics, providing a rigorous framework for describing abstract objects and structures where some contradictions are allowed, without collapse into incoherence. Over the past decades, this initiative has evolved into an area of non-classical mathematics known as inconsistent or paraconsistent mathematics. This Element provides a selective introductory survey of this research program, distinguishing between `moderate' (...) and `radical' approaches. The emphasis is on philosophical issues and future challenges. (shrink)
A textbook for modal and other intensional logics based on the Open Logic Project. It covers normal modal logics, relational semantics, axiomatic and tableaux proof systems, intuitionistic logic, and counterfactual conditionals.
ABSTRACTDo truth tables—the ordinary sort that we use in teaching and explaining basic propositional logic—require an assumption of consistency for their construction? In this essay we show that truth tables can be built in a consistency-independent paraconsistent setting, without any appeal to classical logic. This is evidence for a more general claim—that when we write down the orthodox semantic clauses for a logic, whatever logic we presuppose in the background will be the logic that appears in the foreground. Rather than (...) any one logic being privileged, then, on this count partisans across the logical spectrum are in relatively similar dialectical positions. (shrink)
This essay examines the anti-producing human body in its limit case of public self-induced starvation, as figured in Franz Kafka's short story ‘A Hunger Artist’ and Steve McQueen's film Hunger. Both works represent the fasting body as hollowed out, a resistance to capitalist-spectator capture that spatialises itself as a smoothing, a relative reconfiguration of parts to whole through the evacuation of flows. In both works the human body becomes a local body without organs, paradoxically disarticulated from the more complex assemblages (...) that constitute it while recording potential circuits of disturbance or resonance predicated upon the porousness of bodily boundaries. (shrink)
In the 1920s, Ackermann and von Neumann, in pursuit of Hilbert's programme, were working on consistency proofs for arithmetical systems. One proposed method of giving such proofs is Hilbert's epsilon-substitution method. There was, however, a second approach which was not reflected in the publications of the Hilbert school in the 1920s, and which is a direct precursor of Hilbert's first epsilon theorem and a certain "general consistency result" due to Bernays. An analysis of the form of this so-called "failed proof" (...) sheds further light on an interpretation of Hilbert's programme as an instrumentalist enterprise with the aim of showing that whenever a "real" proposition can be proved by ?ideal? means, it can also be proved by "real", finitary means. (shrink)
A benefit of randomized experiments is that covariate distributions of treatment and control groups are balanced on average, resulting in simple unbiased estimators for treatment effects. However, it is possible that a particular randomization yields covariate imbalances that researchers want to address in the analysis stage through adjustment or other methods. Here we present a randomization test that conditions on covariate balance by only considering treatment assignments that are similar to the observed one in terms of covariate balance. Previous conditional (...) randomization tests have only allowed for categorical covariates, while our randomization test allows for any type of covariate. Through extensive simulation studies, we find that our conditional randomization test is more powerful than unconditional randomization tests and other conditional tests. Furthermore, we find that our conditional randomization test is valid unconditionally across levels of covariate balance, and conditional on particular levels of covariate balance. Meanwhile, unconditional randomization tests are valid for but not. Finally, we find that our conditional randomization test is similar to a randomization test that uses a model-adjusted test statistic. (shrink)
This paper considers a generalisation of the sorites paradox, in which only topological notions are employed. We argue that by increasing the level of abstraction in this way, we see the sorites paradox in a new, more revealing light—a light that forces attention on cut-off points of vague predicates. The generalised sorites paradox presented here also gives rise to a new, more tractable definition of vagueness.
Some of the most important developments of symbolic logic took place in the 1920s. Foremost among them are the distinction between syntax and semantics and the formulation of questions of completeness and decidability of logical systems. David Hilbert and his students played a very important part in these developments. Their contributions can be traced to unpublished lecture notes and other manuscripts by Hilbert and Bernays dating to the period 1917-1923. The aim of this paper is to describe these results, focussing (...) primarily on propositional logic, and to put them in their historical context. It is argued that truth-value semantics, syntactic ("Post-") and semantic completeness, decidability, and other results were first obtained by Hilbert and Bernays in 1918, and that Bernays's role in their discovery and the subsequent development of mathematical logic is much greater than has so far been acknowledged. (shrink)
The naive set theory problem is to begin with a full comprehension axiom, and to find a logic strong enough to prove theorems, but weak enough not to prove everything. This paper considers the sub-problem of expressing extensional identity and the subset relation in paraconsistent, relevant solutions, in light of a recent proposal from Beall, Brady, Hazen, Priest and Restall . The main result is that the proposal, in the context of an independently motivated formalization of naive set theory, leads (...) to triviality. (shrink)
Conciliationism faces a challenge that has not been satisfactorily addressed. There are clear cases of epistemically significant merely possible disagreement, but there are also clear cases where merely possible disagreement is epistemically irrelevant. Conciliationists have not yet accounted for this asymmetry. In this paper, we propose that the asymmetry can be explained by positing a selection constraint on all cases of peer disagreement—whether actual or merely possible. If a peer’s opinion was not selected in accordance with the proposed constraint, then (...) it lacks epistemic significance. This allows us to distinguish the epistemically significant cases of merely possible disagreement from the insignificant ones. (shrink)
The period from 1900 to 1935 was particularly fruitful and important for the development of logic and logical metatheory. This survey is organized along eight "itineraries" concentrating on historically and conceptually linked strands in this development. Itinerary I deals with the evolution of conceptions of axiomatics. Itinerary II centers on the logical work of Bertrand Russell. Itinerary III presents the development of set theory from Zermelo onward. Itinerary IV discusses the contributions of the algebra of logic tradition, in particular, Löwenheim (...) and Skolem. Itinerary V surveys the work in logic connected to the Hilbert school, and itinerary V deals specifically with consistency proofs and metamathematics, including the incompleteness theorems. Itinerary VII traces the development of intuitionistic and many-valued logics. Itinerary VIII surveys the development of semantical notions from the early work on axiomatics up to Tarski's work on truth. (shrink)
After a brief flirtation with logicism around 1917, David Hilbertproposed his own program in the foundations of mathematics in 1920 and developed it, in concert with collaborators such as Paul Bernays andWilhelm Ackermann, throughout the 1920s. The two technical pillars of the project were the development of axiomatic systems for everstronger and more comprehensive areas of mathematics, and finitisticproofs of consistency of these systems. Early advances in these areaswere made by Hilbert (and Bernays) in a series of lecture courses atthe (...) University of Göttingen between 1917 and 1923, and notably in Ackermann's dissertation of 1924. The main innovation was theinvention of the -calculus, on which Hilbert's axiom systemswere based, and the development of the -substitution methodas a basis for consistency proofs. The paper traces the developmentof the ``simultaneous development of logic and mathematics'' throughthe -notation and provides an analysis of Ackermann'sconsistency proofs for primitive recursive arithmetic and for thefirst comprehensive mathematical system, the latter using thesubstitution method. It is striking that these proofs use transfiniteinduction not dissimilar to that used in Gentzen's later consistencyproof as well as non-primitive recursive definitions, and that thesemethods were accepted as finitistic at the time. (shrink)
Hilbert's ε-calculus is based on an extension of the language of predicate logic by a term-forming operator εx. Two fundamental results about the ε-calculus, the first and second epsilon theorem, play a rôle similar to that which the cut-elimination theorem plays in sequent calculus. In particular, Herbrand's Theorem is a consequence of the epsilon theorems. The paper investigates the epsilon theorems and the complexity of the elimination procedure underlying their proof, as well as the length of Herbrand disjunctions of existential (...) theorems obtained by this elimination procedure. (shrink)
This note motivates a logic for a theory that can express its own notion of logical consequence—a ‘syntactically closed’ theory of naive validity. The main issue for such a logic is Curry’s paradox, which is averted by the failure of contraction. The logic features two related, but different, implication connectives. A Hilbert system is proposed that is complete and non-trivial.
In the early 1920s, the German mathematician David Hilbert (1862–1943) put forward a new proposal for the foundation of classical mathematics which has come to be known as Hilbert's Program. It calls for a formalization of all of mathematics in axiomatic form, together with a proof that this axiomatization of mathematics is consistent. The consistency proof itself was to be carried out using only what Hilbert called “finitary” methods. The special epistemological character of finitary reasoning then yields the required justification (...) of classical mathematics. Although Hilbert proposed his program in this form only in 1921, various facets of it are rooted in foundational work of his going back until around 1900, when he first pointed out the necessity of giving a direct consistency proof of analysis. Work on the program progressed significantly in the 1920s with contributions from logicians such as Paul Bernays, Wilhelm Ackermann, John von Neumann, and Jacques Herbrand. It was also a great influence on Kurt Gödel, whose work on the incompleteness theorems were motivated by Hilbert's Program. Gödel's work is generally taken to show that Hilbert's Program cannot be carried out. It has nevertheless continued to be an influential position in the philosophy of mathematics, and, starting with the work of Gerhard Gentzen in the 1930s, work on so-called Relativized Hilbert Programs have been central to the development of proof theory. (shrink)
Mereotopology is a theory of connected parts. The existence of boundaries, as parts of everyday objects, is basic to any such theory; but in classical mereotopology, there is a problem: if boundaries exist, then either distinct entities cannot be in contact, or else space is not topologically connected . In this paper we urge that this problem can be met with a paraconsistent mereotopology, and sketch the details of one such approach. The resulting theory focuses attention on the role of (...) empty parts, in delivering a balanced and bounded metaphysics of naive space. (shrink)
Abstraction and idealization are the two notions that are most often discussed in the context of assumptions employed in the process of model building. These notions are also routinely used in philosophical debates such as that on the mechanistic account of explanation. Indeed, an objection to the mechanistic account has recently been formulated precisely on these grounds: mechanists cannot account for the common practice of idealizing difference-making factors in models in molecular biology. In this paper I revisit the debate and (...) I argue that the objection does not stand up to scrutiny. This is because it is riddled with a number of conceptual inconsistencies. By attempting to resolve the tensions, I also draw several general lessons regarding the difficulties of applying abstraction and idealization in scientific practice. Finally, I argue that more care is needed only when speaking of abstraction and idealization in a context in which these concepts play an important role in an argument, such as that on mechanistic explanation. (shrink)
The proof theory of many-valued systems has not been investigated to an extent comparable to the work done on axiomatizatbility of many-valued logics. Proof theory requires appropriate formalisms, such as sequent calculus, natural deduction, and tableaux for classical (and intuitionistic) logic. One particular method for systematically obtaining calculi for all finite-valued logics was invented independently by several researchers, with slight variations in design and presentation. The main aim of this report is to develop the proof theory of finite-valued first order (...) logics in a general way, and to present some of the more important results in this area. In Systems covered are the resolution calculus, sequent calculus, tableaux, and natural deduction. This report is actually a template, from which all results can be specialized to particular logics. (shrink)
The Curry-Howard isomorphism is a proof-theoretic result that establishes a connection between derivations in natural deduction and terms in typed lambda calculus. It is an important proof-theoretic result, but also underlies the development of type systems for programming languages. This fact suggests a potential importance of the result for a philosophy of code.
On some accounts of vagueness, predicates like “is a heap” are tolerant. That is, their correct application tolerates sufficiently small changes in the objects to which they are applied. Of course, such views face the sorites paradox, and various solutions have been proposed. One proposed solution involves banning repeated appeals to tolerance, while affirming tolerance in any individual case. In effect, this solution rejects the reasoning of the sorites argument. This paper discusses a thorny problem afflicting this approach to vagueness. (...) In particular, it is shown that, on the foregoing view, whether an object is a heap will sometimes depend on factors extrinsic to that object, such as whether its components came from other heaps. More generally, the paper raises the issue of how to count heaps in a tolerance-friendly framework. (shrink)
First-order Gödel logics are a family of finite- or infinite-valued logics where the sets of truth values V are closed subsets of [0,1] containing both 0 and 1. Different such sets V in general determine different Gödel logics GV (sets of those formulas which evaluate to 1 in every interpretation into V). It is shown that GV is axiomatizable iff V is finite, V is uncountable with 0 isolated in V, or every neighborhood of 0 in V is uncountable. Complete (...) axiomatizations for each of these cases are given. The r.e. prenex, negation-free, and existential fragments of all first-order Gödel logics are also characterized. (shrink)
Textbook on Gödel’s incompleteness theorems and computability theory, based on the Open Logic Project. Covers recursive function theory, arithmetization of syntax, the first and second incompleteness theorem, models of arithmetic, second-order logic, and the lambda calculus.
An introductory textbook on metalogic. It covers naive set theory, first-order logic, sequent calculus and natural deduction, the completeness, compactness, and Löwenheim-Skolem theorems, Turing machines, and the undecidability of the halting problem and of first-order logic. The audience is undergraduate students with some background in formal logic.
Roger White (2015) sketches an ingenious new solution to the problem of induction. He argues from the principle of indifference for the conclusion that the world is more likely to be induction- friendly than induction-unfriendly. But there is reason to be skeptical about the proposed indifference-based vindication of induction. It can be shown that, in the crucial test cases White concentrates on, the assumption of indifference renders induction no more accurate than random guessing. After discussing this result, the paper explains (...) why the indifference-based argument seemed so compelling, despite ultimately being unsound. (shrink)
Hilbert’s program was an ambitious and wide-ranging project in the philosophy and foundations of mathematics. In order to “dispose of the foundational questions in mathematics once and for all,” Hilbert proposed a two-pronged approach in 1921: first, classical mathematics should be formalized in axiomatic systems; second, using only restricted, “finitary” means, one should give proofs of the consistency of these axiomatic systems. Although Gödel’s incompleteness theorems show that the program as originally conceived cannot be carried out, it had many partial (...) successes, and generated important advances in logical theory and metatheory, both at the time and since. The article discusses the historical background and development of Hilbert’s program, its philosophical underpinnings and consequences, and its subsequent development and influences since the 1930s. (shrink)
Standard reasoning about Kripke semantics for modal logic is almost always based on a background framework of classical logic. Can proofs for familiar definability theorems be carried out using a nonclassical substructural logic as the metatheory? This article presents a semantics for positive substructural modal logic and studies the connection between frame conditions and formulas, via definability theorems. The novelty is that all the proofs are carried out with a noncontractive logic in the background. This sheds light on which modal (...) principles are invariant under changes of metalogic, and provides evidence for the general viability of nonclassical mathematics. (shrink)
The Church-Turing Thesis is widely regarded as true, because of evidence that there is only one genuine notion of computation. By contrast, there are nowadays many different formal logics, and different corresponding foundational frameworks. Which ones can deliver a theory of computability? This question sets up a difficult challenge: the meanings of basic mathematical terms are not stable across frameworks. While it is easy to compare what different frameworks say, it is not so easy to compare what they mean. We (...) argue for some minimal conditions that must be met if two frameworks are to be compared; if frameworks are radical enough, comparison becomes hopeless. Our aim is to clarify the dialectical situation in this bourgeoning area of research, shedding light on the nature of non-classical logic and the notion of computation alike. (shrink)
This paper begins an analysis of the real line using an inconsistency-tolerant (paraconsistent) logic. We show that basic field and compactness properties hold, by way of novel proofs that make no use of consistency-reliant inferences; some techniques from constructive analysis are used instead. While no inconsistencies are found in the algebraic operations on the real number field, prospects for other non-trivializing contradictions are left open.
The epsilon operator is a term-forming operator which replaces quantifiers in ordinary predicate logic. The application of this undervalued formalism has been hampered by the absence of well-behaved proof systems on the one hand, and accessible presentations of its theory on the other. One significant early result for the original axiomatic proof system for the epsilon-calculus is the first epsilon theorem, for which a proof is sketched. The system itself is discussed, also relative to possible semantic interpretations. The problems facing (...) the development of proof-theoretically well-behaved systems are outlined. (shrink)