In this essay KelvinBeckett argues that Richard Peters's major work on education, Ethics and Education, belongs on a short list of important texts we can all share. He argues this not because of the place it has in the history of philosophy of education, as important as that is, but because of the contribution it can still make to the future of the discipline. The limitations of Peters's analysis of the concept of education in his chapter on (...) “Criteria of Education” are well known. In the chapter on “Education as Initiation,” however, Peters offered a synthetic sketch of education that, Beckett argues, points us toward a more comprehensive definition of education, one which, he maintains, can be accepted by all philosophers, regardless of the tradition they work in. (shrink)
In this article, I argue that Paulo Freire?s liberatory conception of education is interesting, challenging, even transforming because central to it are important aspects of education which other philosophers marginalise. I also argue that Freire?s critics are right when they claim that he paid insufficient attention to another important aspect of education. Finally, I argue for a conception of education which takes account of the strengths and at the same time overcomes the limitations of Freire?s liberatory conception.
John Dewey adopted a child-centered point of view to illuminate aspects of education he believed teacher-centered educators were neglecting, but he did so self-consciously and self-critically, because he also believed that ‘a new order of conceptions leading to new modes of practice’ was needed. Dewey introduced his new conceptions in The Child and the Curriculum and later and more fully in Democracy and Education. Teachers at his Laboratory School in Chicago developed the new modes of practice. In this article, I (...) explore Dewey’s new conception of education and compare it with the apparently opposed views of R. S. Peters and Paulo Freire. In doing so, I show that, despite their criticisms of Dewey, whether explicit or implicit, these influential philosophers, representing quite different traditions in philosophy of education were in substantial agreement with him. I also show that, despite our own differences, as important as they are, seeing teachers and learners at work in a rapidly changing society, now on a global scale, in classrooms which are also changing, driven largely by new technologies, the conception of education Dewey, Peters, and Freire developed can provide us with the foundation we need to understand the changing teacher–learner relationship and the purposes their shared activities serve. (shrink)
After a brief account of the problem of higher-order vagueness, and its seeming intractability, I explore what comes of the issue on a linguistic, contextualist account of vagueness. On the view in question, predicates like ‘borderline red’ and ‘determinately red’ are, or at least can be, vague, but they are different in kind from ‘red’. In particular, ‘borderline red’ and ‘determinately red’ are not colours. These predicates have linguistic components, and invoke notions like ‘competent user of the language’. On my (...) view, so-called ‘higher-order vagueness’ is actually ordinary, ﬁrst-order vagueness in different predicates. I explore the possibility that, nevertheless, a pernicious regress ensues. (shrink)
In this article I shall concern myself with the question ‘Is some type of justification required in order for belief in God to be rational?’ Many philosophers and theologians in the past would have responded affirmatively to this question. However, in our own day, there are those who maintain that natural theology in any form is not necessary. This is because of the rise of a different understanding of the nature of religious belief. Unlike what most people in the past (...) thought, religious belief is not in any sense arrived at or inferred on the basis of other known propositions. On the contrary, belief in God is taken to be as basic as a person's belief in the existence of himself, of the chair in which he is sitting, or the past. The old view that there must be a justification of religious belief, whether known or unknown, is held to be mistaken. One of the most outspoken advocates of this view is Alvin Plantinga. According to Plantinga the mature theist ought not to accept belief in God as a conclusion from other things he believes. Rather, he should accept it as basic, as a part of the bedrock of his noetic structure. ‘The mature theist commits himself to belief in God; this means that he accepts belief in God as basic.’. (shrink)
We defend the many-worlds interpretation of quantum mechanics against the objection that it cannot explain why measurement outcomes are predicted by the Born probability rule. We understand quantum probabilities in terms of an observer's self-location probabilities. We formulate a probability postulate for the MWI: the probability of self-location in a world with a given set of outcomes is the absolute square of that world's amplitude. We provide a proof of this postulate, which assumes the quantum formalism and two principles concerning (...) symmetry and locality. We also show how a structurally similar proof of the Born rule is available for collapse theories. We conclude by comparing our account to the recent account offered by Sebens and Carroll. (shrink)
The primary quantum mechanical equation of motion entails that measurements typically do not have determinate outcomes, but result in superpositions of all possible outcomes. Dynamical collapse theories (e.g. GRW) supplement this equation with a stochastic Gaussian collapse function, intended to collapse the superposition of outcomes into one outcome. But the Gaussian collapses are imperfect in a way that leaves the superpositions intact. This is the tails problem. There are several ways of making this problem more precise. But many authors dismiss (...) the problem without considering the more severe formulations. Here I distinguish four distinct tails problems. The first (bare tails problem) and second (structured tails problem) exist in the literature. I argue that while the first is a pseudo-problem, the second has not been adequately addressed. The third (multiverse tails problem) reformulates the second to account for recently discovered dynamical consequences of collapse. Finally the fourth (tails problem dilemma) shows that solving the third by replacing the Gaussian with a non-Gaussian collapse function introduces new conflict with relativity theory. (shrink)
Graham Priest's In Contradiction (Dordrecht: Martinus Nijhoff Publishers, 1987, chapter 3) contains an argument concerning the intuitive, or ‘naïve’ notion of (arithmetic) proof, or provability. He argues that the intuitively provable arithmetic sentences constitute a recursively enumerable set, which has a Gödel sentence which is itself intuitively provable. The incompleteness theorem does not apply, since the set of provable arithmetic sentences is not consistent. The purpose of this article is to sharpen Priest's argument, avoiding reference to informal notions, consensus, or (...) Church's thesis. We add Priest's dialetheic semantics to ordinary Peano arithmetic PA, to produce a recursively axiomatized formal system PA★ that contains its own truth predicate. Whether one is a dialetheist or not, PA★ is a legitimate, rigorously defined formal system, and one can explore its proof‐theoretic properties. The system is inconsistent (but presumably non‐trivial), and it proves its own Gödel sentence as well as its own soundness. Although this much is perhaps welcome to the dialetheist, it has some untoward consequences. There are purely arithmetic (indeed, Π0) sentences that are both provable and refutable in PA★. So if the dialetheist maintains that PA★ is sound, then he must hold that there are true contradictions in the most elementary language of arithmetic. Moreover, the thorough dialetheist must hold that there is a number g which both is and is not the code of a derivation of the indicated Gödel sentence of PA★. For the thorough dialetheist, it follows ordinary PA and even Robinson arithmetic are themselves inconsistent theories. I argue that this is a bitter pill for the dialetheist to swallow. (shrink)
Guthrie contends that religion can best be understood as systematic anthropomorphism - the attribution of human characteristics to nonhuman things and events. Religion, he says, consists of seeing the world as human like. He offers a fascinating array of examples to show how this strategy pervades secular life and how it characterizes religious experience.
In this paper, I assume that if we have libertarian freedom, it is located in the power to choose and its exercise. Given this assumption, I then further assume a version of the Principle of Alternative Possibilities which states that an agent is morally responsible for his choice only if he could have chosen otherwise. With these assumptions in place, I examine three recent attempts to construct Frankfurt‐style counterexamples to PAP. I argue that all fail to undermine the intuitive plausibility (...) of PAP. (shrink)
The growing range of methods for statistical model selection is inspiring new debates about how to handle the potential for conflicting results when different methods are applied to the same data. While many factors enter into choosing a model selection method, we focus on the implications of disagreements among scientists about whether, and in what sense, the true probability distribution is included in the candidate set of models. While this question can be addressed empirically, the data often provide inconclusive results (...) in practice. In such cases, we argue that differences in prior metaphysical views about the local adequacy of the models can produce underdetermination of results, even for the same data and candidate models. As a result, data alone are sometimes insufficient to settle rational beliefs about nature. (shrink)
Integrated information theory is a theory of consciousness that was originally formulated, and is standardly still expressed, in terms of controversial interpretations of its own ontological and epistemological basis. These form the orthodox interpretation of IIT. The orthodox epistemological interpretation is the axiomatic method, whereby IIT is ultimately derived from, justified by, and beholden to a set of phenomenological axioms. The orthodox ontological interpretation is panpsychism, according to which consciousness is fundamental, intrinsic, and pervasive. In this paper it is argued (...) that both components of the orthodox interpretation should be rejected. But IIT should not be rejected since an interpretation-neutral formulation is available. After explaining the neutral formulation, more plausible non-axiomatic epistemologies are defended. The neutral formulation is then shown to be consistent with various contemporary physicalist ontologies of consciousness, including the phenomenal concept strategy, representationalism, and even illusionism. Along the way, instructive connections between interpretations of IIT and interpretations of quantum mechanics are noted. (shrink)
Does consciousness collapse the quantum wave function? This idea was taken seriously by John von Neumann and Eugene Wigner but is now widely dismissed. We develop the idea by combining a mathematical theory of consciousness (integrated information theory) with an account of quantum collapse dynamics (continuous spontaneous localization). Simple versions of the theory are falsified by the quantum Zeno effect, but more complex versions remain compatible with empirical evidence. In principle, versions of the theory can be tested by experiments with (...) quantum computers. The upshot is not that consciousness-collapse interpretations are clearly correct, but that there is a research program here worth exploring. (shrink)
The many worlds interpretation of quantum mechanics (MWI) states that the world we live in is just one among many parallel worlds. It is widely believed that because of this commitment to parallel worlds, the MWI violates common sense. Some go so far as to reject the MWI on this basis. This is despite its myriad of advantages to physics (e.g. consistency with relativity theory, mathematical simplicity, realism, determinism, etc.). Here, we make the case that common sense in fact favors (...) the MWI. We argue that causal explanations are commonsensical only when they are local causal explanations. We present several quantum mechanical experiments that seem to exhibit nonlocal “action at a distance”. Under the assumption that only one world exists, these experiments seem immune to local causal explanation. However, we show that the MWI, by taking all worlds together, can provide local causal explanations of the experiments. The MWI therefore restores common sense to physical explanation. (shrink)
We introduce a new type of pluralism about biological function that, in contrast to existing, demonstrates a practical integration among the term’s different meanings. In particular, we show how to generalize Sandra Mitchell’s notion of integrative pluralism to circumstances where multiple epistemic tools of the same type are jointly necessary to solve scientific problems. We argue that the multiple definitions of biological function operate jointly in this way based on how biologists explain the evolution of protein function. To clarify how (...) our account relates to existing views, we introduce a general typology for monist and pluralist accounts along with standardized criteria for judging which is best supported by evidence. (shrink)
Do numbers, sets, and so forth, exist? What do mathematical statements mean? Are they literally true or false, or do they lack truth values altogether? Addressing questions that have attracted lively debate in recent years, Stewart Shapiro contends that standard realist and antirealist accounts of mathematics are both problematic. As Benacerraf first noted, we are confronted with the following powerful dilemma. The desired continuity between mathematical and, say, scientific language suggests realism, but realism in this context suggests seemingly intractable (...) epistemic problems. As a way out of this dilemma, Shapiro articulates a structuralist approach. On this view, the subject matter of arithmetic, for example, is not a fixed domain of numbers independent of each other, but rather is the natural number structure, the pattern common to any system of objects that has an initial object and successor relation satisfying the induction principle. Using this framework, realism in mathematics can be preserved without troublesome epistemic consequences. Shapiro concludes by showing how a structuralist approach can be applied to wider philosophical questions such as the nature of an "object" and the Quinean nature of ontological commitment. Clear, compelling, and tautly argued, Shapiro's work, noteworthy both in its attempt to develop a full-length structuralist approach to mathematics and to trace its emergence in the history of mathematics, will be of deep interest to both philosophers and mathematicians. (shrink)
Thomson, Kelvin You might be surprised to learn that China, home of the much derided one-child policy, has a higher birth rate than Italy, home of the Vatican. This suggests Chinese families are quietly defying their political leaders and Italian families are quietly defying their religious ones. But the overall global picture is one of rapid population growth.
Logical pluralism is the view that different logics are equally appropriate, or equally correct. Logical relativism is a pluralism according to which validity and logical consequence are relative to something. Stewart Shapiro explores various such views. He argues that the question of meaning shift is itself context-sensitive and interest-relative.
Stewart Shapiro's ambition in Vagueness in Context is to develop a comprehensive account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary according to their context: a person can be tall with respect to male accountants and not tall (even short) with respect to professional basketball players. The key feature of Shapiro's account is that the extensions of vague (...) terms also vary in the course of conversations and that, in some cases, a competent speaker can go either way without sinning against the meaning of the words or the non-linguistic facts. As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak; but vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. (shrink)
Stewart Shapiro's aim in Vagueness in Context is to develop both a philosophical and a formal, model-theoretic account of the meaning, function, and logic of vague terms in an idealized version of a natural language like English. It is a commonplace that the extensions of vague terms vary with such contextual factors as the comparison class and paradigm cases. A person can be tall with respect to male accountants and not tall with respect to professional basketball players. The main (...) feature of Shapiro's account is that the extensions of vague terms also vary in the course of a conversation, even after the external contextual features, such as the comparison class, are fixed. A central thesis is that in some cases, a competent speaker of the language can go either way in the borderline area of a vague predicate without sinning against the meaning of the words and the non-linguistic facts. Shapiro calls this open texture, borrowing the term from Friedrich Waismann.The formal model theory has a similar structure to the supervaluationist approach, employing the notion of a sharpening of a base interpretation. In line with the philosophical account, however, the notion of super-truth does not play a central role in the development of validity. The ultimate goal of the technical aspects of the work is to delimit a plausible notion of logical consequence, and to explore what happens with the sorites paradox.Later chapters deal with what passes for higher-order vagueness - vagueness in the notions of 'determinacy' and 'borderline' - and with vague singular terms, or objects. In each case, the philosophical picture is developed by extending and modifying the original account. This is followed with modifications to the model theory and the central meta-theorems.As Shapiro sees it, vagueness is a linguistic phenomenon, due to the kinds of languages that humans speak. But vagueness is also due to the world we find ourselves in, as we try to communicate features of it to each other. Vagueness is also due to the kinds of beings we are. There is no need to blame the phenomenon on any one of those aspects. (shrink)
Chimpanzee language studies have generated much heated controversy, as Roger Fouts can attest from firsthand experience. Perhaps this is because language is usually considered to be what truly distinguishes humans from apes. If chimps can indeed be taught the rudiments of language, then the difference between them and us is not as great as we might have thought. It is a matter of degree rather than kind, a continuity, and our species is not so special after all. The advantage of (...) this continuity thesis, as Fouts has emphasized, is that it conforms to the general tenets of evolutionary theory, and fits well with the evidence from paleontology and genetics that suggests that apes and humans are close cousins. It also .. (shrink)
The ‘managed-metabolism’ hypothesis suggests that a ‘cooperation barrier’ must be overcome if self-producing chemical organizations are to undergo the transition from non-life to life. This dynamical barrier prevents un-managed autocatalytic networks of molecular species from individuating into complex, cooperative organizations. The barrier arises because molecular species that could otherwise make significant cooperative contributions to the success of an organization will often not be supported within the organization, and because side reactions and other ‘free-riding’ processes will undermine cooperation. As a result, (...) the barrier seriously impedes the emergence of individuality, complex functionality and the transition to life. This barrier is analogous to the cooperation barrier that also impedes the emergence of complex cooperation at all levels of living organization. As has been shown at other levels of organization, the barrier can be overcome comprehensively by appropriate ‘management’. Management implements a system of evolvable constraints that can overcome the cooperation barrier by ensuring that beneficial co-operators are supported within the organization and by suppressing free-riders. In this way, management can control and manipulate the chemical processes of a collectively autocatalytic organization, producing novel processes that serve the interests of the organization as a whole and that could not arise and persist in an un-managed chemical organization. Management self-organizes because it is able to capture some of the benefits that are produced when its interventions promote cooperation, thereby enhancing productivity. Selection will therefore favour the emergence of managers that take over and manage chemical organizations so as to overcome the cooperation barrier. The managed-metabolism hypothesis demonstrates that if management is to overcome the cooperation barrier comprehensively, its interventions must be digitally coded. In this way, the hypothesis accounts for the two-tiered structure of all living cells in which a digitally-coded genetic apparatus manages an analogically-informed metabolism. Graphical Abstract. (shrink)
Aristotle is the most influential philosopher of practice, and Knight's new book explores the continuing importance of Aristotelian philosophy. First, it examines the theoretical bases of what Aristotle said about ethical, political and productive activity. It then traces ideas of practice through such figures as St Paul, Luther, Hegel, Heidegger and recent Aristotelian philosophers, and evaluates Alasdair MacIntyre's contribution. Knight argues that, whereas Aristotle's own thought legitimated oppression, MacIntyre's revision of Aristotelianism separates ethical excellence from social elitism and justifies resistance. (...) With MacIntyre, Aristotelianism becomes revolutionary. MacIntyre's case for the Thomistic Aristotelian tradition originates in his attempt to elaborate a Marxist ethics informed by analytic philosophy. He analyses social practices in teleological terms, opposing them to capitalist institutions and arguing for the cooperative defence of our moral agency. In condensing these ideas, Knight advances a theoretical argument for the reformation of Aristotelianism and an ethical argument for social change. (shrink)
This unique book by Stewart Shapiro looks at a range of philosophical issues and positions concerning mathematics in four comprehensive sections. Part I describes questions and issues about mathematics that have motivated philosophers since the beginning of intellectual history. Part II is an historical survey, discussing the role of mathematics in the thought of such philosophers as Plato, Aristotle, Kant, and Mill. Part III covers the three major positions held throughout the twentieth century: the idea that mathematics is logic (...) (logicism), the view that the essence of mathematics is the rule-governed manipulation of characters (formalism), and a revisionist philosophy that focuses on the mental activity of mathematics (intuitionism). Finally, Part IV brings the reader up-to-date with a look at contemporary developments within the discipline. This sweeping introductory guide to the philosophy of mathematics makes these fascinating concepts accessible to those with little background in either mathematics or philosophy. (shrink)
It is time to escape the constraints of the Systematics Wars narrative and pursue new questions that are better positioned to establish the relevance of the field in this time period to broader issues in the history of biology and history of science. To date, the underlying assumptions of the Systematics Wars narrative have led historians to prioritize theory over practice and the conflicts of a few leading theorists over the less-polarized interactions of systematists at large. We show how shifting (...) to a practice-oriented view of methodology, centered on the trajectory of mathematization in systematics, demonstrates problems with the common view that one camp straightforwardly “won” over the other. In particular, we critique David Hull’s historical account in Science as a Process by demonstrating exactly the sort of intermediate level of positive sharing between phenetic and cladistic theories that undermines their mutually exclusive individuality as conceptual systems over time. It is misleading, or at least inadequate, to treat them simply as holistically opposed theories that can only interact by competition to the death. Looking to the future, we suggest that the concept of workflow provides an important new perspective on the history of mathematization and computerization in biology after World War II. (shrink)
The purpose of this book is to explain Quantum Bayesianism (‘QBism’) to “people without easy access to mathematical formulas and equations” (4-5). Qbism is an interpretation of quantum mechanics that “doesn’t meddle with the technical aspects of the theory [but instead] reinterprets the fundamental terms of the theory and gives them new meaning” (3). The most important motivation for QBism, enthusiastically stated on the book’s cover, is that QBism provides “a way past quantum theory’s paradoxes and puzzles” such that much (...) of the weirdness associated with quantum theory “dissolves under the lens of QBism”. (shrink)
The dominant response to this problem of the criterion focuses on the alleged requirement that we need to know a belief source is reliable in order for us to acquire knowledge by that source. Let us call this requirement, “The KR principle”.
What are the prospects for a monistic view of biological individuality given the multiple epistemic roles the concept must satisfy? In this paper, I examine the epistemic adequacy of two recent accounts based on the capacity to undergo natural selection. One is from Ellen Clarke, and the other is by Peter Godfrey-Smith. Clarke’s position reflects a strong monism, in that she aims to characterize individuality in purely functional terms and refrains from privileging any specific material properties as important in their (...) own right. I argue that Clarke’s functionalism impairs the epistemic adequacy of her account compared to a middle-ground position taken by Godfrey-Smith. In comparing Clarke and Godfrey-Smith’s account, two pathways emerge to pluralism about biological individuality. The first develops from the contrast between functionalist and materialist approaches, and the second from an underlying temporal structure involved in using evolutionary processes to define individuality. (shrink)
The central contention of this book is that second-order logic has a central role to play in laying the foundations of mathematics. In order to develop the argument fully, the author presents a detailed description of higher-order logic, including a comprehensive discussion of its semantics. He goes on to demonstrate the prevalence of second-order concepts in mathematics and the extent to which mathematical ideas can be formulated in higher-order logic. He also shows how first-order languages are often insufficient to codify (...) many concepts in contemporary mathematics, and thus that both first- and higher-order logics are needed to fully reflect current work. Throughout, the emphasis is on discussing the associated philosophical and historical issues and the implications they have for foundational studies. For the most part, the author assumes little more than a familiarity with logic comparable to that provided in a beginning graduate course which includes the incompleteness of arithmetic and the Lowenheim-Skolem theorems. All those concerned with the foundations of mathematics will find this a thought-provoking discussion of some of the central issues in the field today. (shrink)
In the early twentieth century an apparently obscure philosophical debate took place between F. H. Bradley and Bertrand Russell. The historical outcome was momentous: the demise of the movement known as British Idealism, and its eventual replacement by the various forms of analytic philosophy. Since then, a conception of this debate and its rights and wrongs has become entrenched in English-language philosophy. Stewart Candlish examines afresh the events of this formative period in twentieth-century thought and comes to some surprising (...) conclusions. (shrink)
The principle of mass additivity states that the mass of a composite object is the sum of the masses of its elementary components. Mass additivity is true in Newtonian mechanics but false in special relativity. Physicists have explained why mass additivity is true in Newtonian mechanics by reducing it to Newton’s microphysical laws. This reductive explanation does not fit well with deducibility theories of reductive explanation such as the modern Nagelian theory of reduction, and the a priori entailment theory of (...) reduction that is prominent in the philosophy of mind. Nonetheless, I argue that a reconstruction of the explanation that incorporates distinctively philosophical concepts in fact fits both theories. I discuss the implications of this result for both theories and for the reductive explanation of consciousness. (shrink)
Criticism of big data has focused on showing that more is not necessarily better, in the sense that data may lose their value when taken out of context and aggregated together. The next step is to incorporate an awareness of pitfalls for aggregation into the design of data infrastructure and institutions. A common strategy minimizes aggregation errors by increasing the precision of our conventions for identifying and classifying data. As a counterpoint, we argue that there are pragmatic trade-offs between precision (...) and ambiguity that are key to designing effective solutions for generating big data about biodiversity. We focus on the importance of theory-dependence as a source of ambiguity in taxonomic nomenclature and hence a persistent challenge for implementing a single, long-term solution to storing and accessing meaningful sets of biological specimens. We argue that ambiguity does have a positive role to play in scientific progress as a tool for efficiently symbolizing multiple aspects of taxa and mediating between conflicting hypotheses about their nature. Pursuing a deeper understanding of the trade-offs and synthesis of precision and ambiguity as virtues of scientific language and communication systems then offers a productive next step for realizing sound, big biodiversity data services. (shrink)
Contemporary biology has inherited two key assumptions from the Modern Synthesis about the nature of population lineages: sexual reproduction is the exemplar for how individuals in population lineages inherit traits from their parents, and random mating is the exemplar for reproductive interaction. While these assumptions have been extremely fruitful for a number of fields, such as population genetics and phylogenetics, they are increasingly unviable for studying the full diversity and evolution of life. I introduce the “mixture” account of population lineages (...) that escapes these assumptions by dissolving the Modern Synthesis’s sharp line separating reproduction and development and characterizing reproductive integration in population lineages by the ephemerality of isolated subgroups rather than random mating. The mixture account provides a single criterion for reproductive integration that accommodates both sexual and asexual reproduction, unifying their treatment under Kevin de Queiroz’s generalized lineage concept of species. The account also provides a new basis for empirically assessing the effect of random mating as an idealization on the empirical adequacy of population genetic models. (shrink)