In this paper, I develop an argument for the thesis that ‘maximality is extrinsic’, on which a whole physical object is not a whole of its kind in virtue of its intrinsic properties. Theodore Sider has a number of arguments that depend on his own simple argument that maximality is extrinsic. However, Peter van Inwagen has an argument in defence of his Duplication Principle that, I will argue, can be extended to show that Sider's simple argument fails. However, van Inwagen's (...) argument fails against a more complex, sophisticated argument that maximality is extrinsic. I use van Inwagen's own commitments to various forms of causation and metaphysical possibility to argue that maximality is indeed extrinsic, although not for the mundane reasons that Sider suggests. I then argue that moral properties are extrinsic properties. Two physically identical things can have different moral properties in a physical world. This argument is a counterexample to a classical ethical supervenience idea (often attributed to G.E. Moore) that if there is identity of physical properties in a physical world, then there is identity in moral properties as well. I argue moral value is ‘border sensitive’ and extrinsic for Kantians, utilitarians, and Aristotelians. (shrink)
David Lewis insists that restrictivist composition must be motivated by and occur due to some intuitive desiderata for a relation R among parts that compose wholes, and insists that a restrictivist’s relation R must be vague. Peter van Inwagen agrees. In this paper, I argue that restrictivists need not use such examples of relation R as a criterion for composition, and any restrictivist should reject a number of related mereological theses. This paper critiques Lewis and van Inwagen (and others) on (...) their respective mereological metaphysics, and offers a Golden Mean between their two opposite extremes. I argue for a novel account of mereology I call Modal Mereology that is an alternative to Classical Mereology. A modal mereologist can be a universalist about the possible composition of wholes from parts and a restrictivist about the actual composition of wholes from parts. I argue that puzzles facing Modal Mereology (e.g., puzzles concerning Cambridge changes and the Problem of the Many, and how to demarcate the actual from the possible) are also faced in similar forms by classical universalists. On my account, restricted composition is rather motivated by and occurs due to a possible whole’s instantiating an actual type. Universalists commonly believe in such types and defend their existence from objections and puzzles. The Modal Mereological restrictivist can similarly defend the existence of such types (adding that such types are the only wholes) from similar objections and puzzles. (shrink)
Many philosophers believe that a moral theory, given all the relevant facts, should be able to determine what is morally right and wrong. It is commonly argued that Aristotle’s ethical theory suffers from a fatal flaw: it places responsibility for determining right and wrong with the virtuous agent who has phronesis rather than with the theory itself. It is also commonly argued that Immanuel Kant’s ethical theory does provide a concept of right that is capable of determining right and wrong (...) in specific cases. I argue, however, that Kant never gives a determinate moral theory of right. Rather, I argue that Kant’s moral theory is similar in many ways to that of Aristotle, in that it still holds that a moral agent with phronesis, rather than the theory, determines what is right. Kant’s practical philosophy was not so much meant to tell us right and wrong as to prevent bad moral theory from corrupting our moral common sense, and it is our moral common sense that determines right and wrong naturally. (shrink)
Aristotle’s best human life is attained through theoretical contemplation, and Confucius’ is attained through practical cultivation of the social self. However, I argue that in the best human life for both Confucius and Aristotle, a form of theoretical contemplation must occur and can only occur with an ethical commitment to community life. Confucius, like Aristotle, sees that the best contemplation comes after later-life, greater-learning and is central to ethical and community life. Aristotle, like Confucius, sees the best contemplation as presupposing (...) full ethical commitment to community life. So, I argue for the theses that: on Aristotle’s view, the best human contemplation requires one be fully morally good; on Confucius’ view, to be fully morally good requires the best human contemplation; being fully morally good for both requires commitment to the good of others and the community. (shrink)
In this book, Sean Noah Walsh applies Herbert Marcuse’s observations on counterrevolution to recent developments in education politics. Seemingly disparate issues such as the exercise of state power to reorganize curricula, the derision of intellectuals, the permeation of consumerism into the collegiate experience, and the expansion of online teaching belong to the same strategy in which the faculties of dissent are neutralized before they can develop and dissent is established as the paramount political obscenity.
Model theory is an important area of mathematical logic which has deep philosophical roots, many philosophical applications, and great philosophical interest in itself. The aim of this book is to introduce, organise, survey, and develop these connections between philosophy and model theory, for the benefit of philosophers and logicians alike.
This paper sets out a predicative response to the Russell-Myhill paradox of propositions within the framework of Church’s intensional logic. A predicative response places restrictions on the full comprehension schema, which asserts that every formula determines a higher-order entity. In addition to motivating the restriction on the comprehension schema from intuitions about the stability of reference, this paper contains a consistency proof for the predicative response to the Russell-Myhill paradox. The models used to establish this consistency also model other axioms (...) of Church’s intensional logic that have been criticized by Parsons and Klement: this, it turns out, is due to resources which also permit an interpretation of a fragment of Gallin’s intensional logic. Finally, the relation between the predicative response to the Russell-Myhill paradox of propositions and the Russell paradox of sets is discussed, and it is shown that the predicative conception of set induced by this predicative intensional logic allows one to respond to the Wehmeier problem of many non-extensions. (shrink)
A crucial part of the contemporary interest in logicism in the philosophy of mathematics resides in its idea that arithmetical knowledge may be based on logical knowledge. Here an implementation of this idea is considered that holds that knowledge of arithmetical principles may be based on two things: (i) knowledge of logical principles and (ii) knowledge that the arithmetical principles are representable in the logical principles. The notions of representation considered here are related to theory-based and structure-based notions of representation (...) from contemporary mathematical logic. It is argued that the theory-based versions of such logicism are either too liberal (the plethora problem) or are committed to intuitively incorrect closure conditions (the consistency problem). Structure-based versions must on the other hand respond to a charge of begging the question (the circularity problem) or explain how one may have a knowledge of structure in advance of a knowledge of axioms (the signature problem). This discussion is significant because it gives us a better idea of what a notion of representation must look like if it is to aid in realizing some of the traditional epistemic aims of logicism in the philosophy of mathematics. (shrink)
This article surveys recent literature by Parsons, McGee, Shapiro and others on the significance of categoricity arguments in the philosophy of mathematics. After discussing whether categoricity arguments are sufficient to secure reference to mathematical structures up to isomorphism, we assess what exactly is achieved by recent ‘internal’ renditions of the famous categoricity arguments for arithmetic and set theory.
This paper presents a systematic study of the prehistory of the traditional subsystems of second-order arithmetic that feature prominently in the reverse mathematics program of Friedman and Simpson. We look in particular at: (i) the long arc from Poincar\'e to Feferman as concerns arithmetic definability and provability, (ii) the interplay between finitism and the formalization of analysis in the lecture notes and publications of Hilbert and Bernays, (iii) the uncertainty as to the constructive status of principles equivalent to Weak K\"onig's (...) Lemma, and (iv) the large-scale intellectual backdrop to arithmetical transfinite recursion in descriptive set theory and its effectivization by Borel, Lusin, Addison, and others. (shrink)
Frege's Grundgesetze was one of the 19th century forerunners to contemporary set theory which was plagued by the Russell paradox. In recent years, it has been shown that subsystems of the Grundgesetze formed by restricting the comprehension schema are consistent. One aim of this paper is to ascertain how much set theory can be developed within these consistent fragments of the Grundgesetze, and our main theorem shows that there is a model of a fragment of the Grundgesetze which defines a (...) model of all the axioms of Zermelo-Fraenkel set theory with the exception of the power set axiom. The proof of this result appeals to G\"odel's constructible universe of sets, which G\"odel famously used to show the relative consistency of the continuum hypothesis. More specifically, our proofs appeal to Kripke and Platek's idea of the projectum within the constructible universe as well as to a weak version of uniformization (which does not involve knowledge of Jensen's fine structure theory). The axioms of the Grundgesetze are examples of abstraction principles, and the other primary aim of this paper is to articulate a sufficient condition for the consistency of abstraction principles with limited amounts of comprehension. As an application, we resolve an analogue of the joint consistency problem in the predicative setting. (shrink)
This paper presents new constructions of models of Hume's Principle and Basic Law V with restricted amounts of comprehension. The techniques used in these constructions are drawn from hyperarithmetic theory and the model theory of fields, and formalizing these techniques within various subsystems of second-order Peano arithmetic allows one to put upper and lower bounds on the interpretability strength of these theories and hence to compare these theories to the canonical subsystems of second-order arithmetic. The main results of this paper (...) are: (i) there is a consistent extension of the hyperarithmetic fragment of Basic Law V which interprets the hyperarithmetic fragment of second-order Peano arithmetic, and (ii) the hyperarithmetic fragment of Hume's Principle does not interpret the hyperarithmetic fragment of second-order Peano arithmetic, so that in this specific sense there is no predicative version of Frege's Theorem. (shrink)
Many recent writers in the philosophy of mathematics have put great weight on the relative categoricity of the traditional axiomatizations of our foundational theories of arithmetic and set theory. Another great enterprise in contemporary philosophy of mathematics has been Wright's and Hale's project of founding mathematics on abstraction principles. In earlier work, it was noted that one traditional abstraction principle, namely Hume's Principle, had a certain relative categoricity property, which here we term natural relative categoricity. In this paper, we show (...) that most other abstraction principles are not naturally relatively categorical, so that there is in fact a large amount of incompatibility between these two recent trends in contemporary philosophy of mathematics. To better understand the precise demands of relative categoricity in the context of abstraction principles, we compare and contrast these constraints to stability-like acceptability criteria on abstraction principles, the Tarski-Sher logicality requirements on abstraction principles studied by Antonelli and Fine, and supervaluational ideas coming out of Hodes' work. (shrink)
This paper addresses how states improve their responsiveness to violence against women in developing countries with little political will and few resources to do so. One key to engendering justice and improving responsiveness is building specialized institutions within the state that facilitate the implementation of laws addressing violence against women. Why and how do states engage in institution-building to protect marginalized populations in these contexts? I propose that developing countries are more likely to create and maintain specialized institutions when domestic (...) and international political and legal frameworks make the state more vulnerable to women’s demands, and when civil society coordinates with the state and/or international organizations to take advantage of this political opportunity. This coordination brings necessary pressure and resources that would be difficult, if not impossible, to deliver otherwise. This inter-institutional coordination is necessary for building and maintaining new state institutions and programs that help to monitor the implementation of laws, develop public policies, provide services for victims, and improve responsiveness of the justice system. This fills an important lacuna in the literature, which focuses on women’s state institutions as an important catalyst for responsiveness to violence against women, but does not explain how these institutions are initially constructed. (shrink)
The Denjoy integral is an integral that extends the Lebesgue integral and can integrate any derivative. In this paper, it is shown that the graph of the indefinite Denjoy integral f↦∫xaf is a coanalytic non-Borel relation on the product space M[a,b]×C[a,b], where M[a,b] is the Polish space of real-valued measurable functions on [a,b] and where C[a,b] is the Polish space of real-valued continuous functions on [a,b]. Using the same methods, it is also shown that the class of indefinite Denjoy integrals, (...) called ACG∗[a,b], is a coanalytic but not Borel subclass of the space C[a,b], thus answering a question posed by Dougherty and Kechris. Some basic model theory of the associated spaces of integrable functions is also studied. Here the main result is that, when viewed as an ℝ[X]-module with the indeterminate X being interpreted as the indefinite integral, the space of continuous functions on the interval [a,b] is elementarily equivalent to the Lebesgue-integrable and Denjoy-integrable functions on this interval, and each is stable but not superstable, and that they all have a common decidable theory when viewed as ℚ[X]-modules. (shrink)
A semantics for quantified modal logic is presented that is based on Kleene's notion of realizability. This semantics generalizes Flagg's 1985 construction of a model of a modal version of Church's Thesis and first-order arithmetic. While the bulk of the paper is devoted to developing the details of the semantics, to illustrate the scope of this approach, we show that the construction produces (i) a model of a modal version of Church's Thesis and a variant of a modal set theory (...) due to Goodman and Scedrov, (ii) a model of a modal version of Troelstra's generalized continuity principle together with a fragment of second-order arithmetic, and (iii) a model based on Scott's graph model (for the untyped lambda calculus) which witnesses the failure of the stability of non-identity. (shrink)
Frege's theorem says that second-order Peano arithmetic is interpretable in Hume's Principle and full impredicative comprehension. Hume's Principle is one example of an abstraction principle, while another paradigmatic example is Basic Law V from Frege's Grundgesetze. In this paper we study the strength of abstraction principles in the presence of predicative restrictions on the comprehension schema, and in particular we study a predicative Fregean theory which contains all the abstraction principles whose underlying equivalence relations can be proven to be equivalence (...) relations in a weak background second-order logic. We show that this predicative Fregean theory interprets second-order Peano arithmetic. (shrink)
Advocacy and scholarship addressing sex trafficking as a human rights issue has become a transnational effort, but there has been less attention to sub-national efficacy. Through analyzing progressive justice system responses to domestic violence in Duluth, Minnesota that have been adopted worldwide, this paper demonstrates how to effectively apply these local advances in order to address sex trafficking on a global scale. This paper makes a theoretical contribution to understanding the intersections between domestic abuse and sex trafficking. A key empirical (...) finding is that a coordinated community response is crucial for advancing domestic abuse training, monitoring, and legislation—and this coordination can also be productively utilized for improving responsiveness to victims of sex trafficking across a diverse range of socio-legal and economic contexts. (shrink)
The topic of this paper is our knowledge of the natural numbers, and in particular, our knowledge of the basic axioms for the natural numbers, namely the Peano axioms. The thesis defended in this paper is that knowledge of these axioms may be gained by recourse to judgements of probability. While considerations of probability have come to the forefront in recent epistemology, it seems safe to say that the thesis defended here is heterodox from the vantage point of traditional philosophy (...) of mathematics. So this paper focuses on providing a preliminary defense of this thesis, in that it focuses on responding to several objections. Some of these objections are from the classical literature, such as Frege's concern about indiscernibility and circularity, while other are more recent, such as Baker's concern about the unreliability of small samplings in the setting of arithmetic. Another family of objections suggests that we simply do not have access to probability assignments in the setting of arithmetic, either due to issues related to the~$\omega$-rule or to the non-computability and non-continuity of probability assignments. Articulating these objections and the responses to them involves developing some non-trivial results on probability assignments, such as a forcing argument to establish the existence of continuous probability assignments that may be computably approximated. In the concluding section, two problems for future work are discussed: developing the source of arithmetical confirmation and responding to the probabilistic liar. (shrink)
Two puzzles with regard to the Kritik der reinen Vernunft are incongruent counterparts and causality. In De mundi sensibilis atque intelligibilis forma et principiis, Kant indicates that the experience of things like left and right hands, so-called incongruent counterparts, involve certain pure intuitions, and hence constitute one line of evidence for the claim that the concept of space itself is a pure intuition. In KrV, Kant again argues that the concept of space itself is a pure intuition, but does not (...) cite the experience of incongruent counterparts as evidence for this claim. Since there is ostensibly nothing in KrV which tells against the existence of the experiences of incongruent counterparts, the natural question is: “Why, in KrV, does Kant not cite the experience of incongruent counterparts as evidence for the claim that the concept of space is a pure intuition?” The problem with causality is as follows. One of the most primary and basic claims of KrV is that empirical experience is structured by non-empirical concepts, such as substantiality and causality. In a portion of KrV entitled the Transcendental Deduction, Kant gives section 20 the heading “All sensible intuitions stand under the categories, as conditions under which alone their the manifold can come together in one consciousness”. Since categories are those concepts which structure empirical experience, section 20 has demonstrated that all sensible intuitions are subject to substantiality and causality. However, there is a later portion of KrV entitled the Second Analogy with the heading “Principle of temporal sequence according to the law of causality: All alterations occur in accordance with the law of the connection of cause and effect”. Thus, the natural question is: “What does the Second Analogy demonstrate about causality which the Transcendental Deduction did not already demonstrate about causality?”. (shrink)
Normal 0 false false false EN-US X-NONE X-NONE Normal 0 false false false EN-US ZH-TW X-NONE It is commonly believed that impartial utilitarian moral theories have significant demands that we help the global poor, and that the partial virtue ethics of Mencius and Aristotle do not. This ethical partiality found in these virtue ethicists has been criticized, and some have suggested that the partialistic virtue ethics of Mencius and Aristotle are parochial (i.e., overly narrow in their scope of concern). I (...) believe, however, that the ethics of Mencius and Aristotle are both more cosmopolitan than many presume and also are very demanding. In this paper, I argue that the ethical requirements to help the poor and starving are very demanding for the quintessentially virtuous person in Mencius and Aristotle. The ethical demands to help even the global poor are demanding for Mencius’ jun-zi ( 君子 chün-tzu / junzi ) and Aristotle’s megalopsuchos . I argue that both the jun-zi and megalopsuchos have a wide scope of concern for the suffering of poor people. I argue that the relevant virtues of the jun-zi and megalopsuchos are also achievable for many people. The moral views of Mencius and Aristotle come with strong demands for many of us to work harder to alleviate global poverty. . (shrink)
This book critically examines Leo Strauss s claim that the philosophers of antiquity, especially Plato, wrote esoterically, hiding the highest truths exclusively between the lines.
The aim of this article is to address the recently renewed debate pertaining to esotericism, secret messages encoded within writings from antiquity, especially in the writings of Plato. The question of esotericism has assumed a prominent role within debates concerning the history of political thought. Ever since Leo Strauss offered his suspicion that there were secrets ‘buried in the writings of the rhetoricians of antiquity’, the idea that philosophers deliberately concealed their true beliefs in a way that few could detect (...) has been fiercely debated. More recently, the research of J.B. Kennedy has made international headlines for discovering a musical pattern embedded within Platonic writings, a pattern that Kennedy insists is evidence of Plato’s Pythagorean allegiance. The theses proffered by Strauss and Kennedy are empty doctrines of esotericism, or empty esotericisms. These doctrines insinuate the presence of secret or coded writing within Platonic dialogues but reveal no actual secret. These theses of esotericism falsely represent Plato as hyper-cryptic.Without actually providing substantive content, these notions of esotericism compel the reader to merely negate the exoteric writings of Plato, which actually render his already heterodox writings as commonplace and orthodox. (shrink)
This article improves two existing theorems of interest to neologicist philosophers of mathematics. The first is a classification theorem due to Fine for equivalence relations between concepts definable in a well-behaved second-order logic. The improved theorem states that if an equivalence relation E is defined without nonlogical vocabulary, then the bicardinal slice of any equivalence class—those equinumerous elements of the equivalence class with equinumerous complements—can have one of only three profiles. The improvements to Fine’s theorem allow for an analysis of (...) the well-behaved models had by an abstraction principle, and this in turn leads to an improvement of Walsh and Ebels-Duggan’s relative categoricity theorem. (shrink)
Objects appear to fall into different sorts, each with their own criteria for identity. This raises the question of whether sorts overlap.ionists about numbers—those who think natural numbers are objects characterized by abstraction principles—face an acute version of this problem. Many abstraction principles appear to characterize the natural numbers. If each abstraction principle determines its own sort, then there is no single subject-matter of arithmetic—there are too many numbers. That is, unless objects can belong to more than one sort. But (...) if there are multi-sorted objects, there should be cross-sortal identity principles for identifying objects across sorts. The going cross-sortal identity principle, ECIA2 of, solves the problem of too many numbers. But, I argue, it does so at a high cost. I therefore propose a novel cross-sortal identity principle, based on embeddings of the induced models of abstracts developed by Walsh. The new criterion matches ECIA2’s success, but offers interestingly different answers to the more controversial identifications made by ECIA2. (shrink)
Emmanuel Levinas has been Professor of Philosophy at the Sorbonne and the director of the Ecole Normale Israelite Orientale. Through such works as "Totality and Infinity" and "Otherwise than Being", he has exerted a profound influence on twentieth-century continental philosophy, providing inspiration for Derrida, Lyotard, Blanchot and Irigaray. "The Levinas Reader" collects, often for the first time in English, essays by Levinas encompassing every aspect of his thought: the early phenomenological studies written under the guidance and inspiration of Husserl and (...) Heidegger; the fully developed ethical critique of such totalizing philosophies; the pioneering texts on the moral dimension to aesthetics; the rich and subtle readings of the Talmud which are an exemplary model of an ethical, transcendental philosophy at work; the admirable meditations on current political issues. Sean Hand's introduction gives a complete overview of Levinas's work and situates each chapter within his general contribution to phenomenology, aesthetics, religion, politics and, above all, ethics. Each essay has been prefaced with a brief introduction presenting the basic issues and the necessary background, and suggesting ways to study the text further. (shrink)
In this book Sylvia Walsh focuses on the writings of this later period and locates the key to Kierkegaard's understanding of Christianity in the "inverse dialectic" that is involved in "living Christianly.
This is the story of a seductive idea and its sobering consequences. The twentieth century brought a new cultural confidence in the social powers of invention – but also saw the advance of consumerism, world wars, globalisation and human-generated climate change. Techno-Fixers traces how passive optimism and active manipulations were linked to our growing trust in technological innovation. It pursues the evolving idea through engineering hubris, radical utopian movements, science fiction fanzines, policy-maker soundbites, corporate marketing, and consumer culture. It explores (...) how evangelists of technological fixes have proselytised their faith, and critically examines the examples and products of their followers. The new technological confidence mixed together beliefs that were simultaneously compelling and unsettling. As motor vehicles, electricity supplies and radio became part of modern life in the early decades of the century, it was hard to argue against the transformative effects and inevitability of such transitions. Like it or not, social consequences seemed to come inexorably with the Machine Age, the Space Age and the Information Age. This deterministic vision implied an ever more technological future with unavoidable social consequences. For many, innovative technologies promised appealing new lifestyles and powers. But for a narrower band of proponents – the first generation of technological fixers – wise engineering invention was touted as a guaranteed route to positive human benefits and societal progress. Socially-engaged engineers and designers argued that such improvements could be directed, hastened and amplified. These engineering adventurers argued that modern societies could be guided only by rational designers. They contended that adroit technological solutions could solve contemporary problems better than any traditional method, including economic initiatives, citizen education, political ideology, lifestyle changes, legal frameworks and moral guidance. Their seductive claims were tamed by more mainstream American enthusiasts, and eventually boiled down to the concept of the technological fix. Their shared confidence infused policy-makers, broadcasters and science popularisers. Trust in technological fixes shaped a new generation of managers and law-makers, engineers and educators, futurists and citizens – and continues to drive a new generation of techno-fixers today. This cultural wave, its promoters and detractors, have championed the promise and voiced concerns that we have inherited, still unresolved. This book tracks the hubristic influencers and weighs up the confidences and concerns associated with them: the dramatic potential for novel technologies to work alongside longer human traditions to meet our enduring ambitions – or to reshape society for the worse. (shrink)
We have a much better understanding of physics than we do of consciousness. I consider ways in which intrinsically mental aspects of fundamental ontology might induce modifications of the known laws of physics, or whether they could be relevant to accounting for consciousness if no such modifications exist. I suggest that our current knowledge of physics should make us skeptical of hypothetical modifications of the known rules, and that without such modifications it’s hard to imagine how intrinsically mental aspects could (...) play a useful explanatory role. Draft version of a paper submitted to Journal of Consciousness Studies, special issue responding to Philip Goff’s Galileo’s Error: Foundations for a New Science of Consciousness. (shrink)
Arendt Contra Sociology re-assesses the relationship between Hannah Arendt's work and the theoretical foundations of sociology, bringing her insights to bear on key themes within contemporary theoretical sociology. Departing from the view of Arendt as a political theorist who sought to rescue politics from society, and political theory from the social sciences, this book re-examines her distinctions between labour, fabrication and action as a theory of the fundamental ontology of human societies, revisiting her criticism of the tendency of many sociological (...) paradigms to conflate the activity of fabrication with that of action. (shrink)
_Living Poetically_ is the first book to focus primarily on Kierkegaard's existential aesthetics as opposed to traditional aesthetic features of his writings such as the use of pseudonyms, literary techniques and figures, and literary criticism. _Living Poetically_ traces the development of the concept of the poetic in Kierkegaard's writings as that concept is worked out in an ethical-religious perspective in contrast to the aesthetics of early German romanticism and Hegelian idealism. Sylvia Walsh seeks to elucidate what it means, in (...) Kierkegaard's view, to be an authentic poet in the form of a poetic writer and to clarify his own role as a Christian poet and writer as he understood it. Walsh shows that, in spite of strong criticisms made of the poetic in some of his writings, Kierkegaard maintained a fundamentally positive understanding of the poetic as an essential ingredient in ethical and religious forms of life. Walsh thus reclaims Kierkegaard as a poetic thinker and writer from those who would interpret him as an ironic practitioner of an aestheticism devoid of and detached from the ethical-religious as well as from those who view him as rejecting the poetic and aesthetic on ethical or religious grounds. Viewing contemporary postmodern feminism and deconstruction as advocating a romantic mode of living poetically, Walsh concludes with a feminist reading of Kierkegaard that affirms both individuality and relatedness, commonalities and differences between the self and others, men and women, for the fashioning of an authentic mode of living poetically in the present age. (shrink)
Denis Walsh has written a striking new defense in this journal of the statisticalist (i.e., noncausalist) position regarding the forces of evolution. I defend the causalist view against his new objections. I argue that the heart of the issue lies in the nature of nonadditive causation. Detailed consideration of that turns out to defuse Walsh’s ‘description‐dependence’ critique of causalism. Nevertheless, the critique does suggest a basis for reconciliation between the two competing views. *Received December 2009; revised December 2009. (...) †To contact the author, please write to: Department of Philosophy, 599 Lucas Hall, One University Boulevard, University of Missouri, St. Louis, MO 63121; e‐mail: [email protected] (shrink)
I discuss "Poetic Naturalism" -- there is only one world, the natural world, but there are many ways of talking about it -- both as a general concept, and how it accounts for our actual world. I talk about emergence, fundamental physics, entropy and complexity, the origins of life and consciousness, and moral constructivism.