This remarkable collection explores the legacy of Wittgenstein's work in contemporary American philosophy. The contributors (including several celebrated philosophers) take a variety of approaches to Wittgenstein; they discuss such topics as rule-following, realism about mathematics, the method of the Tractatus, the relation between style and content in Wittgenstein, and his distinction between sense and nonsense. Wittgenstein also is discussed in relation to subsequent philosophers such as Quine and Kripke.
Modern ethical perspectives toward the environment often emphasize the connection of humans to a broader biotic community. The full intimacy of this connectedness, however, is only now being revealed as scientific findings in developmental biology and genetics provide new insights into the importance of environmental interaction for the development of organisms. These insights are reshaping our understanding of how organism-environment interaction contributes to both consistency and variation in organism development, and leading to a new perspective whereby an “organism” is not (...) solely viewed as the adaptive product of evolutionary selection to an external environment over generations, but as continuously being constructed through systems of interactions that link an organism’s characteristics developmentally to the physical and social influences it experiences during life. This newfound emphasis on “interaction” leads to an interdependency whereby any change to an “environment” impacts the interacting “organism,” and an alteration to the “organism” eventually affects its “environment.” The causal reciprocity embedded within this organism-environment interdependency holds implications for our moral obligations to environments, given their compulsory role in shaping all organisms including ourselves. (shrink)
In the preceding article, Buchner and Wippich used a guessing-corrected, multinomial process-dissociation analysis to test whether a gender bias in fame judgments reported by Banaji and Greenwald was unconscious. In their two experiments, Buchner and Wippich found no evidence for unconscious mediation of this gender bias. Their conclusion can be questioned by noting that the gender difference in familiarity of previously seen names that Buchner and Wippich modeled was different from the gender difference in criterion for fame judgments reported by (...) Banaji and Greenwald, the assumptions of Buchner and Wippich's multinomial model excluded processes that are plausibly involved in the fame judgment task, and the constructs of Buchner and Wippich's model that corresponded most closely to Banaji and Greenwald's gender-bias interpretation were formulated so as to preclude the possibility of modeling that interpretation. Perhaps a more complex multinomial model can model the Banaji and Greenwald interpretation. (shrink)
This article improves two existing theorems of interest to neologicist philosophers of mathematics. The first is a classification theorem due to Fine for equivalence relations between concepts definable in a well-behaved second-order logic. The improved theorem states that if an equivalence relation E is defined without nonlogical vocabulary, then the bicardinal slice of any equivalence class—those equinumerous elements of the equivalence class with equinumerous complements—can have one of only three profiles. The improvements to Fine’s theorem allow for an analysis of (...) the well-behaved models had by an abstraction principle, and this in turn leads to an improvement of Walsh and Ebels-Duggan’s relative categoricity theorem. (shrink)
Neo-Fregeans have been troubled by the Nuisance Principle, an abstraction principle that is consistent but not jointly satisfiable with the favored abstraction principle HP. We show that logically this situation persists if one looks at joint consistency rather than satisfiability: under a modest assumption about infinite concepts, NP is also inconsistent with HP.
The injective version of Cantor’s theorem appears in full second-order logic as the inconsistency of the abstraction principle, Frege’s Basic Law V (BLV), an inconsistency easily shown using Russell’s paradox. This incompatibility is akin to others—most notably that of a (Dedekind) infinite universe with the Nuisance Principle (NP) discussed by neo-Fregean philosophers of mathematics. This paper uses the Burali–Forti paradox to demonstrate this incompatibility, and another closely related, without appeal to principles related to the axiom of choice—a result hitherto unestablished. (...) It discusses both the general interest of this result, its interest to neo-Fregean philosophy of mathematics, and the potential significance of the Burali–Fortian method of proof. (shrink)
Benacerraf’s 1965 multiple-reductions argument depends on what I call ‘deferential logicism’: his necessary condition for number-set identity is most plausible against a background Quineanism that allows autonomy of the natural number concept. Steinhart’s ‘folkist’ sufficient condition on number-set identity, by contrast, puts that autonomy at the center — but fails for not taking the folk perspective seriously enough. Learning from both sides, we explore new conditions on number-set identity, elaborating a suggestion from Wright.
Objects appear to fall into different sorts, each with their own criteria for identity. This raises the question of whether sorts overlap.ionists about numbers—those who think natural numbers are objects characterized by abstraction principles—face an acute version of this problem. Many abstraction principles appear to characterize the natural numbers. If each abstraction principle determines its own sort, then there is no single subject-matter of arithmetic—there are too many numbers. That is, unless objects can belong to more than one sort. But (...) if there are multi-sorted objects, there should be cross-sortal identity principles for identifying objects across sorts. The going cross-sortal identity principle, ECIA2 of, solves the problem of too many numbers. But, I argue, it does so at a high cost. I therefore propose a novel cross-sortal identity principle, based on embeddings of the induced models of abstracts developed by Walsh. The new criterion matches ECIA2’s success, but offers interestingly different answers to the more controversial identifications made by ECIA2. (shrink)
A widespread perception is that carnivores are limited by the amount of prey that can be captured rather than their nutritional quality, and thus have no need to regulate macronutrient balance. Contrary to this view, recent laboratory studies show macronutrient‐specific food selection by both invertebrate and vertebrate predators, and in some cases also associated performance benefits. The question thus arises of whether wild predators might likewise feed selectively according to the macronutrient content of prey. Here we review laboratory studies demonstrating (...) the regulation of macronutrient intake by invertebrate and vertebrate predators, and address the question of whether this is likely to also occur in the wild. We conclude that it is highly likely that wild predators select prey or selectively feed on body parts according to their macronutrient composition, a possibility that could have significant implications for ecological and foraging theory, as well as applied wildlife conservation and management. (shrink)
We present an annotated bibliography of peer reviewed scientific research highlighting the human health, animal welfare, and environmental risks associated with genetic modification. Risks associated with the expression of the transgenic material include concerns over resistance and non-target effects of crops expressing Bt toxins, consequences of herbicide use associated with genetically modified herbicide-tolerant plants, and transfer of gene expression from genetically modified crops through vertical and horizontal gene transfer. These risks are not connected to the technique of genetic modification as (...) such, but would be present for any conventionally produced crops with the same heritable traits. In contrast, other risks are a direct consequence of the method used in gene manipulation. These come about because of the unstable nature of the transgene and vectors used to insert it, and because of unpredictable interactions between the transgene and the host genome. The debate over the release of genetically modified organisms is not merely a scientific one; it encompasses economics, law, ethics, and policy. Any discussion on these levels does, however, need to be informed by sound science. We hope that the scientific references provided here will provide a useful starting point for further debate. (shrink)
Arthur C. Danto is the Johnsonian Professor Emeritus of Philosophy at Columbia University and the most influential philosopher of art in the last half-century. As an art critic for the Nation and frequent contributor to other widely read outlets such as the New York Review of Books, Danto also has become one of the most respected public intellectuals of his generation. He is the author of some two dozen important books, along with hundreds of articles and reviews that have been (...) the center of both controversy and discussion. In this volume Danto offers his intellectual autobiography and responds to essays by 27 of the keenest critics of his thought from the worlds of philosophy and the arts. (shrink)
This paper outlines a first attempt to model the special constraints that arise in language processing in conversation, and to explore the implications such functional considerations may have on language typology and language change. In particular, we focus on processing pressures imposed by conversational turn-taking and their consequences for the cultural evolution of the structural properties of language. We present an agent-based model of cultural evolution where agents take turns at talk in conversation. When the start of planning for the (...) next turn is constrained by the position of the verb, the stable distribution of dominant word orders across languages evolves to match the actual distribution reasonably well. We suggest that the interface of cognition and interaction should be a more central part of the story of language evolution. (shrink)
A crisis continues to brew within the pharmaceutical research and development (R&D) enterprise: productivity continues declining as costs rise, despite ongoing, often dramatic scientific and technical advances. To reverse this trend, we offer various suggestions for both the expansion and broader adoption of modeling and simulation (M&S) methods. We suggest strategies and scenarios intended to enable new M&S use cases that directly engage R&D knowledge generation and build actionable mechanistic insight, thereby opening the door to enhanced productivity. What M&S requirements (...) must be satisfied to access and open the door, and begin reversing the productivity decline? Can current methods and tools fulfill the requirements, or are new methods necessary? We draw on the relevant, recent literature to provide and explore answers. In so doing, we identify essential, key roles for agent-based and other methods. We assemble a list of requirements necessary for M&S to meet the diverse needs distilled from a collection of research, review, and opinion articles. We argue that to realize its full potential, M&S should be actualized within a larger information technology framework—a dynamic knowledge repository—wherein models of various types execute, evolve, and increase in accuracy over time. We offer some details of the issues that must be addressed for such a repository to accrue the capabilities needed to reverse the productivity decline. (shrink)
Gathering the fragments of an enigma Content Type Journal Article DOI 10.1007/s11016-010-9493-1 Authors Sean Dyde, Unit for the History and Philosophy of Science, University of Sydney, Carslaw F07, Sydney, NSW 2006, Australia Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
It is not uncommon for advertisers to present required product disclaimers quickly at the end of advertisements. We show that fast disclaimers greatly reduce consumer comprehension of product risks and benefits, creating implications for social responsibility. In addition, across two studies, we found that disclaimer speed and brand familiarity interact to predict brand trust and purchase intention, and that brand trust mediated the interactive effect of brand familiarity and disclaimer speed on purchase intention. Our results indicate that fast disclaimers actually (...) reduce brand trust and purchase intention for unfamiliar brands, suggesting that there are both economic and social responsibility reasons to use less rapid disclaimers for unfamiliar brands. Conversely, disclaimer speed had no negative effects on brand trust and purchase intention for highly familiar brands, presenting ethical tensions between economic interests (e.g., an efficient use of advertisement time) and social responsibility. We discuss the implications of our framework for advertising ethics, for corporate social performance, and for corporate social responsibility. (shrink)
Canada, as one of the three Allied nations collaborating on atomic energy development during the Second World War, had an early start in applying its new knowledge and defining a new profession. Owing to postwar secrecy and distinct national aims for the field, nuclear engineering was shaped uniquely by the Canadian context. Alone among the postwar powers, Canadian exploration of atomic energy eschewed military applications; the occupation emerged within a governmental monopoly; the intellectual content of the discipline was influenced by (...) its early practitioners, administrators, scarce resources, and university niches; and a self-recognized profession coalesced later than did its American and British counterparts. This paper argues that the history of the emergence of Canadian nuclear engineers exemplifies unusually strong shaping of technical expertise by political and cultural context. (shrink)
Critical care is in an emerging crisis of conflict between what individuals expect and the economic burden society and government are prepared to provide. The goal of critical care support is to prevent suffering and premature death by intensive therapy of reversible illnesses within a reasonable timeframe. Recently, it has become apparent that early support in an intensive care environment can improve patient outcomes. However, life support technology has advanced, allowing physicians to prolong life (and postpone death) in circumstances that (...) were not possible in the recent past. This has been recognized by not only the medical community, but also by society at large. One corollary may be that expectations for recovery from critical illness have also become extremely high. In addition, greater numbers of patients are dying in intensive care units after having receiving prolonged durations of life-sustaining therapy. Herein lies the emerging crisis – critical care therapy must be available in a timely fashion for those who require it urgently, yet its provision is largely dependent on a finite availability of both capital and human resources. Physicians are often placed in a troubling conflict of interest by pressures to use health resources prudently while also promoting the equitable and timely access to critical care therapy. In this commentary, these issues are broadly discussed from the perspective of the individual clinician as well as that of society as a whole. The intent is to generate dialogue on the dynamic between individual clinicians navigating the complexities of how and when to use critical care support in the context of end-of-life issues, the increasing demands placed on finite critical care capacity, and the reasonable expectations of society. (shrink)
I begin by examining a recent debate between John McDowell and Christopher Peacocke over whether the content of perceptual experience is non-conceptual. Although I am sympathetic to Peacocke’s claim that perceptual content is non-conceptual, I suggest a number of ways in which his arguments fail to make that case. This failure stems from an over-emphasis on the "fine-grainedness" of perceptual content - a feature that is relatively unimportant to its non-conceptual structure. I go on to describe two other features of (...) perceptual experience that are more likely to be relevant to the claim that perceptual content is non-conceptual. These features are 1) the dependence of a perceived object on the perceptual context in which it is perceived and 2) the dependence of a perceived property on the object it is perceived to be a property of. (shrink)
There you are at the opera house. The soprano has just hit her high note – a glassshattering high C that fills the hall – and she holds it. She holds it. She holds it. She holds it. She holds it. She holds the note for such a long time that after a while a funny thing happens: you no longer seem only to hear it, the note as it is currently sounding, that glass-shattering high C that is loud and (...) high and pure. In addition, you also seem to hear something more. It is difficult to express precisely what this extra feature is. One is tempted to say, however, that the note now sounds like it has been going on for a very long time. Perhaps it even sounds like a note that has been going on for too long. In any event, what you hear no longer seems to be limited to the pitch, timbre, loudness, and other strictly audible qualities of the note. You seem in addition to experience, even to hear, something about its temporal extent. (shrink)
Pest control operations andexperimentation on sentient animals such as thebrushtail possum can cause unnecessary andavoidable suffering in the animal subjects.Minimizing animal suffering is an animalwelfare goal and can be used as a guide in thedesign and execution of animal experimentationand pest control operations.The public has little sympathy for the possum,which can cause widespread environmentaldamage, but does believe that control should beas painless as possible. Trapping and poisoningprovide only short-term solutions to the possumproblem and often involve methods that causesuffering. Intrusive experiments (...) connected withthese methods of control and published in thelast 6 years are reviewed. Many of theexperiments do not attain the welfare standardsrequired by members of the public. (shrink)
Humanitarian organisations often work alongside those responsible for serious wrongdoing. In these circumstances, accusations of moral complicity are sometimes levelled at decision makers. These accusations can carry a strong if unfocused moral charge and are frequently the source of significant moral unease. In this paper, we explore the meaning and usefulness of complicity and its relation to moral accountability. We also examine the impact of concerns about complicity on the motivation of humanitarian staff and the risk that complicity may lead (...) to a retreat into moral narcissism. Moral narcissism is the possibility that where humanitarian actors inadvertently become implicated in wrongdoing, they may focus more on their image as self-consciously good actors than on the interests of potential beneficiaries. Moral narcissism can be triggered where accusations of complicity are made and can slew decision making. We look at three interventions by Médecins Sans Frontières that gave rise to questions of complicity. We question its decision-guiding usefulness. Drawing on recent thought, we suggest that complicity can helpfully draw attention to the presence of moral conflict and to the way International Non-Governmental Organisations can be drawn into unintentional wrongdoing. We acknowledge the moral challenge that complicity presents to humanitarian staff but argue that complicity does not help INGOs make tough decisions in morally compromising situations as to whether they should continue with an intervention or pull out. (shrink)