Robustness has long been recognized as an important parameter for evaluating game-theoretic results, but talk of ‘robustness’ generally remains vague. What we offer here is a graphic measure for a particular kind of robustness (‘matrix robustness’), using a three-dimensional display of the universe of 2 × 2 game theory. In such a measure specific games appear as specific volumes (Prisoner’s Dilemma, Stag Hunt, etc.), allowing a graphic image of the extent of particular game-theoretic effects in terms of those games. The (...) measure also allows for an easy comparison between different effects in terms of matrix robustness. Here we use the measure to compare the robustness of Tit for Tat’s well-known success in spatialized games (Axelrod, R. (1984). The evolution of cooperation . New York: Basic Books; Grim, P. et al. (1998). The philosophical computer: Exploratory essays in philosophical computer modeling . Cambridge, Mass: MIT Press) with the robustness of a recent game-theoretic model of the contact hypothesis regarding prejudice reduction (Grim et al. 2005. Public Affairs Quarterly, 19 , 95–125). (shrink)
We apply spatialized game theory and multi-agent computational modeling as philosophical tools: (1) for assessing the primary social psychological hypothesis regarding prejudice reduction, and (2) for pursuing a deeper understanding of the basic mechanisms of prejudice reduction.
Our aims are to set forth a multiprinciple system for selecting among clinical trials competing for limited space in an immunotherapy production facility that supplies products under investigation by scientific investigators; defend this system by appealing to justice principles; and illustrate our proposal by showing how it might be implemented. Our overarching aim is to assist manufacturers of immunotherapeutic products and other potentially breakthrough experimental therapies with the ethical task of prioritizing requests from scientific investigators when production capacity is limited.
Algebraic/topological descriptions of living processes are indispensable to the understanding of both biological and cognitive functions. This paper presents a fundamental algebraic description of living/cognitive processes and exposes its inherent ambiguity. Since ambiguity is forbidden to computation, no computational description can lend insight to inherently ambiguous processes. The impredicativity of these models is not a flaw, but is, rather, their strength. It enables us to reason with ambiguous mathematical representations of ambiguous natural processes. The noncomputability of these structures means computerized (...) simulacra of them are uninformative of their key properties. This leads to the question of how we should reason about them. That question is answered in this paper by presenting an example of such reasoning, the demonstration of a topological strategy for understanding how the fundamental structure can form itself from within itself. (shrink)
‘The problem with simulations is that they are doomed to succeed.’ So runs a common criticism of simulations—that they can be used to ‘prove’ anything and are thus of little or no scientific value. While this particular objection represents a minority view, especially among those who work with simulations in a scientific context, it raises a difficult question: what standards should we use to differentiate a simulation that fails from one that succeeds? In this paper we build on a structural (...) analysis of simulation developed in previous work to provide an evaluative account of the variety of ways in which simulations do fail. We expand the structural analysis in terms of the relationship between a simulation and its real-world target emphasizing the important role of aspects intended to correspond and also those specifically intended not to correspond to reality. The result is an outline both of the ways in which simulations can fail and the scientific importance of those various forms of failure. (shrink)
In response to growing concern in the 1980s about the quality of public education across the United States, a tremendous amount of energy was expended by organizations such as the Holmes Group and the Carnegie Forum to organize professional development schools or “partner schools” for teacher education. On the surface, the concept of partnering is simple; however, the practice is very costly, complex, and difficult. In _Schooling, Democracy, and the Quest for Wisdom_, Robert V. Bullough, Jr. and John R. (...) Rosenberg examine the concept of partnering through various lenses and they address what they think are the major issues that need to be, but rarely are, discussed by thousands of educators in the U.S. who are involved and invested in university-public school partnerships. Ultimately, they assert that the conversation around partnering needs re-centering, refreshing, and re-theorizing. (shrink)
This book provides an introduction to postphenomenology, an emerging school of thought in the philosophy of technology and science and technology studies, which addresses the relationships users develop with the devices they use.
In the 1978 volume of Process Studies, Nancy Frankenberry published an article called “The Empirical Dimension of Religious Experience” that I thought was so good that I wrote her a short fan letter about it.1 She responded by saying that she was flattered by my praise because I was a model for her younger generation. For the first time in my life I felt old. And I wasn’t yet forty. But here I am, still fully employed, presenting a long (...) fan letter at her retirement. Nice irony! I want to begin laying out some of the themes and accomplishments of her distinguished career as a model philosopher of religion by discussing her early book Religion and Radical Empiricism.2 The context in which she wrote.. (shrink)
Is simulation some new kind of science? We argue that instead simulation fits smoothly into existing scientific practice, but does so in several importantly different ways. Simulations in general, and computer simulations in particular, ought to be understood as techniques which, like many scientific techniques, can be employed in the service of various and diverse epistemic goals. We focus our attentions on the way in which simulations can function as (i) explanatory and (ii) predictive tools. We argue that a wide (...) variety of simulations, both computational and physical, are best conceived in terms of a set of common features: initial or input conditions, a mechanism or set of rules, and a set of results or output conditions. Studying simulations in these terms yields a new understanding of their character as well as a body of normative recommendations for the care and feeding of scientific simulations. (shrink)
During the mid-nineteenth century the annual tuberculosis mortality in the penitentiaries at Auburn, N.Y., Boston, and Philadelphia exceeded 10 percent of the inmate population. At the beginning of the sanatorium era, 80 percent of the prison deaths were attributed to TB. As the mountain air was “commonly known” to be healthful, the first prison sanatorium was opened in the mountains near Dannemora, N.Y. in 1904. It served to isolate contagious prison inmates until the advent of effective chemotherapy for the disease (...) in the 1950’s. Early antibiotic therapy for TB was such a great success that the public health aspects of TB in prisons remained dormant for the next 40 years.In 1991, a correctional officer from Auburn Correctional Facility in Auburn, New York died as a result of multidrug-resistant TB. He had been posted to care for hospitalized patients, from whom he acquired his disease. This death, and the transmission of TB infection to health care workers in the same hospital, brought the nature and extent of modern inmate medical care into finer focus. (shrink)
A central question in philosophical and sociological accounts of technology is how the agency of technologies should be conceived, that is, how to understand their constitutive roles in the actions performed by assemblages of humans and artifacts. To address this question, I build on the suggestion that a helpful perspective can be gained by amalgamating “actor-network theory” and “postphenomenological” accounts. The idea is that only a combined account can confront both the nuances of human experiential relationships with technology on which (...) postphenomenology specializes, and also the chains of interactions between numerous technologies and humans that actor-network theory can address. To perform this amalgamation, however, several technical adjustments to these theories are required. The central change I develop here is to the postphenomenological notion of “multistability,” i.e., the claim that a technology can be used for multiple purposes through different contexts. I expand the postphenomenological framework through the development of a method called “variational cross-examination,” which involves critically contrasting the various stabilities of a multistable technology for the purpose of exploring how a particular stability has come to dominate. As a guiding example, I explore the case of the everyday public bench. The agency of this “mundane artifact,” as actor-network theorist Bruno Latour would call it, cannot be accounted for by either postphenomenology or actor-network theory alone. (shrink)
During the mid-nineteenth century the annual tuberculosis mortality in the penitentiaries at Auburn, N.Y., Boston, and Philadelphia exceeded 10 percent of the inmate population. At the beginning of the sanatorium era, 80 percent of the prison deaths were attributed to TB. As the mountain air was “commonly known” to be healthful, the first prison sanatorium was opened in the mountains near Dannemora, N.Y. in 1904. It served to isolate contagious prison inmates until the advent of effective chemotherapy for the disease (...) in the 1950’s. Early antibiotic therapy for TB was such a great success that the public health aspects of TB in prisons remained dormant for the next 40 years.In 1991, a correctional officer from Auburn Correctional Facility in Auburn, New York died as a result of multidrug-resistant TB. He had been posted to care for hospitalized patients, from whom he acquired his disease. This death, and the transmission of TB infection to health care workers in the same hospital, brought the nature and extent of modern inmate medical care into finer focus. (shrink)
The emerging school of thought called “postphenomenology” offers a distinct understanding of the ways that people experience technology usage. This perspective combines insights from the philosophical tradition of phenomenology with commitments to the anti-essentialism and nonfoundationalism of American pragmatism. One of postphenomenology’s central positions is that technologies always remain “multistable,” i.e., subject to different uses and meanings. But I suggest that as this perspective matures, philosophical problems are emerging around the notion of multistability, what I call “the problem of invariance” (...) and “the problem of grounding.” These problems point out things that remain unclear within the postphenomenological framework, such as how it handles structural claims regarding a technology’s various stabilities, and how it grounds its claims. How can postphenomenology make structural claims about technology and yet remain anti-essentializing? And on what epistemological basis does it ground its claims about human-technology relations? The paper concludes with a series of prescriptions that, if followed, enable postphenomenology to make edifying claims about technology, all while avoiding the problems of invariance and grounding, and maintaining its commitments to anti-essentialism and nonfoundationalism. (shrink)
A discussion is emerging within the contemporary philosophy of technology over issues of discrimination through design. My suggestion is that a productive way to approach this topic is through a combination of insights from the postphenomenological and critical constructivist perspectives. In particular, I recommend that we build on the postphenomenological notion of “multistability” and conceive of instances of discrimination through design as a kind of discriminatory “stability,” one possible instantiation of a device that could be usefully contrasted with others. Through (...) the adoption of ideas from critical constructivism and postphenomenology, it is possible to draw out some of the features of discriminatory stabilities, including how systems of bias can go unnoticed, especially by those not targeted by them. These ideas could be of use in the identification of ways that unjust systematic biases become set within dominant culture, designed into technologies, sedimented within individual bodily-perceptual habits, and even constructed into prevailing senses of reason. As a practical contribution to this ongoing discussion, I identify a distinction that can be made between two broad categories of discrimination via technology: 1. that occurring along what could be called “an axis of difference,” and 2. “an axis of usage.” In the former, discriminatory efforts occur as different users are advantaged and disadvantaged by a device, even as they use it for similar purposes. In the latter, discriminatory effects occur as the particular usage of a technology preferred by a vulnerable group is shut down through design choices. Although the various emerging discussions on technology and discrimination each tend to gravitate toward analysis along one of these axes, it will of course be important to keep our eyes on the variety of ways that biases are faced by the vulnerable. (shrink)
A discussion is emerging within the contemporary philosophy of technology over issues of discrimination through design. My suggestion is that a productive way to approach this topic is through a combination of insights from the postphenomenological and critical constructivist perspectives. In particular, I recommend that we build on the postphenomenological notion of “multistability” and conceive of instances of discrimination through design as a kind of discriminatory “stability,” one possible instantiation of a device that could be usefully contrasted with others. Through (...) the adoption of ideas from critical constructivism and postphenomenology, it is possible to draw out some of the features of discriminatory stabilities, including how systems of bias can go unnoticed, especially by those not targeted by them. These ideas could be of use in the identification of ways that unjust systematic biases become set within dominant culture, designed into technologies, sedimented within individual bodily-perceptual habits, and even constructed into prevailing senses of reason. As a practical contribution to this ongoing discussion, I identify a distinction that can be made between two broad categories of discrimination via technology: 1. that occurring along what could be called “an axis of difference,” and 2. “an axis of usage.” In the former, discriminatory efforts occur as different users are advantaged and disadvantaged by a device, even as they use it for similar purposes. In the latter, discriminatory effects occur as the particular usage of a technology preferred by a vulnerable group is shut down through design choices. Although the various emerging discussions on technology and discrimination each tend to gravitate toward analysis along one of these axes, it will of course be important to keep our eyes on the variety of ways that biases are faced by the vulnerable. (shrink)
Providing the most thorough coverage available in one volume, this comprehensive, broadly based collection offers a wide variety of selections in four major genres, and also includes a section on film. Each of the five sections contains a detailed critical introduction to each form, brief biographies of the authors, and a clear, concise editorial apparatus. Updated and revised throughout, the new Fourth Edition adds essays by Margaret Mead, Russell Baker, Joan Didion, Annie Dillard, and Alice Walker; fiction by Nathaniel Hawthorne, (...) Ursula K. LeGuin, Anton Chekov, James Joyce, Katherine Mansfield, F. Scott Fitzgerald, William Faulkner, Alice Walker, Louise Erdrich, Donald Barthelme, and James McPherson; poems by John Donne, Robert Browning, Walt Whitman, Edwin Arlington Robinson, e.e. cummings, Langston Hughes, W.H. Auden, Philip Levine, and Louise Gluck; and plays by August Wilson, Marsha Norman, Wendy Wasserstein, and Vaclav Havel. The chapter devoted to film examines the relation of film to literature and gives the complete screenplay for Citizen Kane plus close analysis of a scene from the film. With its innovative structure, comprehensive coverage, and insightful and stimulating presentation of all kinds of literature, this is an anthology readers will turn to again and again. (shrink)
Book Symposium on Don Ihde’s Expanding Hermeneutics: Visualism in Science Content Type Journal Article Category Book Symposium Pages 1-22 DOI 10.1007/s13347-011-0060-5 Authors Jan Kyrre Berg Olsen Friis, University of Copenhagen, Nørre Farimagsgade 5 A, Room 10.0.27, 1014 Copenhagen, Denmark Larry A. Hickman, The Center for Dewey Studies, Southern Illinois University Carbondale, Carbondale, IL 62901, USA RobertRosenberger, School of Public Policy, Georgia Institute of Technology, DM Smith Building, 685 Cherry Street, Atlanta, GA 30332-0345, USA Robert C. Scharff, (...) University of New Hampshire, Durham, NH 03824-3574, USA Don Ihde, Stony Brook University, Harriman Hall 221, Stony Brook, NY 11794-3750, USA Journal Philosophy & Technology Online ISSN 2210-5441 Print ISSN 2210-5433. (shrink)
How should we understand postphenomenological methodology? Postphenomenology is a research perspective which builds on phenomenological and pragmatist philosophy to explore human–technology relations, but one with open methodological questions. Here, I offer some thoughts on the epistemological processes that should be at work in this research. In particular, I am concerned with postphenomenological research on technological “multistability,” i.e., a device’s ever-present capacity to be used for a variety of purposes, and to always be meaningful in multiple ways. I develop a methodology (...) called “variational cross-examination,” which entails the critical contrast of a device’s various stabilities. As a set of instructive examples, I draw on my own line of research on the politics of public spaces, and especially the critique of anti-homeless design. (shrink)
Contemporary scientific research and public policy are not in agreement over what should be done to address the dangers that result from the drop in driving performance that occurs as a driver talks on a cellular phone. One response to this threat to traffic safety has been the banning in a number of countries and some states in the USA of handheld cell phone use while driving. However, research shows that the use of hands-free phones (such as headsets and dashboard-mounted (...) speakers) also accompanies a drop, leading some to recommend regulation of both kinds of mobile phones. In what follows, I draw out the accounts of the driving impairment associated with phone use implicit in research and policy and develop an alternative account grounded in philosophical considerations. Building on work in a school of thought called postphenomenology, I review and expand concepts useful for articulating human bodily and perceptual relations to technology. By applying these ideas to the case of driving while talking on the phone, I offer an account of the drop in driving performance which focuses on the embodied relationships users develop with the car and the phone, and I consider implications for research and policy. (shrink)
The experience of computer use can be productively articulated with concepts developed in the phenomenological tradition of philosophy. Building on the insights of classical phenomenologists, Ihde has advanced a sophisticated view of the ways humans relate to technology. I review and expand on his notions of “technological mediation,” “embodiment,” and “multistability,” and apply them to the experience of computer interface. In particular, I explore the experience of using a computer that fails to work properly. A revealing example is the experience (...) of a user who suddenly and unexpectedly encounters a slowly-loading webpage while using the Internet. This phenomenological framework provides an account of the ways a suddenly failing technology changes our relationships to the device, to the world, and to ourselves, and it also suggests how this experience can be usefully reconceptualized. (shrink)
Biology is seen not merely as a privileged oppressor of women but as a co-victim of masculinist social assumptions. We see feminist critique as one of the normative controls that any scientist must perform whenever analyzing data, and we seek to demonstrate what has happened when this control has not been utilized. Narratives of fertilization and sex determination traditionally have been modeled on the cultural patterns of male/female interaction, leading to gender associations being placed on cells and their components. We (...) also find that when gender biases are controlled, new perceptions of these intracellular and extracellular relationships emerge. (shrink)
Thinkers from a variety of fields analyze the roles of imaging technologies in science and consider their implications for many issues, from our conception of selfhood to the authority of science. In what follows, I encourage scholars to develop an applied philosophy of imaging, that is, to collect these analyses of scientific imaging and to reflect on how they can be made useful for ongoing scientific work. As an example of this effort, I review concepts developed in Don Ihde’s phenomenology (...) of technology and refigure them for use in the analysis of scientific practice. These concepts are useful for drawing out the details of the interpretive frameworks scientists bring to laboratory images. Next, I apply these ideas to a contemporary debate in neurobiology over the interpretation of images of neurons which have been frozen at the moment of transmitter release. This reveals directions for further thought for the study of neurotransmission. (shrink)
Within this submission the authors share their experiences as a blended research team with Aboriginal community and mainstream academic researchers. The team has collaborated since 2004 on several externally funded research projects. Initially, the team engaged in research through mainstream methodologies. In the process, the community co-researchers and participants were silenced through mainstream cultural practices that were unfamiliar and meaningless in Wikwemikong culture. More recently, the team has employed a community conceived de-colonizing methodology, developed from within Wikwemikong Unceded Indian Reserve. (...) Within this submission, the authors will highlight their initial cultural missteps, followed by more recently utilized culturally relevant approaches. It is proposed that what might be ethically sound research with mainstream participants and among mainstream researchers can silence and subvert practices among those from marginalized groups/cultures. Provisional suggestions are offered for researchers interested in co-researching in Aboriginal communities. (shrink)