We offer a sceptical examination of a thesis recently advanced in a monograph published by Princeton University Press, entitled Greek Buddha: Pyrrho’s Encounter with Early Buddhism in Central Asia. In this dense and probing work, Christopher I. Beckwith, a professor of Central Eurasian studies at Indiana University, Bloomington, argues that Pyrrho of Elis adopted a form of early Buddhism during his years in Bactria and Gandhāra, and that early Pyrrhonism must be understood as a sect of early Buddhism. In making (...) his case Beckwith claims that virtually all scholars of Greek, Indian, and Chinese philosophy have been operating under flawed assumptions and with flawed methodologies, and so have failed to notice obvious and undeniable correspondences between the philosophical views of the Buddha and of Pyrrho. In this study we take Beckwith’s proposal and challenge seriously, and we examine his textual basis and techniques of translation, his methods of examining passages, his construal of problems and his reconstruction of arguments. We find that his presuppositions are contentious and doubtful, his own methods are extremely flawed, and that he draws unreasonable conclusions. Although the result of our study is almost entirely negative, we think it illustrates some important general points about the methodology of comparative philosophy. (shrink)
This study explores some of the connections between the presentation of religious ideas and the use of concessive clauses and sentences in Pāli Buddhist literature. Special emphasis is placed on the linguistic construction kiñcāpi... atha kho.... Although this is widely understood to be a concessive and correlative construction and is often translated in ways that adequately reproduce the meaning of the Pāli, still it is the case that the kiñcāpi... atha kho... construction is sometimes misrepresented. Surprisingly, misrepresentations of said construction (...) are especially prevalent in an ever-growing body of work related to one Pāli text in particular, the Tevijja Sutta. This has helped to obscure the extent to which the sutta is a response to developments in Brahmanical theology external to the text itself. This study examines this unwelcome situation and proposes a remedy. (shrink)
Every day, new warnings emerge about artificial intelligence rebelling against us. All the while, a more immediate dilemma flies under the radar. Have forces been unleashed that are thrusting humanity down an ill-advised path, one that's increasingly making us behave like simple machines? In this wide-reaching, interdisciplinary book, Brett Frischmann and Evan Selinger examine what's happening to our lives as society embraces big data, predictive analytics, and smart environments. They explain how the goal of designing programmable worlds goes hand (...) in hand with engineering predictable and programmable people. Detailing new frameworks, provocative case studies, and mind-blowing thought experiments, Frischmann and Selinger reveal hidden connections between fitness trackers, electronic contracts, social media platforms, robotic companions, fake news, autonomous cars, and more. This powerful analysis should be read by anyone interested in understanding exactly how technology threatens the future of our society, and what we can do now to build something better. (shrink)
Jakob von Uexküll's theories of life -- Biography and historical background -- Nature's conformity with plan -- Umweltforschung -- Biosemiotics -- Concluding remarks -- Marking a path into the environments of animals -- The essential approach to the organism -- Heidegger and the biologists -- Paths to the world -- Disruptive behavior : Heidegger and the captivated animal -- The worldless stone -- The poor animal -- For example, three bees and a lark -- Animal morphology -- A shocking wealth (...) -- A fine line in the rupture of time -- An affected body -- The theme of the animal melody : Merleau-Ponty and the umwelt -- The structure of behavior -- A pure wake, a quiet force -- A leaf of being -- Interanimality -- The-animal-stalks-at-five-oclock : Deleuze's affection for Uexküll -- Problematic organisms -- Uexküll's ethology of affects -- The body without organs, the embryonic egg, and prebiotic soup -- Nature's refrain sung across milieus and territories -- The animal stalks. (shrink)
In this essay I explore the need for transforming the Christian theological symbols of the Trinity, Incarnation, and Redemption, which arose in the context of neo-Platonic metaphysics, in light of late modern, especially Peircean, metaphysics and categories. I engage and attempt to complement the proposal by Andrew Robinson and Christopher Southgate (in this issue of Zygon) with insights from the Peircean-inspired philosophical theology of Robert Neville. I argue that their proposal can be strengthened by acknowledging the way in which theological (...) symbols themselves have a transformative (pragmatic) effect as they are “taken” in context and “break” on the Infinite. (shrink)
Biologists frequently draw on ideas and terminology from engineering. Evolutionary systems biology—with its circuits, switches, and signal processing—is no exception. In parallel with the frequent links drawn between biology and engineering, there is ongoing criticism against this cross-fertilization, using the argument that over-simplistic metaphors from engineering are likely to mislead us as engineering is fundamentally different from biology. In this article, we clarify and reconfigure the link between biology and engineering, presenting it in a more favorable light. We do so (...) by, first, arguing that critics operate with a narrow and incorrect notion of how engineering actually works, and of what the reliance on ideas from engineering entails. Second, we diagnose and diffuse one significant source of concern about appeals to engineering, namely that they are inherently and problematically metaphorical. We suggest that there is plenty of fertile ground left for a continued, healthy relationship between engineering and biology. (shrink)
Since the advent of the women's movement, women have made unprecedented gains in almost every field, from politics to the professions. Paradoxically, doctors and mental health professionals have also seen a staggering increase in the numbers of young women suffering from an epidemic of depression, eating disorders, and other physical and psychological problems. In The Cost of Competence, authors Brett Silverstein and Deborah Perlick argue that rather than simply labeling individual women as, say, anorexic or depressed, it is time (...) to look harder at the widespread prejudices within our society and child-rearing practices that lead thousands of young women to equate thinness with competence and success, and femininity with failure. They argue that continuing to treat depression, anxiety, anorexia and bulimia as separate disorders in young women can, in many cases, be a misguided approach since they are really part of a single syndrome. Furthermore, their fascinating research into the lives of forty prominent women from Elizabeth I to Eleanor Roosevelt show that these symptoms have been disrupting the lives of bright, ambitious women not for decades, but for centuries. Drawing on all the latest findings, rare historical research, cross-cultural comparisons, and their own study of over 2,000 contemporary women attending high schools and colleges, the authors present powerful new evidence to support the existence of a syndrome they call anxious somatic depression. Their investigation shows that the first symptoms usually surface in adolescence, most often in young women who aspire to excel academically and professionally. Many of the affected women grew up feeling that their parents valued sons over daughters. They identified intellectually with their successful fathers, not with their traditional homemaker mothers. Disordered eating is one way of rejecting the feminine bodies they perceive as barriers to achievement and recognition. Silverstein and Perlick uncover medical descriptions matching their diagnosis in Hippocratic texts from the fourth century B.C., in anthropological studies of Africa, Asia, and Latin America, and in case studies of many noted psychologists and psychiatrists, including the "hysteric" patients Freud used to develop his theories on psychoanalysis. They have also discovered that statistics on disordered eating, depression, and a host of other symptoms soared in eras in which women's opportunities grew--particularly the 1920s, when record numbers of women entered college and the workforce, the boyish silhouette of the flapper became the feminine ideal, and anorexia became epidemic, and again from the 1970s to the present day. The authors show that identifying this devastating syndrome is a first step toward its prevention and cure. The Cost of Competence presents an urgent message to parents, educators, policymakers, and the medical community on the crucial importance of providing young women with equal opportunity, and equal respect. (shrink)
'the whole work is remarkably fresh, vivid and attractively written psychologists will be grateful that a work of this kind has been done ... by one who has the scholarship, science, and philosophical training that are requisite for the task' - Mind This renowned three-volume collection records chronologically the steps by which psychology developed from the time of the early Greek thinkers and the first writings on the nature of the mind, through to the 1920s and such modern preoccupations as (...) criminal and animal psychology. It is only in relatively recent times that psychology has been considered an empirical science independent of philosophy. Brett's account is thus concerned with the broadest definition of psychology, taking in such philosophical aspects as the relation of mind and body, thought processes, etc. For each period he gives an account of the state of the sciences which influenced psychology, the state of psychology itself, the influence psychology had on other areas, and the applications of psychological theories. Examining a huge range of figures, he describes their attitudes on fundamental questions and their contribution to the progress of the subject, as well as the history of the different methods of inquiry. The thinkers he discusses range from Aristotle, Democritus, Socrates, Plato, and Xenocrates to Proclus, the Arabian teachers, Magnus, Duns Scotus, and Ockham from Galileo, Descartes, Gassendi, and Cudworth to Locke, Berkeley, Condillac, and Kant from Reid, Stewart, Herbart, and Schopenhauer to Bain, Spencer, Mill and Darwin. Surprisingly clear and easy to read, Brett's account succeeds in illuminating the nature of psychology as well as its history. It remains a classic overview of the subject from its broad roots in philosophy through to the independent empirical science of the modern era. --a scarce work, rarely found as a complete set --a classic work - all historians of psychology and philosophy should have A History of Psychology. (shrink)
F. LeRon Shults explores DeleuzeOCOs fascination with theological themes and shows how his entire corpus can be understood as a creative atheist machine that liberates thinking, acting and feeling."e.
Hilary Greaves and David Wallace argue that conditionalization maximizes expected accuracy and so is a rational requirement, but their argument presupposes a particular picture of the bridge between rationality and accuracy: the Best-Plan-to-Follow picture. And theorists such as Miriam Schoenfield and Robert Steel argue that it's possible to motivate an alternative picture—the Best-Plan-to-Make picture—that does not vindicate conditionalization. I show that these theorists are mistaken: it turns out that, if an update procedure maximizes expected accuracy on the Best-Plan-to-Follow picture, it's (...) guaranteed to maximize expected accuracy on the Best-Plan-to-Make picture as well, in which case moving from the former to the latter can't help us avoid the conclusion that conditionalization is a rational requirement. If there's a problem with Greaves and Wallace’s argument, it must lie elsewhere. (shrink)
This paper describes a pattern of explanation prevalent in the biological sciences that I call a ‘lineage explanation’. The aim of these explanations is to make plausible certain trajectories of change through phenotypic space. They do this by laying out a series of stages, where each stage shows how some mechanism worked, and the differences between each adjacent stage demonstrates how one mechanism, through minor modifications, could be changed into another. These explanations are important, for though it is widely accepted (...) that there is an ‘incremental constraint’ on evolutionary change, in an important class of cases it is difficult to see how to satisfy this constraint. I show that lineage explanations answer important questions about evolutionary change, but do so by demonstrating differences between individuals rather than invoking population processes, such as natural selection. Introduction Turning a ‘Scale’ into a ‘Plume’ Lineage Explanations in Biology 3.1 The evolution of eyes 3.2 The evolution of feathers The Two Dimensions of a Lineage Explanation 4.1 The production dimension 4.2 The continuity dimension 4.3 The dual role of the parts Constraining the Explanations Operational and Generative Lineages Explaining Change Without Populations Conclusion. (shrink)
The thesis that agents should calibrate their beliefs in the face of higher-order evidence—i.e., should adjust their first-order beliefs in response to evidence suggesting that the reasoning underlying those beliefs is faulty—is sometimes thought to be in tension with Bayesian approaches to belief update: in order to obey Bayesian norms, it's claimed, agents must remain steadfast in the face of higher-order evidence. But I argue that this claim is incorrect. In particular, I motivate a minimal constraint on a reasonable treatment (...) of the evolution of self-locating beliefs over time and show that calibrationism is compatible with any generalized Bayesian approach that respects this constraint. I then use this result to argue that remaining steadfast isn't the response to higher-order evidence that maximizes expected accuracy. (shrink)
A Benacerraf–Field challenge is an argument intended to show that common realist theories of a given domain are untenable: such theories make it impossible to explain how we’ve arrived at the truth in that domain, and insofar as a theory makes our reliability in a domain inexplicable, we must either reject that theory or give up the relevant beliefs. But there’s no consensus about what would count here as a satisfactory explanation of our reliability. It’s sometimes suggested that giving such (...) an explanation would involve showing that our beliefs meet some modal condition, but realists have claimed that this sort of modal interpretation of the challenge deprives it of any force: since the facts in question are metaphysically necessary and so obtain in all possible worlds, it’s trivially easy, even given realism, to show that our beliefs have the relevant modal features. Here I show that this claim is mistaken—what motivates a modal interpretation of the challenge in the first place also motivates an understanding of the relevant features in terms of epistemic possibilities rather than metaphysical possibilities, and there are indeed epistemically possible worlds where the facts in question don’t obtain. (shrink)
Sensitivity has sometimes been thought to be a highly epistemologically significant property, serving as a proxy for a kind of responsiveness to the facts that ensure that the truth of our beliefs isn’t just a lucky coincidence. But it's an imperfect proxy: there are various well-known cases in which sensitivity-based anti-luck conditions return the wrong verdicts. And as a result of these failures, contemporary theorists often dismiss such conditions out of hand. I show here, though, that a sensitivity-based understanding of (...) epistemic luck can be developed that respects what was attractive about sensitivity-based approaches in the first place but that's immune to these failures. (shrink)
A cross-section of the writings of Dominique Lestel, Vinciane Despret and Roberto Marchesini is presented here in translation across three special issues on philosophical ethology. These thinkers, relatively unknown in anglophone scholarship, offer important contributions to contemporary debates in posthumanism and animal studies. Particularly in so far as they scrutinise our often awkward attempts to understand the behaviour of animals in labs and ﬁelds – to know what animal bodies can do – they share in the rethinking of interspecies forms (...) of life, as domains of both empirical knowledge and zoo-political performance, and thereby take important steps towards a new philosophical ethology. (shrink)
This review of Wimsatt’s book Re-engineering Philosophy for Limited Beings focuses on analysing his use of robustness, a central theme in the book. I outline a family of three distinct conceptions of robustness that appear in the book, and look at the different roles they play. I briefly examine what underwrites robustness, and suggest that further work is needed to clarify both the structure of robustness and the relation between it various conceptions.
"In 2003 the Getty Museum, which holds a collection of about 240 Weston prints, hosted a colloquium on the photographer. This volume in the In Focus series records remarks by the author, Brett Abbott, along with those of six other participants: William Clift, Amy Conger, David Featherstone, Weston Naef, David Travis, and Jennifer Watts. Context for their conversation is provided by the author's introduction, plate texts, and chronology. Approximately fifty of Weston's images demonstrate why his work continues to resonate (...) with a contemporary public and serves as a model for a host of photographers active today."--BOOK JACKET. (shrink)
Truth by convention, once thought to be the foundation of a uniquely promising approach to explaining our access to the truth in nonempirical domains, is nowadays widely considered an absurdity. Its fall from grace has been due largely to the influence of an argument that can be sketched as follows: our linguistic conventions have the power to make it the case that a sentence expresses a particular proposition, but they can’t by themselves generate truth; whether a given proposition is true—and (...) so whether the sentence that expresses it is true—is a matter of what the world is like, which means it isn’t a matter of convention alone. The consensus is that this argument is decisive against truth by convention. Strikingly, though, it has rarely been formulated with much precision. Here I provide a new rendering of the argument, one that reveals its structure and makes transparent just what assumptions it requires, and then I assess conventionalists’ prospects for resisting each of those assumptions. I conclude that the consensus is mistaken: contrary to what is almost universally thought, there remains a promising way forward for the conventionalist project. Along the way, I clarify conventionalists’ commitments by thinking about what truth by convention would need to be like in order for conventionalism to do the epistemological work it’s intended to do. (shrink)
When epistemologists talk about knowledge, the discussions traditionally include only a small class of other epistemic notions: belief, justification, probability, truth. In this paper, we propose that epistemologists should include an additional epistemic notion into the mix, namely the notion of assuming or taking for granted.
This article explores some of the ways in which the conceptual apparatus of A Thousand Plateaus, and especially its machinic metaphysics, can be connected to recent developments in computer modelling and social simulation, which provide new tools for thinking that are becoming increasingly popular among philosophers and social scientists. Conversely, the successful deployment of these tools provides warrant for the flat ontology articulated in A Thousand Plateaus and therefore contributes to the ‘reversal of Platonism’ for which Deleuze had called in (...) his earlier works, such as Logic of Sense. The first major section offers a brief exposition of some key concepts in A Thousand Plateaus in order to set the stage for the second and third major sections, which argue that the fabrication of a metaphysics of immanence can be accelerated by connecting its conceptual apparatus more explicitly to insights derived from philosophical analyses of computational modelling and simulation and the social scientific use of ‘assemblage theory’. The article concludes with a summary of the argument and a brief consideration of some of the potential ethical and political implications of this interdisciplinary engagement. (shrink)
The literature contains evidence from some studies of asymmetric patterns of choice cycles in the direction consistent with regret theory, and evidence from other studies of asymmetries in the opposite direction. This article reports an experiment showing that both patterns occur within the same sample of respondents operating in the same experimental environment. We discuss the implications for modelling behaviour in such environments.
I use some recent formal work on measuring causation to explore a suggestion by James Woodward: that the notion of causal specificity can clarify the distinction in biology between permissive and instructive causes. This distinction arises when a complex developmental process, such as the formation of an entire body part, can be triggered by a simple switch, such as the presence of particular protein. In such cases, the protein is said to merely induce or "permit" the developmental process, whilst the (...) causal "instructions" for guiding that process are already prefigured within the cells. I construct a novel model that expresses in a simple and tractable way the relevant causal structure of biological development and then use a measure of causal specificity to analyse the model. I show that the permissive-instructive distinction cannot be captured by simply contrasting the specificity of two causes as Woodward proposes, and instead introduce an alternative, hierarchical approach to analysing the interaction between two causes. The resulting analysis highlights the importance of focusing on gene regulation, rather than just the coding regions, when analysing the distinctive causal power of genes. (shrink)
Emotion Review, Ahead of Print. A growing cadre of influential scholars has converged on a circumscribed definition of empathy as restricted only to feeling the same emotion that one perceives another is feeling. We argue that this restrictive isomorphic matching definition is deeply problematic because it deviates dramatically from traditional conceptualizations of empathy and unmoors the construct from generations of scientific research and clinical practice; insistence on an isomorphic form undercuts much of the functional value of empathy from multiple perspectives (...) of analysis; and combining the opposing concepts of isomorphic matching and self-other awareness implicitly requires motivational content, causing the RIM definition to implicitly require the kind of non-matching emotional content that it explicitly seeks to exclude. (shrink)
Recent work by Brian Skyrms offers a very general way to think about how information flows and evolves in biological networks—from the way monkeys in a troop communicate to the way cells in a body coordinate their actions. A central feature of his account is a way to formally measure the quantity of information contained in the signals in these networks. In this article, we argue there is a tension between how Skyrms talks of signalling networks and his formal measure (...) of information. Although Skyrms refers to both how information flows through networks and that signals carry information, we show that his formal measure only captures the latter. We then suggest that to capture the notion of flow in signalling networks, we need to treat them as causal networks. This provides the formal tools to define a measure that does capture flow, and we do so by drawing on recent work defining causal specificity. Finally, we suggest that this new measure is crucial if we wish to explain how evolution creates information. For signals to play a role in explaining their own origins and stability, they can’t just carry information about acts; they must be difference-makers for acts. _1_ Signalling, Evolution, and Information _2_ Skyrms’s Measure of Information _3_ Carrying Information versus Information Flow _3.1_ Example 1 _3.2_ Example 2 _3.3_ Example 3 _4_ Signalling Networks Are Causal Networks _4.1_ Causal specificity _4.2_ Formalizing causal specificity _5_ Information Flow as Causal Control _5.1_ Example 1 _5.2_ Examples 2 and 3 _5.3_ Average control implicitly ‘holds fixed’ other pathways _6_ How Does Evolution Create Information? _7_ Conclusion Appendix >. (shrink)
Why can I not appropriately utter ‘It must be raining’ while standing outside in the rain, even though every world consistent with my knowledge is one in which it is raining? The common response to this problem is to hold that epistemic must, in addition to quantifying over epistemic possibilities, carries some additional evidential information concerning the source of one'S evidence. I argue that this is a mistake: epistemic modals are mere quantifiers over epistemic possibilities. My central claim is that (...) the seeming anomaly of the data above arises from a mistaken conception of what a possibility is. Instead of conceiving of possibilities as possible worlds, I argue that we should conceive of possibilities as answers to open questions. (shrink)
Like Laland et al., I think Mayr’s distinction is problematic, but I identify a further problem with it. I argue that Mayr’s distinction is a false dichotomy, and obscures an important question about evolutionary change. I show how this question, once revealed, sheds light on some debates in evo-devo that Laland et al.’s analysis cannot, and suggest that it provides a different view about how future integration between biological disciplines might proceed.
Comparing engineering to evolution typically involves adaptationist thinking, where well-designed artifacts are likened to well-adapted organisms, and the process of evolution is likened to the process of design. A quite different comparison is made when biologists focus on evolvability instead of adaptationism. Here, the idea is that complex integrated systems, whether evolved or engineered, share universal principles that affect the way they change over time. This shift from adaptationism to evolvability is a significant move for, as I argue, we can (...) make sense of these universal principles without making any adaptationism claims. Furthermore, evolvability highlights important aspects of engineering that are ignored in the adaptationist debates. I introduce some novel engineering examples that incorporate these key neglected aspects, and use these examples to challenge some commonly cited contrasts between engineering and evolution, and to highlight some novel resemblances that have gone unnoticed. (shrink)
Which rules should guide our reasoning? Human reasoners often use reasoning shortcuts, called heuristics, which function well in some contexts but lack the universality of reasoning rules like deductive implication or inference to the best explanation. Does it follow that human reasoning is hopelessly irrational? I argue: no. Heuristic reasoning often represents human reasoners reaching a local rational maximum, reasoning more accurately than if they try to implement more “ideal” rules of reasoning. I argue this is a genuine rational achievement. (...) Our ideal rational advisors would advise us to reason with heuristic rules, not more complicated ideal rules. I argue we do not need a radical new account of epistemic norms to make sense of the success of heuristic reasoning. (shrink)
One recent topic of debate in Bayesian epistemology has been the question of whether imprecise credences can be rational. I argue that one account of imprecise credences, the orthodox treatment as defended by James M. Joyce, is untenable. Despite Joyce’s claims to the contrary, a puzzle introduced by Roger White shows that the orthodox account, when paired with Bas C. van Fraassen’s Reflection Principle, can lead to inconsistent beliefs. Proponents of imprecise credences, then, must either provide a compelling reason to (...) reject Reflection or admit that the rational credences in White’s case are precise. (shrink)
Understanding how cooperation evolves is central to explaining some core features of our biological world. Many important evolutionary events, such as the arrival of multicellularity or the origins of eusociality, are cooperative ventures between formerly solitary individuals. Explanations of the evolution of cooperation have primarily involved showing how cooperation can be maintained in the face of free-riding individuals whose success gradually undermines cooperation. In this paper I argue that there is a second, distinct, and less well explored, problem of cooperation (...) that I call the generation of benefit. Focusing on how benefit is generated within a group poses a different problem: how is it that individuals in a group can (at least in principle) do better than those who remain solitary? I present several different ways that benefit may be generated, each with different implications for how cooperation might be initiated, how it might further evolve, and how it might interact with different ways of maintaining cooperation. I argue that in some cases of cooperation, the most important underlying “problem” of cooperation may be how to generate benefit, rather than how to reduce conflict or prevent free-riding. (shrink)
Recent work by Brian Skyrms offers a very general way to think about how information flows and evolves in biological networks — from the way monkeys in a troop communicate, to the way cells in a body coordinate their actions. A central feature of his account is a way to formally measure the quantity of information contained in the signals in these networks. In this paper, we argue there is a tension between how Skyrms talks of signalling networks and his (...) formal measure of information. Although Skyrms refers to both how information flows through networks and that signals carry information, we show that his formal measure only captures the latter. We then suggest that to capture the notion of flow in signalling networks, we need to treat them as causal networks. This provides the formal tools to define a measure that does capture flow, and we do so by drawing on recent work defining causal specificity. Finally, we suggest that this new measure is crucial if we wish to explain how evolution creates information. For signals to play a role in explaining their own origins and stability, they can’t just carry information about acts: they must be difference-makers for acts. (shrink)
It is commonly held that the context with respect to which an indexical is interpreted is determined independently of the interpretation of the indexical. This view, which I call Context Realism, has explanatory significance: it is because the context is what it is that an indexical refers to what it does. In this paper, I provide an argument against Context Realism. I then develop an alternative that I call Context Constructivism, according to which indexicals are defined not in terms of (...) features of utterance situations, but rather in terms of roles that objects could play. (shrink)
Readers of reports on ethical failures by four-star general officers must wonder, “Don’t they have staffs to ensure that the general follows ethics rules?” The Department of Defense publishes robust ethics guidance in several documents; however, a staff’s best efforts to implement this guidance may fail to make an impression on a senior leader who is susceptible to the “Bathsheba syndrome,” an allusion to the biblical account where the prophet Nathan rebuked King David for his moral failings. This paper proposes (...) a methodology to enable senior headquarters staffs to play the role of Nathan in supporting ethical behaviors by high-level officers. It examines the mechanisms that embed ethical behavior within members of those staffs in carrying out their three principal roles of advising, scheduling, and transporting the four-star officer. The authors offer a framework based on an ethical infrastructure of organizational climate that focuses the staff’s daily efforts to mitigate risk across seven ethical “danger areas” that threaten ethical failures by senior officers. (shrink)
Recent work on the evolution of signaling systems provides a novel way of thinking about genetic information, where information is passed between genes in a regulatory network. I use examples from evolutionary developmental biology to show how information can be created in these networks and how it can be reused to produce rapid phenotypic change.
In 2003, the concept of precarity emerged as the central organizing platform for a series of social struggles that would spread across the space of Europe. Four years later, almost as suddenly as the precarity movement appeared, so it would enter into crisis. To understand precarity as a political concept it is necessary to go beyond economistic approaches that see social conditions as determined by the mode of production. Such a move requires us to see Fordism as exception and precarity (...) as the norm. The political concept and practice of translation enables us to frame the precarity of creative labour in a broader historical and geographical perspective, shedding light on its contestation and relation to the concept of the common. Our interest is in the potential for novel forms of connection, subjectivization and political organization. Such processes of translation are themselves inherently precarious, transborder undertakings. (shrink)
Hugh Everett III proposed that a quantum measurement can be treated as an interaction that correlates microscopic and macroscopic systems—particularly when the experimenter herself is included among those macroscopic systems. It has been difficult, however, to determine precisely what this proposal amounts to. Almost without exception, commentators have held that there are ambiguities in Everett’s theory of measurement that result from significant—even embarrassing—omissions. In the present paper, we resist the conclusion that Everett’s proposal is incomplete, and we develop a close (...) reading that accounts for apparent oversights. We begin by taking a look at how Everett set up his project—his method and his criterion of success. Illuminating parallels are found between Everett’s method and then-contemporary thought regarding inter-theoretic reduction. Also, from unpublished papers and correspondence, we are able to piece together how Everett judged the success of his theory of measurement, which completes our account of his intended contribution to the resolution of the quantum measurement problem. (shrink)