This article outlines a training activity that can enable both business and governmental professionals to translate the principles in a code of ethics to a specific list of company-related behaviors ranging from highly ethical to highly unethical. It also explores how this list can become a concrete model to follow in making ethical decisions. The article begins with a discussion as to what will improve ethical decision making in business and government. This leads us to explore the factors that can (...) most easily lead to improvement, namely a comprehensive code of conduct and employee training. From there we look at the Critical Incident Technique as a training strategy that has the potential for identifying those behaviors that distinguish really outstanding behaviors from those that go by the book, and can be used to encourage more independent thinking and to set expectations for future decisions. If employees are given the skills and examples that will enable them to make better decisions, they can apply them to any situation. (shrink)
During the evolution of business ethics as a profession, the fields it draws from have identified separate knowledge and skills they believe define business ethics; however, there is little agreement among these fields. This means the strengths of each are seldom combined to guide ethical decision making in business and industry, which leaves business ethicists looking less effective, and perhaps less professional, than their counter-parts in medicine and law. It also means that those who have been thrust into the role (...) of guiding business ethics – or those who have added the title of ethics consultant after their name, without having the preparation to do so, have no standards to look to.This article examines one of the touchstones of mature professions, performance standards by which members of the profession can measure themselves and the profession can self-monitor. Further, it suggests that business ethics has not yet addressed one of the standards by which all professions are measured, that of service. (shrink)
This article describes some potential uses of Bayesian estimation for time-series and panel data models by incorporating information from prior probabilities in addition to observed data. Drawing on econometrics and other literatures we illustrate the use of informative “shrinkage” or “small variance” priors while extending prior work on the general cross-lagged panel model. Using a panel dataset of national income and subjective well-being we describe three key benefits of these priors. First, they shrink parameter estimates toward zero or toward each (...) other for time-varying parameters, which lends additional support for an income → SWB effect that is not supported with maximum likelihood. This is useful because, second, these priors increase model parsimony and the stability of estimates and thus improve out-of-sample predictions and interpretability, which means estimated effect should also be more trustworthy than under ML. Third, these priors allow estimating otherwise under-identified models under ML, allowing higher-order lagged effects and time-varying parameters that are otherwise impossible to estimate using observed data alone. In conclusion we note some of the responsibilities that come with the use of priors which, departing from typical commentaries on their scientific applications, we describe as involving reflection on how best to apply modeling tools to address matters of worldly concern. (shrink)
Alan C. LoveDarwinian calisthenicsAn athlete engages in calisthenics as part of basic training and as a preliminary to more advanced or intense activity. Whether it is stretching, lunges, crunches, or push-ups, routine calisthenics provide a baseline of strength and flexibility that prevent a variety of injuries that might otherwise be incurred. Peter Bowler has spent 40 years doing Darwinian calisthenics, researching and writing on the development of evolutionary ideas with special attention to Darwin and subsequent filiations among scientists exploring (...) evolution . Therefore, we would expect that when Bowler engages in a counterfactual history—imagining a world without Darwin—he is able to avoid historical injury and generate novel insights. My assessment is that the results are mixed. Before we can see why, it is necessary to walk briskly through the main contours of his argument.Bowler begins with an apologia for a counterfactual appr .. (shrink)
Acknowledgments 1. Culture Is Essential 2. Culture Exists 3. Culture Evolves 4. Culture Is an Adaptation 5. Culture Is Maladaptive 6. Culture and Genes Coevolve 7. Nothing about Culture Makes Sense except in the Light of Evolution.
Peter van Inwagen famously offers a version of the luck objection to libertarianism called the ‘Rollback Argument.’ It involves a thought experiment in which God repeatedly rolls time backward to provide an agent with many opportunities to act in the same circumstance. Because the agent has the kind of freedom that affords her alternative possibilities at the moment of choice, she performs different actions in some of these opportunities. The upshot is that whichever action she performs in the actual-sequence (...) is intuitively a matter of mere chance. I explore a new response to the Rollback Argument. If there are true counterfactuals of libertarian freedom, then the agent performs the same action each time she is placed in the same circumstance, because that is what she would freely do in that circumstance. This response appears to negate the chancy intuition. Ultimately, however, I argue that this new response is unsuccessful, because there is a variant of the Rollback Argument that presents the same basic challenge to the libertarian on the assumption that there are true counterfactuals of libertarian freedom. Thus, true counterfactuals of libertarian freedom do not provide the libertarian with a solution to the Rollback Argument. (shrink)
Metaphysicians should pay attention to quantum mechanics. Why? Not because it provides definitive answers to many metaphysical questions-the theory itself is remarkably silent on the nature of the physical world, and the various interpretations of the theory on offer present conflicting ontological pictures. Rather, quantum mechanics is essential to the metaphysician because it reshapes standard metaphysical debates and opens up unforeseen new metaphysical possibilities. Even if quantum mechanics provides few clear answers, there are good reasons to think that any adequate (...) understanding of the quantum world will result in a radical reshaping of our classical world-view in some way or other. Whatever the world is like at the atomic scale, it is almost certainly not the swarm of particles pushed around by forces that is often presupposed. This book guides readers through the theory of quantum mechanics and its implications for metaphysics in a clear and accessible way. The theory and its various interpretations are presented with a minimum of technicality. The consequences of these interpretations for metaphysical debates concerning realism, indeterminacy, causation, determinism, holism, and individuality are explored in detail, stressing the novel form that the debates take given the empirical facts in the quantum domain. While quantum mechanics may not deliver unconditional pronouncements on these issues, the range of possibilities consistent with our knowledge of the empirical world is relatively small-and each possibility is metaphysically revisionary in some way. This book will appeal to researchers, students, and anybody else interested in how science informs our world-view. (shrink)
What is the best account of process reliabilism about epistemic justification, especially epistemic entitlement? I argue that entitlement consists in the normal functioning (proper operation) of the belief-forming process when the process has forming true beliefs reliably as an etiological function. Etiological functions involve consequence explanation: a belief-forming process has forming true beliefs reliably as a function just in case forming-true beliefs reliably partly explains the persistence of the process. This account paves the way for avoiding standard objections to process (...) reliabilism and situations epistemic entitlement within a normative framework of functions and functional norms. (shrink)
Although much has been written about the vigorous debates over science and religion in the Victorian era, little attention has been paid to their continuing importance in early twentieth-century Britain. Reconciling Science and Religion provides a comprehensive survey of the interplay between British science and religion from the late nineteenth century to World War II. Peter J. Bowler argues that unlike the United States, where a strong fundamentalist opposition to evolutionism developed in the 1920s (most famously expressed in the (...) Scopes "monkey trial" of 1925), in Britain there was a concerted effort to reconcile science and religion. Intellectually conservative scientists championed the reconciliation and were supported by liberal theologians in the Free Churches and the Church of England, especially the Anglican "Modernists." Popular writers such as Julian Huxley and George Bernard Shaw sought to create a non-Christian religion similar in some respects to the Modernist position. Younger scientists and secularists—including Rationalists such as H. G. Wells and the Marxists—tended to oppose these efforts, as did conservative Christians, who saw the liberal position as a betrayal of the true spirit of their religion. With the increased social tensions of the 1930s, as the churches moved toward a neo-orthodoxy unfriendly to natural theology and biologists adopted the "Modern Synthesis" of genetics and evolutionary theory, the proposed reconciliation fell apart. Because the tensions between science and religion—and efforts at reconciling the two—are still very much with us today, Bowler's book will be important for everyone interested in these issues. Contents: Illustrations Preface Introduction: A Legacy of Conflict? Confrontation, Cooperation, or Coexistence? Victorian Background Science and Religion in the New Century Part One: The Sciences and Religion 1. The Religion of Scientists Changing Patterns of Belief Scientists and Christianity Scientists and Theism Method and Meaning Science and Values 2. Scientists against Superstition Science and Rationalism Religion without Revelation Marxists and Other Radicals Science, Religion, and the History of Science 3. Physics and Cosmology Ether and Spirit The New Physics The Earth and the Universe 4. Evolution and the New Natural Theology Science and Creation Evolution and Progress The Role of Lamarckism Darwinism Revived 5. Matter, Life, and Mind The Origin of Life Vitalism and Organicism Mind and Body Psychology and Religion Part Two: The Churches and Science 6. The Churches in the New Century The Challenge of the New The Churches’ Response 7. The New Theology in the Free Churches Precursors of the New Theology Campbell and the New Theology Modernism in the Free Churches 8. Anglican Modernism Modernism and the New Natural Theology Charles F. D’Arcy E. W. Barnes W. R. Inge Charles Raven 9. The Reaction against Modernism Evangelicals against Evolution Liberal Catholicism The Menace of the New Psychology Science and Modern Life Theology in the Thirties Roman Catholicism Part Three: The Wider Debate 10. Science and Secularism Against Idealism Popular Rationalism The Social Reformers 11. Religion’s Defenders From Idealism to Spiritualism Creative and Emergent Evolution Evolution and the Human Spirit Progress through Struggle The Christian Response Epilogue Biographical Appendix Bibliography Index. (shrink)
Ambitiously identifying fresh issues in the study of complex systems, Peter J. Taylor, in a model of interdisciplinary exploration, makes these concerns accessible to scholars in the fields of ecology, environmental science, and science studies. Unruly Complexity explores concepts used to deal with complexity in three realms: ecology and socio-environmental change; the collective constitution of knowledge; and the interpretations of science as they influence subsequent research. For each realm Taylor shows that unruly complexity-situations that lack definite boundaries, where what (...) goes on "outside" continually restructures what is "inside," and where diverse processes come together to produce change-should not be suppressed by partitioning complexity into well-bounded systems that can be studied or managed from an outside vantage point. Using case studies from Australia, North America, and Africa, he encourages readers to be troubled by conventional boundaries-especially between science and the interpretation of science-and to reflect more self-consciously on the conceptual and practical choices researchers make. (shrink)
Behaviour norms are considered for decision trees which allow both objective probabilities and uncertain states of the world with unknown probabilities. Terminal nodes have consequences in a given domain. Behaviour is required to be consistent in subtrees. Consequentialist behaviour, by definition, reveals a consequence choice function independent of the structure of the decision tree. It implies that behaviour reveals a revealed preference ordering satisfying both the independence axiom and a novel form of sure-thing principle. Continuous consequentialist behaviour must be expected (...) utility maximizing. Other plausible assumptions then imply additive utilities, subjective probabilities, and Bayes' rule. (shrink)
This paper argues for the general proper functionalist view that epistemic warrant consists in the normal functioning of the belief-forming process when the process has forming true beliefs reliably as an etiological function. Such a process is reliable in normal conditions when functioning normally. This paper applies this view to so-called testimony-based beliefs. It argues that when a hearer forms a comprehension-based belief that P (a belief based on taking another to have asserted that P) through the exercise of a (...) reliable competence to comprehend and filter assertive speech acts, then the hearer's belief is prima facie warranted. The paper discusses the psychology of comprehension, the function of assertion, and the evolution of filtering mechanisms, especially coherence checking. (shrink)
This paper investigates the tenability of wavefunction realism, according to which the quantum mechanical wavefunction is not just a convenient predictive tool, but is a real entity figuring in physical explanations of our measurement results. An apparent difficulty with this position is that the wavefunction exists in a many-dimensional configuration space, whereas the world appears to us to be three-dimensional. I consider the arguments that have been given for and against the tenability of wavefunction realism, and note that both the (...) proponents and the opponents assume that quantum mechanical configuration space is many-dimensional in exactly the same sense in which classical space is three-dimensional. I argue that this assumption is mistaken, and that configuration space can be taken as three-dimensional in a relevant sense. I conclude that wavefunction realism is far less problematic than it has been taken to be. Introduction Non-separability The instantaneous solution The dynamical solution Invariance What is configuration space, anyway? Conclusion. (shrink)
The nature of persons is a perennial topic of debate in philosophy, currently enjoying something of a revival. In this volume for the first time metaphysical debates about the nature of human persons are brought together with related debates in philosophy of religion and theology. Fifteen specially written essays explore idealist, dualist, and materialist views of persons, discuss specifically Christian conceptions of the value of embodiment, and address four central topics in philosophical theology: incarnation, resurrection, original sin, and the trinity.
Putnam and Laudan separately argue that the falsity of past scientific theories gives us reason to doubt the truth of current theories. Their arguments have been highly influential, and have generated a significant literature over the past couple of decades. Most of this literature attempts to defend scientific realism by attacking the historical evidence on which the premises of the relevant argument are based. However, I argue that both Putnam's and Laudan's arguments are fallacious, and hence attacking their premises is (...) unnecessary. The paper concludes with a discussion of the further historical evidence that would be required if the pessimistic induction is to present a serious threat to scientific realism. (shrink)
Stewart Cohen has recently presented solutions to two forms of what he calls "The Problem of Easy Knowledge" ("Basic Knowledge and the Problem of Easy Knowledge," Philosophy and Phenomenological Research, LXV, 2, September 2002, pp. 309-329). I offer alternative solutions. Like Cohen's, my solutions allow for basic knowledge. Unlike his, they do not require that we distinguish between animal and reflective knowledge, restrict the applicability of closure under known entailments, or deny the ability of basic knowledge to combine with self-knowledge (...) to provide inductive evidential support. My solution to the closure version of the problem covers a variation on the problem that is immune to Cohen's approach. My response to the bootstrapping version presents reasons to question whether the problem case, as Cohen presents it, is even possible, and, assuming it is, my solution avoids a false implication of Cohen's own. The key to my solutions for both versions is the distinction between an inference's transferring epistemic support, on the one hand, and its not begging the question against skeptics, on the other. (shrink)
The Sleeping Beauty paradox in epistemology and the many-worlds interpretation of quantum mechanics both raise problems concerning subjective probability assignments. Furthermore, there are striking parallels between the two cases; in both cases personal experience has a branching structure, and in both cases the agent loses herself among the branches. However, the treatment of probability is very different in the two cases, for no good reason that I can see. Suppose, then, that we adopt the same treatment of probability in each (...) case. Then the dominant ‘thirder’ solution to the Sleeping Beauty paradox becomes incompatible with the tenability of the many-worlds interpretation. (shrink)
I hold that epistemic warrant consists in the normal functioning of the belief-forming process when the process has forming true beliefs reliably as an etiological function. Evolution by natural selection is the central source of etiological functions. This leads many to think that on my view warrant requires a history of natural selection. What then about learning? What then about Swampman? Though functions require history, natural selection is not the only source. Self-repair and trial-and-error learning are both sources. Warrant requires (...) history, but not necessarily that much. (shrink)
How should we undertand the role of norms—especially epistemic norms—governing assertive speech acts? Mitchell Green (2009) has argued that these norms play the role of handicaps in the technical sense from the animal signals literature. As handicaps, they then play a large role in explaining the reliability—and so the stability (the continued prevalence)—of assertive speech acts. But though norms of assertion conceived of as social norms do indeed play this stabilizing role, these norms are best understood as deterrents and not (...) as handicaps. This paper explains the stability problem for the maintenance of animal signals, and so human communication, for we are animals too, after all; the mechanics of the handicap principle; the role of deterrents and punishments as an alternative mechanism; and the role of social norms governing assertion for the case of human communication. (shrink)
Everettian accounts of quantum mechanics entail that people branch; every possible result of a measurement actually occurs, and I have one successor for each result. Is there room for probability in such an account? The prima facie answer is no; there are no ontic chances here, and no ignorance about what will happen. But since any adequate quantum mechanical theory must make probabilistic predictions, much recent philosophical labor has gone into trying to construct an account of probability for branching selves. (...) One popular strategy involves arguing that branching selves introduce a new kind of subjective uncertainty. I argue here that the variants of this strategy in the literature all fail, either because the uncertainty is spurious, or because it is in the wrong place to yield probabilistic predictions. I conclude that uncertainty cannot be the ground for probability in Everettian quantum mechanics. (shrink)
Advances in molecular biological research in the latter half of the twentieth century have made the story of the gene vastly complicated: the more we learn about genes, the less sure we are of what a gene really is. Knowledge about the structure and functioning of genes abounds, but the gene has also become curiously intangible. This collection of essays renews the question: what are genes? Philosophers, historians and working scientists re-evaluate the question in this volume, treating the gene as (...) a focal point of interdisciplinary and international research. It will be of interest to professionals and students in the philosophy and history of science, genetics and molecular biology. (shrink)
Jennifer Lackey ('Testimonial Knowledge and Transmission' The Philosophical Quarterly 1999) and Peter Graham ('Conveying Information, Synthese 2000, 'Transferring Knowledge' Nous 2000) offered counterexamples to show that a hearer can acquire knowledge that P from a speaker who asserts that P, but the speaker does not know that P. These examples suggest testimony can generate knowledge. The showpiece of Lackey's examples is the Schoolteacher case. This paper shows that Lackey's case does not undermine the orthodox view that testimony cannot generate (...) knowledge. This paper explains why Lackey's arguments to the contrary are ineffective for they misunderstand the intuitive rationale for the view that testimony cannot generate knowledge. This paper then elaborates on a version of the case from Graham's paper 'Conveying Information' (the Fossil case) that effectively shows that testimony can generate knowledge. This paper then provides a deeper informative explanation for how it is that testimony transfers knowledge, and why there should be cases where testimony generates knowledge. (shrink)
In quantum mechanics it is usually assumed that mutually exclusives states of affairs must be represented by orthogonal vectors. Recent attempts to solve the measurement problem, most notably the GRW theory, require the relaxation of this assumption. It is shown that a consequence of relaxing this assumption is that arithmatic does not apply to ordinary macroscopic objects. It is argued that such a radical move is unwarranted given the current state of understanding of the foundations of quantum mechanics.
Epistemic defeat is standardly understood in either evidentialist or responsibilist terms. The seminal treatment of defeat is an evidentialist one, due to John Pollock, who famously distinguishes between undercutting and rebutting defeaters. More recently, an orthogonal distinction due to Jennifer Lackey has become widely endorsed, between so-called doxastic (or psychological) and normative defeaters. We think that neither doxastic nor normative defeaters, as Lackey understands them, exist. Both of Lackey’s categories of defeat derive from implausible assumptions about epistemic responsibility. Although Pollock’s (...) evidentialist view is superior, the evidentialism per se can be purged from it, leaving a general structure of defeat that can be incorporated in a reliabilist theory that is neither evidentialist nor responsibilist in any way. (shrink)
What is the biological function of perception? I hold perception, especially visual perception in humans, has the biological function of accurately representing the environment. Tyler Burge argues this cannot be so in Origins of Objectivity (Oxford, 2010), for accuracy is a semantical relationship and not, as such, a practical matter. Burge also provides a supporting example. I rebut the argument and the example. Accuracy is sometimes also a practical matter if accuracy partly explains how perception contributes to survival and reproduction.
The use of socially learned information (culture) is central to human adaptations. We investigate the hypothesis that the process of cultural evolution has played an active, leading role in the evolution of genes. Culture normally evolves more rapidly than genes, creating novel environments that expose genes to new selective pressures. Many human genes that have been shown to be under recent or current selection are changing as a result of new environments created by cultural innovations. Some changed in response to (...) the development of agricultural subsistence systems in the Early and Middle Holocene. Alleles coding for adaptations to diets rich in plant starch (e.g., amylase copy number) and to epidemic diseases evolved as human populations expanded (e.g., sickle cell and G6PD defi- ciency alleles that provide protection against malaria). Large-scale scans using patterns of linkage disequilibrium to detect recent selection suggest that many more genes evolved in response to agriculture. Genetic change in response to the novel social environment of contemporary modern societies is also likely to be occurring. The functional effects of most of the alleles under selection during the last 10,000 years are currently unknown. Also unknown is the role of paleoenvironmental change in regulating the tempo of hominin evolution. Although the full extent of culture-driven gene-culture coevolution is thus far unknown for the deeper history of the human lineage, theory and some evidence suggest that such effects were profound. Genomic methods promise to have a major impact on our understanding of gene-culture coevolution over the span of hominin evolutionary history. (shrink)
Putnam and Laudan separately argue that the falsity of past scientific theories gives us reason to doubt the truth of current theories. Their arguments have been highly influential, and have generated a significant literature over the past couple of decades. Most of this literature attempts to defend scientific realism by attacking the historical evidence on which the premises of the relevant argument are based. However, I argue that both Putnam's and Laudan's arguments are fallacious, and hence attacking their premises is (...) unnecessary. The paper concludes with a discussion of the further historical evidence that would be required if the pessimistic induction is to present a serious threat to scientific realism. (shrink)
The complexity of human societies of the past few thousand years rivals that of social insect societies. We hypothesize that two sets of social “instincts” underpin and constrain the evolution of complex societies. One set is ancient and shared with other social primate species, and one is derived and unique to our lineage. The latter evolved by the late Pleistocene, and led to the evolution of institutions of intermediate complexity in acephalous societies. The institutions of complex societies often conflict with (...) our social instincts. The complex societies of the past few thousand years can function only because cultural evolution has created effective “work-arounds” to manage such instincts. We describe a series of work-arounds and use the data on the relative effectiveness of WWII armies to test the work-around hypothesis. (shrink)