To what extent should states accommodate religious liberty claims? Can the pluralist state be neutral between religions and secularism? This book explores contemporary legal controversies regarding the protection of religious liberty from a theoretical and comparative perspective, looking at issues such as family and parenting, medical treatment, education, employment, religious group autonomy, and freedom of expression.
The approach of the European Court of Human Rights to cases of religiously offensive expression is inconsistent and unsatisfactory. A critical analysis of the Court’s jurisprudence on blasphemy, religious insult and religious hatred identifies three problems with its approach in this field. These are: the embellishment and over-emphasis of freedom of religion, the use of the margin of appreciation and the devaluing of some forms of offensive speech. Nevertheless, it is possible to defend a more coherent approach to the limitation (...) of freedom of expression under the European Convention of Human Rights, designed to protect religious liberty in a narrower category of cases. (shrink)
BackgroundThe ARRIVE guidelines are widely endorsed but compliance is limited. We sought to determine whether journal-requested completion of an ARRIVE checklist improves full compliance with the guidelines.MethodsIn a randomised controlled trial, manuscripts reporting in vivo animal research submitted to PLOS ONE were randomly allocated to either requested completion of an ARRIVE checklist or current standard practice. Authors, academic editors, and peer reviewers were blinded to group allocation. Trained reviewers performed outcome adjudication in duplicate by assessing manuscripts against an operationalised version (...) of the ARRIVE guidelines that consists 108 items. Our primary outcome was the between-group differences in the proportion of manuscripts meeting all ARRIVE guideline checklist subitems.ResultsWe randomised 1689 manuscripts, of which 1269 were sent for peer review and 762 accepted for publication. No manuscript in either group achieved full compliance with the ARRIVE checklist. Details of animal husbandry was the only subitem to show improvements in reporting, with the proportion of compliant manuscripts rising from 52.1 to 74.1% in the control and intervention groups, respectively.ConclusionsThese results suggest that altering the editorial process to include requests for a completed ARRIVE checklist is not enough to improve compliance with the ARRIVE guidelines. Other approaches, such as more stringent editorial policies or a targeted approach on key quality items, may promote improvements in reporting. (shrink)
At the turn of the 21st century, Susan Leigh Anderson and Michael Anderson conceived and introduced the Machine Ethics research program, that aimed to highlight the requirements under which autonomous artificial intelligence systems could demonstrate ethical behavior guided by moral values, and at the same time to show that these values, as well as ethics in general, can be representable and computable. Today, the interaction between humans and AI entities is already part of our everyday lives; in the near (...) future it is expected to play a key role in scientific research, medical practice, public administration, education and other fields of civic life. In view of this, the debate over the ethical behavior of machines is more crucial than ever and the search for answers, directions and regulations is imperative at an academic, institutional as well as at a technical level. Our discussion with the two inspirers and originators of Machine Ethics highlights the epistemological, metaphysical and ethical questions arising by this project, as well as the realistic and pragmatic demands that dominate artificial intelligence and robotics research programs. Most of all, however, it sheds light upon the contribution of Susan and Michael Anderson regarding the introduction and undertaking of a main objective related to the creation of ethical autonomous agents, that will not be based on the “imperfect” patterns of human behavior, or on preloaded hierarchical laws and human-centric values. (shrink)
On September 27, 2016 people across the world looked down at their buzzing phones to see the AP Alert: “Baby born with DNA from 3 people, first from new technique.” It was an announcement met with confusion by many, but one that polarized the scientific community almost instantly. Some celebrated the birth as an advancement that could help women with a family history of mitochondrial diseases prevent the transmission of the disease to future generations; others held it unethical, citing medical (...) tourism and consequences for the future of the therapy. The child in question was actually born a few months earlier on April 6, 2016, but the research was published a few months later in the October edition of Fertility and Sterility. The mother carries DNA that could have given the baby Leigh Syndrome, a severe neurological disorder characterized by psychomotor regression that typically results in death between ages two and three. Also known as subacute necrotizing encephalomyelopathy, Leigh Syndrome is caused by genetic mutations in mitochondrial DNA, which causes defective oxidative phosphorylation. Because people with this condition cannot re-form ATP, “demyleniation, gliosis, necrosis, spongiosis, or capillary proliferation” can occur, thereby producing bilateral lesions across the central nervous system. The 36-year-old mother previously had four miscarriages and successfully birthed two children, both of whom survived less than six years due to the syndrome. For religious reasons, the mother opted for Spindle Nuclear Transfer instead of Pronuclear Transfer, which many religious organizations oppose because it entails the destruction of fertilized eggs. In Pronuclear Transfer, both the parent and donor egg are fertilized. The parent’s nuclear material is then removed from the egg containing mutated mitochondria and inserted into the fertilized donor egg—from which the original nuclear material has been removed and destroyed. Spindle Nuclear Transfer, on the other hand, removes the mother’s nuclear material from the egg with unhealthy mitochondria, then implants it into a donor’s egg. The newly created egg is then fertilized with the father’s sperm. This Spindle Nuclear Transfer was performed in Mexico by a team of doctors led by Dr. John Zhang, the founder of the New Hope Fertility Center in New York City. In their report, the doctors outline that five oocytes were successfully reconstituted, four of which developed into blastocysts. Only one was euploid: containing 46 XY chromosomes. Researchers then biopsied that blastocyst and found that the transmission rate of maternal mtDNA was, “5.10 ± 1.11% and the heteroplasmy level for 8993T>G was 5.73%.” This indicates that at the blastocyst stage, around five percent of the mitochondria are from the original maternal egg with the mutation for Leigh Syndrome. After birth, they biopsied different tissues and found that they had an average of less than 1.60 ± 0.92% of transmitted mtDNA from the original maternal egg. The baby is reportedly doing well and the team of doctors concluded that “[h]uman oocytes reconstituted by SNT are capable of producing a healthy live birth. SNT may provide a novel treatment option in minimizing pathogenic mtDNA transmission from mothers to their babies." Even when considered theoretically, Spindle Nuclear Transfer is controversial. Many doctors believe that the risks at this stage of research are too great. Dr. Trevor Stammers, a bioethicist at St. Mary’s University in London, points out: “We do not yet have a clear picture of the interaction between nuclear DNA and mitochondria.” Others hold that faulty mitochondrial DNA can still be transferred during the procedure. Still more argue this technology could create problems because of germline modification. Dr. Paul Knoepfler, a cell biologist at the UC Davis School of Medicine, explains: “Since this is uncharted territory and the children born from this technology would have heritable genetic changes, there are also significant unknown risks to future generations.” However, proponents counter these arguments. They say the benefits outweigh the risks, as “mitochondrial replacement techniques [like Spindle Nuclear Transfer] would eliminate maternal transmission of mitochondrial disease… allowing a woman with a family history of mitochondrial diseases to ensure her children would not be affected.” Additionally, experts say that no symptoms will occur if less than 20% of the transferred mitochondrial DNA is faulty. And while opponents see potential germline modification as a problem, advocates answer that this could stop a family history of mitochondrial disease, which they deem a more serious concern. In this case specifically, however, there are more issues at hand. While the team of doctors did eliminate the possibility of germline modification by selecting a male embryo, there are other ethical concerns. Some argue that this case could be described as “medical tourism.” While New Hope Medical Clinic does maintain a branch in Mexico, Dr. Zhang stated that Mexico has “no rules.” It is a country with underdeveloped regulations, making it difficult to confirm that doctors adhere to widely-held medical and ethical standards. Furthermore, while it is highly unlikely that the child will develop Leigh’s Syndrome, Dr. Dietrich Egli of the New York Stem Cell Foundation says the 5% mitochondrial transfer rate indicates that the technique “was not carried out well.” He points to studies of embryos where the rate of mtDNA transfer was almost ten times lower. Spindle Nuclear Transfer is currently legal in the UK, and many are hoping to see it legalized in the US, although Congress currently prohibits the FDA from considering applications that would entail trials in people. While Dr. Zhang’s work is arguably revolutionary, many wonder if the manner in which this study was performed—in Mexico and with a high mitochondrial transfer rate—will impact the technology’s future in the US. Last week, Dr. Zhang presented the report to scientists gathered in Salt Lake City, saying that while science isn’t a race, it is “in a sense, a race for the family to find a cure, to find hope.” Only time will tell if this study set back other families racing and hoping for a cure. References 1. "Leigh Syndrome." Genetics Home Reference, US National Library of Medicine, 25 Oct. 2016. Accessed 26 Oct. 2016. 2. McKusick, Victor A., and Ada Hamosh. "LEIGH SYNDROME; LS." Online Mendelian Inheritance in Man, Johns Hopkins University, 20 Jan. 2016. Accessed 30 Oct. 2016. 3. Zhang, J, H Liu, S Luo, A Chavez-Badiola, and Z Liu. "First live birth using human oocytes reconstituted by spindle nuclear transfer for mitochondrial DNA mutation causing Leigh syndrome." Fertility and Sterility, vol. 106, no. 3, 2016, pp. e375-76. Accessed 30 Oct. 2016. 4. Sample, Ian. "‘Three-parent’ babies explained: what are the concerns and are they justified?" The Guardian, 2 Feb. 2015. Accessed 30 Oct. 2016. 5. Zhang, et al. 6. Ibid. 7. Knapton, Sarah. "Three-parent babies: the arguments for and against." The Telegraph, 3 Feb. 2015. 8. Ibid. 9. Ibid. 10. "Mitochondrial Replacement Therapy FAQs." New York Stem Cell Foundation Research Institute, New York Stem Cell Foundation. Accessed 30 Oct. 2016. 11. Reardon, Sara. "‘Three-parent baby’ claim raises hopes — and ethical concerns." Nature, 28 Sept. 2016. Accessed 30 Oct. 2016. 12. Hamzelou, Jessica. "Exclusive: World’s first baby born with new “3 parent” technique." New Scientist, 27 Sept. 2016. Accessed 30 Oct. 2016. 13. Reardon, Sara. "‘Three-parent baby’ claim raises hopes — and ethical concerns." Nature, 28 Sept. 2016. Accessed 30 Oct. 2016. 14. "3-Person IVF: A Resource Page." Center for Genetics and Society, 24 Oct. 2016. Accessed 30 Oct. 2016. 15. Ritter, Malcolm. "Baby born with DNA from 3 people, first from new technique." Associated Press, 27 Sept. 2016. Accessed 30 Oct. 2016. 16. Chen, Daphne. "Controversy swirls around first three-parent baby." Deseret News, 19 Oct. 2016. Accessed 30 Oct. 2016. Works consulted Swetlitz, Ike. "FDA urged to approve ‘three-parent embryos,’ a new frontier in reproduction." STAT, 3 Feb. 2016. Accessed 30 Oct. 2016. (shrink)
The rosy dawn of my title refers to that optimistic time when the logical concept of a natural kind originated in Victorian England. The scholastic twilight refers to the present state of affairs. I devote more space to dawn than twilight, because one basic problem was there from the start, and by now those origins have been forgotten. Philosophers have learned many things about classification from the tradition of natural kinds. But now it is in disarray and is unlikely to (...) be put back together again. My argument is less founded on objections to the numerous theories now in circulation, than on the sheer proliferation of incompatible views. There no longer exists what Bertrand Russell called ‘the doctrine of natural kinds’—one doctrine. Instead we have a slew of distinct analyses directed at unrelated projects. (shrink)
How is a person's freedom related to his or her preferences? Liberal theorists of negative freedom have generally taken the view that the desire of a person to do or not do something is irrelevant to the question of whether he is free to do it. Supporters of the “pure negative” conception of freedom have advocated this view in its starkest form: they maintain that a person is unfree to Φ if and only if he is prevented from Φ-ing by (...) the conduct or dispositions of some other person. This definition of freedom is value-neutral in the sense that no reference is made to preferences over options or indeed to any other indicators of the values of options, either in the characterization of “Φ-ing” itself or in the characterization of the way in which Φ-ing can be constrained. (shrink)
From the time of its clearest origins with Pascal, the theory of probabilities seemed to offer means by which the study of human affairs might be reduced to the same kind of mathematical discipline that was already being achieved in the study of nature. Condorcet is to a great extent merely representative of the philosophers of the seventeenth and eighteenth centuries who were led on by the prospect of developing moral and political sciences on the pattern of the natural sciences, (...) specifically physics. The development of economics and the social sciences, from the eighteenth century onwards, may be said in part to have fulfilled and in a manner to have perpetuated these ambitions. In so far as the new sciences have been susceptible of mathematical treatment, this has not been confined to the calculus of probabilities. But there is a temptation at every stage to ascribe fundamental significance and universal applicability to each latest mathematical device that is strikingly useful or illuminating on its first introduction. It is the theory of games that enjoys this position at present, and shapes the common contemporary conception of the very same problems that preoccupied Condorcet. (shrink)
The _Mozi_ is a key philosophical work written by a major social and political thinker of the fifth century B.C.E. It is one of the few texts to survive the Warring States period and is crucial to understanding the origins of Chinese philosophy and two other foundational works, the _Mengzi_ and the _Xunzi_. Ian Johnston provides an English translation of the entire _Mozi_, as well as the first bilingual edition in any European language to be published in the West. His (...) careful translation reasserts the significance of the text's central doctrines, and his annotations and contextual explanations add vivid historical and interpretive dimensions. Part 1 of the _Mozi_ is called the "Epitomes" and contains seven short essays on the elements of Mohist doctrine. Part 2, the "Core Doctrines," establishes the ten central tenets of Mo Zi's ethical, social, and political philosophy, while articulating his opposition to Confucianism. Part 3, the "Canons and Explanations," comprises observations on logic, language, disputation, ethics, science, and other matters, written particularly in defense of Mohism. Part 4, the "Dialogues," presents lively conversations between Master Mo and various disciples, philosophical opponents, and potential patrons. Part 5, the "Defense Chapters," details the principles and practices of defensive warfare, a subject on which Master Mo was an acknowledged authority. Now available to English-speaking readers of all backgrounds, the Mozi is a rich and varied text, and this bilingual edition provides an excellent tool for learning classical Chinese. (shrink)
This article offers two arguments for the conclusion that we should refuse on moral grounds to establish a human presence on the surface of Mars. The first argument appeals to a principle constraining the use of invasive or destructive techniques of scientific investigation. The second appeals to a principle governing appropriate human behavior in wilderness. These arguments are prefaced by two preliminary sections. The first preliminary section argues that authors working in space ethics have good reason to shift their focus (...) away from theory-based arguments in favor of arguments that develop in terms of pretheoretic beliefs. The second argues that of the popular justifications for sending humans to Mars only appeals to scientific curiosity can survive reflective scrutiny. (shrink)
In the first part of chapter 2 of book II of the Physics Aristotle addresses the issue of the difference between mathematics and physics. In the course of his discussion he says some things about astronomy and the ‘ ‘ more physical branches of mathematics”. In this paper I discuss historical issues concerning the text, translation, and interpretation of the passage, focusing on two cruxes, the first reference to astronomy at 193b25–26 and the reference to the more physical branches at 194a7–8. In (...) section I, I criticize Ross’s interpretation of the passage and point out that his alteration of has no warrant in the Greek manuscripts. In the next three sections I treat three other interpretations, all of which depart from Ross's: in section II that of Simplicius, which I commend; in section III that of Thomas Aquinas, which is importantly influenced by a mistranslation of, and in section IV that of Ibn Rushd, which is based on an Arabic text corresponding to that printed by Ross. In the concluding section of the paper I describe the modern history of the Greek text of our passage and translations of it from the early twelfth century until the appearance of Ross's text in 1936. (shrink)
There is, I gloomily suspect, little which is significantly new that remain to be said about psycho-analysis by philosophers. The almost profligate theorising that goes on within the psycho-analytic journals will, no doubt, continue unabated. It simply strikes me as unlikely that such theorising will generate further issues of the kind that excite the philosophical mind. Though in making such an observation, I recognise that I lay claim upon the future in a manner that many might believe to be unwise. (...) The place of psycho-analysis upon the intellectual map, the implications that psycho-analytic theory and practice have for the various kinds of judgements that we make about human behaviour, have been exhaustively discussed in recent times. Rather more specifically, whether psycho-analysis should be accorded the dignity of being labelled a ‘science’, what the significance is of psycho-analysis for those complex problems bounded by the notions of Reason, Freedom, Motivation, have occasioned much fruitful philosophical debate. It is not any wish of mine to add to the literature on these problems in the forlorn hope that even slightly different answers might be forthcoming. (shrink)
The explosive growth in computational power over the past several decades offers new tools and opportunities for economists. This handbook volume surveys recent research on Agent-based Computational Economics (ACE), the computational study of economic processes modeled as open-ended dynamic systems of interacting agents. Empirical referents for “agents” in ACE models can range from individuals or social groups with learning capabilities to physical world features with no cognitive function. Topics covered include: learning; empirical validation; network economics; social dynamics; financial markets; innovation (...) and technological change; organizations; market design; automated markets and trading agents; political economy; social-ecological systems; computational laboratory development; and general methodological issues. (shrink)
Introductions to the theory of knowledge are plentiful, but none introduce students to the most recent debates that exercise contemporary philosophers. Ian Evans and Nicholas D. Smith aim to change that. Their book guides the reader through the standard theories of knowledge while simultaneously using these as a springboard to introduce current debates. Each chapter concludes with a “Current Trends” section pointing the reader to the best literature dominating current philosophical discussion. These include: the puzzle of reasonable disagreement; the so-called (...) “problem of easy knowledge”; the intellectual virtues; and new theories in the philosophy of language relating to knowledge. Chapters include discussions of skepticism, the truth condition, belief and acceptance, justification, internalism versus externalism, epistemic evaluation, and epistemic contextualism. Evans and Smith do not merely offer a review of existing theories and debates; they also offer a novel theory that takes seriously the claim that knowledge is not unique to humans. Surveying current scientific literature in animal ethology, they discover surprising sophistication and diversity in non-human cognition. In their final analysis the authors provide a unified account of knowledge that manages to respect and explain this diversity. They argue that animals know when they make appropriate use of the cognitive processes available to animals of that kind, in environments within which those processes are veridically well-adapted. _Knowledge_ is a lively and accessible volume, ideal for undergraduate and post-graduate students. It is also set to spark debate among scholars for its novel approaches to traditional topics and its thoroughgoing commitment to naturalism. (shrink)
He concludes with an assessment of democracy's strengths and limitations as the font of political legitimacy. The book offers a lucid and accessible introduction to urgent ongoing conversations about the sources of political allegiance.
Economies are complicated systems encompassing micro behaviors, interaction patterns, and global regularities. Whether partial or general in scope, studies of economic systems must consider how to handle difficult real-world aspects such as asymmetric information, imperfect competition, strategic interaction, collective learning, and the possibility of multiple equilibria. Recent advances in analytical and computational tools are permitting new approaches to the quantitative study of these aspects. One such approach is Agent-based Computational Economics (ACE), the computational study of economic processes modeled as dynamic (...) systems of interacting agents. This chapter explores the potential advantages and disadvantages of ACE for the study of economic systems. General points are concretely illustrated using an ACE model of a two-sector decentralized market economy. Six issues are highlighted: Constructive understanding of production, pricing, and trade processes; the essential primacy of survival; strategic rivalry and market power; behavioral uncertainty and learning; the role of conventions and organizations; and the complex interactions among structural attributes, institutional arrangements, and behavioral dispositions. (shrink)
Classical logic has been attacked by adherents of rival, anti-realist logical systems: Ian Rumfitt comes to its defence. He considers the nature of logic, and how to arbitrate between different logics. He argues that classical logic may dispense with the principle of bivalence, and may thus be liberated from the dead hand of classical semantics.
This is a work of normative political philosophy that seeks to identify the legitimate goals of public education policy in liberal democratic states and the implications of those goals for arguments about public funding and regulation of religious schools. ;The thesis of the first section is that the inferiority of certain types of religious school as instruments of civic education in a pluralist state would not suffice to justify liberal states in a general refusal to fund such schools. States with (...) no position on the value of autonomy for the good life would have to balance civic concerns against the preferences of religious parents who want to send their children to narrowly religious schools to shield them from exposure to ethical diversity. But, I argue, the principles of liberal democracy actually presuppose the value of autonomy. ;In the second section, I develop a conception of ethical autonomy and argue for its adoption as a public value. Autonomy, understood to entail distinctively rational reflection that must nonetheless inevitably be situated within an unchosen cultural context, can be publicly justified as having instrumental value to all persons in their quest to live a good life. And I defend the legitimacy of adopting autonomy as a goal of public education policy against a series of objections, most notably those grounded in claims about parental rights and fairness to traditional cultures. ;In the third section, I explore the implications of the autonomy goal for religious schools. After defending secular public schools from several prominent criticisms, I consider the argument that religious secondary schools are unsuitable to deliver education for autonomy because they provide children with inadequate exposure to and rational engagement with ethical diversity: I conclude that states cannot justify prohibiting or even presumptively denying public funding to all religious secondary schools, but that there is need for extensive public regulation. Finally, I argue that religious primary schools should be treated differently because of the particular developmental needs and capacities of pre-adolescents. Religious primary schools whose pedagogy is non-authoritarian are specially suitable to lay the foundations for autonomy in young children from religious families. (shrink)
Given the centrality of arguments from vicious infinite regress to our philosophical reasoning, it is little wonder that they should also appear on the catalogue of arguments offered in defense of theses that pertain to the fundamental structure of reality. In particular, the metaphysical foundationalist will argue that, on pain of vicious infinite regress, there must be something fundamental. But why think that infinite regresses of grounds are vicious? I explore existing proposed accounts of viciousness cast in terms of contradictions, (...) dependence, failed reductive theories and parsimony. I argue that no one of these accounts adequately captures the conditions under which an infinite regress—any infinite regress—is vicious as opposed to benign. In their place, I suggest an account of viciousness in terms of explanatory failure. If this account is correct, infinite grounding regresses are not necessarily vicious; and we must be much more careful employing such arguments to the conclusion that there has to be something fundamental. (shrink)
_Learning Communities_ is a groundbreaking book that shows how learning communities can be a flexible and effective approach to enhancing student learning, promoting curricular coherence, and revitalizing faculty. Written by Barbara Leigh Smith, Jean MacGregor, Roberta S. Matthews, and Faith Gabelnick¾acclaimed national leaders in the learning communities movement¾this important book provides the historical, conceptual, and philosophical context for LCs and clearly demonstrates that they can be a key element in institutional transformation.
Love, fear, hope, calculus, and game shows-how do all these spring from a few delicate pounds of meat? Neurophysiologist Ian Glynn lays the foundation for answering this question in his expansive An Anatomy of Thought, but stops short of committing to one particular theory. The book is a pleasant challenge, presenting the reader with the latest research and thinking about neuroscience and how it relates to various models of consciousness. Combining the aim of a textbook with the style of a (...) popularization, it provides all the lay reader needs to know to participate in the philosophical debate that is redefining our attitudes about our minds. Drawing on the rich history of neurological case studies, Glynn picks through the building blocks of our nervous system, examines our visual and linguistic systems, and probes deeply into our higher thought processes. The stories of great scientists, like Ramon y Cajal, and famous patients, like Sperry's split-brained epileptics, illuminate the scientific issues Glynn selects as essential for understanding consciousness. Some might argue that his lengthy explorations of natural selection overemphasize evolutionary explanations of psychological phenomena, but they must also agree that evolutionary psychology has distanced itself mightily from social Darwinism in recent years and merits a reappraisal. The great consciousness debate may form the core of the 21st-century Zeitgeist; get ready for it with An Anatomy of Thought. -Rob Lightner From Publishers Weekly How do we know? What do we think? How could a philosophical problem-'the mind-body problem,' say-induce a headache? What can evolutionary theory, molecular biology, the history of medicine and experimental psychology tell us about the features of human consciousness, and (once again) how do we know? Glynn, a physician and Cambridge University professor, meticulously attempts to answer these questions and more, setting forth the results of all sorts of research relevant to our brains-from 19th-century dissections to Oliver Sacks-like case studies, work with monkeys and supercomputers, and the enduring puzzles of philosophy, which he rightly saves for near the end. After explaining evolution by natural selection and 'clearing away much dross,' Glynn lays out the experiments and theories that have shown 'how nerve cells can carry information about the body, how they can interact' and how sense organs work; demonstrates the 'mixture of parallel and hierarchical organization' in our brains and 'the striking localization of function within it'; considers where neuroscience is likely to go; and admits that, among the many fields of exciting research just ahead, 'we can be least confident of progress toward a complete, scientific explanation of our sensations and thoughts and feelings.' Other recent explaining-the-brain books have sometimes advanced simplistic, or implausibly grand, claims about the nature and features of consciousness in general. Instead, Glynn offers a patient, informative, well-laid-out researcher's-eye view of what we have learned, how we figured it out and what we still don't know about neurons, senses, feelings, brains and minds. (Apr.) Copyright 2000 Reed Business Information, Inc. From Library Journal The nature of consciousness, which perennially troubles the minds of scientists and philosophers, is the subject of an ever-growing body of literature. Two of the latest entries approach the topic from different perspectives. Glynn, a professor of physiology and head of the Physiological Laboratory at Cambridge, offers a comprehensive summary of what we know about the brain-both its evolution and its mechanisms. Among the topics he covers are natural selection, molecular evolution, nerves and the nervous system, sensory perception, and the specific structures responsible for our intellect. Using the mechanisms involved in vision and speech as models, Glynn skillfully describes various neurological deficiencies that can lead to 'disordered seeing' and problems with the use of language. He carefully distinguishes what we know through experimental evidence from what we know through the observation of patients with neurological damage. He also describes some of the major theories that attempt to explain why these structures arose. While his book concentrates on the structures that make up the mind, Glynn is well aware that some physical events appear explicable only in terms of conscious mental events-a situation that conflicts with the laws of modern physics. Only briefly, however, does he consider the various approaches that have been taken to deal with the issues of mind/body and free will. In contrast, this is the primary focus of The Physics of Consciousness. After reviewing the fundamentals of classic physics, Walker (who has a Ph.D. in physics) summarizes elements of the new physics in which our knowledge of space, time, matter, and energy are all dependent on the moment of observation. Walker explores the meaning of consciousness as a characteristic of the observer. In this context both the observer and the act of measurement are critical. In essence, Walker leads his reader on a journey through his concept of a 'quantum mind,' which can both affect matter (including other minds) and can be affected by other distant matter/minds. To break up what would otherwise be an extremely dense text, Walker also relates the very touching story of the loss of his high-school sweetheart to leukemia. Indeed, it is his memory of their relationship that drives Walker to seek an understanding of ultimate reality. At times, he has a tendency to be dogmatic-as when he concludes, 'our consciousness, our mind, and the will of God are the same mind.' While An Anatomy of Thought is appropriate for most academic libraries, the Physics of Consciousness will be most accessible to readers with some knowledge of advanced physics. -Laurie Bartolini, Illinois State Lib., Springfield Copyright 2000 Reed Business Information, Inc. From Booklist The codiscoverers of natural selection-Charles Darwin and Alfred Wallace-disagreed over the possibility of finding an evolutionary explanation for the human mind. Glynn here argues Darwin's side of the debate, tracing an eons-long path of development starting from simple amino acids floating in primal seas and extending through the erect hominids in which the powers of a massive brain first manifest themselves. Patiently adducing evidence of an evolutionary origin for the underlying molecular machinery, Glynn dissects the nerve centers that make possible speech and hearing, sight, and reading. Pressing deeper, he lays bare the cortical foundations of personality. But those who deal with the mind must attend also to the arguments advanced by philosophers. And it is when he turns from dendrites to syllogisms (especially the vexing mind-body paradox) that Glynn's empirical reasoning fails him. In the end, he concedes his perplexity in trying to conceive of an evolutionary origin for human consciousness. This concession may set the shade of Alfred Wallace to chortling, but it will draw readers into an honest confrontation with a profound enigma. Bryce Christensen. (shrink)
In “Global Knowledge Frameworks and the Tasks of Cross-Cultural Philosophy,” Leigh Jenco searches for the conception of knowledge that best justifies the judgment that one can learn from non-local traditions of philosophy. Jenco considers four conceptions of knowledge, namely, in catchwords, the esoteric, Enlightenment, hermeneutic, and self- transformative conceptions of knowledge, and she defends the latter as more plausible than the former three. In this critical discussion of Jenco’s article, I provide reason to doubt the self-transformative conception, and also (...) advance a fifth, pluralist conception of knowledge that I contend best explains the prospect of learning from traditions other than one’s own. (shrink)
A good book may have the power to change the way we see the world, but a great book actually becomes part of our daily consciousness, pervading our thinking to the point that we take it for granted, and we forget how provocative and challenging its ideas once were—and still are. _The Structure of Scientific Revolutions _is that kind of book. When it was first published in 1962, it was a landmark event in the history and philosophy of science. Fifty (...) years later, it still has many lessons to teach. With _The Structure of Scientific Revolutions, _Kuhn challenged long-standing linear notions of scientific progress, arguing that transformative ideas don’t arise from the day-to-day, gradual process of experimentation and data accumulation but that the revolutions in science, those breakthrough moments that disrupt accepted thinking and offer unanticipated ideas, occur outside of “normal science,” as he called it. Though Kuhn was writing when physics ruled the sciences, his ideas on how scientific revolutions bring order to the anomalies that amass over time in research experiments are still instructive in our biotech age. This new edition of Kuhn’s essential work in the history of science includes an insightful introduction by Ian Hacking, which clarifies terms popularized by Kuhn, including paradigm and incommensurability, and applies Kuhn’s ideas to the science of today. Usefully keyed to the separate sections of the book, Hacking’s introduction provides important background information as well as a contemporary context. Newly designed, with an expanded index, this edition will be eagerly welcomed by the next generation of readers seeking to understand the history of our perspectives on science. (shrink)
This book examines Renaissance modes of interpretation as they arise in legal contexts, and relates them to modern debates about meaning and its determination. By placing legal hermeneutic theories in their institutional and pedagogical contexts, the author is able to give an account of Renaissance thought showing how it operates in its own terms, and in relation to the thought of the medieval period. Renaissance legal thought is also compared to modern discussions of interpretation, allowing a critical examination of its (...) coherence and consistency. (shrink)
In this paper I argue for the superiority of a critical realist understanding of interdisciplinarity over a mainstream understanding of it. I begin by exploring the reasons for the failure of mainstream researchers to achieve interdisciplinarity. My main argument is that mainstream interdisciplinary researchers tend to hypostatize facts, fetishize constant conjunctions of events and apply to open systems an epistemology designed for closed systems. I also explain how mainstream interdisciplinarity supports oppression and gross inequality. I argue that mainstream interdisciplinarity is (...) not true interdisciplinarity and refer to it accordingly as ‘condisciplinarity’. By way of example, I examine the condisciplinarity of the World Health Organization’s ecological model applied to the issue of men’s violence against women. Specifically, I argue that critical realist interdisciplinarity is preferable because it acknowledges inter alia the empirical, actual and real layers of reality, which allows it to develop depth-explanations of phenomena. In practice, this means that critical realist interdisciplinarity can potentially provide explanations that, compared to condisciplinarity, are broader and deeper. In the World Health Organization’s example of the causes of men’s violence against women, condisciplinarity resulted in the absence of historical, global and unconscious aspects of the problem. It is also restricted the analysis to reductive, constant-conjunction based theories of the causes of the problem, specifically ‘risk factors’, thereby providing a relatively shallow explanation for the problem. (shrink)
In days past, epistemologists expended a good deal of effort trying to analyze the basing relation—the relation between a belief and its basis. No satisfying account was offered, and the project was largely abandoned. Younger epistemologists, however, have begun to yearn for an adequate theory of basing. I aim to deliver one. After establishing some data and arguing that traditional accounts of basing are unsatisfying, I introduce a novel theory of the basing relation: the dispositional theory. It begins with the (...) pedestrian observation that beliefs stand or fall with their bases. The theory I offer is an elucidation and refinement of this thought. (shrink)
This Element considers the relationship between the traditional view of God as all-powerful, all-knowing and wholly good on the one hand, and the idea of human free will on the other. It focuses on the potential threats to human free will arising from two divine attributes: God's exhaustive foreknowledge and God's providential control of creation.
The analytical notion of ‘scientific style of reasoning’, introduced by Ian Hacking in the middle of the 1980s, has become widespread in the literature of the history and philosophy of science. However, scholars have rarely made explicit the philosophical assumptions and the research objectives underlying the notion of style: what are its philosophical roots? How does the notion of style fit into the area of research of historical epistemology? What does a comparison between Hacking’s project on styles of thinking and (...) other similar projects suggest? My aim in this paper is to answer these questions. Hacking has denied that his project of styles of thinking falls into the field of historical epistemology. I shall challenge his remark by tracing out the connections of the notion of style with historical epistemology and, more in general, with a tradition of thought born in France in the beginning of twentieth-century. (shrink)
Written by Thomas Hobbes and first published in 1651, _Leviathan_ is widely considered the greatest work of political philosophy ever composed in the English language. Hobbes's central argument—that human beings are first and foremost concerned with their own fears and desires, and that they must relinquish basic freedoms in order to maintain a peaceful society—has found new adherents and critics in every generation. This new edition, which uses modern text and relies on large-sheet copies from the 1651 Head version, includes (...) interpretive essays by four leading Hobbes scholars: John Dunn, David Dyzenhaus, Elisabeth Ellis, and Bryan Garsten. Taken together with Ian Shapiro’s wide-ranging introduction, they provide fresh and varied interpretations of _Leviathan_ for our time. (shrink)
Humanity has sat at the center of philosophical thinking for too long. The recent advent of environmental philosophy and posthuman studies has widened our scope of inquiry to include ecosystems, animals, and artificial intelligence. Yet the vast majority of the stuff in our universe, and even in our lives, remains beyond serious philosophical concern. In _Alien Phenomenology, or What It’s Like to Be a Thing_, Ian Bogost develops an object-oriented ontology that puts things at the center of being—a philosophy in (...) which nothing exists any more or less than anything else, in which humans are elements but not the sole or even primary elements of philosophical interest. And unlike experimental phenomenology or the philosophy of technology, Bogost’s alien phenomenology takes for granted that _all_ beings interact with and perceive one another. This experience, however, withdraws from human comprehension and becomes accessible only through a speculative philosophy based on metaphor. Providing a new approach for understanding the experience of things _as_ things, Bogost also calls on philosophers to rethink their craft. Drawing on his own background as a videogame designer, Bogost encourages professional thinkers to become makers as well, engineers who construct things as much as they think and write about them. (shrink)