Current research suggests that nonclinical forensic psychologists[sup1] are appearing increasingly more often in the legal arena. We argue that many of the ethical dilemmas that face these psychologists differ from those encountered by clinical forensic psychologists. To test the accuracy of this assertion, 37 nonclinical forensic psychologists were surveyed to identify some of the ethical issues and dilemmas they have encountered while engaging in expert testimony or pretrial consulting. Respondents were asked also about how they have resolved these ethical issues (...) and whether they were aware of the "Specialty Guidelines for Forensic Psychologists" (Committee on Ethical Guidelines for Forensic Psychologists, 1991). Results of the survey are discussed in terms of the need for additional regulatory guidelines or professional standards that speak directly to the ethical issues confronting nonclinical, forensic expert witnesses and consultants. (shrink)
Hobbes conception of reason as computation or reckoning is significantly different in Part I of De Corpore from what I take to be the later treatment in Leviathan. In the late actual computation with words starts with making an affirmation, framing a proposition. Reckoning then has to do with the consequences of propositions, or how they connect the facts, states of affairs or actions which they refer tor account. Starting from this it can be made clear how Hobbes understood the (...) crucial application of this conception to natural law, identified as 'right reason'. (shrink)
The interview reconstructs Jeffrey Schnapp's brilliant career from his origins as a scholar of Dante and the Middle Ages to his current multiple interdisciplinary interests. Among other things, Schnapp deals with knowledge design, media history and theory, history of the book, the future of archives, museums, and libraries. The main themes of the interview concern the relationships between technology and pedagogy, the future of reading, and artificial intelligence.
In 1929 Ernst Cassirer and Martin Heidegger participated in a momentous debate in Davos, Switzerland, which is widely held to have marked an important division in twentieth-century European thought. Peter E. Gordon’s recent book, Continental Divide: Heidegger, Cassirer, Davos, centers on this debate between these two philosophical adversaries. In his book Gordon examines the background of the debate, the issues that distinguished the respective positions of Cassirer and Heidegger, and the legacy of the debate for later decades. Throughout the work, (...) Gordon concisely portrays the source of disagreement between the two adversaries in terms of a difference between Cassirer’s philosophy of spontaneity and Heidegger’s philosophy of receptivity, or of “thrownness” , into a situation that finite human beings can never hope to master. Although it recognizes that this work provides an important contribution to our understanding of the Davos debate and to twentieth-century European thought, this review essay subjects Gordon’s manner of interpreting the distinction between Cassirer and Heidegger to critical scrutiny. Its purpose is to examine the possibility that important aspects of the debate, which do not conform to the grid imposed by Gordon’s interpretation, might have been set aside in the context of his analysis. (shrink)
For nearly half a century, Quentin Skinner has been the world's foremost interpreter of Thomas Hobbes. When the contextualist mode of intellectual history now known as the “Cambridge School” was first asserting itself in the 1960s, the life and writings of John Locke were the primary topic for pioneers such as Peter Laslett and John Dunn. At that time, Hobbes was still the plaything of philosophers and political scientists, virtually all of whom wrote in an ahistorical, textual-analytic manner. Hobbes had (...) not been the subject of serious contextual research for decades, since the foundational writings of Ferdinand Tönnies. For Skinner, he was thus an ideal subject, providing a space for original research on a major figure, and an occasion for some polemically charged methodological manifestos. Both of these purposes animated his 1965 article “History and Ideology in the English Revolution,” and his 1966 article “The Ideological Context of Hobbes's Political Thought”. The latter of these remains to this day one of the most widely cited scholarly articles in the fifty-year run of Cambridge's Historical Journal. Among other results of these early efforts was the scholarly controversy during which Howard Warrender chided Skinner for having reduced the “classic texts in political philosophy” to mere “tracts for the times”. (shrink)
In his introduction, Jeffrey Metzger states that “at some point in the past 20 or 30 years … Nietzsche’s name [became] no longer associated primarily with nihilism” (1). Metzger is pointing to the increasing contemporary scholarly interest in Nietzsche’s epistemology, naturalism, and metaethics. The worthy aim of this volume is to ask us to examine once again the underlying philosophical problem to which these views are a response, namely, nihilism. This volume helpfully reminds us that Nietzsche’s philosophical motivation still (...) requires clarification, and that we can only fully understand Nietzsche’s particular views by grasping Nietzsche’s fundamental philosophical aims.As with so many edited volumes on .. (shrink)
In Zadig, published in 1748, Voltaire wrote of “the great principle that it is better to run the risk of sparing the guilty than to condemn the innocent.” At about the same time, Blackstone noted approvingly that “the law holds that it is better that ten guilty persons escape, than that one innocent suffer.” In 1824, Thomas Fielding cited the principle as an Italian proverb and a maxim of English law. John Stuart Mill endorsed it in an address to Parliament (...) in 1868. General acceptance of this maxim continues into our own period, yet it is difficult to find systematic attempts to defend the maxim. It is treated as a truism in no need of defense. But the principle within it is not at all obvious; and since it undergirds many of our criminal justice policies, we should be sure that it is justifiable. First, however, we must clarify what the principle means. (shrink)
Sandra Field, Jeffrey Flynn, Stephen Macedo, Longxi Zhang, and Martin Powers discussed Powers’ book China and England: The Preindustrial Struggle for Social Justice in Word and Image at the American Philosophical Association’s 2020 Eastern Division meeting in Philadelphia. The panel was sponsored by the APA’s “Committee on Asian and Asian-American Philosophers and Philosophies” and organized by Brian Bruya.
While Classical Logic (CL) used to be the gold standard for evaluating the rationality of human reasoning, certain non-theorems of CL—like Aristotle’s and Boethius’ theses—appear intuitively rational and plausible. Connexive logics have been developed to capture the underlying intuition that conditionals whose antecedents contradict their consequents, should be false. We present results of two experiments (total n = 72), the first to investigate connexive principles and related formulae systematically. Our data suggest that connexive logics provide more plausible rationality frameworks for (...) human reasoning compared to CL. Moreover, we experimentally investigate two approaches for validating connexive principles within the framework of coherence-based probability logic (Pfeifer & Sanfilippo, 2021). Overall, we observed good agreement between our predictions and the data, but especially for Approach 2. (shrink)
In this paper, I discuss how information theory has been used in the study of animal communication, as well as how these uses are justified. Biologists justify their use of Shannon’s information measures by the work they do in allowing for comparisons between different organisms and because they measure a quantity that is purported to be important for natural selection. I argue that there are problems with both sorts of justification. To make these difficulties clear, I focus on the use (...) of Shannon’s information measures to quantify the amount of information transmitted by the fire ant’s odor trail and the honeybee’s waggle dance. Both of these systems are relatively simple and well understood, and the application of Shannon’s information measure to these systems initially seemed very promising and relatively straightforward. They are therefore particularly suitable for revealing the benefits and difficulties of applying Shannon’s information measures to biological systems in general, and animal communication systems in particular. (shrink)
Retributive restrictions are principles of justice according to which what a criminal deserves on account of his individual conduct and character restricts how states are morally permitted to treat him. The main arguments offered in defense of retributive restrictions involve thought experiments in which the state punishes the innocent, a practice known as telishment. In order to derive retributive restrictions from the wrongness of telishment, one must engage in moral argument from generalization. I show how generalization arguments of the same (...) form can be used subversively to derive morally unacceptable conclusions from other scenarios in which the state intentionally inflicts undeserved coercion. For example, our considered moral convictions approve of punishment policies that inflict collateral damage, such as the ubiquitous policy of excluding the family members of inmates from prison facilities outside visiting hours. I present a generalization argument for the conclusion that these policies are seriously unjust. If we firmly believe that these policies are not unjust, then we should put less stock in generalization arguments. We should not use them to support retributive restrictions. This conclusion has broad implications for the theory and practice of criminal justice. (shrink)
Legal decision-support systems have the potential to improve access to justice, administrative efficiency, and judicial consistency, but broad adoption of such systems is contingent on development of technologies with low knowledge-engineering, validation, and maintenance costs. This paper describes two approaches to an important form of legal decision support—explainable outcome prediction—that obviate both annotation of an entire decision corpus and manual processing of new cases. The first approach, which uses an attention network for prediction and attention weights to highlight salient case (...) text, was shown to be capable of predicting decisions, but attention-weight-based text highlighting did not demonstrably improve human decision speed or accuracy in an evaluation with 61 human subjects. The second approach, termed semi-supervised case annotation for legal explanations, exploits structural and semantic regularities in case corpora to identify textual patterns that have both predictable relationships to case decisions and explanatory value. (shrink)
Geoffrey Marnell presents philosophical arguments favoring grammatical descriptivism over grammatical prescriptivism. I argue that his explanation and defence of descriptivism reveal that his descriptivism is itself prescriptivist.
This collection of essays looks at the relation between phenomenology and the political from a variety of possible positions both critical and complimentary.
IT IS SHOWN IN DETAIL THAT RECENT ACCOUNTS FAIL TO DISTINGUISH BETWEEN INTENTIONALITY AND MERELY CAUSALLY DISPOSITIONAL STATES OF INORGANIC PHYSICAL OBJECTS—A QUICK ROAD TO PANPSYCHISM. THE CLEAR NEED TO MAKE SUCH A DISTINCTION GIVES DIRECTION FOR FUTURE WORK. A BEGINNING IS MADE TOWARD PROVIDING SUCH AN ACCOUNT.
IN "THE FUNCTION OF CONSCIOUSNESS ON MATTER", "CHINESE STUDIES IN PHILOSOPHY" 12 (1981) PAGES 38-54, YU CLAIMS THAT IN ORDER TO UNDERSTAND HOW CONSCIOUSNESS CAN AFFECT THE PHYSICAL WORLD, TWO CATEGORIES OF MATTER MUST BE DISTINGUISHED. I ARGUE THAT YU'S DISTINCTION HAS NO EXPLANATORY FORCE AND, MOREOVER, IS AT ODDS WITH HIS MATERIALIST ASSUMPTIONS. I THEN SUGGEST OTHER STRATEGIES.
The Jeffreys–Lindley paradox displays how the use of a \ value ) in a frequentist hypothesis test can lead to an inference that is radically different from that of a Bayesian hypothesis test in the form advocated by Harold Jeffreys in the 1930s and common today. The setting is the test of a well-specified null hypothesis versus a composite alternative. The \ value, as well as the ratio of the likelihood under the null hypothesis to the maximized likelihood under the (...) alternative, can strongly disfavor the null hypothesis, while the Bayesian posterior probability for the null hypothesis can be arbitrarily large. The academic statistics literature contains many impassioned comments on this paradox, yet there is no consensus either on its relevance to scientific communication or on its correct resolution. The paradox is quite relevant to frontier research in high energy physics. This paper is an attempt to explain the situation to both physicists and statisticians, in the hope that further progress can be made. (shrink)
There is wide support in logic, philosophy, and psychology for the hypothesis that the probability of the indicative conditional of natural language, P(if A then B), is the conditional probability of B given A, P(B|A). We identify a conditional which is such that P(if A then B)=P(B|A) with de Finetti's conditional event, B|A. An objection to making this identification in the past was that it appeared unclear how to form compounds and iterations of conditional events. In this paper, we illustrate (...) how to overcome this objection with a probabilistic analysis, based on coherence, of these compounds and iterations. We interpret the compounds and iterations as conditional random quantities which, given some logical dependencies, may reduce to conditional events. We show how the inference to B|A from A and B can be extended to compounds and iterations of both conditional events and biconditional events. Moreover, we determine the respective uncertainty propagation rules. Finally, we make some comments on extending our analysis to counterfactuals. (shrink)
The physical activity-related health competence model assumes that individuals require movement competence, control competence, and self-regulation competence to lead a healthy, physically active lifestyle. Although previous research has already established some measurement factors of the three dimensions, no attempts have so far been made to statistically aggregate them on the sub-competence level. Therefore, the goal of the present study was to test two additional factors for PAHCO and subsequently model the second-order structure with two samples from the fields of rehabilitation (...) and prevention. We conducted two questionnaire surveys with persons with multiple sclerosis and teaching students undergoing a basic qualification course in physical education. After performing exploratory items analysis, we used second-order confirmatory factor analysis and multidimensional scaling to investigate whether the scales could be bundled in accordance with the PAHCO model. The CFAs with 10 factors demonstrated a good model fit. In contrast, the second-order analysis with a simple loading structure on the three sub-competencies revealed an unacceptable model fit. Instead, a second-order model variant was preferred [comparative fit index = 0.926, root mean square error of approximation = 0.048, standardized root mean square residual = 0.065] in which body awareness and self-efficacy had theory-conform cross-loadings. The results of multidimensional scaling were in line with the extracted second-order structure. The present results suggested that the extension of the measurement instrument to 10 first-order factors was psychometrically justified for the two populations. The results from the second-order analyses provided the basis for the creation of sum scores, representing manifest indicators of movement competence, control competence, and self-regulation competence. Future studies are needed that cross-validate the extended measurement model with other populations and that relate the sub-competencies of PAHCO to indicators of health-enhancing physical activity. (shrink)
Belief in propositions has had a long and distinguished history in analytic philosophy. Three of the founding fathers of analytic philosophy, Gottlob Frege, Bertrand Russell, and G. E. Moore, believed in propositions. Many philosophers since then have shared this belief; and the belief is widely, though certainly not universally, accepted among philosophers today. Among contemporary philosophers who believe in propositions, many, and perhaps even most, take them to be structured entities with individuals, properties, and relations as constituents. For example, the (...) proposition that Glenn loves Tracy has Glenn, the loving relation, and Tracy as constituents. What is it, then, that binds these constituents together and imposes structure on them? And if the proposition that Glenn loves Tracy is distinct from the proposition that Tracy loves Glenn yet both have the same constituents, what is about the way these constituents are structured or bound together that makes them two different propositions? In The Nature and Structure of Content, Jeffrey C. King formulates a detailed account of the metaphysical nature of propositions, and provides fresh answers to the above questions. In addition to explaining what it is that binds together the constituents of structured propositions and imposes structure on them, King deals with some of the standard objections to accounts of propositions: he shows that there is no mystery about what propositions are; that given certain minimal assumptions, it follows that they exist; and that on his approach, we can see how and why propositions manage to have truth conditions and represent the world as being a certain way. The Nature and Structure of Content also contains a detailed account of the nature of tense and modality, and provides a solution to the paradox of analysis. Scholars and students working in the philosophy of mind and language will find this book rewarding reading. (shrink)
The Philosophy of Philip Kitcher contains eleven chapters on the work of noted philosopher Philip Kitcher, whose work is known for its broad range and insightfulness. Topics covered include philosophy of science, philosophy of biology, philosophy of mathematics, ethics, epistemology, and philosophy of religion. Each of the chapters is followed by a reply from Kitcher himself. This first significant edited volume devoted to examining Kitcher's work is an essential reference for anyone interested in understanding this important philosopher.
In four closely interwoven studies, Jeffrey Alexander identifies the central dilemma that provokes contemporary social theory and proposes a new way to resolve it. The dream of reason that marked the previous fin de siècle foundered in the face of the cataclysms of the twentieth century, when war, revolution, and totalitarianism came to be seen as themselves products of reason. In response there emerged the profound skepticism about rationality that has so starkly defined the present fin de siècle. From (...) Wittgenstein through Rorty and postmodernism, relativism rejects the very possibility of universal standards, while for both positivism and neo-Marxists like Bourdieu, reductionism claims that ideas simply reflect their social base. In a readable and spirited argument, Alexander develops the alternative of a "neo-modernist" position that defends reason from within a culturally centered perspective while remaining committed to the goal of explaining, not merely interpreting, contemporary social life. On the basis of a sweeping reinterpretation of postwar society and its intellectuals, he suggests that both antimodernist radicalism and postmodernist resignation are now in decline; a more democratic, less ethnocentric and more historically contingent universalizing social theory may thus emerge. Developing in his first two studies a historical approach to the problem of "absent reason," Alexander moves via a critique of Richard Rorty to construct his case for "present reason." Finally, focusing on the work of Pierre Bourdieu, he provokes the most sustained critical reflection yet on this influential thinker. Fin de Siecle Social Theory is a tonic intervention in contemporary debates, showing how social and cultural theory can properly take the measure of the extraordinary times in which we live. (shrink)
In these two essays, two of the most important French thinkers of our time reflect on each other's work. In so doing, novelist/essayist Maurice Blanchot and philosopher Michel Foucault develop a new perspective on the relationship between subjectivity, fiction, and the will to truth. The two texts present reflections on writing, language, and representation which question the status of the author/subject and explore the notion of a "neutral" voice that arises from the realm of the "outside." This book is crucial (...) not only to an understanding of these two thinkers, but also to any overview of recent French thought.Michel Foucault was the holder of a chair at the College de France. Among his works are Madness and Civilization, The Order of Things, Discipline and Punish, and The History of Sexuality Maurice Blanchot, born in 1907, is a novelist and critic. His works include Death Sentence, Thomas the Obscure, and The Space of Literature. (shrink)
Modus ponens (from A and “if A then C” infer C) is one of the most basic inference rules. The probabilistic modus ponens allows for managing uncertainty by transmitting assigned uncertainties from the premises to the conclusion (i.e., from P(A) and P(C|A) infer P(C)). In this paper, we generalize the probabilistic modus ponens by replacing A by the conditional event A|H. The resulting inference rule involves iterated conditionals (formalized by conditional random quantities) and propagates previsions from the premises to the (...) conclusion. Interestingly, the propagation rules for the lower and the upper bounds on the conclusion of the generalized probabilistic modus ponens coincide with the respective bounds on the conclusion for the (non-nested) probabilistic modus ponens. (shrink)
Philosophy, science, and common sense all refer to propositions--things we believe and say, and things which are true or false. But there is no consensus on what sorts of things these entities are. Jeffrey C. King, Scott Soames, and Jeff Speaks argue that commitment to propositions is indispensable, and each defend their own views on the debate.
Karl Pfeifer attempts to present a coherent view of panentheism that eschews Pickwickian senses of “in” and aligns itself with, and builds upon, familiar diagrammed portrayals of panentheism. The account is accordingly spatial-locative and moreover accepts the proposal of R.T. Mullins that absolute space and time be regarded as attributes of God. In addition, however, it argues that a substantive parthood relation between the world and God is required. Pfeifer’s preferred version of panpsychism, viz. panintentionalism, is thrown into (...) the mix as an optional add-on. On this account, God is conceived of as a “spiritual field” whose nature can be made more intelligible by regarding “God” as having a mass-noun sense in some contexts. Pfeifer closes with the suggestion that we look to topology and mereology for further development of the position outlined in his paper. (shrink)
"This book introduces Gottfried Wilhelm Leibniz's Principle of Optimality and argues that it plays a central role his physics and philosophy, with profound implications for both. Each chapter begins with an introduction to one of Leibniz's ground-breaking studies in natural philosophy, paying special attention to the role of optimal form in those investigations. Each chapter then goes on to explore the philosophical implications of optimal form for Leibniz's broader philosophical system. Individual chapters include discussions of Leibniz's understanding of teleology, the (...) nature of bodies, laws of nature, and free will. The final chapter explores the legacy of Leibniz's physics in light of his work on optimal form"--. (shrink)
I show that David Lewis’s principal principle is not preserved under Jeffrey conditionalization. Using this observation, I argue that Lewis’s reason for rejecting the desire as belief thesis and Adams’s thesis applies also to his own principal principle. 1 Introduction2 Adams’s Thesis, the Desire as Belief Thesis, and the Principal Principle3 Jeffrey Conditionalization4 The Principal Principles Not Preserved under Jeffrey Conditionalization5 Inadmissible Experiences.