Taste-aversion learning has been a popular paradigm for examining associative processes because it often produces outcomes that are different from those observed in other classical conditioning paradigms. One such outcome is taste-mediated odor potentiation in which aversion conditioning with a weak odor and a strong taste results in increased or synergistic conditioning to the odor. Because this strengthened odor aversion was not anticipated by formal models of learning, investigation of taste-mediated odor potentiation was a hot topic in the 1980s. The (...) present manuscript reviews the history of potentiation research with particular focus given to the stimuli that produce potentiation, the conditions that produce potentiation, the possible mechanism of this phenomenon, and possible reasons for the decline of research in this area. Although the number of published reports of potentiation has decreased since the 1980s, recent physiological and behavioral assessments have advanced the field considerably, and the opportunities for future research are bountiful. Recent physiological experiments, for example, have identified the basolateral nucleus of the amygdala as the key brain region to produce taste-mediated odor potentiation. Also, recent behavioral experiments have extended the generality of synergistic conditioning effects. Studies have shown that odor can potentiate responding to taste and that augmented responding can be produced in the A+ /AX+blocking design. With the current understanding of where synergistic conditioning may occur in the brain and the new tools to explore synergistic conditioning, we propose various directions for future research to determine whether taste-aversion learning and synergistic conditioning require unique explanations. (shrink)
We introduce the concept of fraud tolerance, validate the conceptualization using prior studies in economics and criminology as well as our own independent tests, and explore the relationship of fraud tolerance with numerous cultural attributes using data from the World Values Survey. Applying partial least squares path modeling, we find that people with stronger self-enhancing values exhibit higher fraud tolerance. Further, respondents who believe in the importance of hard work exhibit lower fraud tolerance, and such beliefs mediate the relationship between (...) locus of control and fraud tolerance. Finally, we find that people prone to traditional gender stereotypes demonstrate higher fraud tolerance and document subtle differences in the influence of these cultural attributes across age, religiosity, and gender groups. Our study contributes to research on corporate governance, ethics, and the antecedents of work-place dishonesty. (shrink)
Predictive Sentencing addresses the role of risk assessment in contemporary sentencing practices. Predictive sentencing has become so deeply ingrained in Western criminal justice decision-making that despite early ethical discussions about selective incapacitation, it currently attracts little critique. Nor has it been subjected to a thorough normative and empirical scrutiny. This is problematic since much current policy and practice concerning risk predictions is inconsistent with mainstream theories of punishment. Moreover, predictive sentencing exacerbates discrimination and disparity in sentencing. Although structured risk assessments (...) may have replaced 'gut feelings', and have now been systematically implemented in Western justice systems, the fundamental issues and questions that surround the use of risk assessment instruments at sentencing remain unresolved. This volume critically evaluates these issues and will be of great interest to scholars of criminal justice and criminology. (shrink)
A significant feature of John Stuart Mill's moral theory is the introduction of qualitative differences as relevant to the comparative value of pleasures. Despite its significance, Mill presents his doctrine of qualities of pleasures in only a few paragraphs in the second chapter of Utilitarianism, where he begins the brief discussion by saying: utilitarian writers in general have placed the superiority of mental over bodily pleasures chiefly … in their circumstantial advantages rather than in their intrinsic nature.… [B]ut they might (...) have taken the … higher ground with entire consistency. It is quite compatible with the principle of utility to recognize the fact, that some kinds of pleasure are more desirable and more valuable than others. It would be absurd that while, in estimating all other things, quality is considered as well as quantity, the estimation of pleasures should be supposed to depend on quantity alone. (shrink)
In contradistinction to the many monographs and edited volumes devoted to historical, cultural, or theological treatments of demonology, this collection features newly written papers by philosophers and other scholars engaged specifically in philosophical argument, debate, and dialogue involving ideas and topics in demonology. The contributors to the volume approach the subject from the perspective of the broadest areas of Western philosophy, namely metaphysics, epistemology, logic, and moral philosophy. The collection also features a plurality of religious, cultural, and theological views on (...) the nature of demons from both Eastern and Western thought, in addition to views that may diverge from these traditional roots. _Philosophical Approaches to Demonology_ will be of interest to philosophers of religion, theologians, and scholars working in philosophical theology and demonology, as well as historians, cultural anthropologists, and sociologists interested more broadly in the concept of demons. (shrink)
Robert Batterman examines a form of scientific reasoning called asymptotic reasoning, arguing that it has important consequences for our understanding of the scientific process as a whole. He maintains that asymptotic reasoning is essential for explaining what physicists call universal behavior. With clarity and rigor, he simplifies complex questions about universal behavior, demonstrating a profound understanding of the underlying structures that ground them. This book introduces a valuable new method that is certain to fill explanatory gaps across disciplines.
In 1935, as the Nazis’ state-of-the art eugenics exhibition from the Deutsches Hygiene Museum was concluding its American tour, a decision had to be made about whether to return the displays to Germany or to house them in an American museum. After the American Academy of Medicine decided against the display because of its political implications, the director of the Buffalo Museum of Science, Carlos Cummings, himself a physician, offered his institution as the exhibition's permanent home. “What is the astounding (...) eugenics program upon which Chancellor Hitler has launched the German people?” Cummings wondered aloud. “As a matter of public interest, without endorsement,” he added, “the Museum will display in the Central Hall throughout this final quarter of 1935, a set of fifty-one posters and charts. . . which gives Americans a graphic explanation of Germany's campaign to rear in posterity ‘a new race nobility.’” Seven years later, with war raging, the museum received permission from the company that had insured the exhibition, to dismantle it from its permanent home in the museum's Hall of Heredity. An exhibition about eugenics, Nazi eugenics no less, that had been enthusiastically received as it had traveled the United States in the mid-1930s, had seemingly fallen victim to the war against eugenics launched by cultural anthropologists and geneticists. In light of the broad scholarship on eugenics, this certainly would be a plausible reading of the deinstallation of the Nazi eugenics exhibition. But the three books under review here suggest a more complex reading, one that suggests that eugenics and racism, considered as ideological systems, were less easily dislodged from American culture than from Buffalo's Museum of Science. (shrink)
Taste-aversion learning has been a popular paradigm for examining associative processes because it often produces outcomes that are different from those observed in other classical conditioning paradigms. One such outcome is taste-mediated odor potentiation in which aversion conditioning with a weak odor and a strong taste results in increased or synergistic conditioning to the odor. Because this strengthened odor aversion was not anticipated by formal models of learning, investigation of taste-mediated odor potentiation was a hot topic in the 1980s. The (...) present manuscript reviews the history of potentiation research with particular focus given to the stimuli that produce potentiation, the conditions that produce potentiation, the possible mechanism of this phenomenon, and possible reasons for the decline of research in this area. Although the number of published reports of potentiation has decreased since the 1980s, recent physiological and behavioral assessments have advanced the field considerably, and the opportunities for future research are bountiful. Recent physiological experiments, for example, have identified the basolateral nucleus of the amygdala as the key brain region to produce taste-mediated odor potentiation (e.g., Hatfield andGallagher, 1995). Also, recent behavioral experiments have extended the generality of synergistic conditioning effects. Studies have shown that odor can potentiate responding to taste (Slotnick, Westbrook and Darling, 1997) and that augmented responding can be produced in the A+/AX+blocking design (e.g., Batsell and Batson, 1999). With the current understanding of where synergistic conditioning may occur in the brain and the new tools to explore synergistic conditioning, we propose various directions for future research to determine whether taste-aversion learning and synergistic conditioning require unique explanations. (shrink)
This article discusses minimal model explanations, which we argue are distinct from various causal, mechanical, difference-making, and so on, strategies prominent in the philosophical literature. We contend that what accounts for the explanatory power of these models is not that they have certain features in common with real systems. Rather, the models are explanatory because of a story about why a class of systems will all display the same large-scale behavior because the details that distinguish them are irrelevant. This story (...) explains patterns across extremely diverse systems and shows how minimal models can be used to understand real systems. (shrink)
But do animals know that other creatures have minds? And how would we know if they do? In "Mindreading Animals," Robert Lurz offers a fresh approach to the hotly debated question of mental-state attribution in nonhuman animals.
This paper examines contemporary attempts to explicate the explanatory role of mathematics in the physical sciences. Most such approaches involve developing so-called mapping accounts of the relationships between the physical world and mathematical structures. The paper argues that the use of idealizations in physical theorizing poses serious difficulties for such mapping accounts. A new approach to the applicability of mathematics is proposed.
This paper examines the role of mathematical idealization in describing and explaining various features of the world. It examines two cases: first, briefly, the modeling of shock formation using the idealization of the continuum. Second, and in more detail, the breaking of droplets from the points of view of both analytic fluid mechanics and molecular dynamical simulations at the nano-level. It argues that the continuum idealizations are explanatorily ineliminable and that a full understanding of certain physical phenomena cannot be obtained (...) through completely detailed, non-idealized representations. (shrink)
Patricia Williams made a number of claims concerning the methods and practise of cladistic analysis and classification. Her argument rests upon the distinction of two kinds of hierarchy: a divisional hierarchy depicting evolutionary descent and the Linnean hierarchy describing taxonomic groups in a classification. Williams goes on to outline five problems with cladistics that lead her to the conclusion that systematists should eliminate cladism as a school of biological taxonomy and to replace it either with something that is philosophically coherent (...) or to replace it with pure methodology, untainted by theory (Williams 1992, 151). Williams makes a number of points which she feels collectively add up to insurmountable problems for cladistics. We examine Williams' views concerning the two hierarchies and consider what cladists currently understand about the status of ancestors. We will demonstrate that Williams has seriously misunderstood many modern commentators on this subject and all of her five persistent problems are derivable from this misunderstanding. Some persons believe and argue, on grounds approaching faith it seems to me, that phylogeny comes from our knowledge of evolution. Others have found to their surprise, and sometimes dismay, that phylogeny comes from our knowledge of systematics. Nelson (1989, 67). (shrink)
This paper aims to draw attention to an explanatory problem posed by the existence of multiply realized or universal behavior exhibited by certain physical systems. The problem is to explain how it is possible that systems radically distinct at lower-scales can nevertheless exhibit identical or nearly identical behavior at upper-scales. Theoretically this is reflected by the fact that continuum theories such as fluid mechanics are spectacularly successful at predicting, describing, and explaining fluid behaviors despite the fact that they do not (...) recognize the discrete nature of fluids. A standard attempt to reduce one theory to another, is shown to fail to answer the appropriate question about autonomy. (shrink)
Record of papers given at a symposium held at the University of Texas at Austin, April 1967; includes; C.J. Fillmore - The case for case; E. Bach - Nouns and noun phrases; J.D. McCawley - The role of semantics in a grammar; P. Kiparsky Linguistic universals and linguistic change.
This paper looks at emergence in physical theories and argues that an appropriate way to understand socalled “emergent protectorates” is via the explanatory apparatus of the renormalization group. It is argued that mathematical singularities play a crucial role in our understanding of at least some well-defined emergent features of the world.
Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternative explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries (...) an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternative explanations for both the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across sub-disciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternative models might be empirically distinguished. (shrink)