This article develops a sociological theory of ambivalence to explain several puzzling and contradictory ethical attitudes of business people: (1) a simultaneous disposition to comparatively more self-interested and more charitable behavior than many other occupational groups and (2) a moderate level of receptiveness to inculcation of moral principles through social channels such as higher education. We test the theory by comparing the way that business students rate the ethical acceptability of various ethically challenging scenarios with the way that criminal justice (...) students rate these same scenarios. We also explore the malleability of ethical views by measuring differences between the responses of sophomores and seniors. The data generally support hypotheses based on a theory of ambivalence. At the same time, however, we also report on findings that suggest alternative explanations to ambivalence. (shrink)
Alexius Meinong's specific use of the term "self-presentation" had a significant influence on modern epistemology and philosophical psychology. To show that there are remarkable parallels between Meinong's account of the self-presentation of experiences and Lehrer's account of the exemplarization of experiences is one of this paper's main objectives. Another objective is to put forward some comments and critical remarks to Lehrer's approach. One of the main problems can be expressed by the following: The process of using a particular (...) experience as a sample, that is, an exemplar that we use to stand for and refer to a plurality of experiences, Lehrer calls "exemplarization". As concrete experiences are multifarious (red and round, for example), how can we single out a specific sort of experiences (the red ones) by the process of exemplarization when we use such a multifarious experience as a sample? (shrink)
Intentionality is a mark of the mental, as Brentano (1874) noted. Any representation or conception of anything has the feature of intentionality, which informally put, is the feature of being about something that may or may not exist. Visual artworks are about something, whether something literal or abstract. The artwork is a mentalized physical object. Aesthetic experience of the artwork illustrates the nature of intentionality as we focus attention on the phenomenology of the sensory exemplar. This focus of attention (...) on the exemplar in aesthetic experience simultaneously exhibits what the intentional object is like and what our conception of it is like. The exemplar is Janus-faced, looking in one direction outward toward the objects conceived and in the other direction inward toward our conceiving of them. It shows us what intentionality is like and how we know it. (shrink)
Nelson Goodman and, following him, Catherine Z. Elgin and Keith Lehrer have claimed that sometimes a sample is a symbol that stands for the property it is a sample of. The relation between the sample and the property it stands for is called 'exemplification' (Goodman, Elgin) or 'exemplarisation' (Lehrer). Goodman and Lehrer argue that the notion of exemplification sheds light on central problems in aesthetics and the philosophy of mind. However, while there seems to be a phenomenon (...) to be captured, Goodman's account of exemplification has several flaws. In this paper I will offer an alternative account of exemplification that is inspired by Grice's idea that one can communicate something by providing one's audience with intention-independent evidence and letting them draw the obvious conclusion for themselves. This explication of exemplification will solve the problems that arose for Goodman's theory in the spirit of his approach.1. (shrink)
This paper explores the scope and limits of rational consensus through mutual respect, with the primary focus on the best known formal model of consensus: the Lehrer–Wagner model. We consider various arguments against the rationality of the Lehrer–Wagner model as a model of consensus about factual matters. We conclude that models such as this face problems in achieving rational consensus on disagreements about unknown factual matters, but that they hold considerable promise as models of how to rationally resolve (...) non-factual disagreements. (shrink)
Group decisions raise a number of substantial philosophical and methodological issues. We focus on the goal of the group decision exercise itself. We ask: What should be counted as a good group decision-making result? The right decision might not be accessible to, or please, any of the group members. Conversely, a popular decision can fail to be the correct decision. In this paper we discuss what it means for a decision to be "right" and what components are required in a (...) decision process to produce happy decision-makers. Importantly, we discuss how "right" decisions can produce happy decision-makers, or rather, the conditions under which happy decision-makers and right decisions coincide. In a large range of contexts, we argue for the adoption of formal consensus models to assist in the group decision-making process. In particular, we advocate the formal consensus convergence model of Lehrer and Wagner (1981), because a strong case can be made as to why the underlying algorithm produces a result that should make each of the experts in a group happy. Arguably, this model facilitates true consensus, where the group choice is effectively each person's individual choice. We analyse Lehrer and Wagner's algorithm for reaching consensus on group probabilities/utilities in the context of complex decision-making for conservation biology. While many conservation decisions are driven by a search for objective utility/probability distributions (regarding extinction risks of species and the like), other components of conservation management primarily concern the interests of stakeholders. We conclude with cautionary notes on mandating consensus in decision scenarios for which no fact of the matter exists. For such decision settings alternative types of social choice methods are more appropriate. (shrink)
The epistemic basing relation is the relation occurring between a belief and a reason when the reason is the reason for which the belief is held. It marks the distinction between a belief's being justifiable for a person, and the person's being justified in holding the belief. As such, it is an essential component of any complete theory of epistemic justification. ;I survey and evaluate all theories of the basing relation that I am aware of published between 1965 and 1995. (...) These include causal theories, , theories involving pseudo-overdetermination relations , theories involving what a person would appeal to in defense of her beliefs , and doxastic theories , involving an appeal to meta-beliefs. My discussion of these theories includes a detailed analysis of variations of Lehrer's case of the gypsy lawyer which, I show, can be reformulated to pose a decisive objection to causal theories, if not causal analyses, of the basing relation. ;Rejecting all published theories, I present a new kind of causal analysis of the basing relation which I call the causal-doxastic theory. This theory states that a belief is based on a reason if the reason bears an appropriate causal relation to the belief, or, it does not bear such a causal relation, but an appropriate meta-belief is present. A causal analysis of which meta-beliefs are appropriate is given, so as to count as inappropriate rationalizations, mistaken meta-beliefs, etc. In developing the causal-doxastic theory, I present a solution to the problem of deviant causal chains, a discussion of the causal sustaining of beliefs, an account of rejecting reasons, and a partial analysis of showing that one is justified. I then discuss implications of my theory regarding foundationalism, inference, basic and non-basic belief, holistic and linear coherentism, process reliabilism, internalism and externalism, and various closure principles. (shrink)
Keith Lehrer has been publishing on free will and compatiblism since 1960. Our concern here is to present an account of the development on his work on the subject.
In the philosophical tradition marked by Descartes and empiricism, the idea of epistemic justification was most often seen in terms of construction on foundations that would be as many immediately justified starting points. The article exposes a completely different approach to the question, due to the philosopher Keith Lehrer. In this approach the epistemic justification derives from a coherence relationship between beliefs that are never immediately justified starting points. What is then decisive for the justification of a belief is (...) to discard or neutralize all the objections that can be raised against it. From an example, the article presents this approach to epistemic justification and exposes a difficulty that it encounters. (shrink)
In this important new text, Keith Lehrer introduces students to the major traditional and contemporary accounts of knowing. Beginning with the accepted definition of knowledge as justified true belief, Lehrer explores the truth, belief and justification conditions on the way to a thorough examination of foundation theories of knowledge, externalism and naturalized epistemologies, internalism and modern coherence theories as well as recent reliabilist and causal theories. Lehrer gives all views careful examination and concludes that external factors must (...) be matched by appropriate internal ones to yield knowledge. Readers of Professor Lehrer's earlier book _Knowledge_ will want to know that this text adopts the framework of that classic text. But _Theory of Knowledge_ is a completely rewritten and updated version of that book that has been simplified throughout for student use. (shrink)
In this important new text, Keith Lehrer introduces students to the major traditional and contemporary accounts of knowing. Beginning with the accepted definition of knowledge as justified true belief, Lehrer explores the truth, belief and justification conditions on the way to a thorough examination of foundation theories of knowledge, externalism and naturalized epistemologies, internalism and modern coherence theories as well as recent reliabilist and causal theories. Lehrer gives all views careful examination and concludes that external factors must (...) be matched by appropriate internal ones to yield knowledge. Readers of Professor Lehrer's earlier book _Knowledge_ will want to know that this text adopts the framework of that classic text. But _Theory of Knowledge_ is a completely rewritten and updated version of that book that has been simplified throughout for student use. (shrink)
Special issue. With contributions by Luc Bovens and Stephan Hartmann, David Glass, Keith Lehrer, Erik Olsson, Tomoji Shogenji, Mark Siebel, and Paul Thagard.
In this important new text, Keith Lehrer introduces students to the major traditional and contemporary accounts of knowing. Beginning with the accepted definition of knowledge as justified true belief, Lehrer explores the truth, belief and justification conditions on the way to a thorough examination of foundation theories of knowledge, externalism and naturalized epistemologies, internalism and modern coherence theories as well as recent reliabilist and causal theories. Lehrer gives all views careful examination and concludes that external factors must (...) be matched by appropriate internal ones to yield knowledge. Readers of Professor Lehrer's earlier book _Knowledge_ will want to know that this text adopts the framework of that classic text. But _Theory of Knowledge_ is a completely rewritten and updated version of that book that has been simplified throughout for student use. (shrink)
Kants Theorie des reinen Geschmacksurteils By Christel Fricke Verlag Walter de Gruyter, 1990 (Quellen und Studien zur Philosophie, 26). Pp. 196. ISBN 3?11?012585?4. DM98.00 The Ontology of Physical Objects By Mark Heller Cambridge University Press, 1990. Pp. iv + 162. ISBN 0?521?38544?X. £25.00. Theory of Knowledge By Keith Lehrer Routledge, 1990. Pp. xii + 212. ISBN 0?415?05407?9. £30.00 hbk. £9.99 pbk. Disciplining Foucault: Feminism, Power and the Body By Jana Sawicki Routledge, 1991. Pp. xii + 130. ISBN 0?415?90187?1. (...) £30.00 hbk. £8.99 pbk. (shrink)
Volume Four, as indicated by the anthology's subtitle, is in honor of Simone de Beauvoir (1908-1986) and Martin Heidegger (1889-1976). The chapters do not necessarily mention Simone de Beauvoir or Martin Heidegger. The 16 chapters (by professional philosophers and other professional scholars) are directed to issues related to death, life extension, and anti-death. Most of the 400-plus pages consist of scholarship unique to this volume. Includes index. -/- -/- The titles of the 16 chapters are as follows: -/- -/- 1. (...) Mechanism, Galileo’s Animale And Heidegger’s Gestell: Reflections On The Lifelessness Of Modern Science by GiorgioBaruchello -/- -/- 2. Simone De Beauvoir by Debra Bergoffen -/- 3. Existentialism by Steven Crowell -/- -/- 4. Time Wounds All Heels by William Grey -/- -/- 5. The Ethical Importance Of Death by Jenann Ismael -/- 6. The Poetics Of Death: Intimations And Illusions by Lawrence Kimmel -/- -/- 7. Death And Aesthetics by Keith Lehrer -/- -/- 8. Ageing And Existentialism: Simone De Beauvoir And The Limits Of Freedom by Shannon M. Mussett -/- -/- 9. Life Extension And Meaning by Carol O’Brien -/- -/- 10. Consciousness As Computation: A Defense Of Strong AI Based On Quantum-State Functionalism by R. Michael Perry -/- -/- 11. Reality Shifts: On The Death And Dying Of Dr. Timothy Leary by Carol Sue Rosin -/- -/- 12. Extraterrestrial Liberty And The Great Transmutation by Charles Tandy -/- -/- 13. A Time Travel Schema And Eight Types Of Time Travel by Charles Tandy -/- -/- 14. Boredom, Experimental Ethics, And Superlongevity by Mark Walker -/- -/- 15. Exopolitics: The Death Of Death by Alfred Lambremont Webre -/- -/- 16. Embryo Cloning: Current State Of The Medical Art And Its Far-Reaching Consequences For Multiple Applications by Panayiotis M.Zavos. (shrink)
According to the thesis of the extended mind (EM) , at least some token cognitive processes extend into the cognizing subject's environment in the sense that they are (partly) composed of manipulative, exploitative, and transformative operations performed by that subject on suitable environmental structures. EM has attracted four ostensibly distinct types of objection. This paper has two goals. First, it argues that these objections all reduce to one basic sort: all the objections can be resolved by the provision of an (...) adequate and properly motivated criterion—or mark—of the cognitive. Second, it provides such a criterion—one made up of four conditions that are sufficient for a process to count as cognitive. (shrink)
Is Bayesian decision theory a panacea for many of the problems in epistemology and the philosophy of science, or is it philosophical snake-oil? For years a debate had been waged amongst specialists regarding the import and legitimacy of this body of theory. Mark Kaplan had written the first accessible and non-technical book to address this controversy. Introducing a new variant on Bayesian decision theory the author offers a compelling case that, while no panacea, decision theory does in fact have (...) the most profound consequences for the way in which philosophers think about inquiry, criticism and rational belief. The new variant on Bayesian theory is presented in such a way that a non-specialist will be able to understand it. The book also offers new solutions to some classic paradoxes. It focuses on the intuitive motivations of the Bayesian approach to epistemology and addresses the philosophical worries to which it has given rise. (shrink)
In this paper, we review Keith Lehrer’s account of the basing relation, with particular attention to the two cases he offered in support of his theory, Raco (Lehrer, Theory of knowledge, 1990; Theory of knowledge, (2nd ed.), 2000) and the earlier case of the superstitious lawyer (Lehrer, The Journal of Philosophy, 68, 311–313, 1971). We show that Lehrer’s examples succeed in making his case that beliefs need not be based on the evidence, in order to be (...) justified. These cases show that it is the justification (rather than the belief) that must be based in the evidence. We compare Lehrer’s account of basing with some alternative accounts that have been offered, and show why Lehrer’s own account is more plausible. (shrink)
Mark Olssen is one of the leading social scientists writing in the world today. Inspired by the writings of Michel Foucault, Olssen’s writing traverses philosophy, politics, education, and epistemology. This book comprises a selection of his papers published in academic journals and books over thirty-five years.
Drawing on insights from causal theories of reference, teleosemantics, and state space semantics, a theory of naturalized mental representation. In A Mark of the Mental, Karen Neander considers the representational power of mental states—described by the cognitive scientist Zenon Pylyshyn as the “second hardest puzzle” of philosophy of mind. The puzzle at the heart of the book is sometimes called “the problem of mental content,” “Brentano's problem,” or “the problem of intentionality.” Its motivating mystery is how neurobiological states can (...) have semantic properties such as meaning or reference. Neander proposes a naturalistic account for sensory-perceptual representations. Neander draws on insights from state-space semantics, causal theories of reference, and teleosemantic theories. She proposes and defends an intuitive, theoretically well-motivated but highly controversial thesis: sensory-perceptual systems have the function to produce inner state changes that are the analogs of as well as caused by their referents. Neander shows that the three main elements—functions, causal-information relations, and relations of second-order similarity—complement rather than conflict with each other. After developing an argument for teleosemantics by examining the nature of explanation in the mind and brain sciences, she develops a theory of mental content and defends it against six main content-determinacy challenges to a naturalized semantics. (shrink)
In _Buddhism As Philosophy_, Mark Siderits makes the Buddhist philosophical tradition accessible to a Western audience. Offering generous selections from the canonical Buddhist texts and providing an engaging, analytical introduction to the fundamental tenets of Buddhist thought, this revised, expanded, and updated edition builds on the success of the first edition in clarifying the basic concepts and arguments of the Buddhist philosophers.
Mark Balaguer’s project in this book is extremely ambitious; he sets out to defend both platonism and fictionalism about mathematical entities. Moreover, Balaguer argues that at the end of the day, platonism and fictionalism are on an equal footing. Not content to leave the matter there, however, he advances the anti-metaphysical conclusion that there is no fact of the matter about the existence of mathematical objects.1 Despite the ambitious nature of this project, for the most part Balaguer does not (...) shortchange the reader on rigor; all the main theses advanced are argued for at length and with remarkable clarity and cogency. There are, of course, gaps in the account but these should not be allowed to overshadow the sig-. (shrink)
The monograph explains how knowledge requires the capacity to justify or defend the target claim of knowledge. Defensibility is based on a background system. Lehrer argues that reflection on experience yields a self-referential exemplar representation.This is the novel contribution of his new book to truth about the perceptual world.
Keith Lehrer has put forward an argument for skepticism which trades on the possibility that a group of creatures in another galaxy (Googols) may be rendering our beliefs about reality largely false (this is ‘Lehrer’s Skeptical Hypothesis’). Since there are no arguments against the Lehrer-Googol hypothesis, it cannot be rejected as unjustified. But since we can be completely justified in believing that p only when hypotheses which conflict with our belief are unjustified, we cannot be completely justified (...) in believing that p. Hence, we can never know that p. I argue that Lehrer’s argument fails in so far as believers on both sides of a question may be completely justified in their beliefs. Since this is so, one can be completely justified in believing p, and thereby know that p, even when an opposing view is itself completely justified. (shrink)
I will compare Lehrer’s anti-skeptical strategy from a coherentist point of view with the anti-skeptical strategy of the Mooreans. I will argue that there are strong similarities between them: neither can present a persuasive argument to the skeptic and both face the problem of easy knowledge in one way or another. However, both can offer a complete and self-explanatory explanation of knowledge although Mooreanism can offer the more natural one. Hence, one has good reasons to prefer Mooreanism to (...) class='Hi'>Lehrer’s anti-skeptical approach, if one does not prefer coherentism to foundationalism for other reasons. (shrink)
The aim of this series is to inform both professional philosophers and a larger readership about what is going on, who's who, and who does what in contemporary philosophy and logic. PROFILES is designed to present the research activity and the resuits of already outstanding personalities and schools and of newly emerging ones in the various fields of philosophy and logic. There are many Festschrift volumes dedicated to various philosophers. There is the celebrated Library of Living Philosophers edited by P. (...) A. Schilpp whose format influenced the present enterprise. Still they can only cover very fittle of the contemporary philosophical scene. Faced with a tremendous expansion of philosophical information and with an almost frightening division of labor and increasing specialization we need systematic and regular ways of keeping track of what happens in the profession. PROFILES is intended to perform such a function. Each volume is devoted to one or several philosophers whose views and results are presented and discussed. The profiled philosopher will summarize and review his own work in the main fields of signifi cant contribution. This work will be discussed and evaluated by invited contributors. Relevant historical and/or biographical data, an up-to-date bibliography with short abstracts of the most important works and, whenever possible, references to significant reviews and discussions will also be included. (shrink)
In _The Meaning of the Body_, Mark Johnson continues his pioneering work on the exciting connections between cognitive science, language, and meaning first begun in the classic _Metaphors We Live By_. Johnson uses recent research into infant psychology to show how the body generates meaning even before self-consciousness has fully developed. From there he turns to cognitive neuroscience to further explore the bodily origins of meaning, thought, and language and examines the many dimensions of meaning—including images, qualities, emotions, and (...) metaphors—that are all rooted in the body’s physical encounters with the world. Drawing on the psychology of art and pragmatist philosophy, Johnson argues that all of these aspects of meaning-making are fundamentally aesthetic. He concludes that the arts are the culmination of human attempts to find meaning and that studying the aesthetic dimensions of our experience is crucial to unlocking meaning's bodily sources. Throughout, Johnson puts forth a bold new conception of the mind rooted in the understanding that philosophy will matter to nonphilosophers only if it is built on a visceral connection to the world. “Mark Johnson demonstrates that the aesthetic and emotional aspects of meaning are fundamental—central to conceptual meaning and reason, and that the arts show meaning-making in its fullest realization. If you were raised with the idea that art and emotion were external to ideas and reason, you must read this book. It grounds philosophy in our most visceral experience.”—George Lakoff, author of _Moral Politics_. (shrink)
Mark Jago presents and defends a novel theory of what truth is, in terms of the metaphysical notion of truthmaking. This is the relation which holds between a truth and some entity in the world, in virtue of which that truth is true. By coming to an understanding of this relation, he argues, we gain better insight into the metaphysics of truth. The first part of the book discusses the property being true, and how we should understand it in (...) terms of truthmaking. The second part focuses on truthmakers, the worldly entities which make various kinds of truths true, and how they do so. Jago argues for a metaphysics of states of affairs, which account for things having properties and standing in relations. The third part analyses the logic and metaphysics of the truthmaking relation itself, and links it to the metaphysical concept of grounding. The final part discusses consequences of the theory for language and logic. Jago shows how the theory delivers a novel and useful theory of propositions, the entities which are true or false, depending on how things are. A notable feature of this approach is that it avoids the Liar paradox and other puzzling paradoxes of truth. (shrink)
Mark Richard presents an original theory of meaning, as the collection of assumptions speakers make in using it and expect their hearers to recognize as being made. Meaning is spread across a population, inherited by each new generation of speakers from the last, and evolving through the interactions of speakers with their environment.
Mark Sainsbury presents an original account of how language works when describing mental states, based on a new theory of what is involved in attributing attitudes like thinking, hoping, and wanting. He offers solutions to longstanding puzzles about how we can direct our thought to such a diversity of things, including things that do not exist.