The idea of an ?inversion principle?, and the name itself, originated in the work of Paul Lorenzen in the 1950s, as a method to generate new admissible rules within a certain syntactic context. Some fifteen years later, the idea was taken up by Dag Prawitz to devise a strategy of normalization for natural deduction calculi (this being an analogue of Gentzen's cut-elimination theorem for sequent calculi). Later, Prawitz used the inversion principle again, attributing it with a semantic role. Still working (...) in natural deduction calculi, he formulated a general type of schematic introduction rules to be matched ? thanks to the idea supporting the inversion principle ? by a corresponding general schematic Elimination rule. This was an attempt to provide a solution to the problem suggested by the often quoted note of Gentzen. According to Gentzen ?it should be possible to display the elimination rules as unique functions of the corresponding introduction rules on the basis of certain requirements?. Many people have since worked on this topic, which can be appropriately seen as the birthplace of what are now referred to as ?general elimination rules?, recently studied thoroughly by Sara Negri and Jan von Plato. In this study, we retrace the main threads of this chapter of proof-theoretical investigation, using Lorenzen's original framework as a general guide. (shrink)
In The Varieties of Reference, Gareth Evans argues that the content of perceptual experience is nonconceptual, in a sense I shall explain momentarily. More recently, in his book Mind and World, John McDowell has argued that the reasons Evans gives for this claim are not compelling and, moreover, that Evans’s view is a version of “the Myth of the Given”: More precisely, Evans’s view is alleged to suffer from the same sorts of problems that plague sense-datum theories of perception. In (...) particular, McDowell argues that perceptual experience must be within “the space of reasons,” that perception must be able to give us reasons for, that is, to justify, our beliefs about the world: And, according to him, no state that does not have conceptual content can be a reason for a belief. Now, there are many ways in which Evans’s basic idea, that perceptual content is nonconceptual, might be developed; some of these, I shall argue, would be vulnerable to the objections McDowell brings against him. But I shall also argue that there is a way of developing it that is not vulnerable to these objections. (shrink)
Charles Griswold has written a comprehensive philosophical study of Smith's moral and political thought. Griswold sets Smith's work in the context of the Enlightenment and relates it to current discussions in moral and political philosophy. Smith's appropriation as well as criticism of ancient philosophy, and his carefully balanced defence of a liberal and humane moral and political outlook, are also explored. This 1999 book is a major philosophical and historical reassessment of a key figure in the Enlightenment that will be (...) of particular interest to philosophers and political and legal theorists, as well as historians of ideas, rhetoric, and political economy. (shrink)
Measurement is fundamental to all the sciences, the behavioural and social as well as the physical and in the latter its results provide our paradigms of 'objective fact'. But the basis and justification of measurement is not well understood and is often simply taken for granted. Henry Kyburg Jr proposes here an original, carefully worked out theory of the foundations of measurement, to show how quantities can be defined, why certain mathematical structures are appropriate to them and what meaning attaches (...) to the results generated. Crucial to his approach is the notion of error - it can not be eliminated entirely from its introduction and control, her argues, arises the very possibility of measurement. Professor Kyburg's approach emphasises the empirical process of making measurements. In developing it he discusses vital questions concerning the general connection between a scientific theory and the results which support it. (shrink)
Cognitive theories of metaphor understanding are typically described in terms of the mappings between different kinds of abstract, schematic, disembodied knowledge. My claim in this paper is that part of our ability to make sense of metaphorical language, both individual utterances and extended narratives, resides in the automatic construction of a simulation whereby we imagine performing the bodily actions referred to in the language. Thus, understanding metaphorical expressions like ‘grasp a concept’ or ‘get over’ an emotion involve simulating what it (...) must be like to engage in these specific activities, even though these actions are, strictly speaking, impossible to physically perform. This process of building a simulation, one that is fundamentally embodied in being constrained by past and present bodily experiences, has specific consequences for how verbal metaphors are understood, and how cognitive scientists, more generally, characterize the nature of metaphorical language and thought. (shrink)
Can there be knowledge and rational belief in the absence of a rational degree of confidence? Yes, and cases of "mistuned knowledge" demonstrate this. In this paper we leverage this normative possibility in support of advancing our understanding of the metaphysical relation between belief and credence. It is generally assumed that a Lockean metaphysics of belief that reduces outright belief to degrees of confidence would immediately effect a unification of coarse-grained epistemology of belief with fine-grained epistemology of confidence. Scott Sturgeon (...) has suggested that the unification is effected by understanding the relation between outright belief and confidence as an instance of the determinable-determinate relation. But determination of belief by confidence would not by itself yield the result that norms for confidence carry over to norms for outright belief unless belief and high confidence are token identical. We argue that this token-identity thesis is incompatible with the neglected phenomenon of “mistuned knowledge”—knowledge and rational belief in the absence of rational confidence. We contend that there are genuine cases of mistuned knowledge and that, therefore, epistemological unification must forego token identity of belief and high confidence. We show how partial epistemological unification can be secured given determination of outright belief by degrees of confidence even without token-identity. Finally, we suggest a direction for the pursuit of thoroughgoing epistemological unification. (shrink)
Originally published by Routledge in 1988, this pioneering collection of essays now features a new preface and updated bibliography by the editor, reflecting the most significant developments in Plato scholarship during the past decade.
John Etchemendy has argued that it is but "a fortuitous accident" that Tarski's work on truth has any signifance at all for semantics. I argue, in response, that Etchemendy and others, such as Scott Soames and Hilary Putnam, have been misled by Tarski's emphasis on definitions of truth rather than theories of truth and that, once we appreciate how Tarski understood the relation between these, we can answer Etchemendy's implicit and explicit criticisms of neo-Davidsonian semantics.
In The Varieties of Reference, Gareth Evans argues that the content of perceptual experience is nonconceptual, in a sense I shall explain momentarily. More recently, in his book Mind and World, John McDowell has argued that the reasons Evans gives for this claim are not compelling and, moreover, that Evans’s view is a version of “the Myth of the Given”: More precisely, Evans’s view is alleged to suffer from the same sorts of problems that plague sense-datum theories of perception. In (...) particular, McDowell argues that perceptual experience must be within “the space of reasons,” that perception must be able to give us reasons for, that is, to justify, our beliefs about the world: And, according to him, no state that does not have conceptual content can be a reason for a belief. Now, there are many ways in which Evans’s basic idea, that perceptual content is nonconceptual, might be developed; some of these, I shall argue, would be vulnerable to the objections McDowell brings against him. But I shall also argue that there is a way of developing it that is not vulnerable to these objections. (shrink)
I said that the book is brilliant. This is not so much because of the conclusions eventually reached about the inadequacy of a purely naturalistic approach to mind. These conclusions are already familiar in the work of Donald Davidson and others. Rather, it is because of the accumulation of historical detail and insight on the basis of which these conclusions are reached. It is often said, for instance, that Kant is a watershed figure, in some sense synthesizing and then moving (...) beyond both empiricism and naturalism. Hatfield uses these terms. But he reinvests them with meaning and energy and in the process shows how radical Kant was in refocusing, by way of a sharp distinction between empirical and transcendental inquiries, the central questions of philosophy. (shrink)
Christian tradition has largely held three affirmations on the resurrection of the physical body. Firstly, that bodily resurrection is not a superfluous hope of afterlife. Secondly, there is immediate post-mortem existence in Paradise. Finally, there is numerical identity between pre-mortem and post-resurrection human beings. The same tradition also largely adheres to a robust doctrine of The Intermediate State, a paradisiacal disembodied state of existence following the biological death of a human being. This book argues that these positions are in fact (...) internally inconsistent, and so a new metaphysics for life after death is required. (shrink)
This study tests the usefulness of a person-situation interactionist framework in examining the willingness of a salesperson to lie to get an order. Using a survey of 389 salespersons, our results demonstrate that organizational relationships influence willingness to lie. Specifically, salespersons are less willing to lie to their own company than to their customer, than to a channel partner, and finally,than to a competitor firm. Furthermore, respondents from firms with a clear and positive ethical climate are less willing to lie. (...) Finally, our study finds that interactions between personality factors, such as high Machiavellianism and high self-monitoring, and situational factors have an impact on willingness to lie. Our results suggest that firms can take steps to influence employee ethical behavior. (shrink)
Merleau-ponty's phenomenology of the intentional arc uniting body and world is viewed as grounded in the meaningfulness and materiality of both. the genetic constitution of the interrelated meaning and physicality of body and world is sketched in a phenomenological interpretation of jean piaget's ``the origin of intelligence in children''. from this sketch emerges an assertion of the priority of action over perception in prepredicative experience.
This book defends the fundamental place of the marital family in modern liberal societies. While applauding modern sexual freedoms, John Witte, Jr also defends the traditional Western teaching that the marital family is an essential cradle of conscience, chrysalis of care, and cornerstone of ordered liberty. He thus urges churches, states, and other social institutions to protect and promote the marital family. He encourages reticent churches to embrace the rights of women and children, as Christians have long taught, and encourages (...) modern states to promote responsible sexual freedom and family relations, as liberals have long said. He counsels modern churches and states to share in family law governance, and to resist recent efforts to privatize, abolish, or radically expand the marital family sphere. Witte also invites fellow citizens to end their bitter battles over same-sex marriage and tend to the vast family field that urgently needs concerted attention and action. (shrink)
Leading legal scholar John Witte, Jr. explores the role religion played in the development of rights in the Western legal tradition and traces the complex interplay between human rights and religious freedom norms in modern domestic and international law. He examines how US courts are moving towards greater religious freedom, while recent decisions of the pan-European courts in Strasbourg and Luxembourg have harmed new religious minorities and threatened old religious traditions in Europe. Witte argues that the robust promotion and protection (...) of religious freedom is the best way to protect many other fundamental rights today, even though religious freedom and other fundamental rights sometimes clash and need judicious balancing. He also responds to various modern critics who see human rights as a betrayal of Christianity and religious freedom as a betrayal of human rights. (shrink)
The authors present a critical edition of the Quaestio de formalitatibus of John Duns Scotus. In the introduction to their edition, they examine the evidence of the manuscripts and the external and internal evidence to determine the authorship, place and date of the question. They conclude that the Quaestio was disputed by John Duns Scotus at Paris in the Franciscan studium sometime between 1305 and 1307. Chronologically, Scotus’ Quaestio, disputed at Paris, would seem to be his final, magisterial word on (...) the subject of the formal distinction. Finally, the authors examine the transmission of the text in each of the manuscripts in order to establish a stemma codicum and the principles that govern their edition. (shrink)
Philosophical defenses of cognitive/evolutionary psychological accounts of racialism claim that classification based on phenotypical features of humans was common historically and is evidence for a species-typical, cognitive mechanism for essentializing. They conclude that social constructionist accounts of racialism must be supplemented by cognitive/evolutionary psychology. This article argues that phenotypical classifications were uncommon historically until such classifications were socially constructed. Moreover, some philosophers equivocate between two different meanings of “racial thinking.” The article concludes that social constructionist accounts are far more robust (...) than psychological accounts for the origins of racialism. (shrink)