Is there a universal set of rules for discovering and testing scientific hypotheses? Since the birth of modern science, philosophers, scientists, and other thinkers have wrestled with this fundamental question of scientific practice. Efforts to devise rigorous methods for obtaining scientific knowledge include the twenty-one rules Descartes proposed in his Rules for the Direction of the Mind and the four rules of reasoning that begin the third book of Newton's Principia , and continue today in debates over the very possibility (...) of such rules. Bringing together key primary sources spanning almost four centuries, Science Rules introduces readers to scientific methods that have played a prominent role in the history of scientific practice. Editor Peter Achinstein includes works by scientists and philosophers of science to offer a new perspective on the nature of scientific reasoning. For each of the methods discussed, he presents the original formulation of the method selections written by a proponent of the method together with an application to a particular scientific example and a critical analysis of the method that draws on historical and contemporary sources. The methods included in this volume are Cartesian rationalism with an application to Descartes' laws of motion Newton's inductivism and the law of gravity two versions of hypothetico-deductivism -- those of William Whewell and Karl Popper -- and the nineteenth-century wave theory of light Paul Feyerabend's principle of proliferation and Thomas Kuhn's views on scientific values, both of which deny that there are universal rules of method, with an application to Galileo's tower argument. Included also is a famous nineteenth-century debate about scientific reasoning between the hypothetico-deductivist William Whewell and the inductivist John Stuart Mill and an account of the realism-antirealism dispute about unobservables in science, with a consideration of Perrin's argument for the existence of molecules in the early twentieth century. (shrink)
Can there be rules of language which serve both to determine meaning and to guide speakers in ordinary linguistic usage, i.e., in the production of speech acts? We argue that the answer is no. We take the guiding function of rules to be the function of serving as reasons for actions, and the question of guidance is then considered within the framework of practical reasoning. It turns out that those rules that can serve as reasons for linguistic utterances cannot be (...) considered as normative or meaning determining. Acceptance of such a rule is simply equivalent to a belief about meaning, and does not even presuppose that meaning is determined by rules. Rules that can determine meaning, on the other hand, i.e., rules that can be regarded as constitutive of meaning, are not capable of guiding speakers in the ordinary performance of speech acts. (shrink)
The paper explores the concept of a Narrative which is defined as a connected structure on a set of constrained actions and forbearances. Explanation via Narratives is compared with explanation through variable centred methodology and an interpretation of correlations in terms of Narratives is also outlined.
This book offers a systematic and critical discussion of Peter Winch's writings on the philosophy of the social sciences. The author points to Winch's tendency to over-emphasize the importance of language and communication, and his insufficient attention to the role of practical, technological activites in human life and society. It also offers an appendix devoted to the controversy between the anthropologists Marshall Sahlins and Gananath Obeyesekere regarding Captain James Cook's Hawaiian adventures. Essential reading for those studying the development of (...) philosophy in the twentieth century, this book will also be of great interest to anthropologists, sociologists, scholars of religion, and all those with an interest in the relationship between philosophy and the social sciences. (shrink)
This book offers a systematic and critical discussion of Peter Winch's writings on the philosophy of the social sciences. The author points to Winch's tendency to over-emphasize the importance of language and communication, and his insufficient attention to the role of practical, technological activites in human life and society. It also offers an appendix devoted to the controversy between the anthropologists Marshall Sahlins and Gananath Obeyesekere regarding Captain James Cook's Hawaiian adventures. Essential reading for those studying the development of (...) philosophy in the twentieth century, this book will also be of great interest to anthropologists, sociologists, scholars of religion, and all those with an interest in the relationship between philosophy and the social sciences. (shrink)
This book offers a systematic and critical discussion of Peter Winch's writings on the philosophy of the social sciences. The author points to Winch's tendency to over-emphasize the importance of language and communication, and his insufficient attention to the role of practical, technological activites in human life and society. It also offers an appendix devoted to the controversy between the anthropologists Marshall Sahlins and Gananath Obeyesekere regarding Captain James Cook's Hawaiian adventures. Essential reading for those studying the development of (...) philosophy in the twentieth century, this book will also be of great interest to anthropologists, sociologists, scholars of religion, and all those with an interest in the relationship between philosophy and the social sciences. (shrink)
The thesis that, in a system of natural deduction, the meaning of a logical constant is given by some or all of its introduction and elimination rules has been developed recently in the work of Dummett, Prawitz, Tennant, and others, by the addition of harmony constraints. Introduction and elimination rules for a logical constant must be in harmony. By deploying harmony constraints, these authors have arrived at logics no stronger than intuitionist propositional logic. Classical logic, they maintain, cannot be justified (...) from this proof-theoretic perspective. This paper argues that, while classical logic can be formulated so as to satisfy a number of harmony constraints, the meanings of the standard logical constants cannot all be given by their introduction and/or elimination rules; negation, in particular, comes under close scrutiny. (shrink)
Measures of epistemic utility are used by formal epistemologists to make determinations of epistemic betterness among cognitive states. The Brier rule is the most popular choice among formal epistemologists for such a measure. In this paper, however, we show that the Brier rule is sometimes seriously wrong about whether one cognitive state is epistemically better than another. In particular, there are cases where an agent gets evidence that definitively eliminates a false hypothesis, but where the Brier rule (...) says that things have become epistemically worse. Along the way to this ‘elimination experiment’ counter-example to the Brier rule as a measure of epistemic utility, we identify several useful monotonicity principles for epistemic betterness. We also reply to several potential objections to this counter-example. (shrink)
Criminal law scholars approach legality in various ways. Some scholars eschew over-arching principles and proceed directly to one or more distinct “rules”: (1) the rule against retroactive criminalization; (2) the rule that criminal statutes be construed narrowly; (3) the rule against the judicial creation of common-law offenses; and (4) the rule that vague criminal statutes are void. Other scholars seek a single principle, i.e., the “principle of legality,” that they claim underlies the four rules. In contrast, (...) I believe that both approaches are misguided. There is no such thing as a single “principle of legality;” yet, the four aforementioned rules are not unrelated to each other. The so-called “principle of legality” consists of two distinct norms that derive, respectively, from two fundamental principles of criminal justice, viz., the principle, “No person shall be punished in the absence of a bad mind,” and the principle that underlies the maxim, “Every person is presumed innocent until proven guilty.” The first norm of legality explains the rules regarding ex post facto legislation, and rules regarding “notice” and “fair warning” of judicial decisions. When a person is punished for violating a rule that was non-existent or unclear at the time he acted, he is punished for conduct that the state now condemns and seeks to prevent by means of penal sanctions. Accordingly, at the time the person is prosecuted, his claim is not that he did not do anything that the state regards as wrong, but, rather, that he neither knew nor should have known that he was doing something that the state would come to regard as wrong. He ought, indeed, to be excused for his mistake, but only because of a principle that is common to excuses generally: “No person ought to be punished in the absence of a guilty mind.” He should be excused because even when a person does something the state condemns and seeks to prevent, he ought not to be blamed for it unless he was motivated in a certain way, namely, by an attitude of disrespect for the legitimate interests of the political community by whose norms he is bound.The second norm of legality informs several of the remaining rules, though not all of them. The second norm is that a person ought not to be punished in the name of a political community unless it can confidently be said that the community officially regards his conduct as warranting the criminal punishment at issue. It is a norm that is most commonly associated with the rule of lenity, but it is not confined to the construction of statutes that are ambiguous or vague. It can also be also violated when a person is punished for violating a statute that has fallen into desuetude, regardless of how widely promulgated or narrowly defined the statute may be. This second norm derives from a principle that also underlies the presumption of innocence - the only difference being that the presumption of innocence is a preference for acquittal in the event of uncertainty regarding the facts which an actor is charged, while the second norm of legality is a preference for acquittal in the event of uncertainty regarding the scope of the offense which he is charged.Nevertheless, one “rule” remains that this analysis throws into question - namely, the rule that “vague” criminal statutes are void. Criminal statutes are sometimes so broadly defined that they do, indeed, infringe constitutionally protected rights of speech, movement, etc. - in which event they ought to be invalidated on those very grounds. And criminal statutes are sometimes so broadly drafted that, before applying them, courts ought to construe them to apply only to constitutionally unprotected acts that courts can confidently say the relevant political community regards as warranting punishment. But once statutes are so construed to apply only to constitutionally unprotected conduct, courts have no further reason to invalidate them on grounds of “vagueness.” Lack of “notice” is no reason to invalidate them because, with respect to the narrowly defined conduct such statutes are construed to prohibit, “common social duty” alone ought to alert actors that their conduct is suspect. (shrink)
However, if Wittgenstein’s so called rule-following considerations are correct, then this reason for believing in the validity of (C), is mistaken. The conclusion of those considerations is that we must reject the idea that rules are things which determine possible cases of application before those cases are actually encountered and decided by speakers. If this is right, then there is no rule which determines the meanings of new sentences, i.e. before those sentences have actually been used. Therefore, it (...) might seem that (C) is not valid for natural languages. (shrink)
The aim of the present article is twofold. Firstly, it aims to study the problems arising from the notion of rule proposed by Peter Winch in The Idea of a Social Science and Its Relation to Philosophy to account for all meaningful behavior. On the one hand, it will analyze the problems in the argument posed by Winch in order to state that all meaningful behavior is governed by rules. On the other hand, it will focus on the (...) problems concerning his conception of rules and rule-following, with specific emphasis on pointing out the issues that arise from the criterion posed by Winch in order to determine when a rule is being followed. Secondly, it aims to reassess Winch’s proposal and reformulate, accommodate and define his notion of rule in an intentional account of meaningful behavior, thus solving the problems presented. In addition, it will provide a criterion that allows to determine when a rule is being followed. (shrink)
Can judging that an agent blamelessly broke a rule lead us to claim, paradoxically, that no rule was broken at all? Surprisingly, it can. Across seven experiments, we document and explain the phenomenon of excuse validation. We found when an agent blamelessly breaks a rule, it significantly distorts people’s description of the agent’s conduct. Roughly half of people deny that a rule was broken. The results suggest that people engage in excuse validation in order to avoid (...) indirectly blaming others for blameless transgressions. Excuse validation has implications for recent debates in normative ethics, epistemology and the philosophy of language. These debates have featured thought experiments perfectly designed to trigger excuse validation, inhibiting progress in these areas. (shrink)
Dialogue is a seminal concept within the work of the Brazilian adult education theorist, Paulo Freire, and the Russian literary critic and philosopher, Mikhail Bakhtin. While there are commonalities in their understanding of dialogue, they differ in their treatment of dialectic. This paper addresses commonalities and dissonances within a Bakhtin-Freire dialogue on the notions of dialogue and dialectic. It then teases out some of the implications for education theory and practice in relation to two South African contexts of learning that (...) facilitate the access to education of disadvantaged groups, one in higher education and the other in early childhood education. (shrink)
In this paper it is argued that different understandings of the requirements of the Rule of Law can to a large extent be explained by the position taken with regard to two interrelated distinctions. On the one hand, the Rule of Law can be regarded as either a principle of law or as a principle of governance. On the other hand, the requirements of the Rule of Law can be regarded as defining either a minimum standard which (...) something has to meet in order to be law or as an aspirational standard identifying what it means to be good law. In combination these two distinctions define a range of perspectives on the nature of the Rule of Law that are complementary rather than mutually exclusive. (shrink)
The claim that rationality is ‘limited, falsified and unhelpful’ for explaining norms is false, for it does not apply to rationality as conceptualized by rational choice theory. Rationality as conceptualized by rational choice theory is not limited: it can be used to develop explanations of any observed human behavior. Rationality as conceptualized by rational choice theory has not been falsified: indeed, it is not falsifiable. Rationality as conceptualized by rational choice theory is not unhelpful for explaining norms: it is often (...) used to develop explanations of observed norms, including norms that seem most puzzling. Rationality as conceptualized by rational choice theory provides a universal framework for developing explanations of human behavior. (shrink)
Jesus of Nazareth, like Socrates, left nothing behind written by himself. Yet, the records of his teaching indicate a rich interest in dialogic pedagogy, reflected in his use of the parable, primarily an oral genre, as a dialogic provocation. Working at the interface of pedagogy, theology and philosophy, this article explores the parable of the Good Samaritan from the perspective of dialogic pedagogy. It employs an analytical approach termed diacognition, developed from the notions of dialogue, position and cognition, to analyse (...) the moves within the parable and the teaching situation in which it is located. The article explores how Jesus engages the dialogue of and around the parable to position and reposition his interlocutor, provoking a re-cognition of what it means to love one’s neighbour. It concludes by reflecting on the implications of this analysis for the relation of meaning to knowing and doing. (shrink)
The authors explore the history of experiments in economics, provide examples of different types of experiments and show that the growing use of experimental methods is transforming economics into an empirical science.
Are rules processes or similarity processes the default for acquisition of grammatical knowledge during natural second language acquisition? Whereas Pothos argues similarity processes are the default in the many areas he reviews, including artificial grammar learning and first language development, I suggest, citing evidence, that in second language acquisition of grammatical morphology “rules processes” may be the default.
The case for anti-realism in the theory of meaning, as presented by Dummen and Wright, 1 is only partly convincing. There is, I shall suggest, a crucial lacuna in the argument, that can only be filled by the later Wittgenstein's following-a-rule considerations. So it is the latter that provides the strongest argument for the rejection of semantic realism. By 'realism', throughout, I should be taken as referring to any conception of meaning that leaves open the possibility that a sentence may (...) have a determinate truth-value although we are incapable - either in practice or in principle - of discovering what truth-value it has ('the possibility of veritication-transcendence' for short). 2 I shall say nothing further about what an anti-realist semantics might look like, nor about the possible consequences for logic, epistemology and metaphysics, beyond the fact that it must involve the rejection of any such conception of meaning. (shrink)
It is argued in this paper that a multimodal analysis of turn-taking, one of the core areas of conversation analytic research, is needed and has to integrate gaze as one of the most central resources for allocating turns, and that new technologies are available that can provide a solid and reliable empirical foundation for this analysis. On the basis of eye-tracking data of spontaneous conversations, it is shown that gaze is the most ubiquitous next-speaker-selection technique. It can function alone or (...) enhance other techniques. I also discuss the interrelationship between the strength for sequential projection and the choice of next-speaker-selection techniques by a current speaker. The appropriate consideration of gaze leads to a revision of the turn-taking model in that it reduces the domain of self-selection and expands that of the current-speaker-selects-next sub-rule. It also has consequences for the analysis of “simultaneous starts”. (shrink)
The interpretation of implications as rules motivates a different left-introduction schema for implication in the sequent calculus, which is conceptually more basic than the implication-left schema proposed by Gentzen. Corresponding to results obtained for systems with higher-level rules, it enjoys the subformula property and cut elimination in a weak form.
Until recently most militaries tended to see moral issues through the lens of rules and regulations. Today, however, many armed forces consider teaching virtues to be an important complement to imposing rules and codes from above. A closer look reveals that it is mainly established military virtues such as honour, courage and loyalty that dominate both the lists of virtues and values of most militaries and the growing body of literature on military virtues. Although there is evidently still a role (...) for these traditional martial virtues, it is equally evident that they are not particularly relevant to, for instance, military personnel operating drones. This chapter looks into the ethics of unmanned warfare from the perspective of military virtues and military ethics education, and addresses the question of what we need to solve that just-mentioned misalignment: 1) a new set of virtues; 2) a different interpretation of the existing virtues; or 3) a different approach altogether, that is, an alternative to teaching virtues? That we have to think about such questions is at least partly because unmanned systems bring risk asymmetry in war to a new level, making warlike virtues such as physical courage by and large obsolete. The last section of this chapter therefore addresses the question: to what extent does the possibility of riskless warfare makes drone use ‘virtue-less’? (shrink)
Table of Contents -/- 1. Introduction and Overview: Two Entitlement Projects, Peter J. Graham, Nikolaj J.L.L. Pedersen, Zachary Bachman, and Luis Rosa -/- Part I. Engaging Burge's Project -/- 2. Entitlement: The Basis of Empirical Warrant, Tyler Burge 3. Perceptual Entitlement and Scepticism, Anthony Brueckner and Jon Altschul 4. Epistemic Entitlement Its Scope and Limits, Mikkel Gerken 5. Why Should Warrant Persist in Demon Worlds?, Peter J. Graham -/- Part II. Extending the Externalist Project -/- 6. Epistemic Entitlement (...) and Epistemic Competence, Ernest Sosa 7. Extended Entitlement, Adam Carter and Duncan Pritchard 8. Moorean Pragmatics, Social Comparisons and Common Knowledge, Allan Hazlett 9. Internalism and Entitlement to Rules and Methods, Joshua Schecter -/- Part III. Engaging Wright's Project -/- 10. Full Bloodied Entitlement, Martin Smith 11. Pluralist Consequentialist Anti-Scepticism, Nikolaj Jang Lee Linding Pedersen 12. Against (Neo-Wittensteinian) Entitlements, Annalisa Coliva 13. The Truth Fairy and the Indirect Consequentialist, Daniel Elstein and Carrie S. I. Jenkins 14. Knowledge for Nothing, Patrick Greenough . (shrink)
We present our calculus of higher-level rules, extended with propositional quantification within rules. This makes it possible to present general schemas for introduction and elimination rules for arbitrary propositional operators and to define what it means that introductions and eliminations are in harmony with each other. This definition does not presuppose any logical system, but is formulated in terms of rules themselves. We therefore speak of a foundational account of proof-theoretic harmony. With every set of introduction rules a canonical elimination (...)rule, and with every set of elimination rules a canonical introduction rule is associated in such a way that the canonical rule is in harmony with the set of rules it is associated with. An example given by Hazen and Pelletier is used to demonstrate that there are significant connectives, which are characterized by their elimination rules, and whose introduction rule is the canonical introduction rule associated with these elimination rules. Due to the availabiliy of higher-level rules and propositional quantification, the means of expression of the framework developed are sufficient to ensure that the construction of canonical elimination or introduction rules is always possible and does not lead out of this framework. (shrink)
Moral perception, for the purposes of this article, is taken to be the perception of moral properties, unless contexts dictate otherwise. While both particularists and generalists agree that we can perceive the moral properties of an action or a feature, they disagree, however, over whether rules play any essential role in moral perception. The particularists argue for a ‘no’ answer, whereas the generalists say ‘yes’. In this paper, I provide a limited defense of particularism by rebutting several powerful generalist arguments. (...) It is hoped particularism can thus be made more attractive as a theory of moral perception. Positive arguments for particularism will also be provided along the way. (shrink)
'Knowledge' doesn't correctly describe our relation to linguistic rules. It is too thick a notion. On the other hand, 'cognize', without further elaboration, is too thin a notion, which is to say that it is too thin to play a role in a competence theory. One advantage of the term 'knowledge'-and presumably Chomsky's original motivation for using it-is that knowledge would play the right kind of role in a competence theory: Our competence would consist in a body of knowledge which (...) we have and which we may or may not act upon-our performance need not conform to the linguistic rules that we know. Is there a way out of the dilemma? I'm going to make the case that the best way to talk about grammatical rules is simply to say that we have them. That doesn't sound very deep, I know, but saying that we have individual rules leaves room for individual norm guidance in a way that 'cognize' does not. Saying we have a rule like subjacency is also thicker than merely saying we cognize it. Saying I have such a rule invites the interpretation that it is a rule for me-that I am normatively guided by it. The competence theory thus becomes a theory of the rules that we have. Whether we follow those rules is another matter entirely. (shrink)
In arguing for a rules-similarity continuum, Pothos should demonstrate that a single process or mechanism (a neural network model, for example) can handle the entire continuum. Pothos deliberately avoids this exercise as beyond the scope of the current research. In this context, I will present simulation, neuropsychological, neurophysiological, and experimental psychological results, arguing against the continuity hypothesis.