Bishop and Trout here present a unique and provocative new approach to epistemology. Their approach aims to liberate epistemology from the scholastic debates of standard analytic epistemology, and treat it as a branch of the philosophy of science. The approach is novel in its use of cost-benefit analysis to guide people facing real reasoning problems and in its framework for resolving normative disputes in psychology. Based on empirical data, Bishop and Trout show how people can improve their reasoning by relying (...) on Statistical Prediction Rules. They then develop and articulate the positive core of the book. Their view, Strategic Reliabilism, claims that epistemic excellence consists in the efficient allocation of cognitive resources to reliable reasoning strategies, applied to significant problems. The last third of the book develops the implications of this view for standard analytic epistemology; for resolving normative disputes in psychology; and for offering practical, concrete advice on how this theory can improve real people's reasoning. This is a truly distinctive and controversial work that spans many disciplines and will speak to an unusually diverse group, including people in epistemology, philosophy of science, decision theory, cognitive and clinical psychology, and ethics and public policy. (shrink)
Scientists and laypeople alike use the sense of understanding that an explanation conveys as a cue to good or correct explanation. Although the occurrence of this sense or feeling of understanding is neither necessary nor sufficient for good explanation, it does drive judgments of the plausibility and, ultimately, the acceptability, of an explanation. This paper presents evidence that the sense of understanding is in part the routine consequence of two well-documented biases in cognitive psychology: overconfidence and hindsight. In light of (...) the prevalence of counterfeit understanding in the history of science, I argue that many forms of cognitive achievement do not involve a sense of understanding, and that only the truth or accuracy of an explanation make the sense of understanding a valid cue to genuine understanding. (shrink)
Philosophers agree that scientific explanations aim to produce understanding, and that good ones succeed in this aim. But few seriously consider what understanding is, or what the cues are when we have it. If it is a psychological state or process, describing its specific nature is the job of psychological theorizing. This article examines the role of understanding in scientific explanation. It warns that the seductive, phenomenological sense of understanding is often, but mistakenly, viewed as a cue of genuine understanding. (...) The article closes with a discussion of several new paths of research that tie the psychology of scientific explanation to cognate notions of learning, testimony, and understanding. (shrink)
The more than 40 readings in this anthology cover the most important developments of the past six decades, charting the rise and decline of logical positivism ...
A fresh, daring, and genuine alternative to the traditional story of scientific progress Explaining the world around us, and the life within it, is one of the most uniquely human drives, and the most celebrated activity of science. Good explanations are what provide accurate causal accounts of the things we wonder at, but explanation's earthly origins haven't grounded it: we have used it to account for the grandest and most wondrous mysteries in the natural world. Explanations give us a sense (...) of understanding, but an explanation that feels right doesn't mean it is true. For every true explanation, there is a false one that feels just as good. A good theory's explanations, though, have a much easier path to truth. This push for good explanations elevated science from medieval alchemy to electro-chemistry, or a pre-inertial physics to the forces underlying nanoparticles. And though the attempt to explain has existed as long as we have been able to wonder, a science timeline from pre-history to the present will reveal a steep curve of theoretical discovery that explodes around 1600, primarily in the West. Ranging over neuroscience, psychology, history, and policy, Wondrous Truths answers two fundamental questions-Why did science progress in the West? And why so quickly? J.D. Trout's answers are surprising. His central idea is that Western science rose above all others because it hit upon successive theories that were approximately true through an awkward assortment of accident and luck, geography and personal idiosyncrasy. Of course, intellectual ingenuity partially accounts for this persistent drive forward. But so too does the persistence of the objects of wonder. Wondrous Truths recovers the majesty of science, and provides a startling new look at the grand sweep of its biggest ideas. (shrink)
Scientific realism has been advanced as an interpretation of the natural sciences but never as an interpretation of the behavioural sciences. This book introduces a novel version of scientific realism -- Measured Realism -- that characterizes the kind of theoretical progress in the social and psychological sciences that is uneven but indisputable. Measuring the Intentional World proposes a theory of measurement -- Population-Guided Estimation -- that connects natural, psychological, and social scientific inquiry.
Standard Analytic Epistemology (SAE) names a contingently clustered class of methods and theses that have dominated English-speaking epistemology for about the past half-century. The major contemporary theories of SAE include versions of foundationalism, coherentism, reliabilism, and contextualism. While proponents of SAE don’t agree about how to define naturalized epistemology, most agree that a thoroughgoing naturalism in epistemology can’t work. For the purposes of this paper, we will suppose that a naturalistic theory of epistemology takes as its core, as its starting-point, (...) an empirical theory. The standard argument against naturalistic approaches to epistemology is that empirical theories are essentially descriptive, while epistemology is essentially prescriptive, and a descriptive theory cannot yield normative, evaluative prescriptions. In short, naturalistic theories cannot overcome the is-ought divide. Our main goal in this paper is to show that the standard argument against naturalized epistemology has it almost exactly backwards. (shrink)
Our aim in this paper is to bring the woefully neglected literature on predictive modeling to bear on some central questions in the philosophy of science. The lesson of this literature is straightforward: For a very wide range of prediction problems, statistical prediction rules (SPRs), often rules that are very easy to implement, make predictions than are as reliable as, and typically more reliable than, human experts. We will argue that the success of SPRs forces us to reconsider our views (...) about what is involved in understanding, explanation, good reasoning, and about how we ought to do philosophy of science. (shrink)
Contemporary Materialism brings together the best recent work on materialism from many of our leading contemporary philosophers. This is the first comprehensive reader on the subject. The majority of philosophers and scientists today hold the view that all phenomena are physical, as a result materialism or 'physicalism' is now the dominant ontology in a wide range of fields. Surprisingly no single book, until now, has collected the key investigations into materialism, to reflect the impact it has had on current thinking (...) in metaphysics, philosophy of mind and the theory of value. The classic papers in this collection chart contemporary problems, positions and themes in materialism. At the invitation of the editors, many of the papers have been specially up-dated for this collection: follow-on pieces written by the contributors enable them to appraise the original paper and assess developments since the work was first published. The book's selections are largely non-technical and accessible to advanced undergraduates. The editors have provided a useful general introduction, outlining and contextualising this central system of thought, as well as a topical bibliography. Contemporary Materialism will be vital reading for anyone concerned to discover the ideas underlying contemporary philosophy. David Armstrong, University of Sydney; Jerry Fodor, Rutgers University, New Jersey; Tim Crane, University College, London; D. H. Mellor, Univeristy of Cambridge; J.J.C. (shrink)
Significance testing is the primary method for establishing causal relationships in psychology. Meehl [1978, 1990a, 1990b] and Faust [1984] argue that significance tests and their interpretation are subject to actuarial and psychological biases, making continued adherence to these practices irrational, and even partially responsible for the slow progress of the ‘soft’ areas of psychology. I contend that familiar standards of testing and literature review, along with recently developed meta-analytic techniques, are able to correct the proposed actuarial and psychological biases. In (...) particular, psychologists embrace a principle of robustness which states that real psychological effects are (1) reproducible by similar methods, (2) detectable by diverse means, and (3) able to survive theoretical integration. By contrast, spurious significant findings perish under the strain of persistent tests of their robustness. The resulting vindication of significance testing confers on the world a role in determining the rationality of a method, and also affords us an explanation for the fast progress of ‘hard’ areas of psychology. *I would like to thank Dick Boyd and Phil Gasper for helpful comments on the ideas presented here. (shrink)
Recently, Rueger and Sharp and Koperski have been concerned to show that certain procedural accounts of model confirmation are compromised by non-linear dynamics. We suggest that the issues raised are better approached by considering whether chaotic data analysis methods allow for reliable inference from data. We provide a framework and an example of this approach.
Strategic Reliabilism is a framework that yields relative epistemic evaluations of belief-producing cognitive processes. It is a theory of cognitive excellence, or more colloquially, a theory of reasoning excellence (where 'reasoning' is understood very broadly as any sort of cognitive process for coming to judgments or beliefs). First introduced in our book, Epistemology and the Psychology of Human Judgment (henceforth EPHJ), the basic idea behind SR is that epistemically excellent reasoning is efficient reasoning that leads in a robustly reliable fashion (...) to significant, true beliefs. It differs from most contemporary epistemological theories in two ways. First, it is not a theory of justification or knowledge – a theory of epistemically worthy belief. Strategic Reliabilism is a theory of epistemically worthy ways of forming beliefs. And second, Strategic Reliabilism does not attempt to account for an epistemological property that is assumed to be faithfully reflected in the epistemic judgments and intuitions of philosophers. If SR makes recommendations that accord with our reflective epistemic judgments and intuitions, great. If not, then so much the worse for our reflective epistemic judgments and intuitions. (shrink)
Scientific realists contend that theory-conjunction presents a problem for empiricist conceptions of scientific knowledge and practice. Van Fraassen (1980) has offered a competing account of theory-conjunction which I argue fails to capture the mercenary character of epistemic dependence in science. Representative cases of theory-conjunction developed in the present paper show that mercenary reliance implies a "principle of epistemic symmetry" which only a realist can consistently accommodate. Finally, because the practice in question involves the conjunction of theories, a version of realism (...) more robust than the "entity realism" of Cartwright (1983, 1989) and Hacking (1983) is required to explain the success of theory-conjunction. (shrink)
Wondrous Truths answers two questions about the steep rise of theoretical discoveries around 1600: Why in the European West? And why so quickly? The history of science's awkward assortment of accident and luck, geography and personal idiosyncrasy, explains scientific progress alongside experimental method. J.D. Trout's blend of scientific realism and epistemic naturalism carries us through neuroscience, psychology, history, and policy, and explains how the corpuscular hunch of Boyle and Newton caught on.
Realizing the ideal of democracy requires political inclusion for citizens. A legitimate democracy must give citizens the opportunity to express their attitudes about the relative attractions of different policies, and access to political mechanisms through which they can be counted and heard. Actual governance often aims not at accurate belief, but at nonepistemic factors like achieving and maintaining institutional stability, creating the feeling of government legitimacy among citizens, or managing access to influence on policy decision-making. I examine the traditional relationship (...) between inclusiveness and accuracy, and illustrate this connection by discussing empirical work on how group decision-making can improve accuracy. I also advance a Generic Epistemic Principle that any evidence-based decision-making procedures must embrace. Focusing on policy-making, I then measure the distance between these standards and the ones actually implemented in U.S. political settings. Psychological research on individual and group decision-making is a source of normative assessment for existing policy judgment, but it neither rationalizes nor legitimates the actual and typical processes used in U.S. institutions of political decision making. To establish this point, I focus on one characteristic government institution—the U.S. House of Representatives Committee on Science, Space, and Technology—that displays deliberative processes at odds with the sciences they advocate, and with the Generic Epistemic Principle. I explain this discouraging condition in terms of several inveterate factors in U.S. politics: a limitlessly money-driven and endless campaigning process that effectively forces elected representatives to align themselves with money and vote strategically, the use of procedural arrangements known to make people feel politically included when they are not, and the unresponsiveness of a majoritarian (vs. consensus) democracy. (shrink)
Scientific realism is usually a thesis or theses advanced about our best natural science. In contrast, this book defends scientific realism applied to the social and behavioral sciences. It does so, however, by applying the same argument strategy that many have found convincing for the natural sciences, namely, by arguing that we can only explain the success of the sciences by postulating their approximate truth. The particular success that Trout emphasizes for the social sciences is the effective use of statistical (...) testing. Social scientists apply diverse statistical measurement tools to social reality; they are able to refine and improve those measurements over time. The best explanation for such success is that the social sciences give us an approximately true account of some of the laws and entities of the social world. (shrink)
The use of null hypothesis significance testing (NHST) in psychology has been under sustained attack, despite its reliable use in the notably successful, so-called "hard" areas of psychology, such as perception and cognition. I argue that, in contrast to merely methodological analyses of hypothesis testing (in terms of "test severity," or other confirmation-theoretic notions), only a patently metaphysical position can adequately capture the uneven but undeniable successes of theories in "hard psychology." I contend that Measured Realism satisfies this description, and (...) characterizes the role of NHST in hard psychology. (shrink)
The role of aesthetic factors in science is often mentioned, but seldom discussed in a sustained and systematic way. This thoughtful book is James McAllister’s attempt to do so. McAllister’s treatment engages a broad range of issues, relating aesthetic criteria to such diverse issues as the history of astronomy and twentieth-century physics, theoretical ruptures, and architecture. Its core goals are two. One goal is to show that there is a role for aesthetic considerations in theory choice that is compatible with (...) the rationalist tradition. Here he defends the rationalist image against two recent competitors, a Kuhnian image of science in which the history of science is not incremental and is punctuated by revolutions, and another which holds that “nonrational” aesthetic criteria play a significant role in theory choice. The second goal of this book is to defend a specific account of the role of aesthetic factors in theory choice, what McAllister calls the “aesthetic induction.”. (shrink)
This paper advances a novel argument that speech perception is a complex system best understood nonindividualistically and therefore that individualism fails as a general philosophical program for understanding cognition. The argument proceeds in four steps. First, I describe a "replaceability strategy", commonly deployed by individualists, in which one imagines replacing an object with an appropriate surrogate. This strategy conveys the appearance that relata can be substituted without changing the laws that hold within the domain. Second, I advance a "counterfactual test" (...) as an alternative to the replaceability strategy. Third, I show how the typical objects of cross-modal processes (in this case, auditory-visual speech perception), more clearly irreplaceable than the objects of the unimodal process examined by Burge [(1986) Individualism and psychology, The Philosophical Review, XCV, 3-45], supply a firm basis for a nonindividualist interpretation of such cases. Finally, I demonstrate that the routine violation of the individualist's Replaceability Condition occurs even in unimodal cases - so the violation of the replaceability constraint does not derive simply from the diversity of modal sources but rather from the causal complexity of psychological processes generally. The conclusion is that philosophical progress on this issue must await progress in psychology, or, at least, philosophical progress in accounting for psychological complexity--precisely the vicissitude predicted by a thoroughgoing naturalism. (shrink)
All Talked Out is an exercise in applied philosophy. It is a study of what the examination of knowledge, explanation, and well-being would look like if freed from the peculiar tools and outlook of modern philosophy and handed over to scientists - or scientifically-trained philosophers - who had a reflective aim.
In “Forced to be Free”, Neil Levy surveys the raft of documented decision-making biases that humans are heir to, and advances several bold proposals designed to enhance the patient's judgment. Gratefully, Levy is moved by the psychological research on judgment and decision-making that documents people's inaccuracy when identifying courses of action will best promote their subjective well-being. But Levy is quick to favour the patient's present preferences, to ensure they get “final say” about their treatment. I urge the opposite inclination, (...) raising doubts about whether the patient's “present preferences” are the best expression of their “final say”. When there is adequate evidence that people, by their own lights, overemphasize their present preferences about the future, we should carefully depreciate those preferences, in effect biasing them to make the right decision by their own lights. (shrink)
Inferential statistical tests-such as analysis of variance, t-tests, chi-square and Wilcoxin signed ranks-now constitute a principal class of methods for the testing of scientific hypotheses. In this paper I will consider the role of one statistical concept (statistical power) and two statistical principles or assumptions (homogeneity of variance and the independence of random error), in the reliable application of selected statistical methods. I defend a tacit but widely-deployed naturalistic principle of explanation (E): Philosophers should not treat as inexplicable or basic (...) those correlational facts that scientists themselves do not treat as irreducible. In light of (E), I contend that the conformity of epistemically reliable statistical tests to these concepts and assumptions entails at least the following modest or austere realist commitment: (C) The populations under study have a stable theoretical or unobserved structure that metaphysically grounds the observed values; the objects therefore have a fixed value independent of our efforts to measure them. (C) provides the best explanation for the correlation between the joint use of statistical assumptions and statistical tests, on the one hand, and methodological success on the other. (shrink)
Some eliminativists have predicted that a developed neuroscience will eradicate the principles and theoretical kinds (belief, desire, etc.) implicit in our ordinary practices of mental state attribution. Prevailing defenses of common-sense psychology infer its basic integrity from its familiarity and instrumental success in everyday social commerce. Such common-sense defenses charge that eliminativist arguments are self-defeating in their folk psychological appeal to the belief that eliminativism is true. I argue that eliminativism is untouched by this simple charge of inconsistency, and introduce (...) a different dialectical strategy for arguing against the eliminativist. In keeping with the naturalistic trend in the sociology and philosophy of science, I show that neuroscientists routinely rely on folk psychological procedures of intentional state attribution in applying epistemically reliable standards of scientific evaluation. These scientific contexts place ordinary procedures of attribution under greater stress, producing evidence of folk psychological success that is less equivocal than the evidence in mundane settings. Therefore, the dependence of science on folk psychology, when combined with an independently plausible explanatory constraint on reduction and an independently motivated notion of theoretical stress, allows us to reconstitute the charge of (neurophilic) eliminativist inconsistency in a more sophisticated form. (shrink)