What does it mean to trust the results of a computer simulation? This paper argues that trust in simulations should be grounded in empirical evidence, good engineering practice, and established theoretical principles. Without these constraints, computer simulation risks becoming little more than speculation. We argue against two prominent positions in the epistemology of computer simulation and defend a conservative view that emphasizes the difference between the norms governing scientific investigation and those governing ordinary epistemic practices.
We address some of the epistemological challenges highlighted by the Critical Data Studies literature by reference to some of the key debates in the philosophy of science concerning computational modeling and simulation. We provide a brief overview of these debates focusing particularly on what Paul Humphreys calls epistemic opacity. We argue that debates in Critical Data Studies and philosophy of science have neglected the problem of error management and error detection. This is an especially important feature of the epistemology of (...) Big Data. In “Error” section we explain the main characteristics of error detection and correction along with the relationship between error and path complexity in software. In this section we provide an overview of conventional statistical methods for error detection and review their limitations when faced with the high degree of conditionality inherent to modern software systems. (shrink)
This paper argues that the difference between contemporary software intensive scientific practice and more traditional non-software intensive varieties results from the characteristically high conditionality of software. We explain why the path complexity of programs with high conditionality imposes limits on standard error correction techniques and why this matters. While it is possible, in general, to characterize the error distribution in inquiry that does not involve high conditionality, we cannot characterize the error distribution in inquiry that depends on software. Software intensive (...) science presents distinctive error and uncertainty modalities that pose new challenges for the epistemology of science. (shrink)
Intuition serves a variety of roles in contemporary philosophy. This paper provides a historical discussion of the revival of intuition in the 1970s, untangling some of the ways that intuition has been used and offering some suggestions concerning its proper place in philosophical investigation. Contrary to some interpretations of the results of experimental philosophy, it is argued that generalized skepticism with respect to intuition is unwarranted. Intuition can continue to play an important role as part of a methodologically conservative stance (...) towards philosophical investigation. I argue that methodological conservatism should be sharply distinguished from the process of evaluating individual propositions. Nevertheless, intuition is not always a reliable guide to truth and experimental philosophy can serve a vital ameliorative role in determining the scope and limits of our intuitive competence with respect to various areas of inquiry. (shrink)
Technologies that deploy data science methods are liable to result in epistemic harms involving the diminution of individuals with respect to their standing as knowers or their credibility as sources of testimony. Not all harms of this kind are unjust but when they are we ought to try to prevent or correct them. Epistemically unjust harms will typically intersect with other more familiar and well-studied kinds of harm that result from the design, development, and use of data science technologies. However, (...) we argue that epistemic injustices can be distinguished conceptually from more familiar kinds of harm. We argue that epistemic harms are morally relevant even in cases where those who suffer them are unharmed in other ways. Via a series of examples from the criminal justice system, workplace hierarchies, and educational contexts we explain the kinds of epistemic injustice that can result from common uses of data science technologies. (shrink)
Epistemic logic is the logic of knowledge and belief. It provides insight into the properties of individual knowers, has provided a means to model complicated scenarios involving groups of knowers and has improved our understanding of the dynamics of inquiry.
How can we be certain that software is reliable? Is there any method that can verify the correctness of software for all cases of interest? Computer scientists and software engineers have informally assumed that there is no fully general solution to the verification problem. In this paper, we survey approaches to the problem of software verification and offer a new proof for why there can be no general solution.
Epistemic logic begins with the recognition that our everyday talk about knowing and believing has some systematic features that we can track and re‡ect upon. Epistemic logicians have studied and extended these glints of systematic structure in fascinating and important ways since the early 1960s. However, for one reason or another, mainstream epistemologists have shown little interest. It is striking to contrast the marginal role of epistemic logic in contemporary epistemology with the centrality of modal logic for metaphysicians. This article (...) is intended to help in correcting this oversight by presenting some important developments in epistemic logic and suggesting ways to understand their applicability to traditional epistemological problems. Obviously, by itself, tweaking the formal apparatus of epistemic logic does not solve traditional epistemological problems. Epistemic logic can help us to navigate through problems in a systematic fashion by unpacking the logic of the problematic concepts, it can also lead us to recognize problems that we had not anticipated. This is basically analogous to the role that modal logic has played in contemporary metaphysics. (shrink)
There are many tangled normative and technical questions involved in evaluating the quality of software used in epidemiological simulations. In this paper we answer some of these questions and offer practical guidance to practitioners, funders, scientific journals, and consumers of epidemiological research. The heart of our paper is a case study of the Imperial College London covid-19 simulator, set in the context of recent work in epistemology of simulation and philosophy of epidemiology.
How can we be certain that software is reliable? Is there any method that can verify the correctness of software for all cases of interest? Computer scientists and software engineers have informally assumed that there is no fully general solution to the verification problem. In this paper, we survey approaches to the problem of software verification and offer a new proof for why there can be no general solution.
Computational modeling plays an increasingly important explanatory role in cases where we investigate systems or problems that exceed our native epistemic capacities. One clear case where technological enhancement is indispensable involves the study of complex systems.1 However, even in contexts where the number of parameters and interactions that define a problem is small, simple systems sometimes exhibit non-linear features which computational models can illustrate and track. In recent decades, computational models have been proposed as a way to assist us in (...) understanding emergent phenomena. (shrink)
In this paper, we argue for the centrality of prediction in the use of computational models in science. We focus on the consequences of the irreversibility of computational models and on the conditional or ceteris paribus, nature of the kinds of their predictions. By irreversibility, we mean the fact that computational models can generally arrive at the same state via many possible sequences of previous states. Thus, while in the natural world, it is generally assumed that physical states have a (...) unique history, representations of those states in a computational model will usually be compatible with more than one possible history in the model. We describe some of the challenges involved in prediction and retrodiction in computational models while arguing that prediction is an essential feature of non-arbitrary decision making. Furthermore, we contend that the non-predictive virtues of computational models are dependent to a significant degree on the predictive success of the models in question. (shrink)
Software-intensive systems are ubiquitous in the industrialized world. The reliability of software has implications for how we understand scientific knowledge produced using software-intensive systems and for our understanding of the ethical and political status of technology. The reliability of a software system is largely determined by the distribution of errors and by the consequences of those errors in the usage of that system. We select a taxonomy of software error types from the literature on empirically observed software errors and compare (...) that taxonomy to Giuseppe Primiero’s Minds and Machines 24: 249–273, (2014) taxonomy of error in information systems. Because Primiero’s taxonomy is articulated in terms of a coherent, explicit model of computation and is more fine-grained than the empirical taxonomy we select, we might expect Primiero’s taxonomy to provide insights into how to reduce the frequency of software error better than the empirical taxonomy. Whether using one software error taxonomy can help to reduce the frequency of software errors better than another taxonomy is ultimately an empirical question. (shrink)
Skeptics argue that the acquisition of knowledge is impossible given the standing possibility of error. We present the limiting convergence strategy for responding to skepticism and discuss the relationship between conceivable error and an agent’s knowledge in the limit. We argue that the skeptic must demonstrate that agents are operating with a bad method or are in an epistemically cursed world. Such demonstration involves a significant step beyond conceivability and commits the skeptic to potentially convergent inquiry.
This paper responds to Jaegwon Kim's powerful objection to the very possibility of genuinely novel emergent properties. Kim argues that the incoherence of reflexive downward causation means that the causal power of an emergent phenomenon is ultimately reducible to the causal powers of its constituents. I offer a simple argument showing how to characterize emergent properties m terms of the effects of structural relations an the causal powers of that. constituents.
Philosophers and cognitive scientists reassess systematicity in the post-connectionist era, offering perspectives from ecological psychology, embodied and distributed cognition, enactivism, and other methodologies.
Formal Philosophy is a collection of short interviews based on 5 questions presented tosome of the most influential and prominent scholars in formal philosophy.
The following analysis shows how developments in epistemic logic can play a nontrivial role in cognitive neuroscience. We argue that the striking correspondence between two modes of identification, as distinguished in the epistemic context, and two cognitive systems distinguished by neuroscientific investigation of the visual system (the "where" and "what" systems) is not coincidental, and that it can play a clarificatory role at the most fundamental levels of neuroscientific theory.
_The Routledge Companion to Philosophy of Psychology, Second Edition_ is an invaluable guide and major reference source to the major topics, problems, concepts and debates in philosophy of psychology and is the first companion of its kind. A team of renowned international contributors provide forty-nine chapters organised into six clear parts: Historical background to Philosophy of Psychology Psychological Explanation Cognition and Representation The biological basis of psychology Perceptual Experience Personhood. _The Companion_ covers key topics such as the origins of experimental (...) psychology; folk psychology; behaviorism and functionalism; philosophy, psychology and neuroscience; the language of thought, modularity, nativism and representational theories of mind; consciousness and the senses; dreams emotion and temporality; personal identity and the philosophy of psychopathology. For the_ second edition_ many of the current chapters have been updated, and seven new chapters added on important new topics such predictive processing, comparative cognition, learning, and group cognition, as well as a new introductory chapter by the editors on the demarcation between philosophy and psychology. Essential reading for all students of philosophy of mind, science and psychology, _The Routledge Companion to Philosophy of Psychology _will also be of interest to anyone studying psychology and its related disciplines. (shrink)
In his “A New Program for Philosophy of Science?”, Ronald Giere expresses qualms regarding the critical and political projects I advocate for philosophy of science—that the critical project assumes an underdetermination absent from actual science, and the political project takes us outside the professional pursuit of philosophy of science. In reply I contend that the underdetermination the critical project assumes does occur in actual science, and I provide a variety of examples to support this. And I contend that the political (...) project requires no more than what other academic fields even in science studies are already providing. (shrink)
Quantum computing is of high interest because it promises to perform at least some kinds of computations much faster than classical computers. Arute et al. 2019 (informally, “the Google Quantum Team”) report the results of experiments that purport to demonstrate “quantum supremacy” – the claim that the performance of some quantum computers is better than that of classical computers on some problems. Do these results close the debate over quantum supremacy? We argue that they do not. In the following, we (...) provide an overview of the Google Quantum Team’s experiments, then identify some open questions in the quest to demonstrate quantum supremacy. (shrink)
While ontic structural realism (OSR) has been a central topic in contemporary philosophy of science, the relation between OSR and the concept of emergence has received little attention. We will argue that OSR is fully compatible with emergentism. The denial of ontological emergence requires additional assumptions that, strictly speaking, go beyond OSR. We call these physicalist closure assumptions. We will explain these assumptions and show that they are independent of the central commitments of OSR and inconsistent with its core goals. (...) Recognizing the compatibility of OSR and ontological emergence may contribute to the solution of ontological puzzles in physics while offering new ways to achieve the goals that advocates of OSR set for their view. (shrink)
Scientific modelling is a value-laden process: the decisions involved can seldom be made using 'scientific' criteria alone, but rather draw on social and ethical values. In this paper, we draw on a body of philosophical literature to analyze a COVID-19 vaccination model, presenting a case study of social and ethical value judgments in health-oriented modelling. This case study urges us to make value judgments in health-oriented models explicit and interpretable by non-experts and to invite public involvement in making them.
This paper argues that open-mindedness is a corrective virtue. It serves as a corrective to the epistemic vice of confirmation bias. Specifically, open-mindedness is the epistemically virtuous disposition to resist the negative effects of confirmation bias on our ability to reason well and to evaluate evidence and arguments. As part of the defense and presentation of our account, we explore four discussions of open-mindedness in the recent literature. All four approaches have strengths and shed light on aspects of the virtue (...) of open-mindedness. Each mentions various symptoms of confirmation bias and some explore aspects of the corrective role of open-mindedness. However, ours is the first to explicitly identify open-mindedness as a corrective virtue to the specific epistemic vice of confirmation bias. We show how the corrective account also permits a response to the concern that open-mindedness might not actually count as a virtue. (shrink)
Scientific modelling is a value-laden process: the decisions involved can seldom be made using ‘scientific’ criteria alone, but rather draw on social and ethical values. In this paper, we draw on a body of philosophical literature to analyze a COVID-19 vaccination model, presenting a case study of social and ethical value judgments in health-oriented modelling. This case study urges us to make value judgments in health-oriented models explicit and interpretable by non-experts and to invite public involvement in making them.
Science is a dynamic process in which the assimilation of new phenomena, perspectives, and hypotheses into the scientific corpus takes place slowly. The apparent disunity of the sciences is the unavoidable consequence of this gradual integration process. Some thinkers label this dynamical circumstance a ‘crisis’. However, a retrospective view of the practical results of the scientific enterprise and of science itself, grants us a clear view of the unity of the human knowledge seeking enterprise. This book provides many arguments, case (...) studies and examples in favor of the unity of science. These contributions touch upon various scientific perspectives and disciplines such as: Physics, Computer Science, Biology, Neuroscience, Cognitive Psychology, and Economics. (shrink)
ABSTRACT In Philosophy Within its Proper Bounds, Édouard Machery argues that the results of experimental philosophy should lead us to abandon much of traditional philosophical practice. In its place Machery defends naturalized conceptual analysis as a more modest and pragmatic alternative to standard analytic philosophy. This paper argues that Machery overstates the metaphilosophical significance of x-phi’s results. We can and should keep many of the insights and good methodological habits that come with x-phi. However, if one is not already convinced (...) of pragmatism or naturalism, the discoveries of x-phi are unlikely to make too much difference to one’s metaphilosophical position. (shrink)
Meaningfulness is the dimension of importance that exists for beings capable of adjudicating between competing kinds of normative reasons. The way an agent decides to rank competing values in terms of importance reflects that agent’s understanding of what counts as meaningful. We can imagine agents who do not engage in this kind of deliberation. Agents who fail to adjudicate between kinds of normative reasons can still act in ways that are prudentially valuable, aesthetically pleasing, and morally praiseworthy. While the actions (...) of such agents can be good in a variety of ways such actions can also be meaningless. This paper explains how meaningfulness is connected to deliberation, how one can be mistaken in one’s judgments of meaningfulness, and how some lives and practices can be more meaningful than others. (shrink)
This paper responds to Jaegwon Kim's powerful objection to the very possibility of genuinely novel emergent properties Kim argue that the incoherence of reflexive downward causation means that the causal power of an emergent phenomenon is ultimately reducible to the causal powers of its constituents. I offer a a simple argument showing how to characterize emergent properties m terms of the effects of structural relations an the causal powers of that constituents.
Masses of Formal Philosophy is an outgrowth of Formal Philosophy. That book gathered the responses of some of the most prominent formal philosophers to five relatively open and broad questions initiating a discussion of metaphilosophical themes and problems surrounding the use of formal methods in philosophy. Including contributions from a wide range of philosophers, Masses of Formal Philosophy contains important new responses to the original five questions.
SPECIAL INTRODUCTORY PRICE! Daniel Dennett has been one of the central voices in the philosophy of mind for at least the past forty years. Unlike most philosophers of his generation, Dennett’s work has resonated far and wide. It has powerfully influenced the development of cognitive science, robotics, developmental psychology, and artificial intelligence. Indeed, his work has led to many new lines of inquiry. For example, he has developed a theory of consciousness which provides an approach to naturalizing mind which circumvents (...) many of the most significant philosophical arguments against the possibility of a scientific explanation of consciousness. The daunting quantity of literature available on Dennett makes it difficult to discriminate the useful from the tendentious, superficial, and otiose. Moreover, because no comparable philosopher has had a profound impact across such a wide range of disciplines and on intellectual culture in general, responses to Dennett’s philosophy are dispersed across a broad range of scientific, philosophical, and cultural domains. That is why this new title in the highly regarded Routledge series, Critical Assessments of Leading Philosophers, is so urgently needed. Edited by John Symons, this new Routledge Major Work is a four-volume collection of the best scholarship on Dennett; the collected materials have been carefully selected from a wide range of academic journals, edited collections, research monographs, and other sources. The tightly focused organization of this collection allows users quickly and easily to access both established and cutting-edge assessments of Dennett’s work. The set is also made for irresistible browsing. With comprehensive introductions to each volume, providing essential background information and relating the various works to each other, Daniel Dennett is destined to be an indispensable resource for research and study. (shrink)
This brief text assists students in understanding Dennett's philosophy and thinking so they can more fully engage in useful, intelligent class dialogue and improve their understanding of course content. Part of the Wadsworth Notes Series, (which will eventually consist of approximately 100 titles, each focusing on a single "thinker" from ancient times to the present), ON DENNETT is written by a philosopher deeply versed in the philosophy of this key thinker. Like other books in the series, this concise book offers (...) sufficient insight into the thinking of a notable philosopher, better enabling students to engage in reading and to discuss the material in class and on paper. (shrink)
Horgan’s perceptive discussion of Freudian psychology, Prozac and evolutionary biology cannot mitigate the problems that seriously weaken his book (Horgan, 1999). While he certainly manages to deflate some of the more outrageous hype surrounding the scientific and often not-so-scientific study of the mind, his criticism of the brain and behavioral sciences contains a number of flaws, some of which I will address below. My response focuses on his discussion of neuroscience. As we shall see, the three mysteries that Horgan believes (...) cripple neuroscience are certainly not as serious as he insists. (shrink)
An asymmetry between the demands at the computational and algorithmic levels of description furnishes the illusion that the abstract profile at the computational level can be multiply realized, and that something is actually being shared at the algorithmic one. A disembodied rendering of the situation lays the stress upon the different ways in which an algorithm can be implemented. However, from an embodied approach, things look rather different. The relevant pairing, I shall argue, is not between implementation and algorithm, but (...) rather between algorithm and computation. The autonomy of psychology is a result of the failure to appreciate this point. (shrink)
Rather than taking the ontological fundamentality of an ideal microphysics as a starting point, this article sketches an approach to the problem of levels that swaps assumptions about ontology for assumptions about inquiry. These assumptions can be implemented formally via computational modeling techniques that will be described below. It is argued that these models offer a way to save some of our prominent commonsense intuitions concerning levels. This strategy offers a way of exploring the individuation of higher level properties in (...) a systematic and formally constrained manner. †To contact the author, please write to: Department of Philosophy, Worrell Hall 306, 500 University Avenue, University of Texas, El Paso, TX 79968; e‐mail: [email protected] (shrink)
Giandomenico Sica’s volume is a collection of eleven papers on category theory by philosophers, mathematicians, and mathematical physicists. In addition to papers of direct interest to philosophers of mathematics, the volume contains some introductory expositions of category theory along with a valuable discussion of the relationship between category theory and physics by Bob Coecke. While there are several technically difficult papers, the volume as a whole is reasonably accessible to those with some familiarity with the basics of category theory. The (...) importance of the volume lies in the possibility that it will encourage broader interest in category theory among philosophers. (shrink)