Bogen and Woodward claim that the function of scientific theories is to account for 'phenomena', which they describe both as investigator-independent constituents of the world and as corresponding to patterns in data sets. I argue that, if phenomena are considered to correspond to patterns in data, it is inadmissible to regard them as investigator-independent entities. Bogen and Woodward's account of phenomena is thus incoherent. I offer an alternative account, according to which phenomena are investigator-relative entities. All the infinitely many patterns (...) that data sets exhibit have equal intrinsic claim to the status of phenomenon: each investigator may stipulate which patterns correspond to phenomena for him or her. My notion of phenomena accords better both with experimental practice and with the historical development of science. (shrink)
Thought experiment acquires evidential significance only on particular metaphysical assumptions. These include the thesis that science aims at uncovering "phenomena"universal and stable modes in which the world is articulatedand the thesis that phenomena are revealed imperfectly in actual occurrences. Only on these Platonically inspired assumptions does it make sense to bypass experience of actual occurrences and perform thought experiments. These assumptions are taken to hold in classical physics and other disciplines, but not in sciences that emphasize variety and contingency, such (...) as Aristotelian natural philosophy and some forms of historiography. This explains why thought experiments carry weight in the former but not the latter disciplines. (shrink)
This article discusses the relation between features of empirical data and structures in the world. I defend the following claims. Any empirical data set exhibits all possible patterns, each with a certain noise term. The magnitude and other properties of this noise term are irrelevant to the evidential status of a pattern: all patterns exhibited in empirical data constitute evidence of structures in the world. Furthermore, distinct patterns constitute evidence of distinct structures in the world. It follows that the world (...) must be regarded as containing all possible structures. The remainder of the article is devoted to elucidating the meaning and implications of the latter claim. (shrink)
Michael Dummett and Storrs McCall have claimed that time travel scenarios in which an artist copies an artwork from a reproduction of it that has been sent from the future introduce a causal loop of a new kind: one involving artistic value. They have suggested that this poses a hitherto unacknowledged challenge to time travel theories. I argue that their conclusion depends on some unstated essentialist assumptions about metaphysics of art and the status of representations. By relaxing these assumptions, I (...) show that Dummett and McCall’s scenarios contain no causal loop involving artistic value, and thus pose no new problem for time travel theories. (shrink)
A rationalist and realist model of scientific revolutions will be constructed by reference to two categories of criteria of theory-evaluation, denominated indicators of truth and of beauty. Whereas indicators of truth are formulateda priori and thus unite science in the pursuit of verisimilitude, aesthetic criteria are inductive constructs which lag behind the progression of theories in truthlikeness. Revolutions occur when the evaluative divergence between the two categories of criteria proves too wide to be recomposed or overlooked. This model of revolutions (...) depends upon a substantial new treatment of aesthetic criteria in science with which much of the paper will therefore be occupied. (shrink)
Murray Gell-Mann has proposed the concept of effective complexity as a measure of information content. The effective complexity of a string of digits is defined as the algorithmic complexity of the regular component of the string. This paper argues that the effective complexity of a given string is not uniquely determined. The effective complexity of a string admitting a physical interpretation, such as an empirical data set, depends on the cognitive and practical interests of investigators. The effective complexity of a (...) string as a purely formal construct, lacking a physical interpretation, is either close to zero, or equal to the string's algorithmic complexity, or arbitrary, depending on the auxiliary criterion chosen to pick out the regular component of the string. Because of this flaw, the concept of effective complexity is unsuitable as a measure of information content. (shrink)
This article defends the following claims. First, for patterns exhibited in empirical data, there is no criterion on which to demarcate patterns that are physically significant and patterns that are not physically significant. I call a pattern physically significant if it corresponds to a structure in the world. Second, all patterns must be regarded as physically significant. Third, distinct patterns must be regarded as providing evidence for distinct structures in the world. Fourth, in consequence, the world must be conceived as (...) showing all possible structures. (shrink)
According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are therefore maximally efficient carriers (...) of information about the world. Since, on algorithmic information theory, a string is algorithmically random just if it is incompressible, the thesis entails that empirical data sets consist of algorithmically random strings of digits. Rather than constituting compressions of empirical data, scientific laws and theories pick out patterns that data sets exhibit with a certain noise.Author Keywords: Algorithmic randomness; Compression; Empirical data; Information; Law; Pattern. (shrink)
Several quantitative techniques for choosing among data models are available. Among these are techniques based on algorithmic information theory, minimum description length theory, and the Akaike information criterion. All these techniques are designed to identify a single model of a data set as being the closest to the truth. I argue, using examples, that many data sets in science show multiple patterns, providing evidence for multiple phenomena. For any such data set, there is more than one data model that must (...) be considered close to the truth. I conclude that, since the established techniques for choosing among data models are unequipped to handle these cases, they cannot be regarded as adequate. ‡I presented a previous version of this paper at the 20th Biennial Meeting of the Philosophy of Science Association, Vancouver, November 2006. I am grateful to the audience for constructive discussion. I thank Leiden University students Marjolein Eysink Smeets and Lenneke Schrier for suggesting the cortisol example, and Remko van der Geest for comments on a draft. †To contact the author, please write to: Faculty of Philosophy, University of Leiden, P.O. Box 9515, 2300 RA Leiden, The Netherlands; e-mail: [email protected] (shrink)
The central terms of certain theories which were valued highly in the past, such as the phlogiston theory, are now believed by realists not to refer. Laudan and others have claimed that, in the light of the existence of such theories, scientific realism is untenable. This paper argues in response that realism is consistent with — and indeed is able to explain — such theories' having been highly valued and yet not being close to the truth. It follows that the (...) set of highly-valued past theories cited by Laudan, presumed to militate against realism, is in fact innocuous to the doctrine. The argument hinges largely on identifying the grounds on which theory-adoption is actually performed. (shrink)
This introduction to the special issue on "Aesthetics of Science" reviews recent philosophical research on aesthetic aspects of science. Topics represented in this research include the aesthetic properties of scientific images, theories, and experiments; the relation of science and art; the role of aesthetic criteria in scientific practice and their effect on the development of science; aesthetic aspects of mathematics; the contrast between a classic and a Romantic aesthetic; and the relation between emotion, cognition, and rationality.
Inconsistencies in science take several forms. Some occur at the level of substantive claims about the world. Others occur at the level of methodology, and take the form of dilemmas, or cases of conflicting epistemic or cognitive values. In this article, I discuss how methodological dilemmas arise. I then consider how scientists resolve them. There are strong grounds for thinking that emotional judgement plays an important role in resolving methodological dilemmas. Lastly, I discuss whether and under what conditions this reliance (...) on emotional judgement is rationally warranted. I consider two possible mechanisms, based on coherence and induction, able to ensure that scientists’ emotional responses to methodological dilemmas are rationally warranted. (shrink)
According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are therefore maximally efficient carriers (...) of information about the world. Since, on algorithmic information theory, a string is algorithmically random just if it is incompressible, the thesis entails that empirical data sets consist of algorithmically random strings of digits. Rather than constituting compressions of empirical data, scientific laws and theories pick out patterns that data sets exhibit with a certain noise. (shrink)
Almost all commentators acknowledge that among the grounds on which scientists perform theory-choices are criteria of simplicity. In general, simplicity is regarded either as only a logico-empirical quality of a theory, diagnostic of the theory's future predictive success, or as a purely aesthetic or otherwise extra-empirical property of it. This paper attempts to demonstrate that the simplicity-criteria applied in scientific practice include both a logico-empirical and a quasi-aesthetic criterion: to conflate these in an account of scientists' theory-choice is to court (...) confusion. (shrink)
The Newtonian universe is usually understood to contain two classes of causal factors: universal regularitiesand initial conditions. I demonstrate that,in fact, the Newtonian universe contains no causal factors other thanuniversal regularities: the initial conditions ofany physical system are merely theconsequence of universal regularities acting on previoussystems. It follows that aNewtonian universe lacks the degree of contingency that is usually attributed to it. This is a necessary precondition for maintaining that the Newtonian universe is a block universe that exhibits no temporal (...) development. It follows also that Newtonian physics is inconsistent, since a Newtonian universe as a whole exhibits some properties – such as the total mass of the universe – that are not determined by the laws of Newtonian physics, and that must therefore be considered contingent. (shrink)
This article investigates epistemological aspects of scientists’ reuse of empirical data over decades and centuries. Giving examples, I discuss three respects in which empirical data are historical entities and the implications for the notion of data reuse. First, any data reuse necessitates metadata, which specify the data’s circumstances of origin. Second, interpretation of historical data often requires the tools of humanities disciplines, which produce a further historicization of data. Finally, some qualitative social scientists hold that data are personal to the (...) researcher who coconstructs them in the research process and are therefore skeptical about the prospects of reusing data. (shrink)
Some classic historical vignettes depict scientists achieving breakthroughs without effort: Archimedes grasping the principles of buoyancy while bathing, Galileo Galilei discovering the isochrony of the pendulum while sitting in a cathedral, James Watt noticing the motive power of steam while passing time in a kitchen, Alexander Fleming finding penicillin in Petri dishes that he had omitted to clean before going on holiday. These stories suggest that, to establish important findings in science, hard work is not always necessary. In this article, (...) I suggest that such stories capture an important aspect of scientific.. (shrink)
In this article, I discuss calls for access to empirical data within controversies about climate science, as revealed and highlighted by the publication of the e-mail correspondence involving scientists at the Climatic Research Unit at the University of East Anglia in 2009. I identify several arguments advanced for and against the sharing of scientific data. My conclusions are that, whereas transparency in science is to be valued, appeals to an unproblematic category of ‘empirical data’ in climate science do not reflect (...) the complexities of scientific practice in this field. (shrink)
This paper argues that evaluation of the truth and rationality of past scientific theories is both possible and profitable. The motivation for this enterprise is traced to recent discussions by I. Lakatos, L. Laudan and others on the import of history for the philosophy of science; several objections to it are considered and T. S. Kuhn is found to advance the most substantive. An argument for establishing judgements of rationality and truth in the face of scientific revolutions is presented; finally (...) evidence is offered for the value of such assessments to historiography and to debates on scientific progress. (shrink)
The ArgumentIn the controversy in 1989 over the reported achievement of cold nuclear fusion, parts of the physics and chemistry communities were opposed in both a theoretic and a professional competition. Physicists saw the chemists' announcement as an incursion into territory allocated to their own discipline and strove to restore the interdisciplinary boundaries that had previously held. The events that followed throw light on the manner in which scientists' knowledge claims and metascientific beliefs are affected by their membership of disciplinary (...) communities. In particular, the controversy offers evidence for a constructivist reinterpretation of the “division of nature into levels,” which is customarily held to underpin the division of science into disciplines. (shrink)
The ArgumentIn the controversy in 1989 over the reported achievement of cold nuclear fusion, parts of the physics and chemistry communities were opposed in both a theoretic and a professional competition. Physicists saw the chemists' announcement as an incursion into territory allocated to their own discipline and strove to restore the interdisciplinary boundaries that had previously held. The events that followed throw light on the manner in which scientists' knowledge claims and metascientific beliefs are affected by their membership of disciplinary (...) communities. In particular, the controversy offers evidence for a constructivist reinterpretation of the “division of nature into levels,” which is customarily held to underpin the division of science into disciplines. (shrink)
The modern sciences are divided into two groups: law-formulating and natural historical sciences. Sciences of both groups aim at describing the world, but they do so differently. Whereas the natural historical sciences produce “transcriptions” intended to be literally true of actual occurrences, laws of nature are expressive symbols of aspects of the world. The relationship between laws and the world thus resembles that between the symbols of classical iconography and the objects for which they stand. The natural historical approach was (...) founded by Aristotle and is retained in such present-day sciences as botany. Modern physics differentiated itself from the natural historical sciences and developed a symbolizing approach at the hands of Galileo and Descartes. Our knowledge of the physical domain is provided by two disciplines: the law-formulating science of physics and a natural historical science on which we depend in the everyday manipulation of our surroundings. (shrink)
Conditions for philosophy of science in the Netherlands are not optimal. The climate of opinion in Dutch philosophy is unsympathetic to the sciences, partly because of the influence of theology. Dutch universities offer no taught graduate programmes in philosophy of science, which would provide an entry route for science graduates. A great deal of Dutch research in philosophy of science is affected by an exegetical attitude, which fosters the interpretation and evaluation of other writers rather than the development of original (...) theories. Doctoral candidates in particular should be trained to greater originality and assertiveness. Nonetheless, much good research in philosophy of science is conducted in the Netherlands, both in philosophy faculties and in institutes dedicated to the foundations of the special sciences. Distinguished work is done also in the neighbouring disciplines of logic, history of science, and social studies of science. (shrink)
(1988). The explanative recourse to realism. International Studies in the Philosophy of Science: Vol. 3, No. 1, pp. 2-18. doi: 10.1080/02698598808573321.