Most scientific models are not physical objects, and this raises important questions. What sort of entity are models, what is truth in a model, and how do we learn about models? In this paper I argue that models share important aspects in common with literary fiction, and that therefore theories of fiction can be brought to bear on these questions. In particular, I argue that the pretence theory as developed by Walton has the resources to answer these questions. I introduce (...) this account, outline the answers that it offers, and develop a general picture of scientific modelling based on it. (shrink)
Models are of central importance in many scientific contexts. The centrality of models such as the billiard ball model of a gas, the Bohr model of the atom, the MIT bag model of the nucleon, the Gaussian-chain model of a polymer, the Lorenz model of the atmosphere, the Lotka-Volterra model of predator-prey interaction, the double helix model of DNA, agent-based and evolutionary models in the social sciences, or general equilibrium models of markets in their respective domains are cases in point. (...) Scientists spend a great deal of time building, testing, comparing and revising models, and much journal space is dedicated to introducing, applying and interpreting these valuable tools. In short, models are one of the principal instruments of modern science. (shrink)
We reconsider the Nagelian theory of reduction and argue that, contrary to a widely held view, it is the right analysis of intertheoretic reduction. The alleged difficulties of the theory either vanish upon closer inspection or turn out to be substantive philosophical questions rather than knock-down arguments.
Scientific discourse is rife with passages that appear to be ordinary descriptions of systems of interest in a particular discipline. Equally, the pages of textbooks and journals are filled with discussions of the properties and the behavior of those systems. Students of mechanics investigate at length the dynamical properties of a system consisting of two or three spinning spheres with homogenous mass distributions gravitationally interacting only with each other. Population biologists study the evolution of one species procreating at a constant (...) rate in an isolated ecosystem. And when studying the exchange of goods, economists consider a situation in which there are only two goods, two perfectly rational agents, no restrictions on available information, no transaction costs, no money, and dealings are done immediately. Their surface structure notwithstanding, no competent scientist would mistake descriptions of such systems as descriptions of an actual system: we know very well that there are no such systems. These descriptions are descriptions of a model-system, and scientists use model-systems to represent parts or aspects of the world they are interested in. Following common practice, I refer to those parts or aspects as target-systems. What are we to make of this? Is discourse about such models merely a picturesque and ultimately dispensable façon de parler? This was the view of some early twentieth century philosophers. Duhem (1906) famously guarded against confusing model building with scientific theorizing and argued that model building has no real place in science, beyond a minor heuristic role. The aim of science was, instead, to construct theories, with theories understood as classificatory or representative structures systematically presented and formulated in precise symbolic.. (shrink)
In this paper we explore the constraints that our preferred account of scientific representation places on the ontology of scientific models. Pace the Direct Representation view associated with Arnon Levy and Adam Toon we argue that scientific models should be thought of as imagined systems, and clarify the relationship between imagination and representation.
It is now part and parcel of the official philosophical wisdom that models are essential to the acquisition and organisation of scientific knowledge. It is also generally accepted that most models represent their target systems in one way or another. But what does it mean for a model to represent its target system? I begin by introducing three conundrums that a theory of scientific representation has to come to terms with and then address the question of whether the semantic view (...) of theories, which is the currently most widely accepted account of theories and models, provides us with adequate answers to these questions. After having argued in some detail that it does not, I conclude by pointing out in what direction a tenable account of scientific representation might be sought. (shrink)
Everything you always wanted to know about structural realism but were afraid to ask Content Type Journal Article Pages 227-276 DOI 10.1007/s13194-011-0025-7 Authors Roman Frigg, Department of Philosophy, Logic and Scientific Method, London School of Economics and Political Science, Houghton Street, London, WC2A 2AE UK Ioannis Votsis, Philosophisches Institut, Heinrich-Heine-Universität Düsseldorf, Universitätsstraße 1, Geb. 23.21/04.86, 40225 Düsseldorf, Germany Journal European Journal for Philosophy of Science Online ISSN 1879-4920 Print ISSN 1879-4912 Journal Volume Volume 1 Journal Issue Volume 1, Number 2.
Computer simulations are an exciting tool that plays important roles in many scientific disciplines. This has attracted the attention of a number of philosophers of science. The main tenor in this literature is that computer simulations not only constitute interesting and powerful new science , but that they also raise a host of new philosophical issues. The protagonists in this debate claim no less than that simulations call into question our philosophical understanding of scientific ontology, the epistemology and semantics of (...) models and theories, and the relation between experimentation and theorising, and submit that simulations demand a fundamentally new philosophy of science in many respects. The aim of this paper is to critically evaluate these claims. Our conclusion will be sober. We argue that these claims are overblown and that simulations, far from demanding a new metaphysics, epistemology, semantics and methodology, raise few if any new philosophical problems. The philosophical problems that do come up in connection with simulations are not specific to simulations and most of them are variants of problems that have been discussed in other contexts before. (shrink)
Science provides us with representations of atoms, elementary particles, polymers, populations, genetic trees, economies, rational decisions, aeroplanes, earthquakes, forest fires, irrigation systems, and the world’s climate. It's through these representations that we learn about the world. This entry explores various different accounts of scientific representation, with a particular focus on how scientific models represent their target systems. As philosophers of science are increasingly acknowledging the importance, if not the primacy, of scientific models as representational units of science, it's important to (...) stress that how they represent plays a fundamental role in how we are to answer other questions in the philosophy of science. This entry begins by disentangling ‘the’ problem of scientific representation, before critically evaluating the current options available in the literature. (shrink)
Classical statistical mechanics posits probabilities for various events to occur, and these probabilities seem to be objective chances. This does not seem to sit well with the fact that the theory’s time evolution is deterministic. We argue that the tension between the two is only apparent. We present a theory of Humean objective chance and show that chances thus understood are compatible with underlying determinism and provide an interpretation of the probabilities we find in Boltzmannian statistical mechanics.
Many scientific models are representations. Building on Goodman and Elgin’s notion of representation-as we analyse what this claim involves by providing a general definition of what makes something a scientific model, and formulating a novel account of how they represent. We call the result the DEKI account of representation, which offers a complex kind of representation involving an interplay of, denotation, exemplification, keying up of properties, and imputation. Throughout we focus on material models, and we illustrate our claims with the (...) Phillips-Newlyn machine. In the conclusion we suggest that, mutatis mutandis, the DEKI account can be carried over to other kinds of models, notably fictional and mathematical models. (shrink)
On the face of it ‘deterministic chance’ is an oxymoron: either an event is chancy or deterministic, but not both. Nevertheless, the world is rife with events that seem to be exactly that: chancy and deterministic at once. Simple gambling devices like coins and dice are cases in point. On the one hand they are governed by deterministic laws – the laws of classical mechanics – and hence given the initial condition of, say, a coin toss it is determined whether (...) it will land heads or tails.2 On the other hand, we commonly assign probabilities to the different outcomes a coin toss, and doing so has proven successful in guiding our actions. The same dilemma also emerges in less mundane contexts. Classical statistical mechanics (which is still an important part of modern physics) assigns probabilities to the occurrence of certain events – for instance to the spreading of a gas that is originally confined to the left half of a container – but at the same time assumes that the relevant systems are deterministic. How can this apparent conflict be resolved? (shrink)
GRW Theory postulates a stochastic mechanism assuring that every so often the wave function of a quantum system is `hit', which leaves it in a localised state. How are we to interpret the probabilities built into this mechanism? GRW theory is a firmly realist proposal and it is therefore clear that these probabilities are objective probabilities (i.e. chances). A discussion of the major theories of chance leads us to the conclusion that GRW probabilities can be understood only as either single (...) case propensities or Humean objective chances. Although single case propensities have some intuitive appeal in the context of GRW theory, on balance it seems that Humean objective chances are preferable on conceptual grounds because single case propensities suffer from various well know problems such as unlimited frequency tolerance and lack of a rationalisation of the principal principle. (shrink)
Understanding scientific modelling can be divided into two sub-projects: analysing what model-systems are, and understanding how they are used to represent something beyond themselves. The first is a prerequisite for the second: we can only start analysing how representation works once we understand the intrinsic character of the vehicle that does the representing. Coming to terms with this issue is the project of the first half of this chapter. My central contention is that models are akin to places and characters (...) of literary fictions, and that therefore theories of fiction play an essential role in explaining the nature of model-systems. This sets the agenda. Section 2 provides a statement of this view, which I label the fiction view of model-systems, and argues for its prima facie plausibility. Section 3 presents a defence of this view against its main rival, the structuralist conception of models. In Section 4 I develop an account of model-systems as imagined objects on the basis of the so-called pretence theory of fiction. This theory needs to be discussed in great detail for two reasons. First, developing an acceptable account of imagined objects is mandatory to make the fiction view acceptable, and I will show that the pretence theory has the resources to achieve this goal. Second, the term ‘representation’ is ambiguous; in fact, there are two very different relations that are commonly called ‘representation’ and a conflation between the two is the root of some of the problems that beset scientific representation. Pretence theory provides us with the conceptual resources to articulate these two different forms of representation, which I call p-representation and t-representation respectively. Putting these elements together provides us with a coherent overall picture of scientific modelling, which I develop in Section 5. (shrink)
Models occupy a central role in the scientific endeavour. Among the many purposes they serve, representation is of great importance. Many models are representations of something else; they stand for, depict, or imitate a selected part of the external world (often referred to as target system, parent system, original, or prototype). Well-known examples include the model of the solar system, the billiard ball model of a gas, the Bohr model of the atom, the Gaussian-chain model of a polymer, the MIT (...) bag model of quark confinement, the Lorenz model of the atmosphere, the Lotka-Volterra model of the predator-prey interaction, or the hydraulic model of an economy, to mention just a few. All these models represent their target systems (or selected parts of them) in one way or another. (shrink)
An important contemporary version of Boltzmannian statistical mechanics explains the approach to equilibrium in terms of typicality. The problem with this approach is that it comes in different versions, which are, however, not recognized as such and not clearly distinguished. This article identifies three different versions of typicality‐based explanations of thermodynamic‐like behavior and evaluates their respective successes. The conclusion is that the first two are unsuccessful because they fail to take the system's dynamics into account. The third, however, is promising. (...) I give a precise formulation of the proposal and present an argument in support of its central contention. †To contact the author, please write to: Department of Philosophy, Logic, and Scientific Method, London School of Economics, Houghton Street, London WC2A 2AE, England; e‐mail: [email protected] (shrink)
Various processes are often classified as both deterministic and random or chaotic. The main difficulty in analysing the randomness of such processes is the apparent tension between the notions of randomness and determinism: what type of randomness could exist in a deterministic process? Ergodic theory seems to offer a particularly promising theoretical tool for tackling this problem by positing a hierarchy, the so-called ‘ergodic hierarchy’, which is commonly assumed to provide a hierarchy of increasing degrees of randomness. However, that notion (...) of randomness requires clarification. The mathematical definition of EH does not make explicit appeal to randomness; nor does the usual way of presenting EH involve a specification of the notion of randomness that is supposed to underlie the hierarchy. In this paper we argue that EH is best understood as a hierarchy of random behaviour if randomness is explicated in terms of unpredictability. We then show that, contrary to common wisdom, EH is useful in characterising the behaviour of Hamiltonian dynamical systems. (shrink)
The sensitive dependence on initial conditions associated with nonlinear models imposes limitations on the models’ predictive power. We draw attention to an additional limitation than has been underappreciated, namely, structural model error. A model has SME if the model dynamics differ from the dynamics in the target system. If a nonlinear model has only the slightest SME, then its ability to generate decision-relevant predictions is compromised. Given a perfect model, we can take the effects of SDIC into account by substituting (...) probabilistic predictions for point predictions. This route is foreclosed in the case of SME, which puts us in a worse epistemic situation than SDIC. (shrink)
Veritism, the position that truth is necessary for epistemic acceptability, seems to be in tension with the observation that much of our best science is not, strictly speaking, true when interpreted literally. This generates a paradox: truth is necessary for epistemic acceptability; the claims of science have to be taken literally; much of what science produces is not literally true and yet it is acceptable. We frame Elgin’s project in True Enough as being motivated by, and offering a particular resolution (...) to, this paradox. We discuss the paradox with a focus on scientific models and argue that there is another resolution available which is compatible with retaining veritism: rejecting the idea that scientific models should be interpreted literally. (shrink)
Entropy is ubiquitous in physics, and it plays important roles in numerous other disciplines ranging from logic and statistics to biology and economics. However, a closer look reveals a complicated picture: entropy is defined differently in different contexts, and even within the same domain different notions of entropy are at work. Some of these are defined in terms of probabilities, others are not. The aim of this chapter is to arrive at an understanding of some of the most important notions (...) of entropy and to clarify the relations between them, After setting the stage by introducing the thermodynamic entropy, we discuss notions of entropy in information theory, statistical mechanics, dynamical systems theory and fractal geometry. (shrink)
Gases reach equilibrium when left to themselves. Why do they behave in this way? The canonical answer to this question, originally proffered by Boltzmann, is that the systems have to be ergodic. This answer has been criticised on different grounds and is now widely regarded as flawed. In this paper we argue that some of the main arguments against Boltzmann's answer, in particular, arguments based on the KAM-theorem and the Markus-Meyer theorem, are beside the point. We then argue that something (...) close to Boltzmann's original proposal is true for gases: gases behave thermodynamic-like if they are epsilon-ergodic, i.e., ergodic on the entire accessible phase space except for a small region of measure epsilon. This answer is promising because there are good reasons to believe that relevant systems in statistical mechanics are epsilon-ergodic. (shrink)
How does mathematics apply to something non-mathematical? We distinguish between a general application problem and a special application problem. A critical examination of the answer that structural mapping accounts offer to the former problem leads us to identify a lacuna in these accounts: they have to presuppose that target systems are structured and yet leave this presupposition unexplained. We propose to fill this gap with an account that attributes structures to targets through structure generating descriptions. These descriptions are physical descriptions (...) and so there is no such thing as a solely mathematical account of a target system. (shrink)
The United Kingdom Climate Impacts Programme’s UKCP09 project makes high-resolution projections of the climate out to 2100 by post-processing the outputs of a large-scale global climate model. The aim of this paper is to describe and analyse the methodology used and then urge some caution. Given the acknowledged systematic, shared errors of all current climate models, treating model outputs as decision-relevant projections can be significantly misleading. In extrapolatory situations, such as projections of future climate change, there is little reason to (...) expect that post-processing of model outputs can correct for the consequences of such errors. This casts doubt on our ability, today, to make trustworthy probabilistic projections at high resolution out to the end of the century. (shrink)
Boltzmannian statistical mechanics partitions the phase space of a sys- tem into macro-regions, and the largest of these is identified with equilibrium. What justifies this identification? Common answers focus on Boltzmann’s combinatorial argument, the Maxwell-Boltzmann distribution, and maxi- mum entropy considerations. We argue that they fail and present a new answer. We characterise equilibrium as the macrostate in which a system spends most of its time and prove a new theorem establishing that equilib- rium thus defined corresponds to the largest (...) macro-region. Our derivation is completely general in that it does not rely on assumptions about a system’s dynamics or internal interactions. (shrink)
Computer simulations are an exciting tool that plays important roles in many scientific disciplines. This has attracted the attention of a number of philosophers of science. The main tenor in this literature is that computer simulations not only constitute interesting and powerful new science, but that they also raise a host of new philosophical issues. The protagonists in this debate claim no less than that simulations call into question our philosophical understanding of scientific ontology, the epistemology and semantics of models (...) and theories, and the relation between experimentation and theorising, and submit that simulations demand a fundamentally new philosophy of science in many respects. The aim of this paper is to critically evaluate these claims. Our conclusion will be sober. We argue that these claims are overblown and that simulations, far from demanding a new metaphysics, epistemology, semantics and methodology, raise few if any new philosophical problems. The philosophical problems that do come up in connection with simulations are not specific to simulations and most of them are variants of problems that have been discussed in other contexts before. (shrink)
Climate change adaptation is largely a local matter, and adaptation planning can benefit from local climate change projections. Such projections are typically generated by accepting climate model outputs in a relatively uncritical way. We argue, based on the IPCC’s treatment of model outputs from the CMIP5 ensemble, that this approach is unwarranted and that subjective expert judgment should play a central role in the provision of local climate change projections intended to support decision-making.
There are two theoretical approaches in statistical mechanics, one associated with Boltzmann and the other with Gibbs. The theoretical apparatus of the two approaches offer distinct descriptions of the same physical system with no obvious way to translate the concepts of one formalism into those of the other. This raises the question of the status of one approach vis-à-vis the other. We answer this question by arguing that the Boltzmannian approach is a fundamental theory while Gibbsian statistical mechanics is an (...) effective theory, and we describe circumstances under which Gibbsian calculations coincide with the Boltzmannian results. We then point out that regarding GSM as an effective theory has important repercussions for a number of projects, in particular attempts to turn GSM into a nonequilibrium theory. (shrink)
The United Kingdom Climate Impacts Program’s UKCP09 project makes high-resolution forecasts of climate during the 21st century using state of the art global climate models. The aim of this paper is to introduce and analyze the methodology used and then urge some caution. Given the acknowledged systematic errors in all current climate models, treating model outputs as decision relevant probabilistic forecasts can be seriously misleading. This casts doubt on our ability, today, to make trustworthy, high-resolution predictions out to the end (...) of this century. (shrink)
Gibbsian statistical mechanics (GSM) is the most widely used version of statistical mechanics among working physicists. Yet a closer look at GSM reveals that it is unclear what the theory actually says and how it bears on experimental practice. The root cause of the difficulties is the status of the averaging principle, the proposition that what we observe in an experiment is the ensemble average of a phase function. We review different stances toward this principle, and eventually present a coherent (...) interpretation of GSM that provides an account of the status and scope of the principle. (shrink)
Various scientific theories stand in a reductive relation to each other. In a recent article, we have argued that a generalized version of the Nagel-Schaffner model (GNS) is the right account of this relation. In this article, we present a Bayesian analysis of how GNS impacts on confirmation. We formalize the relation between the reducing and the reduced theory before and after the reduction using Bayesian networks, and thereby show that, post-reduction, the two theories are confirmatory of each other. We (...) then ask when a purported reduction should be accepted on epistemic grounds. To do so, we compare the prior and posterior probabilities of the conjunction of both theories before and after the reduction and ask how well each is confirmed by the available evidence. (shrink)
Computer simulations are an exciting tool that plays important roles in many scientific disciplines. This has attracted the attention of a number of philosophers of science. The main tenor in this literature is that computer simulations not only constitute interesting and powerful new science, but that they also raise a host of new philosophical issues. The protagonists in this debate claim no less than that simulations call into question our philosophical understanding of scientific ontology, the epistemology and semantics of models (...) and theories, and the relation between experimentation and theorising, and submit that simulations demand a fundamentally new philosophy of science in many respects. The aim of this paper is to critically evaluate these claims. Our conclusion will be sober. We argue that these claims are overblown and that simulations, far from demanding a new metaphysics, epistemology, semantics and methodology, raise few if any new philosophical problems. The philosophical problems that do come up in connection with simulations are not specific to simulations and most of them are variants of problems that have been discussed in other contexts before. (shrink)
In Boltzmannian statistical mechanics macro-states supervene on micro-states. This leads to a partitioning of the state space of a system into regions of macroscopically indistinguishable micro-states. The largest of these regions is singled out as the equilibrium region of the system. What justifies this association? We review currently available answers to this question and find them wanting both for conceptual and for technical reasons. We propose a new conception of equilibrium and prove a mathematical theorem which establishes in full generality (...) -- i.e. without making any assumptions about the system's dynamics or the nature of the interactions between its components -- that the equilibrium macro-region is the largest macro-region. We then turn to the question of the approach to equilibrium, of which there exists no satisfactory general answer so far. In our account, this question is replaced by the question when an equilibrium state exists. We prove another -- again fully general -- theorem providing necessary and sufficient conditions for the existence of an equilibrium state. This theorem changes the way in which the question of the approach to equilibrium should be discussed: rather than launching a search for a crucial factor, the focus should be on finding triplets of macro-variables, dynamical conditions, and effective state spaces that satisfy the conditions of the theorem. (shrink)
Determinism and chance seem to be irreconcilable opposites: either something is chancy or it is deterministic but not both. Yet there are processes which appear to square the circle by being chancy and deterministic at once, and the appearance is backed by well-confirmed scientific theories such as statistical mechanics which also seem to provide us with chances for deterministic processes. Is this possible, and if so how? In this essay I discuss this question for probabilities as they occur in the (...) empirical sciences, setting aside metaphysical questions in connection with free will, divine intervention and determinism in history. (shrink)
Why do systems prepared in a non-equilibrium state approach, and eventually reach, equilibrium? An important contemporary version of the Boltzmannian approach to statistical mechanics answers this question by an appeal to the notion of typicality. The problem with this approach is that it comes in different versions, which are, however, not recognised as such, much less clearly distinguished, and we often find different arguments pursued side by side. The aim of this paper is to disentangle different versions of typicality-based explanations (...) of thermodynamic behaviour and evaluate their respective success. My conclusion will be that the boldest version fails for technical reasons, while more prudent versions leave unanswered essential questions. (shrink)
The so-called ergodic hierarchy (EH) is a central part of ergodic theory. It is a hierarchy of properties that dynamical systems can possess. Its five levels are egrodicity, weak mixing, strong mixing, Kolomogorov, and Bernoulli. Although EH is a mathematical theory, its concepts have been widely used in the foundations of statistical physics, accounts of randomness, and discussions about the nature of chaos. We introduce EH and discuss how its applications in these fields.
A gas prepared in a non-equilibrium state will approach equilibrium and stay there. An influential contemporary approach to Statistical Mechanics explains this behaviour in terms of typicality. However, this explanation has been criticised as mysterious as long as no connection with the dynamics of the system is established. We take this criticism as our point of departure. Our central claim is that Hamiltonians of gases which are epsilon-ergodic are typical with respect to the Whitney topology. Because equilibrium states are typical, (...) we argue that there follows the desired conclusion that typical initial conditions approach equilibrium and stay there. (shrink)
In two recent papers Barry Loewer (2001, 2004) has suggested to interpret probabilities in statistical mechanics as Humean chances in David Lewis’ (1994) sense. I first give a precise formulation of this proposal, then raise two fundamental objections, and finally conclude that these can be overcome only at the price of interpreting these probabilities epistemically.
Many policy decisions take input from collections of scientific models. Such decisions face significant and often poorly understood uncertainty. We rework the so-called confidence approach to tackle decision-making under severe uncertainty with multiple models, and we illustrate the approach with a case study: insurance pricing using hurricane models. The confidence approach has important consequences for this case and offers a powerful framework for a wide class of problems. We end by discussing different ways in which model ensembles can feed information (...) into the approach, appropriate to different collections of models. (shrink)
Gibbsian statistical mechanics is the most widely used version of statistical mechanics among working physicists. Yet a closer look at GSM reveals that it is unclear what the theory actually says and how it bears on experimental practice. The root cause of the difficulties is the status of the Averaging Principle, the proposition that what we observe in an experiment is the ensemble average of a phase function. We review different stances toward this principle, and eventually present a coherent interpretation (...) of GSM that provides an account of the status and scope of the principle. (shrink)
There are two main theoretical frameworks in statistical mechanics, one associated with Boltzmann and the other with Gibbs. Despite their well-known differences, there is a prevailing view that equilibrium values calculated in both frameworks coincide. We show that this is wrong. There are important cases in which the Boltzmannian and Gibbsian equilibrium concepts yield different outcomes. Furthermore, the conditions under which equilibriums exists are different for Gibbsian and Boltzmannian statistical mechanics. There are, however, special circumstances under which it is true (...) that the equilibrium values coincide. We prove a new theorem providing sufficient conditions for this to be the case. (shrink)
At first blush, the idea that fictions play a role in science seems to be off the mark. Realists and antirealists alike believe that science instructs us about how the world is. Fiction not only seems to play no role in such an endeavour; it seems to detract from it. The aims of science and fiction seem to be diametrically opposed and a view amalgamating the two rightly seems to be the cause of discomfort and concern.
Consider a gas that is adiabatically isolated from its environment and confined to the left half of a container. Then remove the wall separating the two parts. The gas will immediately start spreading and soon be evenly distributed over the entire available space. The gas has approached equilibrium. Thermodynamics (TD) characterizes this process in terms of an increase of thermodynamic entropy, which attains its maximum value at equilibrium. The second law of thermodynamics captures the irreversibility of this process by positing (...) that in an isolated system such as the gas entropy cannot decrease. The aim of statistical mechanics (SM) is to explain the behavior of the gas and, in particular, its conformity with the second law in terms of the dynamical laws governing the individual molecules of which the gas is made up. In what follows these laws are assumed to be the ones of Hamiltonian classical mechanics. We should not, however, ask for an explanation of the second law literally construed. This law is a universal law and as such cannot be explained by a statistical theory. But this is not a problem because we.. (shrink)
Featuring contributions from leading experts, this book represents the first collection of essays on the topic of art and science in the analytic tradition of ...
On an influential account, chaos is explained in terms of random behaviour; and random behaviour in turn is explained in terms of having positive Kolmogorov-Sinai entropy (KSE). Though intuitively plausible, the association of the KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. I provide this justification for the case of Hamiltonian systems by proving that the KSE is equivalent to a generalized version of Shannon's (...) communication-theoretic entropy under certain plausible assumptions. I then discuss consequences of this equivalence for randomness in chaotic dynamical systems. Introduction Elements of dynamical systems theory Entropy in communication theory Entropy in dynamical systems theory Comparison with other accounts Product versus process randomness. (shrink)