It is often claimed that epistemic bubbles and echo chambers foster post-truth by filtering our access to information and manipulating our epistemic attitude. In this paper, I try to add a further level of analysis by adding the issue of belief formation. Building on cognitive psychology work, I argue for a dual-system theory according to which beliefs derive from a default system and a critical system. One produces beliefs in a quasi-automatic, effortless way, the other in a slow, effortful way. (...) I also argue that digital socio-epistemic environments tend to inculcate disvalues in their agent's epistemic identity, a process that causes the cognitive shortcircuits typical of conspiracy theories. (shrink)
An intricate, long, and occasionally heated debate surrounds Boltzmann’s H-theorem (1872) and his combinatorial interpretation of the second law (1877). After almost a century of devoted and knowledgeable scholarship, there is still no agreement as to whether Boltzmann changed his view of the second law after Loschmidt’s 1876 reversibility argument or whether he had already been holding a probabilistic conception for some years at that point. In this paper, I argue that there was no abrupt statistical turn. In the first (...) part, I discuss the development of Boltzmann’s research from 1868 to the formulation of the H-theorem. This reconstruction shows that Boltzmann adopted a pluralistic strategy based on the interplay between a kinetic and a combinatorial approach. Moreover, it shows that the extensive use of asymptotic conditions allowed Boltzmann to bracket the problem of exceptions. In the second part I suggest that both Loschmidt’s challenge and Boltzmann’s response to it did not concern the H-theorem. The close relation between the theorem and the reversibility argument is a consequence of later investigations on the subject. (shrink)
The foundation of statistical mechanics and the explanation of the success of its methods rest on the fact that the theoretical values of physical quantities (phase averages) may be compared with the results of experimental measurements (infinite time averages). In the 1930s, this problem, called the ergodic problem, was dealt with by ergodic theory that tried to resolve the problem by making reference above all to considerations of a dynamic nature. In the present paper, this solution will be analyzed first, (...) highlighting the fact that its very general nature does not duly consider the specificities of the systems of statistical mechanics. Second, Khinchin’s approach will be presented, that starting with more specific assumptions about the nature of systems, achieves an asymptotic version of the result obtained with ergodic theory. Third, the statistical meaning of Khinchin’s approach will be analyzed and a comparison between this and the point of view of ergodic theory is proposed. It will be demonstrated that the difference consists principally of two different perspectives on the ergodic problem: that of ergodic theory puts the state of equilibrium at the center, while Khinchin’s attempts to generalize the result to non-equilibrium states. (shrink)
The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the beginning of (...) celestial mechanics and through some of the most exciting developments of mathematical physics of the 19th century. (shrink)
The recent use of typicality in statistical mechanics for foundational purposes has stirred an important debate involving both philosophers and physicists. While this debate customarily focuses on technical issues, in this paper I try to approach the problem from an epistemological angle. The discussion is driven by two questions: (1) What does typicality add to the concept of measure? (2) What kind of explanation, if any, does typicality yield? By distinguishing the notions of `typicality-as-vast-majority' and `typicality-as-best-exemplar', I argue that the (...) former goes beyond the concept of measure. Furthermore, I also argue that typicality aims at providing us with a form of causal explanation of equilibrium. (shrink)
Most philosophical accounts on scientific theories are affected by three dogmas or ingrained attitudes. These dogmas have led philosophers to choose between analyzing the internal structure of theories or their historical evolution. In this paper, I turn these three dogmas upside down. I argue (i) that mathematical practices are not epistemically neutral, (ii) that the morphology of theories can be very complex, and (iii) that one should view theoretical knowledge as the combination of internal factors and their intrinsic historicity.
The aim of this paper is not only to deal with the concept of infinity, but also to develop some considerations about the epistemological status of cosmology. These problems are connected because from an epistemological point of view, cosmology, meant as the study of the universe as a whole, is not merely a physical (or empirical) science. On the contrary it has an unavoidable metaphysical character which can be found in questions like “why is there this universe (or a universe (...) at all)?”. As a consequence, questions concerning the infinity of the universe in space and time can correctly arise only taking into account this metaphysical character of cosmology. Accordingly, in the following paper it will be shown that two different concepts of physical infinity of the universe (the relativistic one and the inflationary one) rely on two different ways of solution of a metaphysical problem. The difference between these concepts cannot be analysed using the classical distinctions between actual/potential infinity or numerable/continuum infinity, but the introduction of a new “modal” distinction will be necessary. Finally, it will be illustrated the role of a philosophical concept of infinity of the universe. (shrink)
Boltzmann’s equilibrium theory has not received by the scholars the attention it deserves. It was always interpreted as a mere generalization of Maxwell’s work or, in the most favorable case, a sketch of some ideas more consistently developed in the 1872 memoir. In this paper, I try to prove that this view is ungenerous. My claim is that in the theory developed during the period 1866-1871 the generalization of Maxwell’s distribution was mainly a mean to get a more general scope: (...) a theory of the equilibrium of a system of mechanical points from a general point of view. To face this issue Boltzmann analyzed and discussed probabilistic assumptions so that his equilibrium theory cannot be considered a purely mechanical theory. I claim also that the special perspective adopted by Boltzmann and his view about probabilistic requirements played a role in the transition to the non equilibrium theory of 1872. (shrink)
This paper analyzes the epistemological significance of the problem of induction. In the first section, the foundation of this problem is identified in the thesis of gnoseological dualism: we only know our representations as separate from ‘the world itself’. This thesis will be countered by the thesis of gnoseological monism. In the second section, the implications of Hume’s skeptical thesis will be highlighted and it will be demonstrated how the point of view of gnoseological monism can offer a way out (...) that I call the hermeneutic theory of induction. In the third section, a formal approach is proposed in agreement with this theory. Using tools of the theory of information, this defines the conditions of acceptance or refusal of a hypothesis starting with an experiment. In the fourth section, the epistemological consequences of this approach are analyzed. (shrink)
This book examines the different areas of knowledge, traditions, and conceptual resources that contributed to the building of Max Planck's theory of radiation. It presents an insightful comparative analysis that not only sheds light upon a fundamental chapter in the history of modern physics, but also enlarges our understanding of how theoreticians work. Coverage offers a deep investigation into the technical aspects behind the theory and extends in time the notion of quantum revolution. It also presents a full-fledged discussion of (...) the combinatorial part of Planck's theory and places emphasis on the epistemological role of mathematical practices. By painstakingly reconstructing both the electromagnetic and the combinatorial part of Planck's black-body theory, the author shows how some apparently merely technical resources, such as the Fourier series, effectively contributed to shape the final form of Planck's theory. For decades, historians have debated the conditions of possibility of Max Planck's discovery as a paradigmatic example of scientific revolution. In particular, the use of combinatorics, which eventually paved the way for the introduction of the quantum hypothesis, has remained a puzzle for experts. This book presents a fresh perspective on this important debate that will appeal to historians and philosophers of science. (shrink)
Ever since Thomas Kuhn's influential The Structure of Scientific Revolutions (1962), textbooks have suffered a bad reputation. They have been accused of distorting—at times purportedly—history and of feeding students with an unacceptably simplified and optimistic view of science. This attitude started to change only in recent times. With the increase of attention paid not only to how theories are conceived, but also how they are practiced, disseminated, and appropriated, historians have rehabilitated textbooks as a legitimate site of knowledge production. In (...) this paper, I adopt textbooks as an instrument to unfold multiple facets of the culture that allowed quantum physics to flourish between 1900 and the early 1930s. I organize the article around two stories about two major textbooks, i.e., Sommerfeld′s Atombau und Spektrallinien and Dirac′s Principles of Quantum Mechanics. I explore the complex pedagogical cultures underlying these two masterpieces and how they intersect local agendas. (shrink)
Selective realism is the thesis that some wisely chosen theoretical posits are essential to science and can therefore be considered as true or approximately true. How to choose them wisely, however, is a matter of fierce contention. Generally speaking, we should favor posits that are effectively deployed in successful prediction. In this paper I propose a refinement of the notion of deployment and I argue that selective realism can be extended to include the analysis of how theoretical posits are actually (...) deployed in symbolic practices. (shrink)
There are two basic approaches to the problem of induction:the empirical one, which deems that the possibility of induction depends on how theworld was made (and how it works) and the logical one, which considers the formation(and function) of language. The first is closer to being useful for induction, whilethe second is more rigorous and clearer. The purpose of this paper is to create an empiricalapproach to induction that contains the same formal exactitude as the logical approach.This requires: (a) that (...) the empirical conditions for the induction are enunciatedand (b) that the most important results already obtained from inductive logic are againdemonstrated to be valid. Here we will be dealing only with induction by elimination,namely the analysis of the experimental confutation of a theory. The result will bea rule of refutation that takes into consideration all of the empirical aspect of theexperiment and has each of the asymptotic properties which inductive logic has shown tobe characteristic of induction. (shrink)
When at the end of the 1900s Planck introduced the constant h into the black-body radiation law together with constant k, he provided no explanation of either its meaning or why it had that particular value. He simply introduced it. In reality the history of the constant was far from straightforward. Planck was confident enough to introduce it like this because he had been working on the question for over a year. In this paper we reconstruct the process that began (...) with the first two constants (c' and C. (shrink)