This monograph presents new ideas in nomic truth approximation. It features original and revised papers from a philosopher of science who has studied the concept for more than 35 years. Over the course of time, the author's initial ideas evolved. He discovered a way to generalize his first theory of nomic truth approximation, viz. by dropping an unnecessarily strong assumption. In particular, he first believed to have to assume that theories were maximally specific in the sense that they did not (...) only exclude certain conceptual possibilities, but also that all non-excluded possibilities were in fact claimed to be nomically possible. Now, he argues that the exclusion claim alone, or for that matter the inclusion claim alone, is sufficient to motivate the formal definition of being closer to the nomic truth. The papers collected here detail this generalized view of nomic truthlikeness or verisimilitude. Besides this, the book presents, in adapted form, the relation with several other topics, such as, domain revision, aesthetic progress, abduction, inference to the best explanation, pragmatic aspects, probabilistic methods, belief revision and epistemological positions, notably constructive realism. Overall, the volume presents profound insight into nomic truth approximation. This idea seeks to determine how one theory can be closer to, or more similar to, the truth about what is nomically, e.g. physically, chemically, biologically, possible than another theory. As a result, it represents the ultimate goal of theory oriented empirical science. Theo Kuipers is the author of Studies in Inductive Probability and Rational Expectation, From Instrumentalism to Constructive Realism and Structures in Science. He is the volume-editor of the Handbook on General Philosophy of Science. In 2005 there appeared two volumes of Essays in Debate with Theo Kuipers, entitled Confirmation, Empirical Progress, and Truth Approximation and Cognitive Structures in Scientific Inquiry. (shrink)
INTRODUCTION When Karl Popper published in' his definition of closer-to-the- truth this was an important intellectual event, but not a shocking one. ...
The naive structuralist definition of truthlikeness is an idealization in the sense that it assumes that all mistaken models of a theory are equally bad. The natural concretization is a refined definition based on an underlying notion of structurelikeness.In Section 1 the naive definition of truthlikeness of theories is presented, using a new conceptual justification, in terms of instantial and explanatory mistakes.
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989; (Ye 2000). On the other hand it is based on the formal theory of truthlikeness and truth approximation as presented in (...) my From Instrumentalism to Constructive Realism (2000). The analysis supports the findings of James McAllister in his beautiful Beauty and Revolutiorl in Science (1996), by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truth approximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
In this paper, we address the problem of truth approximation through theory change, asking whether revising our theories by newly acquired data leads us closer to the truth about a given domain. More particularly, we focus on “nomic conjunctive theories”, i.e., theories expressed as conjunctions of logically independent statements concerning the physical or, more generally, nomic possibilities and impossibilities of the domain under inquiry. We define both a comparative and a quantitative notion of the verisimilitude of such theories, and identify (...) suitable conditions concerning the (partial) correctness of acquired data, under which revising our theories by data leads us closer to “the nomic truth”, construed as the target of scientific inquiry. We conclude by indicating some further developments, generalizations, and open issues arising from our results. (shrink)
3 in philosophy, and therefore in metaphilosophy, cannot be based on rules that avoid spending time on pseudo-problems. Of course, this implies that, if one succeeds in demonstrating convincingly the pseudo-character of a problem by giving its 'solution', the time spent on it need not be seen as wasted. We conclude this section with a brief statement of the criteria for concept explication as they have been formulated in several places by Carnap, Hempel and Stegmiiller. Hempel's account is still very (...) adequate for a detailed introduction. The process of explication starts with the identification of one or more vague and, perhaps, ambiguous concepts, the so-called explicanda. Next, one tries to disentangle the ambiguities. This, however, need not be possible at once. Ultimately the explicanda are to be replaced by certain counterparts, the so-called explicata, which have to conform to four requirements. They have to be as precise as possible and as simple as possible. In addition, they have to be useful in the sense that they give rise to the formulation of theories and the solution of problems. The three requirements of preciseness, simplicity and usefulness. have of course to be pursued in all concept formation. (shrink)
In this article I give a naturalistic-cum-formal analysis of therelation between beauty, empirical success, and truth. The analysis is based on the onehand on a hypothetical variant of the so-called `mere-exposure effect'' which has been more orless established in experimental psychology regarding exposure-affect relationshipsin general and aesthetic appreciation in particular (Zajonc 1968; Temme 1983; Bornstein 1989;Ye 2000). On the other hand it is based on the formal theory of truthlikeness andtruth approximation as presented in my From Instrumentalism to Constructive Realism (...) (2000).The analysis supports the findings of James McAllister in his beautifulBeauty and Revolution in Science (1996), by explaining and justifying them.First, scientists are essentially right in regarding aesthetic criteria useful for empiricalprogress and even for truth approximation, provided they conceive of them as less hard thanempirical criteria. Second, the aesthetic criteria of the time, the `aesthetic canon'', maywell be based on `aesthetic induction'' regarding nonempirical features of paradigms of successfultheories which scientists have come to appreciate as beautiful. Third, aestheticcriteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, `revolutionary'' scientists. We may find totallyopposite things beautiful: a simplemathematical principle as well as a series of unrepeatable complex contingencies. It is a matter of psychology.(Stephen Jay Gould, translated passage from (Kayzer 2000, 30)). (shrink)
In section I the notions of logical and inductive probability will be discussed as well as two explicanda, viz. degree of confirmation, the base for inductive probability, and degree of evidential support, Popper's favourite explicandum. In section II it will be argued that Popper's paradox of ideal evidence is no paradox at all; however, it will also be shown that Popper's way out has its own merits.
This paper primarily deals with theconceptual prospects for generalizing the aim ofabduction from the standard one of explainingsurprising or anomalous observations to that ofempirical progress or even truth approximation. Itturns out that the main abduction task then becomesthe instrumentalist task of theory revision aiming atan empirically more successful theory, relative to theavailable data, but not necessarily compatible withthem. The rest, that is, genuine empirical progress aswell as observational, referential and theoreticaltruth approximation, is a matter of evaluation andselection, and possibly new (...) generation tasks forfurther improvement. The paper concludes with a surveyof possible points of departure, in AI and logic, forcomputational treatment of the instrumentalist taskguided by the `comparative evaluation matrix''. (shrink)
The philosophy of science has lost its self-confidence, witness the lack of advanced textbooks in contrast to the abundance of elementary textbooks. Structures in Science is an advanced textbook that explicates, updates, accommodates, and integrates the best insights of logical-empiricism and its main critics. This `neo-classical approach' aims at providing heuristic patterns for research. The book introduces four ideal types of research programs and reanimates the distinction between observational laws and proper theories. It explicates various patterns of explanation by subsumption (...) and specification as well as structures in reductive and other types of interlevel research. Its analysis of theory evaluation leads to new characterizations of confirmation, empirical progress, and pseudoscience. Partial analogies between progress in nomological research and progress in explicative and design research emerge. Finally, special chapters are devoted to design research programs, computational philosophy of science, the structuralist approach to theories, and research ethics. (shrink)
Straightforward theory revision, taking into account as effectively as possible the established nomic possibilities and, on their basis induced empirical laws, is conducive for (unstratified) nomic truth approximation. The question this paper asks is: is it possible to reconstruct the relevant theory revision steps, on the basis of incoming evidence, in AGM-terms? A positive answer will be given in two rounds, first for the case in which the initial theory is compatible with the established empirical laws, then for the case (...) in which it is incompatible with at least one such a law. (shrink)
3 in philosophy, and therefore in metaphilosophy, cannot be based on rules that avoid spending time on pseudo-problems. Of course, this implies that, if one succeeds in demonstrating convincingly the pseudo-character of a problem by giving its 'solution', the time spent on it need not be seen as wasted. We conclude this section with a brief statement of the criteria for concept explication as they have been formulated in several places by Carnap, Hempel and Stegmiiller. Hempel's account is still very (...) adequate for a detailed introduction. The process of explication starts with the identification of one or more vague and, perhaps, ambiguous concepts, the so-called explicanda. Next, one tries to disentangle the ambiguities. This, however, need not be possible at once. Ultimately the explicanda are to be replaced by certain counterparts, the so-called explicata, which have to conform to four requirements. They have to be as precise as possible and as simple as possible. In addition, they have to be useful in the sense that they give rise to the formulation of theories and the solution of problems. The three requirements of preciseness, simplicity and usefulness. have of course to be pursued in all concept formation. (shrink)
In this article I give a naturalistic-cum-formal analysis of the relation between beauty, empirical success, and truth. The analysis is based on the one hand on a hypothetical variant of the so-called 'mere-exposure effect' which has been more or less established in experimental psychology regarding exposure-affect relationships in general and aesthetic appreciation in particular. On the other hand it is based on the formal theory of truthlikeness and truth approximation as presented in my "From Instrumentalism to Constructive Realism". The analysis (...) supports the findings of James McAllister in his beautiful "Beauty and Revolution in Science", by explaining and justifying them. First, scientists are essentially right in regarding aesthetic criteria useful for empirical progress and even for truth approximation, provided they conceive of them as less hard than empirical criteria. Second, the aesthetic criteria of the time, the 'aesthetic canon', may well be based on 'aesthetic induction' regarding nonempirical features of paradigms of successful theories which scientists have come to appreciate as beautiful. Third, aesthetic criteria can play a crucial, schismatic role in scientific revolutions. Since they may well be wrong, they may, in the hands of aesthetic conservatives, retard empirical progress and hence truth approximation, but this does not happen in the hands of aesthetically flexible, 'revolutionary' scientists. (shrink)
While the special volumes of the series of Handbooks of the Philosophy of Science address topics relative to a specific discipline, this general volume deals ...
In earlier publications of the first author it was shown that intentional explanation of actions, functional explanation of biological traits and causal explanation of abnormal events share a common structure. They are called explanation by specification (of a goal, a biological function, an abnormal causal factor, respectively) as opposed to explanation by subsumption under a law. Explanation by specification is guided by a schematic train of thought, of which the argumentative steps not concerning questions were already shown to be logically (...) valid (elementary) arguments.Independently, the second author developed a new, inferential approach to erotetic logic, the logic of questions. In this approach arguments resulting in questions, with declarative sentences and/or other questions as premises, are analyzed, and validity of such arguments is defined. (shrink)
Assuming that the target of theory oriented empirical science in general and of nomic truth approximation in particular is to characterize the boundary or demarcation between nomic possibilities and nomic impossibilities, I have presented, in my article entitled “Models, postulates, and generalized nomic truth approximation” :3057–3077, 2016. 10.1007/s11229-015-0916-9), the ‘basic’ version of generalized nomic truth approximation, starting from ‘two-sided’ theories. Its main claim is that nomic truth approximation can perfectly be achieved by combining two prima facie opposing views on theories: (...) the traditional view: theories are postulates that exclude certain possibilities from being realizable, enabling explanation and prediction and the model view: theories are sets of models that claim to represent certain realizable possibilities. Nomic truth approximation, i.e. increasing truth-content and decreasing falsity-content, becomes in this way revising theories by revising their models and/or their postulates in the face of increasing evidence. The basic version of generalized nomic truth approximation is in many respects as simple as possible. Among other things, it does not take into account that one conceptual possibility may be more similar to another than a third one. However, for example, one theory may include a possibility that is more similar to a wrongly not included possibility than another theory can offer. Similarly, for wrongly not excluded possibilities. In this article it will be shown that such ‘refined’ considerations can be taken into account by adapted clauses based on a ternary similarity relation between possibilities. This allows again abductive conclusions about refined truth approximation if a theory is persistently more successful in the refined sense than another. It will also be indicated and illustrated that this refined approach enables a specification to the effect that refined truth approximation can be obtained by the method of idealization and subsequent concretization. Finally, the basic and the refined approach will be evaluated with regard to some general principles and objections that have been discussed in the literature. (shrink)
Standard accounts of the micro-reduction of phenomenological to kinetic thermostatics, based on the postulate relating empirical absolute temperature to mean kinetic energy ū=(3/2)kT, face two problems. The standard postulate also allows 'reduction' in the other direction and it can be criticized from the point of view that reduction postulates need to be ontological identities. This paper presents a detailed account of the reduction, based on the postulate that thermal equilibrium is ontologically identical to having equal mean kinetic energy. In particular, (...) it is shown that this postulate enables reduction only in the appropriate direction, but leaves room for 'evidence transport' in the other. Moreover, it makes possible the derivation (explanation) of the standard postulate, using the existential kinetic hypothesis and phenomenological laws with which it turns out to be laden. (shrink)
Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a ‘Hypothetico-Probabilistic (HP-) method’, in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truth approximation. In all three (...) cases, the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate ‘concretizations’ of their deductive analogs, being ‘idealizations’. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison (LC-) method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truth approximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem. (shrink)
This paper supplies a structuralist reconstruction of the Modigliani-Miller theory and shows that the economic literature following their results reports on research with an implicit strategy to come "closer-to-the-truth" in the modern technical sense in philosophy of science.
In section I the notions of logical and inductive probability will be discussed as well as two explicanda, viz. degree of confirmation, the base for inductive probability, and degree of evidential support, Popper's favourite explicandum. In section II it will be argued that Popper's paradox of ideal evidence is no paradox at all; however, it will also be shown that Popper's way out has its own merits.
The structuralist theory of truth approximation essen-tially deals with truth approximation by theory revision for a fixed domain. However, variable domains can also be taken into account, where the main changes concern domain extensions and restrictions. In this paper I will present a coherent set of definitions of “more truth-likeness”, “empirical progress” and “truth approximation” due to a revision of the domain of intended applications. This set of definitions seems to be the natural counterpart of the basic definitions of similar (...) notions as far as theory revision is concerned. The formal aspects of theory revision strongly suggest an analogy between truth approximation and design research, for example, drug research. Whereas a new drug may be better for a certain disease than an old one, a certain drug may be better for another disease than for the original target disease, a phenomenon which was nicely captured by the title of a study by Rein Vos [1991]: Drugs Looking for Diseases. Similarly, truth approximation may not only take the shape of theory revision but also of domain revision, naturally suggesting the phenomenon of “Theories looking for domains”. However, whereas Vos documented his title with a number of examples, so far, apart from plausible cases of “truth accumulation by domain extension”, I did not find clear-cut empirical instantiations of the analogy, only, as such, very interesting, non-empirical examples. (shrink)
The main formal notion involved in qualitative truth approximation by the HD-method, viz. ‘more truthlike’, is shown to not only have, by its definition, an intuitively appealing ‘model foundation’, but also, at least partially, a conceptually plausible ‘consequence foundation’. Moreover, combining the relevant parts of both leads to a very appealing ‘dual foundation’, the more so since the relevant methodological notions, viz. ‘more successful’ and its ingredients provided by the HD-method, can be given a similar dual foundation. According to the (...) resulting dual foundation of ‘naive truth approximation’, the HD-method provides successes (established true consequences) and counterexamples (established wrongly missing models) of theories. Such HD-results may support the tentative conclusion that one theory seems to remain more successful than another in the naive sense of having more successes and fewer counterexamples. If so, this provides good reasons for believing that the more successful theory is also more truthlike in the naive sense of having more correct models and more true consequences. (shrink)
Design research programs attempt to bring together the properties of available materials and the demands derived from intended applications. The logic of problem states and state transitions in such programs, including assessment criteria and heuristic principles, is described in settheoretic terms, starting with a naive model comprising an intended profile and the operational profile of a prototype. In a first concretization the useful distinction between structural and functional properties is built into the model. In two further concretizations the inclusion of (...) potential applications is motivated and described for the case of drug research as well as the inclusion of potential realizations for the case of complex products. Next, another line of concretization of the naive model, the incorporation of potentially relevant properties, is sketched. Then the partial analogy between product- and truth-approximation is indicated. We conclude with some remarks about the usefulness of our models for products reaching the market in comparison to the the so-called social construction of technology approach. (shrink)
Three related intuitions are explicated in this paper. The first is the idea that there must be some kind of probabilistic version of the HD-method, a 'Hypothetico-Probabilistic method', in terms of something like probabilistic consequences, instead of deductive consequences. According to the second intuition, the comparative application of this method should also be functional for some probabilistic kind of empirical progress, and according to the third intuition this should be functional for something like probabilistic truth approximation. In all three cases, (...) the guiding idea is to explicate these intuitions by explicating the crucial notions as appropriate 'concretizations' of their deductive analogs, being 'idealizations'. It turns out that the comparative version of the proposed HP-method amounts to the likelihood comparison method applied to the cumulated evidence. This method turns out to be not only functional for probabilistic empirical progress but also for probabilistic truth approximation. The latter is based on a probabilistic threshold theorem constituting for this reason the analog of the deductive success theorem. (shrink)
Table of ContentsAndrzej KLAWITER, Krzystof #ASTOWSKI: Introduction: Originality, Courage and Responsibility List of Books by Leszek NowakSelected Bibliography of Leszek Nowak's WritingsScience and Idealization Theo A.F. KUIPERS: On Two ...
In this paper it is shown that there is a natural way of dealing with analogy by similarity in inductive systems by extending intuitive ways of introduction of systems without analogy. This procedure leads to Carnap-like systems, with zero probability for contingent generalizations, satisfying a general principle of so-called virtual analogy. This new principle is different from, but compatible with, Carnap's principle. It will be shown that the latter principle is satisfied, and should only be satisfied, if the underlying distance (...) function is such that all predicates have the same "predicate-environment". Finally, the claim that the new systems have the property of order indifference only with respect to the past will be defended. (shrink)
This collection of 17 articles offers an overview of the philosophical activities of a group of philosophers working at the Groningen University. The meta-methodological assumption which unifies the research of this group, holds that there is a way to do philosophy which is a middle course between abstract normative philosophy of science and descriptive social studies of science. On the one hand it is argued with social studies of science that philosophy should take notice of what scientists actually do. On (...) the other hand, however, it is claimed that philosophy can and should aim to reveal cognitive patterns in the processes and products of scientific and common sense knowledge. Since it is thought that those patterns can function as guidelines in new research and/or in research in other disciplines, philosophy can nevertheless hold on to the normative aim which is characteristic of 'classical' philosophy of science. Compared to this common assumption, there is a diversity of subjects. Some papers deal with general problems of science, knowledge, cognition and argumentation, others with topics relating to foundational problems of particular sciences. Therefore this volume is of interest to philosophers of science, to philosophers of knowledge and argumentation in general, to philosophers of mind, as well as for scientists working in the physical and applied sciences, biology, psychology and economy who are interested in the foundations of their disciplines.After a foreword by Leszek Nowak and a general introduction by the editors, the book is divided into four parts, with special introductions. I: CONCEPTUAL ANALYSIS IN SERVICE OF VARIOUS RESEARCH PROGRAMMES II: THE LOGIC OF THE EVALUATION OF ARGUMENTS, HYPOTHESES, DEFAULT RULES, AND INTERESTING THEOREMS III: THREE CHALLENGES TO THE TRUTH APPROXIMATION PROGRAMME IV: EXPLICATING PSYCHOLOGICAL INTUITIONS .The Groningen research group was recently qualified, by an official international assessment committee, as one of the best philosophy research groups in the Netherlands. (shrink)
The philosophy of science has lost its self-confidence. Structures in Science (2001) is an advanced textbook that explicates, updates and integrates the best insights of logical empiricism and its main critics. This "neo-classical approach" aims at providing heuristic patterns for research.The book introduces four ideal types of research programs (descriptive, explanatory, design and explicative) and reanimates the distinction between observational laws and proper theories without assuming a theory-free language. It explicates various patterns of explanation by subsumption and specification as well (...) as structures in reductive and other types of interlevel research. Its threefold analysis of theory evaluation leads to new characterizations of confirmation, empirical progress, and truth approximation. What emerges are partial analogies between progress in nomological research, presented in detail in From Instrumentalism to Constructive Realism (2000) and progress in explicative and design research. Finally, special chapters are devoted to design research programs, computational philosophy of science, the structuralist approach to theories, and research ethics.The present synopsis of Structures in Science highlights the main topics, the final emphasis being on design research and research ethics. (shrink)
It is argued that the conjunction effect has a disjunctive analog of strong interest for the realism–antirealism debate. It is possible that a proper theory is more confirmed than its (more probable) observational sub-theory and hence than the latter’s disjunctive equivalent, i.e., the disjunction of all proper theories that are empirically equivalent to the given one. This is illustrated by a toy model.
Surprisingly enough, modified versions of the confirmation theory of Carnap and Hempel and the truth approximation theory of Popper turn out to be smoothly synthesizable. The glue between confirmation and truth approximation appears to be the instrumentalist methodology, rather than the falsificationist one.By evaluating theories separately and comparatively in terms of their successes and problems (hence even if they are already falsified), the instrumentalist methodology provides – both in theory and in practice – the straight route for short-term empirical progress (...) in science in the spirit of Laudan. However, it is argued that such progress is also functional for all kinds of truth approximation: observational, referential, and theoretical. This sheds new light on the long-term dynamic of science and hence on the relation between the main epistemological positions, viz., instrumentalism (Toulmin, Laudan), constructive empiricism (van Fraassen), referential realism (Hacking and Cartwright), and theory realism of a non-essentialist nature (Popper), here called constructive realism.In From Instrumentalism to Constructive Realism (2000) the above story is presented in great detail. The present synopsis highlights the main ways of theory evaluation presented in that book, viz. evaluation in terms of confirmation (or falsification), empirical progress and truth approximation. (shrink)
This paper primarily deals with the conceptual prospects for generalizing the aim of abduction from the standard one of explaining surprising or anomalous observations to that of empirical progress or even truth approximation. It turns out that the main abduction task then becomes the instrumentalist task of theory revision aiming at an empirically more successful theory, relative to the available data, but not necessarily compatible with them. The rest, that is, genuine empirical progress as well as observational, referential and theoretical (...) truth approximation, is a matter of evaluation and selection, and possibly new generation tasks for further improvement. The paper concludes with a survey of possible points of departure, in AI and logic, for computational treatment of the instrumentalist task guided by the ‘comparative evaluation matrix’. (shrink)
I sketch the most important epistemological positions in the instrumentalism-realism debate, viz., instrumentalism, constructive empiricism, referential realism, and theory realism. I order them according to their answers to a number of successive leading questions, where every next question presupposes an affirmative answer to the foregoing one. I include the answer to questions concerning truth, as well as the most plausible answer to questions concerning truth approximation. Restricting my survey to the natural sciences and hence to the natural world, I indicate (...) the implications of the results of the study of empirical progress and truth approximation for the way these epistemological positions are related. I conclude that there are good reasons for the instrumentalist to become a constructive empiricist; in order to give deeper explanations of success differences, the constructive empiricist is forced to become a referential realist; and, there are good reasons for the referential realist to become a theory realist of a non-essentialist nature, here called a constructive realist. (shrink)
Theories of truth approximation in terms of truthlikeness almost always deal with approaching deterministic truths, either actual or nomic. This paper deals first with approaching a probabilistic nomic truth, viz. a true probability distribution. It assumes a multinomial probabilistic context, hence with a lawlike true, but usually unknown, probability distribution. We will first show that this true multinomial distribution can be approached by Carnapian inductive probabilities. Next we will deal with the corresponding deterministic nomic truth, that is, the set of (...) conceptually possible outcomes with a positive true probability. We will introduce Hintikkian inductive probabilities, based on a prior distribution over the relevant deterministic nomic theories and on conditional Carnapian inductive probabilities, and first show that they enable again probabilistic approximation of the true distribution. Finally, we will show, in terms of a kind of success theorem, based on Niiniluoto’s estimated distance from the truth, in what sense Hintikkian inductive probabilities enable the probabilistic approximation of the relevant deterministic nomic truth. In sum, the truth approximation perspective on Carnapian and Hintikkian inductive probabilities leads to the unification of the inductive probability field and the field of truth approximation. (shrink)