The ideal of value free science states that the justification of scientific findings should not be based on non-epistemic (e.g. moral or political) values. It has been criticized on the grounds that scientists have to employ moral judgements in managing inductive risks. The paper seeks to defuse this methodological critique. Allegedly value-laden decisions can be systematically avoided, it argues, by making uncertainties explicit and articulating findings carefully. Such careful uncertainty articulation, understood as a methodological strategy, is exemplified by the current (...) practice of the Intergovernmental Panel on Climate Change (IPCC). (shrink)
By means of multi-agent simulations, it investigates the truth and consensus-conduciveness of controversial debates. The book brings together research in formal epistemology and argumentation theory.
Climate models don’t give us probabilistic forecasts. To interpret their results, alternatively, as serious possibilities seems problematic inasmuch as climate models rely on contrary-to-fact assumptions: why should we consider their implications as possible if their assumptions are known to be false? The paper explores a way to address this possibilistic challenge. It introduces the concepts of a perfect and of an imperfect credible world, and discusses whether climate models can be interpreted as imperfect credible worlds. That would allow one to (...) use models for possibilistic prediction and salvage widespread scientific practice. (shrink)
Wo Meinungen aufeinanderprallen, um Verständnis geworben und Überzeugungsarbeit geleistet wird, sind Begründungen nicht weit. Für jede Überzeugung gibt es immer ein, zwei Gründe, die mit Gegengründen konfrontiert und, im Gegenzug, mit weiteren Überlegungen verteidigt werden usw. usf. Schnell sind wir verwirrt und drohen, ohne uns der "Grammatikregeln" vernünftigen Argumentierens zu besinnen, nicht mehr durchzublicken. Die Theorie dialektischer Strukturen leistet einen Beitrag zur Grammatik vernünftigen Argumentierens. Sie stellt Begriffe und Verfahren bereit, um Fragen, die sich angesichts einer komplexen Argumentation stellen können, (...) zu beantworten: Wie beziehen sich die einzelnen Argumente der Debatte aufeinander? Welche Positionen vertreten die verschiedenen Proponenten, und wie lassen sich diese bewerten? Wie gut ist die zentrale These der Kontroverse alles in allem begründet? Ist die und die Argumentationshandlung angesichts des Debattenstandes tatsächlich zweckmäßig und rational? Liegt hier eine zirkuläre - und damit fehlerhafte - Argumentation vor? Die klassische Logik, die sogenannte informal logic und argumentationstheoretische Ansätze der Künstlichen Intelligenz bilden Anknüpfungspunkte der Theorie dialektischer Strukturen. Exemplarisch werden die entwickelten argumentationstheoretischen Methoden zur Analyse umfangreicher philosophischer Kontroversen eingesetzt. (shrink)
This article discusses how inference to the best explanation can be justified as a practical meta - argument. It is, firstly, justified as a practical argument insofar as accepting the best explanation as true can be shown to further a specific aim. And because this aim is a discursive one which proponents can rationally pursue in — and relative to — a complex controversy, namely maximising the robustness of one’s position, IBE can be conceived, secondly, as a meta - argument. (...) My analysis thus bears a certain analogy to Sellars ’ well - known justification of inductive reasoning ; it is based on recently developed theories of complex argumentation. (shrink)
As climate policy decisions are decisions under uncertainty, being based on a range of future climate change scenarios, it becomes a crucial question how to set up this scenario range. Failing to comply with the precautionary principle, the scenario methodology widely used in the Third Assessment Report of the International Panel on Climate Change (IPCC) seems to violate international environmental law, in particular a provision of the United Nations Framework Convention on Climate Change. To place climate policy advice on a (...) sound methodological basis would imply that climate simulations which are based on complex climate models had, in stark contrast to their current hegemony, hardly an epistemic role to play in climate scenario analysis at all. Their main function might actually consist in ‘foreseeing future ozone-holes’. In order to argue for these theses, I explain first of all the plurality of climate models used in climate science by the failure to avoid the problem of underdetermination. As a consequence, climate simulation results have to be interpreted as modal sentences, stating what is possibly true of our climate system. This indicates that climate policy decisions are decisions under uncertainty. Two general methodological principles which may guide the construction of the scenario range are formulated and contrasted with each other: modal inductivism and modal falsificationism. I argue that modal inductivism, being the methodology implicitly underlying the third IPCC report, is severely flawed. Modal falsificationism, representing the sound alternative, would in turn require an overhaul of the IPCC practice. (shrink)
This paper gives an explication of our intuitive notion of strength of justification in a controversial debate. It defines a thesis' degree of justification within the bipolar argumentation framework of the theory of dialectical structures as the ratio of coherently adoptable positions according to which that thesis is true over all coherently adoptable positions. Broadening this definition, the notion of conditional degree of justification, i.e.\ degree of partial entailment, is introduced. Thus defined degrees of justification correspond to our pre-theoretic intuitions (...) in the sense that supporting and defending a thesis t increases, whereas attacking it decreases, t's degree of justification. Moreover, it is shown that (conditional) degrees of justification are (conditional) probabilities. Eventually, the paper explains that it is rational to believe theses with a high degree of justification insofar as this strengthens the robustness of one's position. (shrink)
Frank Knight (1921) famously distinguished the epistemic modes of certainty, risk, and uncertainty in order to characterize situations where deterministic, probabilistic or possibilistic foreknowledge is available. Because our probabilistic knowledge is limited, i.e. because many systems, e.g. the global climate, cannot be described and predicted probabilistically in a reliable way, Knight's third category, possibilistic foreknowledge, is not simply swept by the probabilistic mode. This raises the question how to justify possibilistic predictionsincluding the identication of the worst case. The development of (...) such a modal methodology is particularly vital with respect to predictions of climate change. I show that a methodological dilemma emerges when possibilistic predictions are framed in traditional terms and argue that a more nuanced conceptual framework, distinguishing dierent types of possibility, should be used in order to convey our uncertain knowledge about the future. The new conceptual scheme, however, questions the applicability of standard rules of rational decision-making, thus generating new challenges. (shrink)
We use recently developed approaches in argumentation theory in order to revamp the hypothetico-deductive model of confirmation, thus alleviating the well-known paradoxes the H-D account faces. More specifically, we introduce the concept of dialectic confirmation on the background of the so-called theory of dialectical structures (Betz 2010, 2012b). Dialectic confirmation generalises hypothetico-deductive confirmation and mitigates the raven paradox, the grue paradox, the tacking paradox, the paradox from conceptual difference, and the problem of surprising evidence.
Gregor Betz explores the following questions: Where are the limits of economics, in particular the limits of economic foreknowledge? Are macroeconomic forecasts credible predictions or mere prophecies and what would this imply for the way economic policy decisions are taken? Is rational economic decision making possible without forecasting at all?
This paper develops concepts and procedures for the evaluation of complex debates. They provide means for answering such questions as whether a thesis has to be considered as proven or disproven in a debate or who carries a burden of proof. While being based on classical logic, this framework represents an (argument-based) approach to non-monotonic, or defeasible reasoning. Debates are analysed as dialectical structures, i.e. argumentation systems with an attack- as well as a support-relationship. The recursive status assignment over the (...) arguments is conditionalised on proponents in a debate. The problem of multiple status assignments arising on circular structures is solved by showing that uniqueness can be guaranteed qua reconstruction of a debate. The notion of burden of proof as well as other discursive aims rational proponents pursue in a debate is defined within the framework. (shrink)
This study investigates the ethical aspects of deploying and researching into so-called climate engineering methods, i.e. large-scale technical interventions in the climate system with the objective of offsetting anthropogenic climate change. The moral reasons in favour of and against R&D into and deployment of CE methods are analysed by means of argument maps. These argument maps provide an overview of the CE controversy and help to structure the complex debate.
This article sets up a graph-theoretical framework for argumentation-analysis (dialectical analysis) which expands classical argument-analysis. Within this framework, a main theorem on the existence of inconsistencies in debates is stated and proved: the vicious circle theorem. Subsequently, two corollaries which generalize the main theorem are derived. Finally, a brief outlook is given on further expansions and possible applications of the developed framework.
With the evidence for anthropogenic climate change piling up, suggesting that climate impacts of GHG emissions might have been underestimated in the past (Allison et al. 2009; WBGU 2009), and mitigation policies apparently lagging behind what many scientists consider as necessary reductions in order to prevent dangerous climate change, the debate about intentional climate change, or “climate engineering”, as we shall say in the following, has gained momentum in the past years. While efforts to technically modify earth’s climate had been (...) the focus of sporadic discussions at least since the White House’s Report “Restoring the Quality of Our Environment” (cf. Keith 2000), Paul Crutzen’s cautious plea for research into the feasibility and side-effects of stratospheric sulphur injections (Crutzen 2006) has incited an inter-disciplinary controversy (with a preliminary culmination in the Royal Society’s assessment (Royal Society 2009)), while increasing public awareness and debate about climate engineering, as well. The controversy, though, does not focus on the question whether climate engineering should be carried out today (which is largely reckoned to be a bad idea, unnecessary, or premature) or at some point in the future (which is considered a decision we don’t have to take now), but on whether to engage in large-scale research into the alternative technological options for carrying out intentional climate change. It is this paper’s purpose to make that controversy more transparent. In order to do so, we analyse what seems to be the major argument in favour of research into climate engineering: the lesser evil-, or, as Stephen Gardiner has called it, the arm the future-argument — in short: the AF-argument (Gardiner 2010). Such an argumentative analysis makes explicit the normative and descriptive assumptions which underlie the reasoning, without ascertaining or denying them, and thus enables one to assess the overall strength of the argument as well as to determine which objections do, and which don’t undermine it. (shrink)
This paper shows how complex argumentation, analyzed as dialectical structures, can be evaluated within a Bayesian framework by interpreting them as coherence constraints on subjective degrees of belief. A dialectical structure is a set of arguments (premiss-conclusion structure) among which support- and attack-relations hold. This approach addresses the observation that some theses in a debate can be better justified than others and thus fixes a shortcoming of a theory of defeasible reasoning which applies the bivalence principle to argument evaluations by (...) assigning them the status of being either defeated or undefeated. Evaluation procedures which are based on the principle of bivalence can, however, be embedded as a special case within the Bayesian framework. The approach developed in this paper rests on the assumptions that arguments can be reconstructed as deductively valid and that complex argumentation can be reconstructed such that premisses of arguments with equivalent conclusions are pairwise independent. (shrink)
Science advances by means of argument and debate. Based on a formal model of complex argumentation, this article assesses the interplay between evidential and inferential drivers in scientific controversy, and explains, in particular, why both evidence accumulation and argumentation are veritistically valuable. By improving the conditions for applying veritistic indicators , novel evidence and arguments allow us to distinguish true from false hypotheses more reliably. Because such veritistic indicators also underpin inductive reasoning, evidence accumulation and argumentation enhance the reliability of (...) inductive inference, for example, inference to the best explanation. 1 Introduction2 Theory of Dialectical Structures3 Debate Simulations4 From Evidence and Arguments to Truth: The First Route5 From Evidence to Truth: The Second Route6 Conclusion. (shrink)
Predictive success as an aim of science -- On the very possibility of prediction in the social sciences -- Empirical facts about social prediction: its mode, object and performance -- Understanding poor forecast performance.
This paper investigates in how far a theory of dialectical structures sheds new light on the old problem of giving a satisfying account of the fallacy of petitio principii, or begging the question. It defends that (i) circular argumentation on the one hand and petitio principii on the other hand are two distinct features of complex argumentation, and that (ii) it is impossible to make general statements about the defectiveness of an argumentation that exhibits these features. Such an argumentation, in (...) contrast, has to be evaluated on a case-by-case basis. “Petitio principii”, this paper thence suggests, is one name for, in fact, a multitude of different and quite complex dialectical situations which require specific analysis and evaluation. (shrink)
Climate policy decisions are decisions under uncertainty and are, therefore, based on a range of future climate scenarios, describing possible consequences of alternative policies. Accordingly, the methodology for setting up such a scenario range becomes pivotal in climate policy advice. The preferred methodology of the Intergovernmental Panel on Climate Change will be characterised as ,,modal verificationism"; it suffers from severe shortcomings which disqualify it for scientific policy advice. Modal falsificationism, as a more sound alternative, would radically alter the way the (...) climate scenario range is set up. Climate science's inability to find robust upper bounds for future temperature rise in line with modal falsificationism does not disprove that methodology, rather, this very fact prescribes even more drastic efforts to curb CO2 emissions than currently proposed. (shrink)
The Stern Review on The Economics of Climate Change is a highly influential welfare analysis of climate policy measures which has been published in 2006. This paper identifies and systematically assesses the long-term socioeconomic and climatic predictions the Stern Review relies on, and reflects them philosophically. Being a cost-benefit analysis, the Stern Review has to predict the benefits of climate mitigation policies, i.e.the damaging consequences of climate change which might be avoided, as well as the costs of implementing such policies. (...) While distinguishing deterministic, probabilistic, and possibilisitic forecasts, this paper finds that the Review's major predictions severly suffer from a lack of robustness. It argues, moreover, that the use of subjective probabilities as well as the fact/value entanglement pose additional problems. Given our ignorance, this assessment raises finally the question how detailed an analysis of climate policy decisions should reasonably be at all, and whether the argument for acting against climate change is maybe very simple. (shrink)
Epistemic trust figures prominently in our socio-cognitive practices. By assigning different degrees of competence to agents, we distinguish between experts and novices and determine the trustworthiness of testimony. This paper probes the claim that epistemic trust furthers our epistemic enterprise. More specifically, it assesses the veritistic value of competence attribution in an epistemic community, i.e., in a group of agents that collaboratively seek to track down the truth. The results, obtained by simulating opinion dynamics, tend to subvert the very idea (...) that competence ascription is essential for the functioning of epistemic collaboration and hence veritistically valuable. On the contrary, we find that, in specific circumstances at least, epistemic trust may prevent a community from finding the truth effectively. (shrink)
Philosophy of science attempts to reconstruct science as a rational cognitive enterprise. In doing so, it depicts a normative ideal of knowledge acquisition and does not primarily seek to describe actual scientific practice in an empirically adequate way. A comprehensive picture of what good science consists in may serve as a standard against which we evaluate and criticize actual scientific practices. Such a normative picture may also explain why it is reasonable for us to trust scientists – to the extent (...) that they live up to the ideal – and to rely on their findings in decision making. Likewise, a sound normative understanding of science exposes the limits of scientific understanding and prevents us from placing blind faith in scientists and experts. For these reasons, philosophy of science represents a useful resource and background theory for the practice and study of science communication. In this handbook article, we provide an opinionated introduction to philosophy of science by flashing a light on 22 central issues which (we think) are of special interest to scholars and practitioners of science communication – and, in particular, to scholars and practitioners of external science communication. (shrink)
In diesem Beitrag möchte ich begründen, warum das 2-Grad-Ziel der internatio- nalen Klimapolitik einen vernünftigen Umgang mit unscharfen Grenzen darstellt. Ich werde zunächst skizzieren, aus welchen Überlegungen das 2-Grad-Ziel ent- standen ist und wie es Eingang fand in die internationale Klimapolitik. Daraufhin werde ich darlegen, dass sich traditionelle Entscheidungsanalyseverfahren (Kos- tennutzenanalyse, kurz: KNA) nicht problemlos auf klimapolitische Fragestel- lungen anwenden lassen. Solche Schwierigkeiten umgeht hingegen der Leitplan- kenansatz, der als Alternative zur KNA entwickelt wurde. Im Leitplankenansatz, so werde ich argumentieren, (...) lässt sich das 2-Grad-Ziel als Klimaleitplanke ein- sichtig begründen. Schließlich werde ich auf ungerechtfertigte Kritik am 2-Grad- Ziel aufmerksam machen. (shrink)
Als ›Climate Engineering‹ bezeichnet man großtechnische Eingriffe in das Klimasystem, die darauf abzielen, den anthropogenen Klimawandel zu kompensieren. Neben Mitigation- und Adaptation-Maßnahmen bilden Climate-Engineering-Verfahren damit eine dritte Kategorie möglicher Reaktionen auf den anthropogenen Klimawandel.
Descartes' "Meditationen" sind vielleicht 'der' Klassiker der Philosophie. Sie behandeln grundlegende Fragen: Welche Arten von Gegenständen kommen in der Welt vor? Was für eine Art von Ding bin ich? Bin ich frei? Was ist Wahrheit? Welchen Status haben logische Wahrheiten oder mathematische Theoreme? Was kann ich wissen? Gregor Betz' systematischer Kommentar rekonstruiert die entsprechenden Gedankengänge und Begründungen und versucht Antworten auf Descartes' Fragen zu geben. Auch andere Philosophen, insbesondere des 20. Jahrhunderts, werden in einen Dialog mit Descartes gestellt. Denn eine (...) philosophische Position zu verstehen bedeutet letztlich auch zu überblicken, "welche weiteren Thesen sie impliziert und mit welchen anderen Positionen sie in Konflikt gerät", so der Autor des Bandes. (shrink)
Dieser Beitrag diskutiert Oskar Morgensterns These von der Unmöglichkeit von Wirtschaftsprognose. Nach einer kritischen Rekonstruktion Morgensterns Argumente wird diese These in ihrer starken, apriorischen Lesart zurückgewiesen. Demgegenüber gestatten es die Ergebnisse empirischer Prognoseevaluationen, Morgensterns Überlegungen als kontingente Erklärungen des Scheiterns makroökonomischer Vorhersagen umzuinterpretieren. Der Beitrag schließt deshalb mit einer provokanten Konklusion, die bereits Morgenstern zog: der Forderung, Versuche makroökonomischer Vorhersage einzustellen.
Wer philosophiert, argumentiert. Der Band vereint Beiträge zur Argumentationstheorie, Erkenntnistheorie, Wissenschaftstheorie, Existenzphilosophie, Religionsphilosophie und Metaphilosophie. Er zeigt auf, dass auch theoretische Fragen von lebenspraktischer Bedeutung sind.
Im Herbst des Jahres 20– breiten sich Gerüchte aus, dass am Genfer Kernforschungszentrum CERN, den gegenteiligen Versicherungen führender Teilchenphysiker zum Trotz, stabile schwarze Löcher erzeugt wurden. Daraufhin kommt es vielerorts zu Plünderungen. Auch vermelden zahlreiche Firmen und öffentliche Arbeitgeber, dass ein erheblicher Anteil der Belegschaft nicht am Arbeitsplatz erschienen ist. Rund um den Globus fragen sich Menschen ob der Hiobsbotschaften aus Genf: Steht nun der Weltuntergang tatsächlich kurz bevor? In der Sendung „Kontrovers“ im Deutschlandfunk diskutieren ein Teilchenphysiker, eine Juristin, ein (...) Wissenschaftsjournalist und ein Philosoph diese Ereignisse. Wir freuen uns, hier einen Mitschnitt der Sendung vorab veröffentlichen zu können. (shrink)
Based on the theory of dialectical structures, I review the concept of degree of justification of a partial position a proponent may hold in a controversial debate. The formal concept of degree of justification dovetails with our pre-theoretic intuitions about a thesis' strength of justification. The central claim I'm going to defend in this paper maintains that degrees of justification, as defined within the theory of dialectical structures, correlate with a proponent position's verisimilitude. I vindicate this thesis with the results (...) of simulations of controversial argumentation. (shrink)
In einer Welt, in der der Umgang mit Komplexität und Unsicherheit an Bedeutung gewinnt, sind politische Entscheidungsträger immer stärker auf eine wissenschaftliche Beratung angewiesen. Trotz des Bedarfs der politischen Akteure nach konkreten Handlungsempfehlungen sollte seriöse Politikberatung die grundlegenden Werte wissenschaftlichen Arbeitens nicht aus den Augen verlieren.
There are different varieties of conservatism concerning belief formation and revision. We assesses the veritistic effects of a particular kind of conservatism commonly attributed to Quine: the so-called maxim of minimum mutiliation, which states that agents should give up as few beliefs as possible when facing recalcitrant evidence. Based on a formal bounded rationality model of belief revision, which parametrizes degree of conservatism, and corresponding multi-agent simulations, we eventually argue against doxastic conservatism from the vantage point of veritistic social epistemology.
Epistemic trust figures prominently in our socio-cognitive practices. By assigning different degrees of competence to agents, we distinguish between experts and novices and determine the trustworthiness of testimony. This paper probes the claim that epistemic trust furthers our epistemic enterprise. More specifically, it assesses the veritistic value of competence attribution in an epistemic community, i.e., in a group of agents that collaboratively seek to track down the truth. The results, obtained by simulating opinion dynamics, tend to subvert the very idea (...) that competence ascription is essential for the functioning of epistemic collaboration and hence veritistically valuable. On the contrary, we find that, in specific circumstances at least, epistemic trust may prevent a community from finding the truth effectively. (shrink)
If we want to understand how extremist group ideologies are established, we have to comprehend the social processes which form the basis of the emergence and distribution of such beliefs. In our chapter, we present an innovative approach to examining these processes and explaining how they function: with the method of computer-based simulation of opinion formation, we develop heuristic explanatory models which help to generate new and interesting hypotheses. The focus is thereby not on individuals and their idiosyncrasies but on (...) the dynamic mutual adaptation of beliefs in a group. These dynamics can produce an incremental establishment of “charismatic” opinion leaders and an increasing radicalization and alienation. A prototype of such a simulation model has produced promising first results which are presented and discussed. (shrink)