Theories of Truth provides a clear, critical introduction to one of the most difficult areas of philosophy. It surveys all of the major philosophical theories of truth, presenting the crux of the issues involved at a level accessible to nonexperts yet in a manner sufficiently detailed and original to be of value to professional scholars. Kirkham's systematic treatment and meticulous explanations of terminology ensure that readers will come away from this book with a comprehensive general understanding of one of philosophy's (...) thorniest set of topics. -/- Included are discussions of the correspondence, coherence, pragmatic, semantic, performative, redundancy, appraisal, and truth-as-justification theories. There are also chapters or sections of chapters on the liar paradox, three-valued logic, Field's critique of Tarski, Davidson's program, Dummett's theory of linguistic competence, satisfaction, recursion, the extension/intension distinction, and an explanation of how theories of justification, properly understood, differ from theories of truth. -/- A persistent theme is that philosophers have too often failed to recognize that not all theories of truth are intended to answer the same question. When the various questions are made distinct, it is apparent that many of the "debates" in this field are really cases of philosophers talking past one another. There is much less disagreement within the field than has commonly been thought. (shrink)
At the centre of the traditional discussion of truth is the question of how truth is defined. Recent research, especially with the development of deflationist accounts of truth, has tended to take truth as an undefined primitive notion governed by axioms, while the liar paradox and cognate paradoxes pose problems for certain seemingly natural axioms for truth. In this book, Volker Halbach examines the most important axiomatizations of truth, explores their properties and shows how the logical results impinge on the (...) philosophical topics related to truth. In particular, he shows that the discussion on topics such as deflationism about truth depends on the solution of the paradoxes. His book is an invaluable survey of the logical background to the philosophical discussion of truth, and will be indispensable reading for any graduate or professional philosopher in theories of truth. (shrink)
I argue that that an influential strategy for understanding conspiracy theories stands in need of radical revision. According to this approach, called ‘generalism’, conspiracy theories are epistemically defective by their very nature. Generalists are typically opposed by particularists, who argue that conspiracy theories should be judged case-by-case, rather than definitionally indicted. Here I take a novel approach to criticizing generalism. I introduce a distinction between ‘Dominant Institution Conspiracy Theories and Theorists’ and ‘Non-Dominant Institution Conspiracy Theories and Theorists’. Generalists uncritically center (...) the latter in their analysis, but I show why the former must be centered by generalists’ own lights: they are the clearest representatives of their views, and they are by far the most harmful. Once we make this change in paradigm cases, however, various typical generalist theses turn out to be false or in need of radical revision. Conspiracy theories are not primarily produced by extremist ideologies, as generalists typically claim, since mainstream, purportedly non-extremist political ideologies turn out to be just as, if not more responsible for such theories. Conspiracy theories are also, we find, not the province of amateurs: they are often created and pushed by individuals widely viewed as experts, who have the backing of our most prestigious intellectual institutions. While generalists may be able to take this novel distinction and shift in paradigm cases on board, this remains to be seen. Subsequent generalist accounts that do absorb this distinction and shift will look radically different from previous incarnations of the view. (shrink)
The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. _Philosophical Theories of Probability_ is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.
What are conspiracy theories? And what, if anything, is epistemically wrong with them? I offer an account on which conspiracy theories are a unique way of holding a belief in a conspiracy. Specifically, I take conspiracy theories to be self-insulating beliefs in conspiracies. On this view, conspiracy theorists have their conspiratorial beliefs in a way that is immune to revision by counter-evidence. I argue that conspiracy theories are always irrational. Although conspiracy theories involve an expectation to encounter some seemingly disconfirming (...) evidence (allegedly planted by the conspirators), resistance to all counter- evidence cannot be justified on these grounds. (shrink)
Most expressions in natural language are vague. But what is the best semantic treatment of terms like 'heap', 'red' and 'child'? And what is the logic of arguments involving this kind of vague expression? These questions are receiving increasing philosophical attention, and in this book, first published in 2000, Rosanna Keefe explores the questions of what we should want from an account of vagueness and how we should assess rival theories. Her discussion ranges widely and comprehensively over the main theories (...) of vagueness and their supporting arguments, and she offers a powerful and original defence of a form of supervaluationism, a theory that requires almost no deviation from standard logic yet can accommodate the lack of sharp boundaries to vague predicates and deal with the paradoxes of vagueness in a methodologically satisfying way. Her study will be of particular interest to readers in philosophy of language and of mind, philosophical logic, epistemology and metaphysics. (shrink)
Additive theories of rationality, as I use the term, are theories that hold that an account of our capacity to reflect on perceptually-given reasons for belief and desire-based reasons for action can begin with an account of what it is to perceive and desire, in terms that do not presuppose any connection to the capacity to reflect on reasons, and then can add an account of the capacity for rational reflection, conceived as an independent capacity to ‘monitor’ and ‘regulate’ our (...) believing-on-the-basis-of-perception and our acting-on-the-basis-of-desire. I show that a number of recent discussions of human rationality are committed to an additive approach, and I raise two difficulties for this approach, each analogous to a classic problem for Cartesian dualism. The interaction problem concerns how capacities conceived as intrinsically independent of the power of reason can interact with this power in what is intuitively the right way. The unity problem concerns how an additive theorist can explain a rational subject's entitlement to conceive of the animal whose perceptual and desiderative life he or she oversees as ‘I’ rather than ‘it’. I argue that these difficulties motivate a general skepticism about the additive approach, and I sketch an alternative, ‘transformative’ framework in which to think about the cognitive and practical capacities of a rational animal. (shrink)
One of the many problems that would have t o be solved by a satisfactory theory of empirical knowledge, perhaps the most central is a general structural problem which I shall call the epistemic regress problem: the problem of how to avoid an in- finite and presumably vicious regress of justification in ones account of the justifica- tion of empirical beliefs. Foundationalist theories of empirical knowledge, as we shall see further below, attempt t o avoid the regress by locating a (...) class of empirical beliefs whose justification does not depend on that of other empirical beliefs. Extemalist theories, the topic of the present paper, represent one species of foundationalism. (shrink)
This paper analyzes the generation and function of hitherto ignored or misrepresented interfield theories , theories which bridge two fields of science. Interfield theories are likely to be generated when two fields share an interest in explaining different aspects of the same phenomenon and when background knowledge already exists relating the two fields. The interfield theory functions to provide a solution to a characteristic type of theoretical problem: how are the relations between fields to be explained? In solving this problem (...) the interfield theory may provide answers to questions which arise in one field but cannot be answered within it alone, may focus attention on domain items not previously considered important, and may predict new domain items for one or both fields. Implications of this analysis for the problems of reduction and the unity and progress of science are mentioned. (shrink)
Informational theories of semantic content have been recently gaining prominence in the debate on the notion of mental representation. In this paper we examine new-wave informational theories which have a special focus on cognitive science. In particular, we argue that these theories face four important difficulties: they do not fully solve the problem of error, fall prey to the wrong distality attribution problem, have serious difficulties accounting for ambiguous and redundant representations and fail to deliver a metasemantic theory of representation. (...) Furthermore, we argue that these difficulties derive from their exclusive reliance on the notion of information, so we suggest that pure informational accounts should be complemented with functional approaches. (shrink)
This paper argues that we should replace the common classification of theories of welfare into the categories of hedonism, desire theories, and objective list theories. The tripartite classification is objectionable because it is unduly narrow and it is confusing: it excludes theories of welfare that are worthy of discussion, and it obscures important distinctions. In its place, the paper proposes two independent classifications corresponding to a distinction emphasised by Roger Crisp: a four-category classification of enumerative theories (about which items constitute (...) welfare), and a four-category classification of explanatory theories (about why these items constitute welfare). (shrink)
Abstract Conspiracy theories should be neither believed nor investigated - that is the conventional wisdom. I argue that it is sometimes permissible both to investigate and to believe. Hence this is a dispute in the ethics of belief. I defend epistemic “oughts” that apply in the first instance to belief-forming strategies that are partly under our control. But the beliefforming strategy of not believing conspiracy theories would be a political disaster and the epistemic equivalent of selfmutilation. I discuss several variations (...) of this strategy, interpreting “conspiracy theory” in different ways but conclude that on all these readings, the conventional wisdom is deeply unwise. (shrink)
The JSTOR Archive is a trusted digital repository providing for long-term preservation and access to leading academic journals and scholarly literature from around the world. The Archive is supported by libraries, scholarly societies, publishers, and foundations. It is an initiative of JSTOR, a not-for-profit organization with a mission to help the scholarly community take advantage of advances in technology. For more information regarding JSTOR, please contact [email protected]
Theories of Theories of Mind brings together contributions by a distinguished international team of philosophers, psychologists, and primatologists, who between them address such questions as: what is it to understand the thoughts, feelings, and intentions of other people? How does such an understanding develop in the normal child? Why, unusually, does it fail to develop? And is any such mentalistic understanding shared by members of other species? The volume's four parts together offer a state of the art survey of the (...) major topics in the theory-theory/simulationism debate within philosophy of mind, developmental psychology, the aetiology of autism and primatology. The volume will be of great interest to researchers and students in all areas interested in the 'theory of mind' debate. (shrink)
Conspiracy theories are often portrayed as unwarranted beliefs, typically supported by suspicious kinds of evidence. Yet contemporary work in Philosophy argues provisional belief in conspiracy theories is—at the very—least understandable and if we take an evidential approach—judging individual conspiracy theories on their particular merits—belief in such theories turns out to be warranted in a range of cases. Drawing on this work, I examine the kinds of evidence typically associated with conspiracy theories, showing that the evidential problems typically associated with conspiracy (...) theories are not unique to such theories. As such, if there is a problem with the conspiracy theorist’s use of evidence, it is one of principle: is the principle which guides their use of evidence somehow in error? I argue that whatever we might think about conspiracy theories generally, there is no prima facie case for a scepticism of conspiracy theories based purely on their use of evidence. (shrink)
Conspiracy theories should be neither believed nor investigated - that is the conventional wisdom. I argue that it is sometimes permissible both to investigate and to believe. Hence this is a dispute in the ethics of belief. I defend epistemic ‘oughts’ that apply in the first instance to belief-forming strategies that are partly under our control. I argue that the policy of systematically doubting or disbelieving conspiracy theories would be both a political disaster and the epistemic equivalent of self-mutilation, since (...) it leads to the conclusion that history is bunk and the nightly news unbelievable. In fact (of course) the policy is not employed systematically but is only wheeled on to do down theories that the speaker happens to dislike. I develop a deductive argument from hard-to-deny premises that if you are not a ‘conspiracy theorist’ in my anodyne sense of the word then you are an ‘idiot’ in the Greek sense of the word, that is, someone so politically purblind as to have no opinions about either history or public affairs. The conventional wisdom can only be saved (if at all) if ‘conspiracy theory’ is given a slanted definition. I discuss some slanted definitions apparently presupposed by proponents of the conventional wisdom (including, amongst others, Tony Blair) and conclude that even with these definitions the conventional wisdom comes out as deeply unwise. I finish up with a little harmless fun at the expense of David Aaronvitch whose abilities as a rhetorician and a popular historian are not perhaps matched by a corresponding capacity for logical thought. (shrink)
“The universe is expanding, not contracting.” Many statements of this form appear unambiguously true; after all, the discovery of the universe’s expansion is one of the great triumphs of empirical science. However, the statement is time-directed: the universe expands towards what we call the future; it contracts towards the past. If we deny that time has a direction, should we also deny that the universe is really expanding? This article draws together and discusses what I call ‘C-theories’ of time — (...) in short, philosophical positions that hold time lacks a direction — from different areas of the literature. I set out the various motivations, aims, and problems for C-theories, and outline different versions of antirealism about the direction of time. (shrink)
Our topic is the theory of topics. My goal is to clarify and evaluate three competing traditions: what I call the way-based approach, the atom-based approach, and the subject-predicate approach. I develop criteria for adequacy using robust linguistic intuitions that feature prominently in the literature. Then I evaluate the extent to which various existing theories satisfy these constraints. I conclude that recent theories due to Parry, Perry, Lewis, and Yablo do not meet the constraints in total. I then introduce the (...) issue-based theory—a novel and natural entry in the atom-based tradition that meets our constraints. In a coda, I categorize a recent theory from Fine as atom-based, and contrast it to the issue-based theory, concluding that they are evenly matched, relative to our main criteria of adequacy. I offer tentative reasons to nevertheless favour the issue-based theory. (shrink)
Definitional and axiomatic theories of truth -- Objects of truth -- Tarski -- Truth and set theory -- Technical preliminaries -- Comparing axiomatic theories of truth -- Disquotation -- Classical compositional truth -- Hierarchies -- Typed and type-free theories of truth -- Reasons against typing -- Axioms and rules -- Axioms for type-free truth -- Classical symmetric truth -- Kripke-Feferman -- Axiomatizing Kripke's theory in partial logic -- Grounded truth -- Alternative evaluation schemata -- Disquotation -- Classical logic -- Deflationism (...) -- Reflection -- Ontological reduction -- Applying theories of truth. (shrink)
Three leading philosopher-logicians present a clear and concise overview of formal theories of truth, explaining key logical techniques. Truth is as central topic in philosophy: formal theories study the connections between truth and logic, including the intriguing challenges presented by paradoxes like the Liar.
Infectious logics are systems that have a truth-value that is assigned to a compound formula whenever it is assigned to one of its components. This paper studies four-valued infectious logics as the basis of transparent theories of truth. This take is motivated as a way to treat different pathological sentences differently, namely, by allowing some of them to be truth-value gluts and some others to be truth-value gaps and as a way to treat the semantic pathology suffered by at least (...) some of these sentences as infectious. This leads us to consider four distinct four-valued logics: one where truth-value gaps are infectious, but gluts are not; one where truth-value gluts are infectious, but gaps are not; and two logics where both gluts and gaps are infectious, in some sense. Additionally, we focus on the proof theory of these systems, by offering a discussion of two related topics. On the one hand, we prove some limitations regarding the possibility of providing standard Gentzen sequent calculi for these systems, by dualizing and extending some recent results for infectious logics. On the other hand, we provide sound and complete four-sided sequent calculi, arguing that the most important technical and philosophical features taken into account to usually prefer standard calculi are, indeed, enjoyed by the four-sided systems. (shrink)
Conspiracy theories are often portrayed as unwarranted beliefs, typically supported by suspicious kinds of evidence. Yet contemporary work in Philosophy argues provisional belief in conspiracy theories is at the very least understandable---because conspiracies occur---and that if we take an evidential approach, judging individual conspiracy theories on their particular merits, belief in such theories turns out to be warranted in a range of cases. -/- Drawing on this work, I examine the kinds of evidence typically associated with conspiracy theories, and show (...) how the so-called evidential problems with conspiracy theories are also problems for the kinds of evidence put forward in support of other theories. As such, if there is a problem with the conspiracy theorist's use of evidence, it is one of principle: is the principle which guides the conspiracy theorist's use of evidence somehow in error? I argue that whatever we might think about conspiracy theories generally, there is no prima facie case for a scepticism of conspiracy theories based purely on their use of evidence. (shrink)
Subjective theories of well-being claim that how well our lives go for us is a matter of our attitudes towards what we get in life rather than the nature of the things themselves. This article explains in more detail the distinction between subjective and objective theories of well-being; describes, for each approach, some reasons for thinking it is true; outlines the main kinds of subjective theory; and explains their advantages and disadvantages.
Many millions of people hold conspiracy theories; they believe that powerful people have worked together in order to withhold the truth about some important practice or some terrible event. A recent example is the belief, widespread in some parts of the world, that the attacks of 9/11 were carried out not by Al Qaeda, but by Israel or the United States. Those who subscribe to conspiracy theories may create serious risks, including risks of violence, and the existence of such theories (...) raises significant challenges for policy and law. The first challenge is to understand the mechanisms by which conspiracy theories prosper; the second challenge is to understand how such theories might be undermined. Such theories typically spread as a result of identifiable cognitive blunders, operating in conjunction with informational and reputational influences. A distinctive feature of conspiracy theories is their self-sealing quality. Conspiracy theorists are not likely to be persuaded by an attempt to dispel their theories; they may even characterize that very attempt as further proof of the conspiracy. Because those who hold conspiracy theories typically suffer from a crippled epistemology, in accordance with which it is rational to hold such theories, the best response consists in cognitive infiltration of extremist groups. Various policy dilemmas, such as the question whether it is better for government to rebut conspiracy theories or to ignore them, are explored in this light. (shrink)
Theories of Consciousness provides an introduction to a variety of approaches to consciousness, questions the nature of consciousness, and contributes to current debates about whether a scientific understanding of consciousness is possible. While discussing key figures including Descartes, Fodor, Dennett and Chalmers, the book incorporates identity theories, representational theories, intentionality, externalism and new information-based theories.
Causal theories of mental content attempt to explain how thoughts can be about things. They attempt to explain how one can think about, for example, dogs. These theories begin with the idea that there are mental representations and that thoughts are meaningful in virtue of a causal connection between a mental representation and some part of the world that is represented. In other words, the point of departure for these theories is that thoughts of dogs are about dogs because dogs (...) cause the mental representations of dogs. (shrink)
Necessitarianism, as we shall use the term, is the view that natural properties and causal powers are necessarily connected in some way. In recent decades the most popular forms of necessitarianism have been the anti-Humean powers-based theories of properties, such as dispositional essentialism and the identity theory. These versions of necessitarianism have come under fire in recent years and I believe it is time for necessitarians to develop a new approach. In this paper I identify unexplored ways of positing metaphysically (...) necessary connections in nature, using the concepts of grounding and essential dependence. For example, I show that one could be a necessitarian by insisting that the properties of things necessarily ground their powers, and that one can maintain this while rejecting dispositional essentialism. Using different combinations of claims about grounding and essential dependence, I map out a spectrum of new positions and compare them to previous theories of natural modality. Some of these positions are compatible with Humean metaphysics while others are not. The overall aim of the paper is to provide a new metaphysical framework for understanding theories of powers and thereby launch a new necessitarian research programme. (shrink)
Theories of Truth introduces readers to issues that have been connected with truth—the only book of its kind. Richard Kirkham has an easy writing style and a good sense of what needs to be explained to students new to the literature. These facts make Theories of Truth a serious contender for use in the classroom. As with most introductions, use of the book should be supplemented with readings from the major authors covered. Beyond that supplementation, however, the text still needs (...) to be used with some caution, for there are shortcomings that could seriously mislead students. (shrink)
Since the beginning of the 20th century, philosophers of science have asked, "what kind of thing is a scientific theory?" The logical positivists answered: a scientific theory is a mathematical theory, plus an empirical interpretation of that theory. Moreover, they assumed that a mathematical theory is specified by a set of axioms in a formal language. Later 20th century philosophers questioned this account, arguing instead that a scientific theory need not include a mathematical component; or that the mathematical component need (...) not be specified by a set of axioms in a formal language. We survey various accounts of scientific theories entertained in the 20th century -- removing some misconceptions, and clearing a path for future research. (shrink)
This chapter presents a new argument for thinking of traditional ethical theories as methods that can be used in first-order ethics - as a kind of deliberation procedures rather than as criteria of right and wrong. It begins from outlining how ethical theories, such as consequentialism and contractualism, are flexible frameworks in which different versions of these theories can be formulated to correspond to different first-order ethical views. The chapter then argues that, as a result, the traditional ethical theories cannot (...) be evaluated in terms of their truth or correctness. Instead, I will suggest that these theories should be understood as providing different kind of ways of thinking about difficult moral problems. I then recommend a certain form of pragmatic pluralism - it may well be that different moral problems are better approached through different ethical theories. (shrink)
Reliabilists hold that a belief is doxastically justified if and only if it is caused by a reliable process. But since such a process is one that tends to produce a high ratio of true to false beliefs, reliabilism is on the face of it applicable to binary beliefs, but not to degrees of confidence or credences. For while beliefs admit of truth or falsity, the same cannot be said of credences in general. A natural question now arises: Can reliability (...) theories of justified belief be extended or modified to account for justified credence? In this paper, I address this question. I begin by showing that, as it stands, reliabilism cannot account for justified credence. I then consider three ways in which the reliabilist may try to do so by extending or modifying her theory, but I argue that such attempts face certain problems. After that, I turn to a version of reliabilism that incorporates evidentialist elements and argue that it allows us to avoid the problems that the other theories face. If I am right, this gives reliabilists a reason, aside from those given recently by Comesaña and Goldman, to move towards such a kind of hybrid theory. (shrink)
In books such as The World Within the World and The Anthropic Cosmological Principle, astronomer John Barrow has emerged as a leading writer on our efforts to understand the universe. Timothy Ferris, writing in The Times Literary Supplement of London, described him as "a temperate and accomplished humanist, scientist, and philosopher of science--a man out to make a contribution, not a show." Now Barrow offers the general reader another fascinating look at modern physics, as he explores the quest for a (...) single, unifying theory that will unlock nature's secrets. Theories of Everything is more than a history of science, more than a popular report on recent research and discoveries. Barrow provides a reflective, intelligent commentary on what a true Theory of Everything would be--its ingredients, its limitations, and what it could tell us about the universe. Never before, he writes, have physicists been so confident and so eager in the hunt for this "cosmic Rosetta Stone," as he calls it: "a single all-embracing picture of all the laws of nature from which the inevitability of all things seen must follow with unimpeachable logic." He lays out eight essential ingredients for a Theory of Everything and then explores each in turn, tracing how our knowledge has developed and how scientific discovery relates to our changing philosophy and religious thought in each area. Some of these ingredients are obvious--the laws of nature must be explained, for example, as well as its organizing principles--but others may be surprising, such as broken symmetries and selection biases. A Theory of Everything must account for the fact that the universe is "messy and complicated," he tells us, and for the limitations imposed by the questions we ask and the information we can obtain. The key lies in the remarkable capacity of mathematics to express the fundamental workings of the physical world--a language that the human mind is uniquely equipped to understand and manipulate. Barrow examines what mathematics actually is and describes how it makes the universe intelligible and provides a path to the underlying coherence in nature--which has led, in fact, to arguments that the universe itself is a vast computer. Yet even the most complete theory, even the most comprehensive mathematical explanation, cannot account for the uncomputable varieties of human experience and thought. "No non-poetic account of reality," he writes, "can be complete." In a field where the authorities converse in equations and mathematical notations, John Barrow speaks with the voice of thoughtful and knowledgeable humanist. Written with eloquence and expertise, Theories of Everything establishes a new perspective on humanity's efforts to explain the universe. (shrink)
This paper examines folk theories of algorithmic recommendations on Spotify in order to make visible the cultural specificities of data assemblages in the global South. The study was conducted in Costa Rica and draws on triangulated data from 30 interviews, 4 focus groups with 22 users, and the study of “rich pictures” made by individuals to graphically represent their understanding of algorithmic recommendations. We found two main folk theories: one that personifies Spotify and another one that envisions it as a (...) system full of resources. Whereas the first theory emphasizes local conceptions of social relations to make sense of algorithms, the second one stresses the role of algorithms in providing a global experience of music and technology. We analyze why people espouse either one of these theories and how these theories provide users with resources to enact different modalities of power and resistance in relation to recommendation algorithms. We argue that folk theories thus offer a productive way to broaden understanding of what agency means in relation to algorithms. (shrink)
In “The Toolbox of Science” (1995) together with Towfic Shomar we advocated a form of instrumentalism about scientific theories. We separately developed this view further in a number of subsequent works. Steven French, James Ladyman, Otavio Bueno and Newton Da Costa (FLBD) have since written at least eight papers and a book criticising our work. Here we defend ourselves. First we explain what we mean in denying that models derive from theory – and why their failure to do so should (...) be lamented. Second we defend our use of the London model of superconductivity as an example. Third we point out both advantages and weaknesses of FLBD’s techniques in comparison to traditional Anglophone versions of the semantic conception. Fourth we show that FLBD’s version of the semantic conception has not been applied to our case study. We conclude by raising doubts about FLBD’s overall project. (shrink)
Skeptical hypotheses such as the brain-in-a-vat hypothesis provide extremely poor explanations for our sensory experiences. Because these scenarios accommodate virtually any possible set of evidence, the probability of any given set of evidence on the skeptical scenario is near zero; hence, on Bayesian grounds, the scenario is not well supported by the evidence. By contrast, serious theories make reasonably specific predictions about the evidence and are then well supported when these predictions are satisfied.
For a long time, regularity accounts of causation have virtually vanished from the scene. Problems encountered within other theoretical frameworks have recently induced authors working on causation, laws of nature, or methodologies of causal reasoning – as e.g. May (Kausales Schliessen. Eine Untersuchung über kausale Erklärungen und Theorienbildung. Ph.D. thesis, Universität Hamburg, Hamburg, 1999), Ragin (Fuzzy-set social science. Chicago: University of Chicago Press, 2000), Graßhoff and May (Causal regularities. In W. Spohn, M. Ledwig, & M. Esfeld (Eds.), Current issues in (...) causation (pp. 85–114). Paderborn: Mentis, 2001), Swartz (The concept of physical law (2nd ed.). http://www.sfu.ca/philosophy/physical-law/, 2003), Halpin (Erkenntnis, 58, 137–168, 2003) – to direct their attention back to regularity theoretic analyses. In light of the latest proposals of regularity theories, the paper at hand therefore reassesses the criticism raised against regularity accounts since the INUS theory of causation of Mackie (The cement of the universe. A study of causation. Oxford: Clarendon Press, 1974). It is shown that most of these objections target strikingly over-simplified regularity theoretic sketches. By outlining ways to refute these objections it is argued that the prevalent conviction as to the overall failure of regularity theories has been hasty. (shrink)
9/11 was an inside job. The Holocaust is a myth promoted to serve Jewish interests. The shootings at Sandy Hook Elementary School were a false flag operation. Climate change is a hoax perpetrated by the Chinese government. These are all conspiracy theories. A glance online or at bestseller lists reveals how popular some of them are. Even if there is plenty of evidence to disprove them, people persist in propagating them. Why? Philosopher Quassim Cassam explains how conspiracy theories are different (...) from ordinary theories about conspiracies. He argues that conspiracy theories are forms of propaganda and their function is to promote a political agenda. Although conspiracy theories are sometimes defended on the grounds that they uncover evidence of bad behaviour by political leaders, they do much more harm than good, with some resulting in the deaths of large numbers of people. There can be no clearer indication that something has gone wrong with our intellectual and political culture than the fact that conspiracy theories have become mainstream. When they are dangerous, we cannot afford to ignore them. At the same time, refuting them by rational argument is difficult because conspiracy theorists discount or reject evidence that disproves their theories. As conspiracy theories are so often smokescreens for political ends, we need to come up with political as well as intellectual responses if we are to have any hope of defeating them. (shrink)
The basic idea of counterfactual theories of causation is that the meaning of causal claims can be explained in terms of counterfactual conditionals of the form “If A had not occurred, C would not have occurred”. While counterfactual analyses have been given of type-causal concepts, most counterfactual analyses have focused on singular causal or token-causal claims of the form “event c caused event e”. Analyses of token-causation have become popular in the last thirty years, especially since the development in the (...) 1970's of possible world semantics for counterfactuals. The best known counterfactual analysis of causation is David Lewis's (1973b) theory. However, intense discussion over thirty years has cast doubt on the adequacy of any simple analysis of singular causation in terms of counterfactuals. Recent years have seen a proliferation of different refinements of the basic idea to achieve a closer match with commonsense judgements about causation. (shrink)
I argue that, under the glitz, dual theories are examples of theoretically equivalent descriptions of the same underlying physical content: I distinguish them from cases of genuine underdetermination on the grounds that there is no real incompatibility involved between the descriptions. The incompatibility is at the level of unphysical structure. I argue that dual pairs are in fact very strongly analogous to gauge- related solutions even for dual pairs that look the most radically distinct, such as AdS/CFT.
All theories of the right to secede either understand the right as a remedial right only or also recognize a primary right to secede. By a right in this context is meant a general, not a special, right (one generated through promising, contract, or some special relationship). Remedial Right Only Theories assert that a group has a general right to secede if and only if it has suffered certain injustices, for which secession is the appropriate remedy of last resort.1 Different (...) Remedial Right Only Theories identify different injustices as warranting the remedy of secession. (shrink)