We argue that David Lewis’s principal principle implies a version of the principle of indifference. The same is true for similar principles that need to appeal to the concept of admissibility. Such principles are thus in accord with objective Bayesianism, but in tension with subjective Bayesianism. 1 The Argument2 Some Objections Met.
When making decisions, governments can and should strive consciously to balance the demands of the present with the needs of future generations. Various advocates for greater governmental foresight have created new processes or institutions within existing systems of democratic government. These include long-range planning departments, futures commissions, impact statements on proposed legislation, environmental protection agencies, and formal technology assessment. But, much more remains to be done. Based on their extensive scholarly and practical experience, the contributors to this volume propose a (...) variety of techniques that will enable foresight to be effectively incorporated into governmental institutions and practices. (shrink)
Direct inferences identify certain probabilistic credences or confirmation-function-likelihoods with values of objective chances or relative frequencies. The best known version of a direct inference principle is David Lewis’s Principal Principle. Certain kinds of statements undermine direct inferences. Lewis calls such statements inadmissible. We show that on any Bayesian account of direct inference several kinds of intuitively innocent statements turn out to be inadmissible. This may pose a significant challenge to Bayesian accounts of direct inference. We suggest some ways in which (...) these challenges may be addressed. (shrink)
In face of the multiple controversies surrounding the DSM process in general and the development of DSM-5 in particular, we have organized a discussion around what we consider six essential questions in further work on the DSM. The six questions involve: 1) the nature of a mental disorder; 2) the definition of mental disorder; 3) the issue of whether, in the current state of psychiatric science, DSM-5 should assume a cautious, conservative posture or an assertive, transformative posture; 4) the role (...) of pragmatic considerations in the construction of DSM-5; 5) the issue of utility of the DSM - whether DSM-III and IV have been designed more for clinicians or researchers, and how this conflict should be dealt with in the new manual; and 6) the possibility and advisability, given all the problems with DSM-III and IV, of designing a different diagnostic system. Part I of this article will take up the first two questions. With the first question, invited commentators express a range of opinion regarding the nature of psychiatric disorders, loosely divided into a realist position that the diagnostic categories represent real diseases that we can accurately name and know with our perceptual abilities, a middle, nominalist position that psychiatric disorders do exist in the real world but that our diagnostic categories are constructs that may or may not accurately represent the disorders out there, and finally a purely constructivist position that the diagnostic categories are simply constructs with no evidence of psychiatric disorders in the real world. The second question again offers a range of opinion as to how we should define a mental or psychiatric disorder, including the possibility that we should not try to formulate a definition. The general introduction, as well as the introductions and conclusions for the specific questions, are written by James Phillips, and the responses to commentaries are written by Allen Frances. (shrink)
John Locke proposed a straightforward relationship between qualitative and quantitative doxastic notions: belief corresponds to a sufficiently high degree of confidence. Richard Foley has further developed this Lockean thesis and applied it to an analysis of the preface and lottery paradoxes. Following Foley's lead, we exploit various versions of these paradoxes to chart a precise relationship between belief and probabilistic degrees of confidence. The resolutions of these paradoxes emphasize distinct but complementary features of coherent belief. These features suggest principles that (...) tie together qualitative and quantitative doxastic notions. We show how these principles may be employed to construct a quantitative model - in terms of degrees of confidence - of an agent's qualitative doxastic state. This analysis fleshes out the Lockean thesis and provides the foundation for a logic of belief that is responsive to the logic of degrees of confidence. (shrink)
We chart the ways in which closure properties of consequence relations for uncertain inference take on different forms according to whether the relations are generated in a quantitative or a qualitative manner. Among the main themes are: the identification of watershed conditions between probabilistically and qualitatively sound rules; failsafe and classicality transforms of qualitatively sound rules; non-Horn conditions satisfied by probabilistic consequence; representation and completeness problems; and threshold-sensitive conditions such as `preface' and `lottery' rules.
In a penetrating investigation of the relationship between belief and quantitative degrees of confidence (or degrees of belief) Richard Foley (1992) suggests the following thesis: ... it is epistemically rational for us to believe a proposition just in case it is epistemically rational for us to have a sufficiently high degree of confidence in it, sufficiently high to make our attitude towards it one of belief. Foley goes on to suggest that rational belief may be just rational degree of confidence (...) above some threshold level that the agent deems sufficient for belief. He finds hints of this view in Locke’s discussion of probability and degrees of assent, so he calls it the Lockean Thesis.1 The Lockean Thesis has important implications for the logic of belief. Most prominently, it implies that even a logically ideal agent whose degrees of confidence satisfy the axioms of probability theory may quite rationally believe each of a large body of propositions that are jointly inconsistent. For example, an agent may legitimately believe that on each given occasion her well-maintained car will start, but nevertheless believe that she will eventually encounter a.. (shrink)
Sections 1 through 3 present all of the main ideas behind the probabilistic logic of evidential support. For most readers these three sections will suffice to provide an adequate understanding of the subject. Those readers who want to know more about how the logic applies when the implications of hypotheses about evidence claims (called likelihoods) are vague or imprecise may, after reading sections 1-3, skip to section 6. Sections 4 and 5 are for the more advanced reader who wants a (...) detailed understanding of some telling results about how this logic may bring about convergence to the truth. (shrink)
I will describe the logics of a range of conditionals that behave like conditional probabilities at various levels of probabilistic support. Families of these conditionals will be characterized in terms of the rules that their members obey. I will show that for each conditional, →, in a given family, there is a probabilistic support level r and a conditional probability function P such that, for all sentences C and B, 'C → B' holds just in case P[B | C] ≥ (...) r. Thus, each conditional in a given family behaves like conditional probability above some specific support level. (shrink)
I argue that Bayesians need two distinct notions of probability. We need the usual degree-of-belief notion that is central to the Bayesian account of rational decision. But Bayesians also need a separate notion of probability that represents the degree to which evidence supports hypotheses. Although degree-of-belief is well suited to the theory of rational decision, Bayesians have tried to apply it to the realm of hypothesis confirmation as well. This double duty leads to the problem of old evidence, a problem (...) that, we will see, is much more extensive than usually recognized. I will argue that degree-of-support is distinct from degree-of-belief, that it is not just a kind of counterfactual degree-of-belief, and that it supplements degree-of-belief in a way that resolves the problems of old evidence and provides a richer account of the logic of scientific inference and belief. (shrink)
In face of the multiple controversies surrounding the DSM process in general and the development of DSM-5 in particular, we have organized a discussion around what we consider six essential questions in further work on the DSM. The six questions involve: 1) the nature of a mental disorder; 2) the definition of mental disorder; 3) the issue of whether, in the current state of psychiatric science, DSM-5 should assume a cautious, conservative posture or an assertive, transformative posture; 4) the role (...) of pragmatic considerations in the construction of DSM-5; 5) the issue of utility of the DSM - whether DSM-III and IV have been designed more for clinicians or researchers, and how this conflict should be dealt with in the new manual; and 6) the possibility and advisability, given all the problems with DSM-III and IV, of designing a different diagnostic system. Part I of this article took up the first two questions. Part II will take up the second two questions. Question 3 deals with the question as to whether DSM-V should assume a conservative or assertive posture in making changes from DSM-IV. That question in turn breaks down into discussion of diagnoses that depend on, and aim toward, empirical, scientific validation, and diagnoses that are more value-laden and less amenable to scientific validation. Question 4 takes up the role of pragmatic consideration in a psychiatric nosology, whether the purely empirical considerations need to be tempered by considerations of practical consequence. As in Part 1 of this article, the general introduction, as well as the introductions and conclusions for the specific questions, are written by James Phillips, and the responses to commentaries are written by Allen Frances. (shrink)
"In the midst of the world, darkened with many sins and sorrows, in which the majority live, there abides another world, lighted up with shining virtues and unpolluted joy, in which the perfect ones live. This world can be found and entered, and the way to it is by self-control and moral excellence. It is the world of the perfect life, and it rightly belongs to man, who is not complete until crowned with perfection. The perfect life is not the (...) far-away, impossible thing that men who are in darkness imagine it to be; it is supremely possible, and very near and real. Man remains a craving, weeping, sinning, repenting creature just so long as he wills to do so by clinging to those weak conditions; but when he wills to shake off his dark dreams and to rise, he arises and achieves." - JAMESALLEN - A Complete and Unabridged edition of JamesAllen's book "The Life Triumphant." Part of The Works of JamesAllen Series. Other Works by JamesAllen:- Above Life's Turmoil All These Things Added As a Man Thinketh Byways of Blessedness Entering the Kingdom From Passion to Peace From Poverty to Power Foundation Stones to Happiness and Success JamesAllen's Book of Meditations for Every Day in the Year Light on Life's Difficulties Man: King of Mind, Body and Circumstance Men and Systems Morning and Evening Thoughts Out from the Heart Poems of Peace The Divine Companion The Eight Pillars of Prosperity The Heavenly Life The Mastery of Destiny The Path to Prosperity The Shining Gateway The Way of Peace Through the Gate of Good. (shrink)
"The discovery of the law of Evolution in the material world has prepared men for a knowledge of the law of cause and effect in the mental world.... In the realm of thought and deed, the good survives, for it is ''fittest;'' the evil ultimately perishes. To know that the ''perfect law'' of Causation is as all-embracing in mind as in matter, is to be relieved from all anxiety concerning the ultimate destiny of individuals and of humanity-''For man is man (...) and master of his fate'' and the will in man which is conquering the knowledge of natural law will conquer the knowledge of spiritual law.... In this volume I have tried to set down some words indicative of this Law and this Destiny, and the manner of its working and its building." - JAMESALLEN - A Complete and Unabridged edition of JamesAllen''s book The Mastery of Destiny. Part of The Works of JamesAllen Series. Other Books by JamesAllen:- Above Life''s Turmoil All These Things Added As a Man Thinketh Byways of Blessedness Entering the Kingdom Foundation Stones to Happiness and Success From Passion to Peace From Poverty to Power JamesAllen''s Book of Meditations for Every Day in the Year Light on Life''s Difficulties Man: King of Mind, Body and Circumstance Men and Systems Morning and Evening Thoughts Out from the Heart Poems of Peace The Divine Companion The Eight Pillars of Prosperity The Heavenly Life The Life Triumphant The Path to Prosperity The Shining Gateway The Way of Peace Through the Gate of Good. (shrink)
Scientific realists often appeal to some version of the conjunction objection to argue that scientific instrumentalism fails to do justice to the full empirical import of scientific theories. Whereas the conjunction objection provides a powerful critique of scientific instrumentalism, I will show that mathematical instnrunentalism escapes the conjunction objection unscathed.
Eliminative induction is a method for finding the truth by using evidence to eliminate false competitors. It is often characterized as "induction by means of deduction"; the accumulating evidence eliminates false hypotheses by logically contradicting them, while the true hypothesis logically entails the evidence, or at least remains logically consistent with it. If enough evidence is available to eliminate all but the most implausible competitors of a hypothesis, then (and only then) will the hypothesis become highly confirmed. I will argue (...) that, with regard to the evaluation of hypotheses, Bayesian inductive inference is essentially a probabilistic form of induction by elimination. Bayesian induction is an extension of eliminativism to cases where, rather than contradict the evidence, false hypotheses imply that the evidence is very unlikely, much less likely than the evidence would be if some competing hypothesis were true. This is not, I think, how Bayesian induction is usually understood. The recent book by Howson and Urbach, for example, provides an excellent, comprehensive explanation and defense of the Bayesian approach; but this book scarcely remarks on Bayesian induction's eliminative nature. Nevertheless, the very essence of Bayesian induction is the refutation of false competitors of a true hypothesis, or so I will argue. (shrink)
From a leading authority in artificial intelligence, this book delivers a synthesis of the major modern techniques and the most current research in natural language processing. The approach is unique in its coverage of semantic interpretation and discourse alongside the foundational material in syntactic processing.
Naive deductivist accounts of confirmation have the undesirable consequence that if E confirms H, then E also confirms the conjunction H·X, for any X—even if X is completely irrelevant to E and H. Bayesian accounts of confirmation may appear to have the same problem. In a recent article in this journal Fitelson (2002) argued that existing Bayesian attempts to resolve of this problem are inadequate in several important respects. Fitelson then proposes a new‐and‐improved Bayesian account that overcomes the problem of (...) irrelevant conjunction, and does so in a more general setting than past attempts. We will show how to simplify and improve upon Fitelson's solution. (shrink)
Confirmation theory is the study of the logic by which scientific hypotheses may be confirmed or disconfirmed, or even refuted by evidence. A specific theory of confirmation is a proposal for such a logic. Presumably the epistemic evaluation of scientific hypotheses should largely depend on their empirical content – on what they say the evidentially accessible parts of the world are like, and on the extent to which they turn out to be right about that. Thus, all theories of confirmation (...) rely on measures of how well various alternative hypotheses account for the evidence.1 Most contemporary confirmation theories employ probability functions to provide such a measure. They measure how well the evidence fits what the hypothesis says about the world in terms of how likely it is that the evidence should occur were the hypothesis true. Such hypothesis-based probabilities of evidence claims are called likelihoods. Clearly, when the evidence is more likely according to one hypothesis than according to an alternative, that should redound to the credit of the former hypothesis and the discredit of the later. But various theories of confirmation diverge on precisely how this credit is to be measured? (shrink)
I’ll describe a range of systems for nonmonotonic conditionals that behave like conditional probabilities above a threshold. The rules that govern each system are probabilistically sound in that each rule holds when the conditionals are interpreted as conditional probabilities above a threshold level specific to that system. The well-known preferential and rational consequence relations turn out to be special cases in which the threshold level is 1. I’ll describe systems that employ weaker rules appropriate to thresholds lower than 1, and (...) compare them to these two standard systems. (shrink)
The objectivity of Bayesian induction relies on the ability of evidence to produce a convergence to agreement among agents who initially disagree about the plausibilities of hypotheses. I will describe three sorts of Bayesian convergence. The first reduces the objectivity of inductions about simple "occurrent events" to the objectivity of posterior probabilities for theoretical hypotheses. The second reveals that evidence will generally induce converge to agreement among agents on the posterior probabilities of theories only if the convergence is 0 or (...) 1. The third establishes conditions under which evidence will very probably compel posterior probabilities of theories to converge to 0 or 1. (shrink)
The (recent, Bayesian) cognitive science literature on The Wason Task (WT) has been modeled largely after the (not-so-recent, Bayesian) philosophy of science literature on The Paradox of Confirmation (POC). In this paper, we apply some insights from more recent Bayesian approaches to the (POC) to analogous models of (WT). This involves, first, retracing the history of the (POC), and, then, reexamining the (WT) with these historico-philosophical insights in mind.
I argue for an epistemic conception of voting, a conception on which the purpose of the ballot is at least in some cases to identify which of several policy proposals will best promote the public good. To support this view I first briefly investigate several notions of the kind of public good that public policy should promote. Then I examine the probability logic of voting as embodied in two very robust versions of the Condorcet Jury Theorem and some related results. (...) These theorems show that if the number of voters or legislators is sufficiently large and the average of their individual propensities to select the better of two policy proposals is a little above random chance, and if each person votes his or her own best judgment (rather than in alliance with a block or faction), then the majority is extremely likely to select the better alternative. Here ‘better alternative’ means that policy or law that will best promote the public good. I also explicate a Convincing Majorities Theorem, which shows the extent to which the majority vote should provide evidence that the better policy has been selected. Finally, I show how to extend all of these results to judgments among multiple alternatives through the kind of sequential balloting typical of the legislative amendment process. (shrink)
Rational consequence relations and Popper functions provide logics for reasoning under uncertainty, the former purely qualitative, the latter probabilistic. But few researchers seem to be aware of the close connection between these two logics. I’ll show that Popper functions are probabilistic versions of rational consequence relations. I’ll not assume that the reader is familiar with either logic. I present them, and explicate the relationship between them, from the ground up. I’ll also present alternative axiomatizations for each logic, showing them to (...) depend on weaker axioms than usually recognized. (shrink)
The metaphysical conception of the generation of the macroworld from fundamental physics that Hawthorne considers is criticized in this Commentary, and compared with the scientific account offered by Halliwell and Hartle. It is argued that Hawthorn's critique of Everettian quantum mechanics fails.
Scientiﬁ c theories and hypotheses make claims that go well beyond what we can immediately observe. How can we come to know whether such claims are true? The obvious approach is to see what a hypothesis says about the observationally accessible parts of the world. If it gets that wrong, then it must be false; if it gets that right, then it may have some claim to being true. Any sensible a empt to construct a logic that captures how we (...) may come to reasonably believe the falsehood or truth of scientiﬁ c hypotheses must be built on this idea. Philosophers refer to such logics as logics of conﬁ rmation or as conﬁ rmation theories. (shrink)
Jeffrey updating is a natural extension of Bayesian updating to cases where the evidence is uncertain. But, the resulting degrees of belief appear to be sensitive to the order in which the uncertain evidence is acquired, a rather un-Bayesian looking effect. This order dependence results from the way in which basic Jeffrey updating is usually extended to sequences of updates. The usual extension seems very natural, but there are other plausible ways to extend Bayesian updating that maintain order-independence. I will (...) explore three models of sequential updating, the usual extension and two alternatives. I will show that the alternative updating schemes derive from extensions of the usual rigidity requirement, which is at the heart of Jeffrey updating. Finally, I will establish necessary and sufficient conditions for order-independent updating, and show that extended rigidity is closely related to these conditions. (shrink)
We will formulate two Bell arguments. Together they show that if the probabilities given by quantum mechanics are approximately correct, then the properties exhibited by certain physical systems must be nontrivially dependent on thetypes of measurements performedand eithernonlocally connected orholistically related to distant events. Although a number of related arguments have appeared since John Bell's original paper (1964), they tend to be either highly technical or to lack full generality. The following arguments depend on the weakest of premises, and the (...) structure of the arguments is simpler than most (without any loss of rigor or generality). The technical simplicity is due in part to a novel version of the generalized Bell inequality. The arguments are self contained and presuppose no knowledge of quantum mechanics. We will also offer a Dutch Book argument for measurement type dependence. (shrink)
In a previous paper I described a range of nonmonotonic conditionals that behave like conditional probability functions at various levels of probabilistic support. These conditionals were defined as semantic relations on an object language for sentential logic. In this paper I extend the most prominent family of these conditionals to a language for predicate logic. My approach to quantifiers is closely related to Hartry Field's probabilistic semantics. Along the way I will show how Field's semantics differs from a substitutional interpretation (...) of quantifiers in crucial ways, and show that Field's approach is closely related to the usual objectual semantics. One of Field's quantifier rules, however, must be significantly modified to be adapted to nonmonotonic conditional semantics. And this modification suggests, in turn, an alternative quantifier rule for probabilistic semantics. (shrink)
Original and penetrating, this book investigates of the notion of inference from signs, which played a central role in ancient philosophical and scientific method. It examines an important chapter in ancient epistemology: the debates about the nature of evidence and of the inferences based on it--or signs and sign-inferences as they were called in antiquity. As the first comprehensive treatment of this topic, it fills an important gap in the histories of science and philosophy.
In the conclusion to this multi-part article I first review the discussions carried out around the six essential questions in psychiatric diagnosis – the position taken by Allen Frances on each question, the commentaries on the respective question along with Frances’ responses to the commentaries, and my own view of the multiple discussions. In this review I emphasize that the core question is the first – what is the nature of psychiatric illness – and that in some manner all (...) further questions follow from the first. Following this review I attempt to move the discussion forward, addressing the first question from the perspectives of natural kind analysis and complexity analysis. This reflection leads toward a view of psychiatric disorders – and future nosologies – as far more complex and uncertain than we have imagined. (shrink)
Physicalism is roughly the thesis that everything is physical. The two most popular ways of formulating physicalism rigorously are the ways given by Frank Jackson and David Chalmers. The best objections, in turn, include John Hawthorne’s ‘blocker’ objections. Hawthorne argues that, if it is possible for there to be non-physical beings or properties that prevent certain mental phenomena from existing, Jackson’s and Chalmers’ formulations will be inadequate. Jackson’s formulation will be inadequate by virtue of not capturing all of (...) the right physical dependence principles. Chalmers’ formulation will be inadequate in so far as, when modified to define ‘restricted physicalisms’, such as physicalism of the mental, the restricted formulations will not capture all of the right physical dependence principles. By contrast, I argue that Hawthorne’s blocker arguments are misguided on the grounds that non-physical blockers are impossible; I argue that his critique of Chalmers’ formulation is unsound by virtue of falsely presupposing that restricted physicalisms require restricted formulations of physicalism ; and I argue that Jackson’s and Chalmers’ formulations capture all of the right physical dependence principles. (shrink)
R. G. Bury’s translations of Sextus Empiricus for the Loeb Library have served English language readers well, but new translations, taking account of advances in scholarship since Bury’s day, have long been needed. We now have two new English versions of the Outlines of Pyrrhonism. They take different and in some ways complementary approaches to the task.
Thousands of texts discuss Egytpain cosmology and cosmogony. JamesAllen has selected sixteen to translate and discuss in order to shed light on one of the questions that clearly preoccupied ancient intellectuals; the origins of the world.