Coping with uncertainty is a necessary part of ordinary life and is crucial to an understanding of how the mind works. For example, it is a vital element in developing artificial intelligence that will not be undermined by its own rigidities. There have been many approaches to the problem of uncertain inference, ranging from probability to inductive logic to nonmonotonic logic. Thisbook seeks to provide a clear exposition of these approaches within a unified framework. The principal market for the book (...) will be students and professionals in philosophy, computer science, and AI. Among the special features of the book are a chapter on evidential probability, which has not received a basic exposition before; chapters on nonmonotonic reasoning and theory replacement, matters rarely addressed in standard philosophical texts; and chapters on Mill's methods and statistical inference that cover material sorely lacking in the usual treatments of AI and computer science. (shrink)
Recent advances in philosophy, artificial intelligence, mathematical psychology, and the decision sciences have brought a renewed focus to the role and interpretation of probability in theories of uncertain reasoning. Henry E. Kyburg, Jr. has long resisted the now dominate Bayesian approach to the role of probability in scientific inference and practical decision. The sharp contrasts between the Bayesian approach and Kyburg's program offer a uniquely powerful framework within which to study several issues at the heart of scientific (...) inference, decision, and reasoning under uncertainty. The commissioned essays for this volume take measure of the scope and impact of Kyburg's views on probability and scientific inference, and include several new and important contributions to the field. Contributors: Gert de Cooman, Clark Glymour, William Harper, Isaac Levi, Ron Loui, Enrique Miranda, John Pollock, Teddy Seidenfeld, Choh Man Teng, Mariam Thalos, Gregory Wheeler, Jon Williamson, and Henry E. Kyburg, Jr. (shrink)
Bishop Butler, [Butler, 1736], said that probability was the very guide of life. But what interpretations of probability can serve this function? It isn’t hard to see that empirical views won’t do, and many recent writers-for example John Earman, who has said that Bayesianism is “the only game in town”-have been persuaded by various dutch book arguments that only subjective probability will perform the function required. We will defend the thesis that probability construed in this way offers very little guidance, (...) dutch book arguments notwithstanding. We will sketch a way out of the impasse. (shrink)
There are a number of reasons for being interested in uncertainty, and there are also a number of uncertainty formalisms. These formalisms are not unrelated. It is argued that they can all be reflected as special cases of the approach of taking probabilities to be determined by sets of probability functions defined on an algebra of statements. Thus, interval probabilities should be construed as maximum and minimum probabilities within a set of distributions, Glenn Shafer's belief functions should be construed as (...) lower probabilities, etc. Updating probabilities introduces new considerations, and it is shown that the representation of belief as a set of probabilities conflicts in this regard with the updating procedures advocated by Shafer. The attempt to make subjectivistic probability plausible as a doctrine of rational belief by making it more flowery -- i.e., by adding new dimensions -- does not succeed. But, if one is going to represent beliefs by sets of distributions, those sets of distributions might as well be based in statistical knowledge, as they are in epistemological or evidential probability. (shrink)
If someone comes to my house, saying, "Here is a bone; I hope Obrecht likes it," I might answer with a deductive argument: "You may rest assured on that score. Obrecht is a dog, and all dogs like bones; therefore Obrecht will like it." We may formalize this argument as follows: Let G be the bone, O be Obrecht, D be the class of dogs, B be the class of bones, and, finally, let L be the class of ordered pairs (...) such that x likes y. The premises are then: x ∈ D ⋅ y ∈ B ⋅ ⊃ ∈ L) G ∈ B O ∈ D. (shrink)
The dominant argument for the introduction of propensities or chances as an interpretation of probability depends on the difficulty of accounting for single case probabilities. We argue that in almost all cases, the "single case" application of probability can be accounted for otherwise. "Propensities" are needed only in theoretical contexts, and even there applications of probability need only depend on propensities indirectly.
Ralph Waldo Emerson said, “a Foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines.” The alleged evidence has mounted that ordinary folk are prone to inconsistency, and particularly that they are prone to inconsistency when it comes to probabilistic judgments. I write “alleged,” because it is open to question whether the experiments that provide this evidence are well designed—in particular whether Quine’s principle of logistical charity has been followed. I also do so because (...) in some cases of probability judgments the untutored intuitions of ordinary people seem to be at least as good as, and perhaps better than, the intuitions of those who run the experiments. (shrink)
If someone comes to my house, saying, "Here is a bone; I hope Obrecht likes it," I might answer with a deductive argument: "You may rest assured on that score. Obrecht is a dog, and all dogs like bones; therefore Obrecht will like it." We may formalize this argument as follows: Let G be the bone, O be Obrecht, D be the class of dogs, B be the class of bones, and, finally, let L be the class of ordered pairs (...) such that x likes y. The premises are then: x ∈ D ⋅ y ∈ B ⋅ ⊃ ∈ L) G ∈ B O ∈ D. (shrink)
These essays are characterized by meticulous argument in the analytical tradition. The book concerns matters of metaphysics in a broad sense: philosophy of mind and the problems of subjectivity; questions concerning nomological and statistical laws of nature; and, as we would expect, chance and induction.
Measurement is fundamental to all the sciences, the behavioural and social as well as the physical and in the latter its results provide our paradigms of 'objective fact'. But the basis and justification of measurement is not well understood and is often simply taken for granted. HenryKyburg Jr proposes here an original, carefully worked out theory of the foundations of measurement, to show how quantities can be defined, why certain mathematical structures are appropriate to them and what (...) meaning attaches to the results generated. Crucial to his approach is the notion of error - it can not be eliminated entirely from its introduction and control, her argues, arises the very possibility of measurement. Professor Kyburg's approach emphasises the empirical process of making measurements. In developing it he discusses vital questions concerning the general connection between a scientific theory and the results which support it (or fail to). (shrink)
Ralph Waldo Emerson said, “a Foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines.” The alleged evidence has mounted that ordinary folk are prone to inconsistency, and particularly that they are prone to inconsistency when it comes to probabilistic judgments. I write “alleged,” because it is open to question whether the experiments that provide this evidence are well designed—in particular whether Quine’s principle of logistical charity has been followed. I also do so because (...) in some cases of probability judgments the untutored intuitions of ordinary people seem to be at least as good as, and perhaps better than, the intuitions of those who run the experiments. (shrink)
The system presented by the author in The Logical Foundations of Statistical Inference suffered from certain technical difficulties, and from a major practical difficulty; it was hard to be sure, in discussing examples and applications, when you had got hold of the right reference class. The present paper, concerned mainly with the characterization of randomness, resolves the technical difficulties and provides a well structured framework for the choice of a reference class. The definition of randomness that leads to this framework (...) is simplified and clarified in a number of respects. It resolves certain puzzles raised by S. Spielman and W. Harper in their contributions to Profiles: Henry E. Kyburg, Jr. and Isaac Levi 1982). (shrink)