1. WHAT IS ARTIFICIAL INTELLIGENCE? One of the fascinating aspects of the field of artificial intelligence (AI) is that the precise nature of its subject ..
If the decades of the forties through the sixties were dominated by discussion of Hempel's “covering law“ explication of explanation, that of the seventies was preoccupied with Salmon's “statistical relevance” conception, which emerged as the principal alternative to Hempel's enormously influential account. Readers of Wesley C. Salmon's Scientific Explanation and the Causal Structure of the World, therefore, ought to find it refreshing to discover that its author has not remained content with a facile defense of his previous investigations; on the (...) contrary, Salmon offers an original account of different kinds of explications, advances additional criticisms of various alternative theories, and elaborates a novel “two-tiered“ analysis of explanation that tacitly depends upon a “two-tiered” account of homogeneity. Indeed, if the considerations that follow are correct, Salmon has not merely refined his statistical relevance account but has actually abandoned it in favor of a “causal/mechanistic“ construction. This striking development suggests that the theory of explanation is likely to remain as lively an arena of debate in the eighties as it has been in the past. (shrink)
The purpose of this paper is to explore three alternative frameworks for understanding the nature of language and mentality, which accent syntactical, semantical, and pragmatical aspects of the phenomena with which they are concerned, respectively. Although the computational conception currently exerts considerable appeal, its defensibility appears to hinge upon an extremely implausible theory of the relation of form to content. Similarly, while the representational approach has much to recommend it, its range is essentially restricted to those units of language that (...) can be understood in terms of undefined units. Thus, the only alternative among these three that can account for the meaning of primitive units of language is one emphasizing the basic role of skills, habits, and tendencies in relating signs and dispositions. (shrink)
The notion of program verification appears to trade upon an equivocation. Algorithms, as logical structures, are appropriate subjects for deductive verification. Programs, as causal models of those structures, are not. The success of program verification as a generally applicable and completely reliable method for guaranteeing program performance is not even a theoretical possibility.
Luciano Floridi offers a theory of information as a “strongly semantic” notion, according to which information encapsulates truth, thereby making truth a necessary condition for a sentence to qualify as “information”. While Floridi provides an impressive development of this position, the aspects of his approach of greatest philosophical significance are its foundations rather than its formalization. He rejects the conception of information as meaningful data, which entails at least three theses – that information can be false; that tautologies are information; (...) and, that “It is true that...” is non-redundant – appear to be defensible. This inquiry offers various logical, epistemic, and ordinary-language grounds to demonstrate that an account of his kind is too narrow to be true and that its adoption would hopelessly obscure crucial differences between information, misinformation, and disinformation. (shrink)
Luciano Floridi (2003) offers a theory of information as a strongly semantic notion, according to which information encapsulates truth, thereby making truth a necessary condition for a sentence to qualify as information. While Floridi provides an impressive development of this position, the aspects of his approach of greatest philosophical significance are its foundations rather than its formalization. He rejects the conception of information as meaningful data, which entails at least three theses – that information can be false; that tautologies are (...) information; and, that It is true that ... is non-redundant – appear to be defensible. This inquiry offers various logical, epistemic, and ordinary-language grounds to demonstrate that an account of his kind is too narrow to be true and that its adoption would hopelessly obscure crucial differences between information, misinformation, and disinformation. (shrink)
The idea that human thought requires the execution of mental algorithms provides a foundation for research programs in cognitive science, which are largely based upon the computational conception of language and mentality. Consideration is given to recent work by Penrose, Searle, and Cleland, who supply various grounds for disputing computationalism. These grounds in turn qualify as reasons for preferring a non-computational, semiotic approach, which can account for them as predictable manifestations of a more adquate conception. Thinking does not ordinarily require (...) the execution of mental algorithms, which appears to be at best no more than one rather special kind of thinking. (shrink)
The idea that human thought requires the execution of mental algorithms provides a foundation for research programs in cognitive science, which are largely based upon the computational conception of language and mentality. Consideration is given to recent work by Penrose, Searle, and Cleland, who supply various grounds for disputing computationalism. These grounds in turn qualify as reasons for preferring a non-computational, semiotic approach, which can account for them as predictable manifestations of a more adquate conception. Thinking does not ordinarily require (...) the execution of mental algorithms, which appears to be at best no more than one rather special kind of thinking. (shrink)
The purpose of this paper is to provide a systematic defense of the single-case propensity account of probabilistic explanation from the criticisms advanced by Hanna and by Humphreys and to offer a critical appraisal of the aleatory conception advanced by Humphreys and of the deductive-nomological-probabilistic approach Railton has proposed. The principal conclusion supported by this analysis is that the Requirements of Maximal Specificity and of Strict Maximal Specificity afford the foundation for completely objective explanations of probabilistic explananda, so long as (...) they are employed on the basis of propensity criteria of explanatory relevance. (shrink)
A debate over the theoretical capabilities of formal methods in computer science has raged for more than two years now. The function of this paper is to summarize the key elements of this debate and to respond to important criticisms others have advanced by placing these issues within a broader context of philosophical considerations about the nature of hardware and of software and about the kinds of knowledge that we have the capacity to acquire concerning their performance.
Cognitive science has been dominated by the computational conception that cognition is computation across representations. To the extent to which cognition as computation across representations is supposed to be a purposive, meaningful, algorithmic, problem-solving activity, however, computers appear to be incapable of cognition. They are devices that can facilitate computations on the basis of semantic grounding relations as special kinds of signs. Even their algorithmic, problem-solving character arises from their interpretation by human users. Strictly speaking, computers as such — apart (...) from human users — are not only incapable of cognition, but even incapable of computation, properly construed. If we want to understand the nature of thought, then we have to study thinking, not computing, because they are not the same thing. (shrink)
The social exchange theory of reasoning, which is championed by Leda Cosmides and John Tooby, falls under the general rubric evolutionary psychology and asserts that human reasoning is governed by content-dependent, domain-specific, evolutionarily-derived algorithms. According to Cosmides and Tooby, the presumptive existence of what they call cheater-detection algorithms disconfirms the claim that we reason via general-purpose mechanisms or via inductively acquired principles. We contend that the Cosmides/Tooby arguments in favor of domain-specific algorithms or evolutionarily-derived mechanisms fail and that the notion (...) of a social exchange rule, which is central to their theory, is not correctly characterized. As a consequence, whether or not their conclusion is true cannot be established on the basis of the arguments they have presented. (shrink)
This paper pursues the question, To what extent does the propensity approach to probability contribute to plausible solutions to various anomalies which occur in quantum mechanics? The position I shall defend is that of the three interpretations — the frequency, the subjective, and the propensity — only the third accommodates the possibility, in principle, of providing a realistic interpretation of ontic indeterminism. If these considerations are correct, then they lend support to Popper's contention that the propensity construction tends to remove (...) (at least some of) the mystery from quantum phenomena. (shrink)
Taking Brian Cantwell Smith’s study, “Limits of Correctness in Computers,” as its point of departure, this article explores the role of models in computer science. Smith identifies two kinds of models that play an important role, where specifications are models of problems and programs are models of possible solutions. Both presuppose the existence of conceptualizations as ways of conceiving the world “in certain delimited ways.” But high-level programming languages also function as models of virtual (or abstract) machines, while low-level programming (...) languages function as models of causal (or physical) machines. The resulting account suggests that sets of models embedded within models are indispensable for computer programming. (shrink)
Carl G. Hempel exerted greater influence upon philosophers of science than any other figure during the 20th century. In this far-reaching collection, distinguished philosophers contribute valuable studies that illuminate and clarify the central problems to which Hempel was devoted. The essays enhance our understanding of the development of logical empiricism as the major intellectual influence for scientifically-oriented philosophers and philosophically-minded scientists of the 20th century.
An approach to inference to the best explanation integrating a Popperianconception of natural laws together with a modified Hempelian account of explanation, one the one hand, and Hacking's law of likelihood (in its nomicguise), on the other, which provides a robust abductivist model of sciencethat appears to overcome the obstacles that confront its inductivist,deductivist, and hypothetico-deductivist alternatives.This philosophy of scienceclarifies and illuminates some fundamental aspects of ontology and epistemology, especially concerning the relations between frequencies and propensities. Among the most important (...) elements of this conception is thecentral role of degrees of nomic expectability in explanation, prediction,and inference, for which this investigation provides a theoretical defense. (shrink)
The distinction between misinformation and disinformation becomes especially important in political, editorial, and advertising contexts, where sources may make deliberate efforts to mislead, deceive, or confuse an audience in order to promote their personal, religious, or ideological objectives. The difference consists in having an agenda. It thus bears comparison with lying, because lies are assertions that are false, that are known to be false, and that are asserted with the intention to mislead, deceive, or confuse. One context in which disinformation (...) abounds is the study of the death of JFK, which I know from more than a decade of personal research experience. Here I reflect on that experience and advance a preliminary theory of disinformation that is intended to stimulate thinking on this increasingly important subject. Five kinds of disinformation are distinguished and exemplified by real life cases I have encountered. It follows that the story you are about to read is true. (shrink)
Cosmides, Wason, and Johnson-Laird, among others, have suggested evidence that reasoning abilities tend to be domain specific, insofar as humans do not appear to acquire capacities for logical reasoning that are applicable across different contexts. Unfortunately, the significance of these findings depends upon the specific variety of logical reasoning under consideration. Indeed, there seem to be at least three grounds for doubting such conclusions, since: (1) tests of reasoning involving the use of material conditionals may not be appropriate for representing (...) ordinary thinking, especially when it concerns causal processes involving the use of causal conditionals instead; (2) tests of domain specificity may fail to acknowledge the crucial role fulfilled by rules of inference, such as modus ponens and modus tollens, which appear to be completely general across different contexts; and, (3) tests that focus exclusively upon deductive reasoning may misinterpret findings involving the use of inductive reasoning, which is of primary importance for human evolution. (shrink)
The purpose of this essay is to investigate the properties of singular causal systems and their population manifestations, with special concern for the thesis of methodological individualism, which claims that there are no properties of social groups that cannot be adequately explained exclusively by reference to properties of individual members of those groups, i.e., at the level of individuals. Individuals, however, may be viewed as singular causal systems, i.e., as instantiations of (arrangements of) dispositional properties. From this perspective, methodological individualism (...) appears to be an ambiguous thesis: some properties of collections of (independent) systems of the same kind are reducible, but other properties of collections of (dependent) systems of the same kind are not. In cases of the first kind, therefore, methodological individualism is true, but trivial; while in cases of the second kind, it is significant, but false. Hence, if the arguments that follow are correct, at least some of the properties of social groups should qualify as emergent. (shrink)
To clarify and illuminate the place of probability in science Ellery Eells and James H. Fetzer have brought together some of the most distinguished philosophers ...
Hempel was one of the most influential philosophers of science in the 20th century, along with Thomas Kuhn and Sir Karl Popper. His work defined the central problems of the field and its proper methods of investigation. By presenting an analytical and historical introduction and a comprehensive bibliography together with a selection of many of Carl G. Hempel's most important studies, this volume provides an ideal opportunity for students and scholars to appreciate the enduring contributions of one of the most (...) important philosophers of science of the 20th century. (shrink)
The purpose of this paper is to provide a systematic appraisal of the covering law and statistical relevance theories of statistical explanation advanced by Carl G. Hempel and by Wesley C. Salmon, respectively. The analysis is intended to show that the difference between these accounts is inprinciple analogous to the distinction between truth and confirmation, where Hempel's analysis applies to what is taken to be the case and Salmon's analysis applies to what is the case. Specifically, it is argued (a) (...) that statistical explanations exhibit the nomic expectability of their explanandum events, which in some cases may be strong but in other cases will not be; (b) that the statistical relevance criterion is more fundamental than the requirement of maximal specificity and should therefore displace it; and, (c) that if statistical explanations are to be envisioned as inductive arguments at all, then only in a qualified sense since, in particular, the requirement of high inductive probability between explanans and explanandum must be abandoned. (shrink)
My purpose here is to elaborate the reasons I maintain that Salmon has not been completely successful in reporting the history of work on explanation. The most important limitation of his account is that it does not emphasize the critical necessity to embrace a suitable conception of probability in the development of the theory of probabilistic explanation.