Proposes a new way of understanding the nature of metaphysics, focusing on nonreductionist emergence theory, both in ancient and modern philosophy, as well as in contemporary philosophy of science.
This paper examines, from the point of view of a philosopher of science, what it is that introductory science textbooks say and do not say about 'scientific method'. Seventy introductory texts in a variety of natural and social sciences provided the material for this study. The inadequacy of these textbook accounts is apparent in three general areas: (a) the simple empiricist view of science that tends to predominate; (b) the demarcation between scientific and non-scientific inquiry and (c) the avoidance of (...) controversy-in part the consequence of the tendency toward textbook standardization. Most importantly, this study provides some evidence of the gulf that separates philosophy of science from science instruction, and examines some important aspects of the demarcation between science and non-science-an important issue for philosophers, scientists, and science educators. (shrink)
If a metaphysics identifies transcendental principles with formal principles, the inevitable result will be a reductionist collapse, that is, a theory of the nature of reality that will exclude as inessential significant differences among existing things. To avoid this result, we must take some such material differences as transcendental in nature. This produces a metaphysics in which the concept of ontological emergence is central—a metaphysics that will depend essentially on the material content of the natural sciences. While both Aristotle and (...) Hegel provided such a metaphysics, they did not, I argue, accept one of its most important consequences—that it must be as incomplete as our scientific knowledge of these material differences. I examine this failure and suggest some areas in which contemporary scientific conceptions may contribute to a more contemporary metaphysics. (shrink)
Metaphysics, for Collingwood, is an historical science. Accordingly, nature and the science of nature did not occupy a prominent position within his general scheme. To appreciate this fact and to consider how this deficiency might be overcome requires that we first attend to the disconnected nature of the doctrines that loosely comprise that scheme. More specifically, we must examine the problematical relationship between Collingwood’s familiar theory of presuppositions and his less frequently discussed doctrine of the scale of forms presented in (...) An Essay on Philosophical Method. This latter task will serve as a focus both for an analysis of the general difficulties that beset Collingwood’s particular formulation of a dialectical method, as well as for an exploration of a possible resolution of these difficulties —a resolution along lines which, admittedly, Collingwood himself did not explicitly follow. A few general comments on the tension between dialectical and idealist elements in his metaphysics will conclude the discussion. (shrink)
An inference to a new explanation may be both logically non-ampliative and epistemically ampliative. Included among the premises of the latter form is the explanadum--a unique premise which is capable of embodying what we do not know about the matter in question, as well as legitimate aspects of what we do know. This double status points to a resolution of the Meno paradox. Ampliative inference of this sort, it is argued, has much in common with Nickles' idea of discoverability and, (...) together with the mapping and correction procedures (briefly summarized) required for such inference, may suggest a broadening of the concept of justification which would incorporate much of what has been defended in theories of discovery. (shrink)
What is the cognitive significance of talking to ourselves? I criticize two interpretations of this function , and offer a third: I argue that inner speech is a genuine dialogue, not a monologue; that the partners in this dialogue represent the independent interests of experienced meaning and logical articulation; that the former is either silent or capable only of abbreviated speech; that articulation is a logical, not a social demand; and that neither partner is a full-time subordinate of the other. (...) I examine the views of Plato, Arendt, Gadamer, Ryle, Piaget and Vygotsky on the nature of inner speech, and the views of Gazzaniga and Dennett on the role of inner speech in the constitution of human consciousness. (shrink)
In recent years, there have been some attempts to defend the legitimacy of a non-inductive generative logic of discovery whose strategy is to analyze a variety of constraints on the actual generation of explanatory hypotheses. These proposed new theories, however, are only weakly generative (relying on sophisticated processes of elimination) rather than strongly generative (embodying processes of correction).This paper develops a strongly generative theory which holds that we can come to know something new only as a variant of what we (...) already know — and that the novelty of this variant is not thereby eliminated nor beyond our powers of characterization, a double requirement that is vital for resolving the Meno paradox. In this light, the discovery of a new hypothesis is taken as the correction of an antecedent hypothesis in response to the discrepancies between the predictions generated by that antecedent hypothesis and the desired result (e.g. the actual data to be explained). This process comprises two parallel operations: the first, which demonstrates the positive role of the facts in generating new explanations, involves a mapping between multiple hypotheses and sets of predictions generated from those hypotheses, for the purpose of taking the actual data as a determinable variant of neighboring sets of predictions. This mapping permits the facts to indicate how corrective adjustments in the working hypothesis should be made; the second operation, which demonstrates the positive role of explanations in generating new facts, involves a mapping between differently construed versions of the actual data and the conceptualizations derived from those perceptual versions, for the purpose of taking the working hypothesis as a determinable variant of these neighboring conceptualizations. This mapping permits a given hypothesis to generate predictions increasingly closer to the actual facts. (shrink)
“Logics of discovery” and “dialectical logics:” two theories of inference occupying quite distant corners of the philosophical world. The former have been advanced by a substantial minority tradition within Anglo-American philosophy of science, while the latter are associated almost exclusively with the Hegelian-Marxist tradition. Given these disparate home bases, it is little wonder that these theories hardly share a common language. I shall argue, however, that despite such significant differences in ancestry, they may actually be intended as answers to the (...) same general epistemological problem. (shrink)
In this paper, I analyze the particular conception of reciprocal justification proposed by Nelson Goodman and incorporated by John Rawls into what he called reflective equilibrium. I propose a way of avoiding the twin dangers which threaten to push this idea to either of two extremes: the reliance on epistemically privileged observation reports (or moral judgments in Rawls version), which tends to disrupt the balance struck between the two sides of the equilibrium and to re-establish a foundationalism; and the denial (...) of any privileged status to such reports (or judgments), which makes the equilibrium into a theoretical monolith. (shrink)
I develop a variant of the constraint interpretation of the emergence of purely physical (non-biological) entities, focusing on the principle of the non-derivability of actual physical states from possible physical states (physical laws) alone. While this is a necessary condition for any account of emergence, it is not sufficient, for it becomes trivial if not extended to types of constraint that specifically constitute physical entities, namely, those that individuate and differentiate them. Because physical organizations with these features are in fact (...) interdependent sets of such constraints, and because such constraints on physical laws cannot themselves be derived from physical laws, physical organization is emergent. These two complementary types of constraint are components of a complete non-reductive physicalism, comprising a non-reductive materialism and a non-reductive formalism. (shrink)
What is the cognitive significance of talking to ourselves? I criticize two interpretations of this function , and offer a third: I argue that inner speech is a genuine dialogue, not a monologue; that the partners in this dialogue represent the independent interests of experienced meaning and logical articulation; that the former is either silent or capable only of abbreviated speech; that articulation is a logical, not a social demand; and that neither partner is a full-time subordinate of the other. (...) I examine the views of Plato, Arendt, Gadamer, Ryle, Piaget and Vygotsky on the nature of inner speech, and the views of Gazzaniga and Dennett on the role of inner speech in the constitution of human consciousness. (shrink)
Abstract Evolutionary epistemologists from Popper to Campbell have appropriated the Darwinian principle to explain the apparent fit between the world and our knowledge of it. I argue that this strategy suffers from the lack of any principled distinction among various types of elimination. I offer such a distinction and show that there is a species of elimination that is really corrective, that is, which violates the Darwinian principle as Popper understands it.
Philosophers of science have used various formulations of the "random mutation--natural selection" scheme to explain the development of scientific knowledge. But the uncritical acceptance of this evolutionary model has led to substantive problems concerning the relation between fact and theory. The primary difficulty lies in the fact that those who adopt this model (Popper and Kuhn, for example) are led to claim that theories arise chiefly through the processes of relatively random change. Systems theory constitutes a general criticism of this (...) model insofar as it demonstrates the necessity of supplementing this mechanism with the non-random influences exercised by the internal organization of a system on its own development. (shrink)
It is a common experience of mental life that we come to articulate meanings which we had initially grasped in only a sketchy way. In this paper, I consider how this idea of an initially unarticulated meaning may fit in a general theory of mental representation. I propose to identify unarticulated meanings with what I callspecific concepts, which are quite similar to Rosch's categories of basic objects and are distinct both from images and generic concepts (which come to articulate meanings). (...) I argue that unarticulated meaning is non-representational in an important respect, a claim which relies on a distinction amonglevels of representation. (shrink)
Abstract In Peirce's and Hanson's characterization of abductive inference, the abducted hypothesis (but not others) is present in the premises, so that the inference can hardly be taken as ampliative. Abduction has consequently been treated as part of the process whereby already generated hypotheses are judged in terms of their plausibility, simplicity, etc. I propose an interpretation of abduction which supports an ampliative view. It relies on a distinction between two logical stages in the generation of hypotheses, one ?factual? and (...) one ?explanatory?. I also indicate how we may reconstruct Peirce's and Hanson's original inference in an ampliative form. (shrink)
Much of Western metaphysics has been shaped by the Parmenidian problem of being, as differentiated into the problem of the one and the many and its correlated problem of change; or more precisely, the problem of making sense of any change from not-being to being. The epistemological side of the Parmenidian problem may well be posed as that of how to make sense of any change from not-knowing to knowing. Plato recognized this as an orienting problem for philosophy and posed (...) it as a famous dilemma in the Meno: how can anyone inquire into that which one does not know? A long-standing modern move for handling this problem is to make a basic distinction between the context of justification and the context of discovery. Armed with this distinction, one delimits the proper domain of epistemology to issues of justification and simply skirts the epistemological side of the Parmenidian problem by relegating questions of the discovery or genesis of knowledge to psychology, history, or other social sciences. In this approach, epistemology, like ethics, presupposes an is/ought distinction, and any attempt to include questions of the genesis of knowledge, even partially, in the context of justification commits a genetic fallacy. And though Karl Popper acknowledged the problem of the growth of knowledge as being at the heart of epistemology’s task, he still insisted on the justification/ discovery distinction. Notwithstanding Popper, the work of W. V. O. Quine, N. R. Hanson, and most notably Thomas Kuhn has seriously undermined the distinction, and renewed the challenge of the Meno for epistemology and for philosophy of science in particular. (shrink)