The new theory of reference has won popularity. However, a number of noted philosophers have also attempted to reply to the critical arguments of Kripke and others, and aimed to vindicate the description theory of reference. Such responses are often based on ingenious novel kinds of descriptions, such as rigidified descriptions, causal descriptions, and metalinguistic descriptions. This prolonged debate raises the doubt whether different parties really have any shared understanding of what the central question of the philosophical theory of reference (...) is: what is the main question to which descriptivism and the causal-historical theory have presented competing answers. One aim of the paper is to clarify this issue. The most influential objections to the new theory of reference are critically reviewed. Special attention is also paid to certain important later advances in the new theory of reference, due to Devitt and others. (shrink)
The issue of downward causation (and mental causation in particular), and the exclusion problem is discussed by taking into account some recent advances in the philosophy of science. The problem is viewed from the perspective of the new interventionist theory of causation developed by Woodward. It is argued that from this viewpoint, a higher-level (e.g., mental) state can sometimes truly be causally relevant, and moreover, that the underlying physical state which realizes it may fail to be such.
The concept of truth and competing philosophical theories on what truth amounts to have an important place in contemporary philosophy. The aim of this chapter is to give a synopsis of different theories of truth and the particular philosophical issues related to the concept of truth. The literature on this topic is vast, and we must necessarily be rather selective and very brief about complex questions of interpretation of various philosophers. The focus of the chapter is mainly on selected systematic (...) issues and the most influential and well-established philosophical theories and key concepts. (shrink)
The new externalist picture of natural kind terms due to Kripke, Putnam, and others has become quite popular in philosophy. Many philosophers of science have remained sceptical. Häggqvist and Wikforss have recently criticised this view severely. They contend it depends essentially on a micro-essentialist view of natural kinds that is widely rejected among philosophers of science, and that a scientifically reasonable metaphysics entails the resurrection of some version of descriptivism. It is argued in this paper that the situation is not (...) quite as dark for the new theory of reference as many critics suggest. There are several distinct questions here which should not be conflated and ought to be dealt with one by one. Descriptivism remains arguably problematic. (shrink)
The rather unrestrained use of second-order logic in the neo-logicist program is critically examined. It is argued in some detail that it brings with it genuine set-theoretical existence assumptions and that the mathematical power that Hume’s Principle seems to provide, in the derivation of Frege’s Theorem, comes largely from the ‘logic’ assumed rather than from Hume’s Principle. It is shown that Hume’s Principle is in reality not stronger than the very weak Robinson Arithmetic Q. Consequently, only a few rudimentary facts (...) of arithmetic are logically derivable from Hume’s Principle. And that hardly counts as a vindication of logicism. (shrink)
Intuitionism’s disagreement with classical logic is standardly based on its specific understanding of truth. But different intuitionists have actually explicated the notion of truth in fundamentally different ways. These are considered systematically and separately, and evaluated critically. It is argued that each account faces difficult problems. They all either have implausible consequences or are viciously circular.
In the theory of meaning, it is common to contrast truth-conditional theories of meaning with theories which identify the meaning of an expression with its use. One rather exact version of the somewhat vague use-theoretic picture is the view that the standard rules of inference determine the meanings of logical constants. Often this idea also functions as a paradigm for more general use-theoretic approaches to meaning. In particular, the idea plays a key role in the anti-realist program of Dummett and (...) his followers. In the theory of truth, a key distinction now is made between substantial theories and minimalist or deflationist views. According to the former, truth is a genuine substantial property of the truth-bearers, whereas according to the latter, truth does not have any deeper essence, but all that can be said about truth is contained in T-sentences (sentences having the form: ‘P’ is true if and only if P). There is no necessary analytic connection between the above theories of meaning and truth, but they have nevertheless some connections. Realists often favour some kind of truth-conditional theory of meaning and a substantial theory of truth (in particular, the correspondence theory). Minimalists and deflationists on truth characteristically advocate the use theory of meaning (e.g. Horwich). Semantical anti-realism (e.g. Dummett, Prawitz) forms an interesting middle case: its starting point is the use theory of meaning, but it usually accepts a substantial view on truth, namely that truth is to be equated with verifiability or warranted assertability. When truth is so understood, it is also possible to accept the idea that meaning is closely related to truth-conditions, and hence the conflict between use theories and truth-conditional theories in a sense disappears in this view. (shrink)
A popular approach in philosophy, the so-called Canberra Plan, is critically scrutinized. Two aspects of this research program, the formal and the informal program, are distinguished. It is argued that the formal program runs up against certain serious technical problems. It is also argued that the informal program involves an unclear leap at its core. Consequently, it is argued that the whole program is much more problematic than its advocates recognize.
Gödel's two incompleteness theorems are among the most important results in modern logic, and have deep implications for various issues. They concern the limits of provability in formal axiomatic theories. The first incompleteness theorem states that in any consistent formal system F within which a certain amount of arithmetic can be carried out, there are statements of the language of F which can neither be proved nor disproved in F. According to the second incompleteness theorem, such a formal system cannot (...) prove that the system itself is consistent (assuming it is indeed consistent). These results have had a great impact on the philosophy of mathematics and logic. There have been attempts to apply the results also in other areas of philosophy such as the philosophy of mind, but these attempted applications are more controversial. The present entry surveys the two incompleteness theorems and various issues surrounding them. (shrink)
The minimalist view of truth endorsed by Paul Horwich denies that truth has any underlying nature. According to minimalism, the truth predicate ‘exists solely for the sake of a certain logical need’; ‘the function of the truth predicate is to enable the explicit formulation of schematic generalizations’. Horwich proposes that all there really is to truth follows from the equivalence schema: The proposition that p is true iff p, or, using Horwich’s notation, ·pÒ is true ´ p. The (unproblematic) instances (...) of the schema form ‘the minimal theory of truth’. Horwich claims that all the facts involving truth can be explained on the basis of the minimal theory. However, it has been pointed out, e.g. by Gupta (1993), that the minimal theory is too weak to entail any general facts about truth, e.g. the fact that.. (shrink)
Causal descriptivism and its relative nominal descriptivism are critically examined. It is argued that they do not manage to undermine the principal conclusions of the new theory of reference.
Jaegwon Kim’s views on mental causation and the exclusion argument are evaluated systematically. Particular attention is paid to different theories of causation. It is argued that the exclusion argument and its premises do not cohere well with any systematic view of causation.
The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin's famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good measure of (...) the strength of the theory. I exhibit certain strong counterexamples and establish conclusively that the received view is false. Moreover, I show that the limiting constants provided by the theorem do not in any way reflect the power of formalized theories, but that the values of these constants are actually determined by the chosen coding of Turing machines, and are thus quite accidental. (shrink)
An argument, different from the Newman objection, against the view that the cognitive content of a theory is exhausted by its Ramsey sentence is reviewed. The crux of the argument is that Ramsification may ruin inductive systematization between theory and observation. The argument also has some implications concerning the issue of underdetermination.
The problem of mental causation is discussed by taking into account some recent developments in the philosophy of science. The problem is viewed from the perspective of the new interventionist theory of causation developed by Woodward. The import of the idea that causal claims involve contrastive classes in mental causation is also discussed. It is argued that mental causation is much less a problem than it has appeared to be.
Hilary Putnam's famous arguments criticizing Tarski's theory of truth are evaluated. It is argued that they do not succeed to undermine Tarski's approach. One of the arguments is based on the problematic idea of a false instance of T-schema. The other ignores various issues essential for Tarski's setting such as language-relativity of truth definition.
Suosituimpia ja vaikutusvaltaisimpia semanttista eksternalismia ja kausaalista viittaamisen teoriaa vastaan käytettyjä strategioita arvioidaan kriittisesti. Tarkemmassa tarkastelussa mikään niistä ei osoittaudu erityisen vakuuttavaksi.
Chaitin’s incompleteness result related to random reals and the halting probability has been advertised as the ultimate and the strongest possible version of the incompleteness and undecidability theorems. It is argued that such claims are exaggerations.
In the early 20th century, scepticism was common among philosophers about the very meaningfulness of the notion of truth – and of the related notions of denotation, definition etc. (i.e., what Tarski called semantical concepts). Awareness was growing of the various logical paradoxes and anomalies arising from these concepts. In addition, more philosophical reasons were being given for this aversion.1 The atmosphere changed dramatically with Alfred Tarski’s path-breaking contribution. What Tarski did was to show that, assuming that the syntax of (...) the object language is specified exactly enough, and that the metatheory has a certain amount of set theoretic power,2 one can explicitly define truth in the object language. And what can be explicitly defined can be eliminated. It follows that the defined concept cannot give rise to any inconsistencies (that is, paradoxes). This gave new respectability to the concept of truth and related notions. Nevertheless, philosophers’ judgements on the nature and philosophical relevance of Tarski’s work have varied. It is my aim here to review and evaluate some threads in this debate. (shrink)
David Chalmers’ two-dimensionalism is an ambitious philosophical program that aims to “ground” or “construct” Fregean meanings and restore “the golden triangle” of apriority, necessity, and meaning that Kripke seemingly broke. This paper aims to examine critically what Chalmers’ theory can in reality achieve. It is argued that the theory faces severe challenges. There are some gaps in the overall arguments, and the reasoning is in some places somewhat circular. Chalmers’ theory is effectively founded on certain strong philosophical assumptions. It is (...) concluded that it is unclear whether the theory can deliver all it promises. (shrink)
For example, Cheryl Misak in her book-length examination of verificationism writes that ‘the holist [such as Quine] need not reject verificationism, if it is suitably formulated. Indeed, Quine often describes himself as a verificationist’.[iii] Misak concludes that Quine ‘can be described as a verificationist who thinks that the unit of meaning is large’;[iv] and when comparing Dummett and Quine, Misak states that ‘both can be, and in fact are, verificationists’.[v].
It has sometimes been suggested that the so-called new theory of reference (NTR) would provide an alternative picture of meaning and reference which avoids the unwelcome consequences of the meaning-variance thesis and incommesurability. However, numerous philosophers of science have been quite critical towards the idea and NTR in general. It is argued that many of them have an over-simplified and, in part, mistaken understanding of what NTR amounts to. It is submitted that NTR, when correctly understood, can be an important (...) ingredient in the realist toolkit for defending the rationality of science. (shrink)
Here the relationship between understanding and knowledge of meaning is discussed from two different perspectives: that of Dummettian semantic anti-realism and that of the semantic externalism of Putnam and others. The question addressed is whether or not the truth of semantic externalism would undermine a central premise in one of Dummetts key arguments for anti-realism, insofar as Dummetts premise involves an assumption about the transparency of meaning and semantic externalism is often taken to undermine such transparency. Several notions of transparency (...) and conveyability of meaning are distinguished and it is argued that, though the Dummettian argument for anti-realism presupposes only a weak connection between knowledge of meaning and understanding, even this much is not trivially true in light of semantic externalism, and that semantic externalism, if true, would thus represent a reason for rejecting the crucial assumption on which the Dummettian argument depends. (shrink)
After sketching the main lines of Hilbert's program, certain well-known and influential interpretations of the program are critically evaluated, and an alternative interpretation is presented. Finally, some recent developments in logic related to Hilbert's program are reviewed.
Philosopher’s judgements on the philosophical value of Tarski’s contributions to the theory of truth have varied. For example Karl Popper, Rudolf Carnap, and Donald Davidson have, in their different ways, celebrated Tarski’s achievements and have been enthusiastic about their philosophical relevance. Hilary Putnam, on the other hand, pronounces that “[a]s a philosophical account of truth, Tarski’s theory fails as badly as it is possible for an account to fail.” Putnam has several alleged reasons for his dissatisfaction,1 but one of them, (...) the one I call the modal objection (cf. Raatikainen 2003), has been particularly influential. In fact, very similar objections have been presented over and over again in the literature. Already in 1954, Arthur Pap had criticized Tarski’s account with a similar argument (Pap 1954). Moreover, both Scott Soames (1984) and John Etchemendy (1988) use, with an explicit reference to Putnam, similar modal arguments in relation to Tarski. Richard Heck (1997), too, shows some sympathy for such considerations. Simon Blackburn (1984, Ch. 8) has put forward a related argument against Tarski. Recently, Marian David has criticized Tarski’s truth definition with an analogous argument as well (David 2004, p. 389-390).2 This line of argument is thus apparently one of the most influential critiques of Tarski. It is certainly worthy of serious attention. Nevertheless, I shall argue that, given closer scrutiny, it does not present such an acute problem for the Tarskian approach to truth as many philosophers think. But I also believe that it is important to understand clearly why this is so. Moreover, I think that a careful consideration of the issue illuminates certain important but somewhat neglected aspects of the Tarskian approach. (shrink)
The essay examines the views expressed in von Wright's Explanation and Understanding (1971) on human action and historical events from the perspective of the recent philosophy of science. Connecting causal explanation tightly to covering laws, as von Wright does, is found to be problematic, and his Logical connection argument invalid. On the other hand, von Wright's sketched theory of causation which is based on the concept of manipulation proves to be on the right track in light of current knowledge. From (...) this perspective, however, there is no obstacle for explaining human action causally. This is illustrated with two examples from historical research. Finally, von Wright's idea that a complete account of the historical past is never achieved, because the past can always be re-evaluated, is briefly discussed. (shrink)
Quine’s thesis of the indeterminacy of translation has puzzled the philosophical community for several decades. It is unquestionably among the best known and most disputed theses in contemporary philosophy. Quine’s classical argument for the indeterminacy thesis, in his seminal work Word and Object, has even been described by Putnam as “what may well be the most fascinating and the most discussed philosophical argument since Kant’s Transcendental Deduction of the Categories” (Putnam, 1975a: p. 159).
Kirja-arvio teoksesta Jordan B. Peterson, 12 elämänohjetta. Käsikirja kaaosta vastaan (12 Rules for Life. An Antidote to Chaos, 2018). Suom. Tero Valkonen. WSOY, Helsinki 2018.
Roads to Reference: An Essay on Reference Fixing in Natural Language by Mario Gómez-Torrente provides an ample attack against certain more recent variants of descriptivism in the theory of reference. The book discusses a wide variety of expressions, but the focus of this short note is on proper names and natural kind terms. In the case of proper names, indeterminacy plays an important role in Gómez-Torrente’s critical argument. Some questions related to it are raised. As to natural kind terms, the (...) differences between ordinary kind terms and scientific kind terms are discussed, and the somewhat ignored relevance of reference borrowing for natural kind terms is emphasized. (shrink)
"Explanation and Understanding" (1971) by Georg Henrik von Wright is a modern classic in analytic hermeneutics, and in the philosophy of the social sciences and humanities in general. In this work, von Wright argues against naturalism, or methodological monism, i.e. the idea that both the natural sciences and the social sciences follow broadly the same general scientific approach and aim to achieve causal explanations. Against this view, von Wright contends that the social sciences are qualitatively different from the natural sciences: (...) according to his view, the natural sciences aim at causal explanations, whereas the purpose of the social sciences is to understand their subjects. In support of this conviction, von Wright also puts forward a version of the so-called logical connection argument. -/- Von Wright views scientific explanation along the lines of the traditional covering law model. He suggests that the social sciences, in contrast, utilize what he calls “practical syllogism” in understanding human actions. In addition, von Wright presents in this work an original picture on causation: a version of the manipulability theory of causation. -/- In the four decades following von Wright’s classic work, the overall picture in in the philosophy of science has changed significantly, and much progress has been made in various fronts. The aim of the contribution is to revisit the central ideas of "Explanation and Understanding" and evaluate them from this perspective. The covering law model of explanation and the regularity theory of causation behind it have since then fallen into disfavor, and virtually no one believes that causal explanations even in the natural sciences comply with the covering law model. No wonder then that covering law explanations are not found in the social sciences either. Ironically, the most popular theory of causal explanation in the philosophy of science nowadays is the interventionist theory, which is a descendant of the manipulability theory of von Wright and others. However, this theory can be applied with no special difficulties in both the natural sciences and the social sciences. -/- Von Wright’s logical connection argument and his ideas concerning practical syllogisms are also critically assessed. It is argued that in closer scrutiny, they do not pose serious problems for the view that the social sciences too provide causal explanations. In sum, von Wright’s arguments against naturalism do not appear, in today’s perspective, particularly convincing. (shrink)
The most popular and influential strategies used against semantic externalism and the causal theory of reference are critically examined. It is argued that upon closer scrutiny, none of them emerges as truly convincing.
Tiedepolitiikkaa on hallinnut ajatus, että rahankäyttöä voidaan tehostaa ja tutkimuksen laatua parantaa kilpailuttamalla, arvioimalla ja keskittämällä rahoitus huippututkimukselle. Huippututkimuksen ajatellaan edellyttävän suuria tutkimusryhmiä. Lisäksi rahoitusta keskitetty yhä enemmän tieteidenvälisille hankkeille. Kirjoituksessa tarkastellaan näitä trendejä kriittisesti viimeaikaisen tieteentutkimuksen ja tieteenfilosofian parhaan tiedon valossa. Näyttää siltä, että suosittu politiikka ei johda toivottuihin lopputuloksiin.
The prospects and limitations of defining truth in a finite model in the same language whose truth one is considering are thoroughly examined. It is shown that in contradistinction to Tarski's undefinability theorem for arithmetic, it is in a definite sense possible in this case to define truth in the very language whose truth is in question.
Three influential forms of realism are distinguished and interrelated: realism about the external world, construed as a metaphysical doctrine; scientific realism about non-observable entities postulated in science; and semantic realism as defined by Dummett. Metaphysical realism about everyday physical objects is contrasted with idealism and phenomenalism, and several potent arguments against these latter views are reviewed. -/- Three forms of scientific realism are then distinguished: (i) scientific theories and their existence postulates should be taken literally; (ii) the existence of unobservable (...) entities posited by our most successful scientific theories is justified scientifically; and (iii) our best current scientific theories are at least approximately true. It is argued that only some form of scientific realism can make proper sense of certain episodes in the history of science. -/- Finally, Dummett’s influential formulation of semantic issues about realism considered. Dummett argued that in some cases, the fundamental issue is not about the existence of entities, but rather about whether statements of some specified class (such as mathematics) have an objective truth value, independently of our means of knowing it. Dummett famously argued against such semantic realism and in favor of anti-realism. The relation of semantic realism to the metaphysical construal of realism and Dummett’s main argument against semantic realism is examined. (shrink)
Viime vuosina ihmistieteiden kentässä on saanut osakseen paljon huomiota uusi lähestymistapa, jota kutsutaan ”evoluutiopsykologiaksi”. Sen piiristä on esimerkiksi väitetty, että evoluutio on muokannut meidän parinvalintamieltymyksiämme niin, että miehillä on taipumus tuntea vetoa lisääntymiskykyisiltä näyttäviin nuoriin naisiin, pyrkiä parittelemaan aina tilaisuuden tullen mahdollisimman monien naisten kanssa ja olla mustasukkaisia, kun taas naiset ovat taipuvaisia mieltymään iäkkäämpiin miehiin, joilla on valtaa ja resursseja. Luonnonvalinnalla on pyritty myös selittämään mm. raiskauksia.