In what sense is the direction of time a matter of convention? In 'The Direction of Time', Hans Reichenbach makes brief reference to parallels between his views about the status of time’s direction and his conventionalism about geometry. In this article, I: (1) provide a conventionalist account of time direction motivated by a number of Reichenbach’s claims in the book; (2) show how forwards and backwards time can give equivalent descriptions of the world despite the former being the ‘natural’ (...) direction of time; and (3) argue that this offers an important middle-ground position between existing realist and antirealist accounts of the direction of time. (shrink)
Poincaré is well known for his conventionalism and structuralism. However, the relationship between these two theses and their place in Poincaré׳s epistemology of science remain puzzling. In this paper I show the scope of Poincaré׳s conventionalism and its position in Poincaré׳s hierarchical approach to scientific theories. I argue that for Poincaré scientific knowledge is relational and made possible by synthetic a priori, empirical and conventional elements, which, however, are not chosen arbitrarily. By examining his geometric conventionalism, his (...) hierarchical account of science and defence of continuity in theory change, I argue that Poincaré defends a complex structuralist position based on synthetic a priori and conventional elements, the mind-dependence of which precludes epistemic access to mind-independent structures. (shrink)
Conventionalism about mathematics claims that mathematical truths are true by linguistic convention. This is often spelled out by appealing to facts concerning rules of inference and formal systems, but this leads to a problem: since the incompleteness theorems we’ve known that syntactic notions can be expressed using arithmetical sentences. There is serious prima facie tension here: how can mathematics be a matter of convention and syntax a matter of fact given the arithmetization of syntax? This challenge has been pressed (...) in the literature by Hilary Putnam and Peter Koellner. In this paper I sketch a conventionalist theory of mathematics, show that this conventionalist theory can meet the challenge just raised , and clarify the type of mathematical pluralism endorsed by the conventionalist by introducing the notion of a semantic counterpart. The paper’s aim is an improved understanding of conventionalism, pluralism, and the relationship between them. (shrink)
I discuss two subjects in Samir Okasha’s excellent book, Evolution and the Levels of Selection. In consonance with Okasha’s critique of the conventionalist view of the units of selection problem, I argue that conventionalists have not attended to what realists mean by group, individual, and genic selection. In connection with Okasha’s discussion of the Price equation and contextual analysis, I discuss whether the existence of these two quantitative frameworks is a challenge to realism.
One common objection to Conventionalism about modality is that since it is contingent what our conventions are, the modal facts themselves will thereby be contingent. A standard reply is that Conventionalists can accept this, if they reject the S4 axiom, that what is possibly possible is possible. I first argue that this reply is inadequate, but then continue to argue that it is not needed, because the Conventionalist need not concede that the contingency of our conventions has any bearing (...) on the modal status of necessary truths. It is explained why this does not compromise the Conventionalist claim that necessity – and particularly, essence – is due to conventions. (shrink)
We argue for a new conventionalism about many kinds of evolutionary groups, including clades, cohesive units, and populations. This rejects a consensus, which says that given any one of the many legitimate grouping concepts, only objective biological facts determine whether a collection is such a group. Surprisingly, being any one kind of evolutionary group typically depends on which of many incompatible values are taken by suppressed variables. This is a novel pluralism underlying most any one group concept, rather than (...) a familiar pluralism claiming many concepts are legitimate. Consequently, we must help biological facts determine grouphood, even when given a single grouping concept. (shrink)
I motivate “Origin Conventionalism”—the view that which facts about one’s origins are essential to one’s existence in part depend on our person-directed attitudes. One important upshot of the view is that it offers a novel and attractive solution to the Nonidentity Problem. The Nonidentity Problem typically assumes that the sperm-egg pair from which a person originates is essential to that person’s existence; if so, then for many future persons that come into existence under adverse conditions, had those conditions not (...) been realized, the individuals wouldn't have existed. This is problematic since it delivers the counter-intuitive conclusion that it’s not wrong to bring about such adverse conditions since they don’t harm anyone. Origin Conventionalism, in contrast, holds that whether a person’s sperm-egg origin is essential to their existence depends on their person-directed attitudes. I argue that this provides a unique and attractive way of preserving the intuition that the actions in the ‘nonidentity cases’ are morally wrong because of the potential harm done to the individuals in question. (shrink)
This paper examines whether, and in what contexts, Duhem’s and Poincaré’s views can be regarded as conventionalist or structural realist. After analysing the three different contexts in which conventionalism is attributed to them – in the context of the aim of science, the underdetermination problem and the epistemological status of certain principles – I show that neither Duhem’s nor Poincaré’s arguments can be regarded as conventionalist. I argue that Duhem and Poincaré offer different solutions to the problem of theory (...) choice, differ in their stances towards scientific knowledge and the status of scientific principles, making their epistemological claims substantially different. (shrink)
The daring idea that convention - human decision - lies at the root both of necessary truths and much of empirical science reverberates through twentieth-century philosophy, constituting a revolution comparable to Kant's Copernican revolution. This is the first comprehensive study of Conventionalism. Drawing a distinction between two conventionalist theses, the under-determination of science by empirical fact, and the linguistic account of necessity, Yemima Ben-Menahem traces the evolution of both ideas to their origins in Poincare;'s geometric conventionalism. She argues (...) that the radical extrapolations of Poincare;'s ideas by later thinkers, including Wittgenstein, Quine, and Carnap, eventually led to the decline of conventionalism. This book provides a new perspective on twentieth-century philosophy. Many of the major themes of contemporary philosophy emerge in this book as arising from engagement with the challenge of conventionalism. (shrink)
The power to promise is morally fundamental and does not, at its foundation, derive from moral principles that govern our use of conventions. Of course, many features of promising have conventional components—including which words, gestures, or conditions of silence create commitments. What is really at issue between conventionalists and nonconventionalists is whether the basic moral relation of promissory commitment derives from the moral principles that govern our use of social conventions. Other nonconventionalist accounts make problematic concessions to the conventionalist's core (...) instincts, including embracing: the view that binding promises must involve the promisee's belief that performance will occur; the view that through the promise, the promisee and promisor create a shared end; and the tendency to take promises between strangers, rather than intimates, as the prototypes to which a satisfactory account must answer. I argue against these positions and then pursue an account that finds its motivation in their rejection. My main claim is: the power to make promises, and other related forms of commitment, is an integral part of the ability to engage in special relationships in a morally good way. The argument proceeds by examining what would be missing, morally, from intimate relationships if we lacked this power. (shrink)
The daring idea that convention - human decision - lies at the root both of necessary truths and much of empirical science reverberates through twentieth-century philosophy, constituting a revolution comparable to Kant's Copernican revolution. This book provides a comprehensive study of Conventionalism. Drawing a distinction between two conventionalist theses, the under-determination of science by empirical fact, and the linguistic account of necessity, Yemima Ben-Menahem traces the evolution of both ideas to their origins in Poincaré's geometric conventionalism. She argues (...) that the radical extrapolations of Poincaré's ideas by later thinkers, including Wittgenstein, Quine, and Carnap, eventually led to the decline of conventionalism. This book provides a fresh perspective on twentieth-century philosophy. Many of the major themes of contemporary philosophy emerge in this book as arising from engagement with the challenge of conventionalism. (shrink)
The daring idea that convention - human decision - lies at the root both of necessary truths and much of empirical science reverberates through twentieth-century philosophy, constituting a revolution comparable to Kant's Copernican revolution. This book provides a comprehensive study of Conventionalism. Drawing a distinction between two conventionalist theses, the under-determination of science by empirical fact, and the linguistic account of necessity, Yemima Ben-Menahem traces the evolution of both ideas to their origins in Poincaré's geometric conventionalism. She argues (...) that the radical extrapolations of Poincaré's ideas by later thinkers, including Wittgenstein, Quine, and Carnap, eventually led to the decline of conventionalism. This book provides a fresh perspective on twentieth-century philosophy. Many of the major themes of contemporary philosophy emerge in this book as arising from engagement with the challenge of conventionalism. (shrink)
We build on Morgan’s deep conventionalist base by offering a pragmatic approach for achieving normative progress on sports most intractable problems (e.g. performance enhancemen...
Historically, opponents of realism have managed to slip beneath a key objection which realists raise against them. The opponents say that some element of the world is constructed by our cognitive practices; realists retort that the element would have existed unaltered, had our practices differed; the opponents sometimes agree, contending that we construct in just such a way as to render the counterfactual true. The contemporary instalment of this debate starts with conventionalism about modality, which holds that the borders (...) of the world's kinds and the careers of individuals in those kinds obtain only relative to our conventions of individuation. Realists object that the kinds and careers in nature would still have obtained, had our conventions been different, but conventionalists claim to be able to agree. I argue that this claim is false, and that conventionalism contradicts itself. (shrink)
Conventionalism in sport philosophy has been rejected as unable to provide a theory of normativity and as collapsing in ethical relativism, but this criticism is rather imprecise about its target, which invites doubt about the legitimacy of the concept of conventionalism described by its critics. Instead, a more charitable and legitimate account of conventionalism is proposed, one that draws inspiration from conventionalism in axiomatic geometry and is able to avoid the counterarguments directed against conventionalism. This (...) new model allows for a number of non-conventional elements of sport, namely the definition of sport and certain central moral norms, while at the same time arguing that normativity in sport is not exhausted by them, which leaves athletic communities with authority over a broad range of norms. (shrink)
The subject of this investigation is the role of conventions in the formulation of Thomas Reid’s theory of the geometry of vision, which he calls the ‘geometry of visibles’. In particular, we will examine the work of N. Daniels and R. Angell who have alleged that, respectively, Reid’s ‘geometry of visibles’ and the geometry of the visual field are non-Euclidean. As will be demonstrated, however, the construction of any geometry of vision is subject to a choice of conventions regarding the (...) construction and assignment of its various properties, especially metric properties, and this fact undermines the claim for a unique non-Euclidean status for the geometry of vision. Finally, a suggestion is offered for trying to reconcile Reid’s direct realist theory of perception with his geometry of visibles.While Thomas Reid is well-known as the leading exponent of the Scottish ‘common-sense’ school of philosophy, his role in the history of geometry has only recently been drawing the attention of the scholarly community. In particular, several influential works, by N. Daniels and R. B. Angell, have claimed Reid as the discoverer of non-Euclidean geometry; an achievement, moreover, that pre-dates the geometries of Lobachevsky, Bolyai, and Gauss by over a half century. Reid’s alleged discovery appears within the context of his analysis of the geometry of the visual field, which he dubs the ‘geometry of visibles’. In summarizing the importance of Reid’s philosophy in this area, Daniels is led to conclude that ‘there can remain little doubt that Reid intends the geometry of visibles to be an alternative to Euclidean geometry’;1 while Angell, similarly inspired by Reid, draws a much stronger inference: ‘The geometry which precisely and naturally fits the actual configurations of the visual field is a non-Euclidean, two-dimensional, elliptical geometry. In substance, this thesis was advanced by Thomas Reid in 1764...’, 2 The significance of these findings has not gone unnoticed in mathematical and scientific circles, moreover, for Reid’s name is beginning to appear more frequently in historical surveys of the development of geometry and the theories of space., 3Implicit in the recent work on Reid’s ‘geometry of visibles’, or GOV, one can discern two closely related but distinct arguments: first, that Reid did in fact formulate a non-Euclidean geometry, and second, that the GOV is non-Euclidean. This essay will investigate mainly the latter claim, although a lengthy discussion will be accorded to the first. Overall, in contrast to the optimistic reports of a non-Euclidean GOV, it will be argued that there is a great deal of conceptual freedom in the construction of any geometry pertaining to the visual field. Rather than single out a non-Euclidean structure as the only geometry consistent with visual phenomena, an examination of Reid, Daniels, and Angell will reveal the crucial role of geometric ‘conventions’, especially of the metric sort, in the formulation of the GOV. Consequently, while a non-Euclidean geometry is consistent with Reid’s GOV, it is only one of many different geometrical structures that a GOV can possess. Angell’s theory that the GOV can only be construed as non-Euclidean, is thus incorrect. After an exploration of Reid’s theory and the alleged non-Euclidean nature of the GOV, in 1 and 2 respectively, the focus will turn to the tacit role of conventionalism in Daniels’ reconstruction of Reid’s GOV argument, and in the contemporary treatment of a non-Euclidean visual geometry offered by Angell. Finally, in the conclusion, a suggestion will be offered for a possible reconstruction of Reid’s GOV that does not violate his avowed ‘direct realist’ theory of perception, since this epistemological thesis largely prompted his formulation of the GOV. (shrink)
Conventionalists about promising believe that it is wrong to break a promise because the promisor takes advantage of a useful social convention only to fail to do his part in maintaining it. Anti-conventionalists claim that the wrong of breaking a promise has nothing essentially to do with a social convention. Anti-conventionalists are right that the social convention is not necessary to explain the wrong of breaking most promises. But conventionalists are right that the convention plays an essential role in any (...) satisfactory account of promising. A new conventionalism can explain this by appealing to special features of social conventions. Two of these special features have important implications for any moral requirements they mediate, such as the requirement to keep one's promises and the moral requirements attached to social or occupational roles. First, these requirements will not depend on features of a situation that are inaccessible to typical participants in the convention. Second, these requirements often cannot be tailored to fit the overly unusual circumstances of participants. (shrink)
This paper distinguishes three concepts of "race": bio-genomic cluster/race, biological race, and social race. We map out realism, antirealism, and conventionalism about each of these, in three important historical episodes: Frank Livingstone and Theodosius Dobzhansky in 1962, A.W.F. Edwards' 2003 response to Lewontin (1972), and contemporary discourse. Semantics is especially crucial to the first episode, while normativity is central to the second. Upon inspection, each episode also reveals a variety of commitments to the metaphysics of race. We conclude by (...) interrogating the relevance of these scientific discussions for political positions and a post-racial future. (shrink)
ABSTRACTA powerful objection against moral conventionalism says that it gives the wrong reasons for individual rights and duties. The reason why I must not break my promise to you, for example, should lie in the damage to you—rather than to the practice of promising or to all other participants in that practice. Common targets of this objection include the theories of Hobbes, Gauthier, Hooker, Binmore, and Rawls. I argue that the conventionalism of these theories is superficial; genuinely conventionalist (...) theories are not vulnerable to the objection; and genuine moral conventionalism is independently plausible. (shrink)
This paper examines methodological issues that arose in the course of the development of the inertial frame concept in classical mechanics. In particular it examines the origins and motivations of the view that the equivalence of inertial frames leads to a kind of conventionalism. It begins by comparing the independent versions of the idea found in J. Thomson (1884) and L. Lange (1885); it then compares Lange's conventionalist claims with traditional geometrical conventionalism. It concludes by examining some implications (...) for contemporary philosophy of space and time. (shrink)
This article examines how Quine and Sellars develop informatively contrasting responses to a fundamental tension in Carnap’s semantics ca. 1950. Quine’s philosophy could well be styled ‘Essays in Radical Empiricism’; his assay of radical empiricism is invaluable for what it reveals about the inherent limits of empiricism. Careful examination shows that Quine’s criticism of Carnap’s semantics in ‘Two Dogmas of Empiricism’ fails, that at its core Quine’s semantics is for two key reasons incoherent and that his hallmark Thesis of Extensionalism (...) is untenable. The tension in Carnap’s semantics together with Quine’s exposure of the severe limits of radical empiricism illuminate central features of Sellars’s philosophy: the fully general form of the myth of givenness, together with Sellars’s alternative Kantian characterisation of understanding; the full significance of Carnap’s distinction between conceptual analysis and conceptual explication, and its important methodological implications; the specifically pragmatic character of Sellars’s realism; and Sellars’s methodological reasons for holding that philosophy must be systematic and that systematic philosophy must be deeply historically and textually informed. This paper thus re-examines this recent episode of philosophical history for its philosophical benefits and systematic insights. (shrink)
The power to promise is morally fundamental and does not, at its foundation, derive from moral principles that govern our use of conventions. Of course, many features of promising have conventional components—including which words, gestures, or conditions of silence create commitments. What is really at issue between conventionalists and nonconventionalists is whether the basic moral relation of promissory commitment derives from the moral principles that govern our use of social conventions. Other nonconventionalist accounts make problematic concessions to the conventionalist's core (...) instincts, including embracing: the view that binding promises must involve the promisee's belief that performance will occur; the view that through the promise, the promisee and promisor create a shared end; and the tendency to take promises between strangers, rather than intimates, as the prototypes to which a satisfactory account must answer. I argue against these positions and then pursue an account that finds its motivation in their rejection. My main claim is: the power to make promises, and other related forms of commitment, is an integral part of the ability to engage in special relationships in a morally good way. The argument proceeds by examining what would be missing, morally, from intimate relationships if we lacked this power. (shrink)
The logical positivists adopted Poincare's doctrine of the conventionality of geometry and made it a key part of their philosophical interpretation of relativity theory. I argue, however, that the positivists deeply misunderstood Poincare's doctrine. For Poincare's own conception was based on the group-theoretical picture of geometry expressed in the Helmholtz-Lie solution of the space problem, and also on a hierarchical picture of the sciences according to which geometry must be presupposed be any properly physical theory. But both of this pictures (...) are entirely incompatible with the radically new conception of space and geometry articulated in the general theory of relativity. The logical positivists's attempt to combine Poincare's conventionalism with Einstein's new theory was therefore, in the end, simply incoherent. Underlying this problem, moreover, was a fundamental philosophical difference between Poincare's and the positivists concerning the status of synthetic a priori truths. (shrink)
There are two questions I would like to address in this article. The first and main question is whether there are rules of recognition, along the lines suggested by H.L.A. Hart. The second question concerns the age-old issue of the autonomy of law. One of the main purposes of this article is to show how these two issues are closely related. The concept of a social convention is the thread holding these two points tightly knit in one coil. Basically, I (...) will argue that a novel account of social conventions can be employed to reestablish Hart's thesis about the rules of recognition, and that this same account shows why, and to what extent, law is partly an autonomous practice. (shrink)
We are confident of many of the judgements we make as to what sorts of alterations the members of nature's kinds can survive, and what sorts of events mark the ends of their existences. But is our confidence based on empirical observation of nature's kinds and their members? Conventionalists deny that we can learn empirically which properties are essential to the members of nature's kinds. Judgements of sameness in kind between members, and of numerical sameness of a member across time, (...) merely project our conventions of individuation. Our confidence is warranted because apart from those conventions there are no phenomena of kind-sameness or of numerical sameness across time. There is just 'stuff' displaying properties. This paper argues that conventionalists can assign no properties to the 'stuff' beyond immediate phenomenal properties. Consequently they cannot explain how each of us comes to be able to wield 'our conventions'. (shrink)
This paper examines popular‘conventionalist’explanations of why philosophers need not back up their claims about how‘we’use our words with empirical studies of actual usage. It argues that such explanations are incompatible with a number of currently popular and plausible assumptions about language's ‘social’character. Alternate explanations of the philosopher's purported entitlement to make a priori claims about‘our’usage are then suggested. While these alternate explanations would, unlike the conventionalist ones, be compatible with the more social picture of language, they are each shown to (...) face serious problems of their own. (shrink)
Quine's arguments for the indeterminacy of translation demonstrate the existence and help to explain the rationale of restraints upon what we can say and understand. In particular they show that there are logical truths to which there are no intelligible alternatives. Thus the standard view that the truths of logic and mathematics differ from "synthetic" statements in being true solely by virtue of linguistic convention--Which requires for its plausibility the existence of intelligible alternatives to our present logical truth--Is opposed directly, (...) And not by the espousal of "a more thorough pragmatism". This raises problems about possibility and conceptual novelty. (shrink)
According to linguistic conventionalism, necessities are to be explained in terms of the conventionally adopted rules that govern the use of linguistic expressions. A number of influential arguments against this view concerns the ‘Truth-Contrast Thesis’. This is the claim that necessary truths are fundamentally different from contingent ones since they are not made true by ‘the facts’. Instead, they are supposed to be something like ‘true in virtue of meaning’. This thesis is widely held to be a core commitment (...) of the conventionalist position, and the view is frequently rejected on the grounds that this thesis is untenable. I argue that this line of reasoning is mistaken. While the thesis should be rejected, it is not, I argue, entailed by linguistic conventionalism – nor was it invariably accepted by the paradigmatic conventionalists. (shrink)
A new reading of Plato's account of conventionalism about names in the Cratylus. It argues that Hermogenes' position, according to which a name is whatever anybody 'sets down' as one, does not have the counterintuitive consequences usually claimed. At the same time, Plato's treatment of conventionalism needs to be related to his treatment of formally similar positions in ethics and politics. Plato is committed to standards of objective natural correctness in all such areas, despite the problematic consequences which, (...) as he himself shows, arise in the case of language. (shrink)
This edited volume offers a new approach to understanding social conventions by way of Martin Heidegger. It connects the philosopher's conceptions of the anyone, everydayness, and authenticity with an analysis and critique of social normativity. Heidegger’s account of the anyone is ambiguous. Some see it as a good description of human sociality, others think of it as an important critique of modern mass society. This volume seeks to understand this ambiguity as reflecting the tension between the constitutive function of conventions (...) for human action and the critical aspects of conformism. It argues that Heidegger’s anyone should neither be reduced to its pejorative nor its constitutive dimension. Rather, the concept could show how power and norms function. This volume would be of interest to scholars and students of philosophy and the social sciences who wish to investigate the social applications of the works of Martin Heidegger. (shrink)
It is shown that moral relativism ('morality is culture-specific') and moral conventionalism ('moral laws are agreements among people as to how to behave') both presuppose the truth of moral realism and are therefore false. It is also shown that every attempt to trivialize moral truth or to prove its non-existence is inconsistent with the fact that moral statements have the same truth-conditions as biological statements.
Roughly speaking all economists can be divided into two groups--those who agree with Milton Friedman and those who do not. Both groups, however, espouse the view that science is a series of approximations to a demonstrated accord with reality. Methodological controversy in economics is now merely a Conventionalist argument over which comes first--simplicity or generality. Furthermore, this controversy in its current form is not compatible with one important new and up and coming economic (welfare) theory called "the theory of the (...) Second Best." In this paper I offer a Second Best meta-theory that says that (1) any compromise between simplicity and generality must yield a theory which is "third best" by these Conventionalist criteria; and (2) there exists a better way than a compromise. (shrink)
Rawlsians argue for principles of justice that apply exclusively to the basic structure of society, but it can seem strange that those who accept these principles should not also regulate their choices by them. Valid moral principles should seemingly identify ideals for both institutions and individuals. What justifies this nonintuitive distinction between institutional and individual principles is not a moral division of labor but Rawls’s dual commitments to conventionalism and constructivism. Conventionalism distinguishes the relevant ideals for evaluating institutions (...) from those for evaluating actions, while constructivism explains why this distinction is morally fundamental. (shrink)
What is the source of logical and mathematical truth? This book revitalizes conventionalism as an answer to this question. Conventionalism takes logical and mathematical truth to have their source in linguistic conventions. This was an extremely popular view in the early 20th century, but it was never worked out in detail and is now almost universally rejected in mainstream philosophical circles. Shadows of Syntax is the first book-length treatment and defense of a combined conventionalist theory of logic and (...) mathematics. It argues that our conventions, in the form of syntactic rules of language use, are perfectly suited to explain the truth, necessity, and a priority of logical and mathematical claims, as well as our logical and mathematical knowledge. (shrink)
Historically, opponents of realism have argued that the world’s objects are constructed by our cognitive activities—or, less colorfully, that they exist and are as they are only relative to our ways of thinking and speaking. To this realists have stoutly replied that even if we had thought or spoken in ways different from our actual ones, the world would still have been populated by the same objects as it actually is, or at least by most of them. (Our thinking differently (...) could cause some differences in which objects exist, or in what some existing objects are like, but that is another matter.) Yet this reply has repeatedly failed to amount to a decisive objection. For opponents of realism have repeatedly argued, in one way or another, that we construct the world’s objects in just such a way as to render such a counterfactual true. We construct them so as to appear not to be our constructs. Just such a debate is currently underway concerning the properties that are essential to the world’s objects. It is widely agreed, with varying caveats1, that there are such properties—that by virtue of belonging to one or another natural kind, the world’s objects possess certain properties essentially, and have individual careers that last exactly as long as those essential properties are jointly present. But what underlies the status as essential of the properties that are thus essential to objects in the world? The realist answer treats essential status as mind-independent, and assigns it to the way the world works (Elder.. (shrink)
The law persists because people have reasons to comply with its rules. What characterizes those reasons is their interdependence: each of us only has a reason to comply because he or she expects the others to comply for the same reasons. The rules may help us to solve coordination problems, but the interaction patterns regulated by them also include Prisoner's Dilemma games, Division problems and Assurance problems. In these "games" the rules can only persist if people can be expected to (...) be moved by considerations of fidelity and fairness, not only of prudence.This book takes a fresh look at the perennial problems of legal philosophy - the source of obligation to obey the law, the nature of authority, the relationship between law and morality, and the nature of legal argument - from the perspective of this conventionalist understanding of social rules. It argues that, since the resilience of such rules depends on cooperative dispositions, conventionalism, properly understood, does not imply positivism. (shrink)
I consider Plato’s argument, in the dialogue Cratylus, against both of two opposed views of the “correctness of names.” The first is a conventionalist view, according to which this relationship is arbitrary, the product of a free inaugural decision made at the moment of the first institution of names. The second is a naturalist view, according to which the correctness of names is initially fixed and subsequently maintained by some kind of natural assignment, rooted in the things themselves. I argue (...) that: 1) Plato’s critical challenge to both views anticipates considerations introduced by Wittgenstein in the Philosophical Investigations ’ consideration of rules and rule-following; 2) Understanding Plato’s appeal to the “form” [ eidos ] of a thing in resolving the problems of both views helps to explicate Wittgenstein’s own appeal to “forms of life” as the “given” ground of linguistic practice; and 3) We should not understand the grounding of language in form-of-life either as a basis in the plural practices of different communities, or as a biological/anthropological basis in the specific nature of the human organism. Rather, it points to an autonomous dimension of form, which articulates the relationship between language and life as it relates to the possibility of truth. (shrink)
Are the properties of communicative acts grounded in the intentions with which they are performed, or in the conventions that govern them? The latest round in this debate has been sparked by Ernie Lepore and Matthew Stone, who argue that much more of communication is conventional than we thought, and that the rest isn’t really communication after all, but merely the initiation of open-ended imaginative thought. I argue that although Lepore and Stone may be right about many of the specific (...) cases they discuss, their big-picture, conventionalist conclusions don’t follow. My argument focuses on four phenomena that present challenges to conventionalist accounts of communication: ambiguity, indirect communication, communication by wholly unconventional means, and convention acquisition. (shrink)
Call the conventionalist challenge to natural rights theory the claim that natural rights theory fails to capture the fact that moral rights are shaped by social and legal convention. While the conventionalist challenge is a natural concern, it is less than clear what this challenge amounts to. This paper aims to develop a clear formulation strong enough to put pressure on the natural rights theorist and precise enough to clarify what an adequate response would require.
We discuss in this paper the question of the scope of the principle of tolerance about languages promoted in Carnap's The Logical Syntax of Language and the nature of the analogy between it and the rudimentary conventionalism purportedly exhibited in the work of Poincaré and Hilbert. We take it more or less for granted that Poincaré and Hilbert do argue for conventionalism. We begin by sketching Coffa's historical account, which suggests that tolerance be interpreted as a conventionalism (...) that allows us complete freedom to select whatever language we wish—an interpretation that generalizes the conventionalism promoted by Poincaré and Hilbert which allows us complete freedom to select whatever axiom system we wish for geometry. We argue that such an interpretation saddles Carnap with a theory of meaning that has unhappy consequences, a theory we believe he did not hold. We suggest that the principle of linguistic tolerance in fact has a more limited scope; but within that scope the analogy between tolerance and geometric conventionalism is quite tight. (shrink)
In this paper, I discuss the influential view that depiction, like language, depends on arbitrary conventions. I argue that this view, however it is elaborated, is false. Any adequate account of depiction must be consistent with the distinctive features of depiction. One such feature is depictive generativity. I argue that, to be consistent with depictive generativity, conventionalism must hold that depiction depends on conventions for the depiction of basic properties of a picture’s object. I then argue that two considerations (...) jointly preclude depiction from being governed by such conventions. Firstly, conventions must be salient to those who employ them. Secondly, those parts of pictures that depict basic properties of objects are not salient to the makers and interpreters of pictures. (shrink)
This article focuses on the conventions that sustain social interaction and argues that they are central to Simon's decision-making theory. Simon clearly identifies two kinds of coordination by convention: behavioral mores that shape human actions, and shared mental models that govern human perceptions. This article argues that Poincaré–Carnap's conventionalism provides powerful support for Simon's theory; it contends that this theory offers a more convincing account of decision and coordination than Lewis' concept of convention. Simon's approach to applying conventionalist logic (...) to social interaction emphasizes the normative role played by mental models in solving coordination problems and considers rationality in terms of both cognitive and moral considerations. By connecting conventional phenomena to social identifications, Simon stresses the resulting complexity of coordination problems. (shrink)
Conventionalism as a distinct approach to the social contract received significant attention in the game-theoretic literature on social contract theory. Peter Vanderschraaf’s sophisticated and innovative theory of conventional justice represents the most recent contribution to this tradition and, in many ways, can be viewed as a culmination of this tradition. In this article, I focus primarily on Vanderschraaf’s defense of the egalitarian bargaining solution as a principle of justice. I argue that one particular formal feature of this bargaining solution, (...) the baseline consistency requirement, may stand in tension with other features of conventionalism as an approach to the social contract and limit the scope of Vanderschraaf’s theory to societies in which de facto an egalitarian sense of justice evolves. It limits the scope of Vanderschraaf’s theory in the face of moral diversity. A similar limitation applies to Vanderschraaf’s theory of democratic political authority. Despite these minor limitations, Vanderschraaf’s theory can only be seen as a major success and significant contribution to social contract theory. (shrink)