W. Labov's & T. Labov's findings concerning their child grammar acquisition ("Learning the Syntax of Questions" in Recent Advances in the Psychology of Language, Campbell, R. & Smith, P. Eds, New York: Plenum Press, 1978) are interpreted in terms of different semantics of why & other wh-questions. Z. Dubiel.
This is an introduction to the structure of sentences in human languages. It assumes no prior knowledge of linguistic theory and little of elementary grammar. It will suit students coming to syntactic theory for the first time either as graduates or undergraduates. It will also be useful for those in fields such as computational science, artificial intelligence, or cognitive psychology who need a sound knowledge of current syntactic theory.
It is commonly argued that the rules of language, as distinct from its semantic features, are the characteristics which most clearly distinguish language from the communication systems of other species. A number of linguists (e.g., Chomsky 1972, 1980; Pinker 1994) have suggested that the universal features of grammar (UG) are unique human adaptations showing no evolutionary continuities with any other species. However, recent summaries of the substantive features of UG are quite remarkable in the very general nature of the features (...) proposed. While the syntax of any given language can be quite complex, the specific rules vary so much between languages that the truly universal (i.e. innate) aspects of grammar are not complex at all. In fact, these features most closely resemble a set of general descriptions of our richly complex semantic cognition, and not a list of specific rules. General principles of the evolutionary process suggest that syntax is more properly understood as an emergent characteristic of the explosion of semantic complexity that occurred during hominid evolution. It is argued that grammatical rules used in given languages are likely to be simply conventionalized, invented features of language, and not the result of an innate, grammar-specific module. The grammatical and syntactic regularities that are found across languages occur simply because all languages attempt to communicate the same sorts of semantic information. (shrink)
I critically examine some provocative arguments that John Searle presents in his book The Rediscovery of Mind to support the claim that the syntactic states of a classical computational system are "observer relative" or "mind dependent" or otherwise less than fully and objectively real. I begin by explaining how this claim differs from Searle's earlier and more well-known claim that the physical states of a machine, including the syntactic states, are insufficient to determine its semantics. In contrast, his more recent (...) claim concerns the syntax, in particular, whether a machine actually has symbols to underlie its semantics. I then present and respond to a number of arguments that Searle offers to support this claim, including whether machine symbols are observer relative because the assignment of syntax is arbitrary, or linked to universal realizability, or linked to the sub-personal interpretive acts of a homunculus, or linked to a person's consciousness. I conclude that a realist about the computational model need not be troubled by such arguments. Their key premises need further support. (shrink)
This groundbreaking book offers a new and compelling perspective on the structure of human language. The fundamental issue it addresses is the proper balance between syntax and semantics, between structure and derivation, and between rule systems and lexicon. It argues that the balance struck by mainstream generative grammar is wrong. It puts forward a new basis for syntactic theory, drawing on a wide range of frameworks, and charts new directions for research. In the past four decades, theories of syntactic (...) structure have become more abstract, and syntactic derivations have become ever more complex. Peter Culicover and Ray Jackendoff trace this development through the history of contemporary syntactic theory, showing how much it has been driven by theory-internal rather than empirical considerations. They develop an alternative that is responsive to linguistic, cognitive, computational, and biological concerns. At the core of this alternative is the Simpler Syntax Hypothesis: the most explanatory syntactic theory is one that imputes the minimum structure necessary to mediate between phonology and meaning. A consequence of this hypothesis is a far richer mapping between syntax and semantics than is generally assumed. Through concrete analyses of numerous grammatical phenomena, some well studied and some new, the authors demonstrate the empirical and conceptual superiority of the Simpler Syntax approach.Simpler Syntax is addressed to linguists of all persuasions. It will also be of central interest to those concerned with language in psychology, human biology, evolution, computational science, and artificial intellige. (shrink)
Much of the best contemporary work in the philosophy of language and content makes appeal to the theories developed in generative syntax. In particular, there is a presumption that—at some level and in some way—the structures provided by syntactic theory mesh with or support our conception of content/linguistic meaning as grounded in our first-person understanding of our communicative speech acts. This paper will suggest that there is no such tight fit. Its claim will be that, if recent generative theories (...) are on the right lines, syntactic structure provides both too much and too little to serve as the structural partner for content, at least as that notion is generally understood in philosophy. The paper will substantiate these claims by an assessment of the recent work of King, Stanley, and others. (shrink)
Building on the success of the bestselling first edition, the second edition of this textbook provides a comprehensive and accessible introduction to the major issues in Principles and Parameters syntactic theory, including phrase structure, the lexicon, case theory, movement, and locality conditions. Includes new and extended problem sets in every chapter, all of which have been annotated for level and skill type. Features three new chapters on advanced topics including vP shells, object shells, control, gapping and ellipsis and an additional (...) chapter on advanced topics in binding. Offers a brief survey of both Lexical-Functional Grammar and Head-Driven Phrase Structure Grammar. Succeeds in strengthening the reader's foundational knowledge, and prepares them for more advanced study. Supported by an instructor's manual and online resources for students and instructors, available at www.blackwellpublishing.com/carnie. (shrink)
Turner argues that computer programs must have purposes, that implementation is not a kind of semantics, and that computers might need to understand what they do. I respectfully disagree: Computer programs need not have purposes, implementation is a kind of semantic interpretation, and neither human computers nor computing machines need to understand what they do.
Proponents of the language of thought (LOT) thesis are realists when it comes to syntactically structured representations, and must defend their view against instrumentalists, who would claim that syntactic structures may be useful in describing cognition, but have no more causal powers in governing cognition than do the equations of physics in guiding the planets. This paper explores what it will take to provide an argument for LOT that can defend its conclusion from instrumentalism. I illustrate a difficulty in this (...) project by discussing arguments for LOT put forward by Horgan and Tienson. When their evidence is viewed in the light of results in connectionist research, it is hard to see how a realist conception of syntax can be formulated and defended. (shrink)
One of the most important discoveries of the last thirty years is the extent to which the pattern of anaphoric interpretations is determined by the geometry of syntactic structure. As our understanding of these phenomena has steadily grown, the theory of syntax has often been driven by discoveries in this domain, and it is no accident that Chomsky's Binding Theory was a centerpiece of the principles and parameters approach of the 1980s. However, what remained accidental in Chomsky's theory, and (...) in most of the theories that have followed it, is the apparently complementary distribution of forms that support anaphora for a given antecedent. This book argues not only that the complementary distribution in question is robust empirically, but that its existence is derived by a competitive theory of anaphora. It is demonstrated in detail that the competitive theory provides a far better explanation of anti-locality, anti-subject orientation and the range of apparently exceptional distributions that have been long been problematic for other approaches, such as Chomsky's Binding Theory and the influential predication-based theory of Reinhart and Reuland. (shrink)
The study of syntax is fundamental to linguistics and language study, but it is often taught solely within the framework of transformational grammar. This book is unique in several respects: it introduces the basic concepts used in the description of syntax, independently of any single model of grammar. Most grammatical models fail to deal adequately with one aspect of syntax or another, and the authors argue that an understanding of the concepts used in any full description of (...) language is crucial for assessing the strengths and weaknesses of formal grammars. Formal approaches to some of these concepts are critically examined. This book will train students, of either linguistics or language, to understand and make the best use of any grammar they encounter. Secondly, the book deals with the whole of syntax from immediate constituents and relations between sentences. It also examines concepts like subject and object, agent and patient, topic, comment and theme. Thirdly, there is a section on morphology, and a discussion of the relationship between syntax and morphology. As a book which explains, in a lucid and approachable way, why linguists have adopted certain solutions to problems and not others, this will be an invaluable introductory text. It is profusely illustrated with diagrams, and there are sets of exercises for every chapter which can be used in class, or by students working independently. This second edition has been extensively revised to take account of recent developments in syntactic studies. (shrink)
Recently several philosophers of science have proposed what has come to be known as the semantic account of scientific theories. It is presented as an improvement on the positivist account, which is now called the syntactic account of scientific theories. Bas van Fraassen claims that the syntactic account does not give a satisfactory definition of "empirical adequacy" and "empirical equivalence". He contends that his own semantic account does define these notations acceptably, through the concept of "embeddability", a concept which he (...) claims cannot be defined syntactically. Here, I define a syntactic relation which corresponds to the semantic relation of "embeddability". I suggest that the critical differences between the positivist account and van Fraassen's account have nothing to do with the distinction between semantics and syntax. (shrink)
The discrepancy between syntax and semantics is a painstaking issue that hinders a better comprehension of the underlying neuronal processes in the human brain. In order to tackle the issue, we at first describe a striking correlation between Wittgenstein's Tractatus, that assesses the syntactic relationships between language and world, and Perlovsky's joint language-cognitive computational model, that assesses the semantic relationships between emotions and “knowledge instinct”. Once established a correlation between a purely logical approach to the language and computable psychological (...) activities, we aim to find the neural correlates of syntax and semantics in the human brain. Starting from topological arguments, we suggest that the semantic properties of a proposition are processed in higher brain's functional dimensions than the syntactic ones. In a fully reversible process, the syntactic elements embedded in Broca's area project into multiple scattered semantic cortical zones. The presence of higher functional dimensions gives rise to the increase in informational content that takes place in semantic expressions. Therefore, diverse features of human language and cognitive world can be assessed in terms of both the logic armor described by the Tractatus, and the neurocomputational techniques at hand. One of our motivations is to build a neuro-computational framework able to provide a feasible explanation for brain's semantic processing, in preparation for novel computers with nodes built into higher dimensions. (shrink)
It is commonly assumed that images, whether in the world or in the head, do not have a privileged analysis into constituent parts. They are thought to lack the sort of syntactic structure necessary for representing complex contents and entering into sophisticated patterns of inference. I reject this assumption. “Image grammars” are models in computer vision that articulate systematic principles governing the form and content of images. These models are empirically credible and can be construed as literal grammars for images. (...) Images can have rich syntactic structure, though of a markedly different form than sentences in language. (shrink)
Like '&', '=' is no term; it represents no extrasentential property. It marks an atomic, nonpredicative, declarative structure, sentences true solely by codesignation. Identity (its necessity and total reflexivity, its substitution rule, its metaphysical vacuity) is the objectual face of codesignation. The syntax demands pure reference, without predicative import for the asserted fact. 'Twain is Clemens' is about Twain, but nothing is predicated of him. Its informational value is in its 'metailed' semantic content: the fact of codesignation (that 'Twain' (...) names Clemens) that explains what fact it asserts and why it is necessary. Critiques of concepts of rigidity and elimination of singular terms result. (shrink)
Dynamic Syntax is an action-based grammar formalism which models the process of natural language understanding as monotonic tree growth. This paper presents an introduction to the notions of incrementality and underspecification and update, drawing on the assumptions made by DS. It lays out the tools of the theoretical framework that are necessary to understand the accounts developed in the other contributions to the Special Issue. It also represents an up-to-date account of the framework, combining the developments that have previously (...) remained distributed in a diverse body of literature. (shrink)
The ‘syntax’ and ‘combinatorics’ of my title are what Curry (1961) referred to as phenogrammatics and tectogrammatics respectively. Tectogrammatics is concerned with the abstract combinatorial structure of the grammar and directly informs semantics, while phenogrammatics deals with concrete operations on syntactic data structures such as trees or strings. In a series of previous papers (Muskens, 2001a; Muskens, 2001b; Muskens, 2003) I have argued for an architecture of the grammar in which ﬁnite sequences of lambda terms are the basic data (...) structures, pairs of terms syntax, semantics for example. These sequences then combine with the help of simple generalizations of the usual abstraction and application operations. This theory, which I call Lambda Grammars and which is closely related to the independently formulated theory of Abstract Categorial Grammars (de Groote, 2001; de Groote, 2002), in fact is an implementation of Curry’s ideas: the level of tectogrammar is encoded by the sequences of lambda-terms and their ways of combination, while the syntactic terms in those sequences constitute the phenogrammatical level. In de Groote’s formulation of the theory, tectogrammar is the level of abstract terms, while phenogrammar is the level of object terms. (shrink)
This book presents a new approach to studying the syntax of human language, one which emphasizes how we think about time. Tilsen argues that many current theories are unsatisfactory because those theories conceptualize syntactic patterns with spatially arranged structures of objects. These object-structures are atemporal and do not lend well to reasoning about time. The book develops an alternative conceptual model in which oscillatory systems of various types interact with each other through coupling forces, and in which the relative (...) energies of those systems are organized in particular ways. Tilsen emphasizes that the two primary mechanisms of the approach - oscillators and energy levels - require alternative ways of thinking about time. Furthermore, his theory leads to a new way of thinking about grammaticality and the recursive nature of language. The theory is applied to a variety of syntactic phenomena: word order, phrase structure, morphosyntax, constituency, case systems, ellipsis, anaphora, and islands. The book also presents a general program for the study of language in which the construction of linguistic theories is itself an object of theoretical analysis. (shrink)
Seit beinahe einem Jahrhundert sind Mathematiker und Logiker mit Erfolg bemiiht, aus der Logik eine strenge Wissen schaft zu machen. Dieses Ziel ist in einem gewissen Sinn erreicht worden: man hat gelemt, in der Logistik mit Symbolen und Formeln ii. hnlich denen der Mathematik in strenger Weise zu operieren. Aber ein logisches Buch muB auBer den Formeln auch Zwischentext enthalten, der mit Hilfe der gewohnlichen Wort sprache iiber die Formeln spricht und ihren Zusammenhang kIar macht. Dieser Zwischentext laBt oft an (...) Klarheit und Exakt heit manches zu wiinschen ubrig. In den letzten Jahren nun hat sich bei den Logikem verschiedener Richtungen immer mehr die Einsicht entwickelt, daB dieser Zwischentext das Wesentliche an der Logik ist und daB as darauf ankommt, fUr diese Satze uber Satze eine exakte Methode zu entwickeln. Dieses Buch will die systematische Darstellung einer solchen Methode, der "logischen Syntax", geben (nahere Erlauterungen in der Einleitung, §§ 1, 2). In unserem "Wiener Kreis" und in manchen ahnlich gerich teten Gruppen (in Polen, Frankreich, England, USA. und ver einzelt sogar in Deutschland) hat sich gegenwartig die Auffassung immer deutlicher herausgebildet, daB die traditionelle meta· physische Philosophie keinen Anspruch auf Wissenschaftlichkeit machen kann. Was an der Arbeit des Philosophen wissenschaft· lich haltbar ist, besteht - soweit es nicht empirische Fragen betrifft, die der Realwissenschaft zuzuweisen sind - in logischer Analyse. Die logische Syntax will nun ein Begriffsgeb8. ude, eine Sprache liefem, mit deren Hilfe die Ergebnisse logischer Analyse exakt formulierbar sind. (shrink)
Three studies provided evidence that syntax influences intentionality judgments. In Experiment 1, participants made either speeded or unspeeded intentionality judgments about ambiguously intentional subjects or objects. Participants were more likely to judge grammatical subjects as acting intentionally in the speeded relative to the reflective condition (thus showing an intentionality bias), but grammatical objects revealed the opposite pattern of results (thus showing an unintentionality bias). In Experiment 2, participants made an intentionality judgment about one of the two actors in a (...) partially symmetric sentence (e.g., “John exchanged products with Susan”). The results revealed a tendency to treat the grammatical subject as acting more intentionally than the grammatical object. In Experiment 3 participants were encouraged to think about the events that such sentences typically refer to, and the tendency was significantly reduced. These results suggest a privileged relationship between language and central theory-of-mind concepts. More specifically, there may be two ways of determining intentionality judgments: (1) an automatic verbal bias to treat grammatical subjects (but not objects) as intentional (2) a deeper, more careful consideration of the events typically described by a sentence. (shrink)
A central debate in the cognitive sciences surrounds the nature of adult speakers' linguistic representations: Are they purely syntactic (a traditional and widely held view; e.g., Branigan & Pickering, ), or are they semantically structured? A recent study (Ambridge, Bidgood, Pine, Rowland, & Freudenthal, ) found support for the latter view, showing that adults' acceptability judgments of passive sentences were significantly predicted by independent semantic “affectedness” ratings designed to capture the putative semantics of the construction (e.g., Bob was pushed by (...) Wendy is rated as more acceptable than Bob was liked by Wendy, as Bob is more affected in the former). However, because English lacks a separate topicalization construction which provides an alternative means of highlighting the patient (e.g., BOB, Wendy kicked), these findings have a possible alternative explanation: that highly affected entities are more likely to be topicalized, rather than passivized per se. Here we show that, in fact, Ambridge et al.'s () finding replicates in Indonesian, a language with a topicalization construction. The present study therefore provides particularly compelling evidence that grammatical representations have semantic structure. (shrink)
Language—often said to set human beings apart from other animals—has resisted explanation in terms of evolution. Language has—among others—two fundamental and distinctive features: syntax and the ability to express non-present actions and events. We suggest that the relation between this representation (of non-present action) and syntax can be analyzed as a relation between a function and a structure to fulfill this function. The strategy of the paper is to ask if there is any evidence of pre-linguistic communication that (...) fulfills the function of communicating an absent action. We identify a structural similarity between understanding indexes of past actions of conspecifics (who did what to whom) and one of the simplest and most paradigmatic linguistic syntactic patterns – that of the simple transitive sentence. When a human being infers past events from an index (i.e., a trace, the conditions of a conspecifics or an animal, a constellation or an object) the interpreters’ comprehension must rely on concepts similar in structure and function to the ‘thematic roles’ believed to underpin the comprehension of linguistic syntax: in his or her mind the idea of a past action or event emerges along with thematic role-like concepts; in the case of the presentation of, e.g., a hunting trophy, the presenter could be understood to be an agent (subject) and the trophy a patient (direct object), while the past action killed is implied by the condition of the object and its possession by the presenter. We discuss whether both the presentation of a trophy and linguistic syntax might have emerged independently while having the same function (to represent a past action) or whether the presentation of an index of a deed could constitute a precursor of language. Both possibilities shed new light on early, and maybe first, language use. (shrink)
We address the issue of the nature of representations during development regarding language acquisition. The controversy of syntax as a process or operation for representation formation and syntax as a representation in itself is discussed. Eliminating the cognitive unconscious does not warrant a simplified, more parsimonious approach to human cognition in general.
Split constructions are widespread in natural languages. The separation of the semantic restriction of a quantifier from that quantifier is a typical example of such a construction. This study addresses the problem that such discontinuous strings exhibit--namely, a number of locality constraints, including intervention effects. These are shown to follow from the interaction of a minimalist syntax with a semantics that directly assigns a model-theoretic interpretation to syntactic logical forms. The approach is shown to have wide empirical coverage and (...) a conceptual simplicity. The book will be of interest to scholars and advanced students of syntax and semantics. (shrink)
This collection covers the fundamental concepts and analytic tools of generative transformational syntax of the last half century, from Chomsky's Morphophonemics of Modern Hebrew (1951) to the present day. It makes available, in one place, key published material on important areas such as phrase structure, transformations, and conditions on rules and representations. Presenting articles by leading contributors to the field such as Baltin, Bokovic, Bresnan, Chomsky, Cinque, Emonds, Freidin, Hale, Higginbotham, Huang, Kayne, Lasnik, McCawley, Pollock, Postal, Reinhart, Rizzi, Ross, (...) Stowell, Torrego, Travis, Vergnaud, and Williams, this fascinating collection also includes a general introduction by the editors and an index, thus providing a comprehensive single reference resource for students and researchers alike. (shrink)
Linear Syntax makes a case for a critical reassessment of the wide-spread view that syntax can be reduced to tree structures. It argues that a crucial part of the description of German clausal syntax should instead be based on concepts that are defined in terms of linear order. By connecting the descriptive tools of modern phrase-structure grammar with traditional descriptive scholarship, Andreas Kathol offers a new perspective on many long-standing problems in syntactic theory.
P.M.S. Hacker has argued that there are numerous misconceptions in James Conant's account of Wittgenstein's views and of those of Carnap. I discuss only Hacker's treatment of Conant on logical syntax in the _Tractatus. I try to show that passages in the _Tractatus which Hacker takes to count strongly against Conant's view do no such thing, and that he himself has not explained how he can account for a significant passage which certainly appears to support Conant's reading.
The Marxian Thesis about the role of violence in History, as it is enunciated in The Capital, is investigated through an analysis of the Hegelian character of its syntax, and the way Engels develops it; a non-teleological interpretation of the thesis is then defended, one that understands that violence presents a plurality of forms, a pervasive character and a heavy materiality.Trata-se de investigar a tese marxiana acerca do papel da violência na história, tal como enunciada em O Capital, analisando (...) sua sintaxe de matriz hegeliana e o modo como Engels articula tal tese, para então defender uma interpretação não-teleológica da violência, segundo a qual esta apresenta uma pluralidade de formas, um caráter totalmente difuso e uma pesada materialidade. (shrink)
This book is a collection of key readings on Minimalist Syntax, the most recent, and arguably most important, theoretical development within the Principles and Parameters approach to syntactic theory. Brings together in one volume the key readings on Minimalist Syntax Includes an introduction and overview of the Minimalist Program written by two prominent researchers Excerpts crucial pieces from the beginning of Minimalism to the most recent work and provides invaluable coverage of the most important topics.
The relation between linguistics and logic has been discussed in a, recent paper by Bar-Hillel} where it is argued that a disregard for workin logical syntax and semantics has caused linguists to limit themselves too narrowly in their inquiries, and to fall into several errors. In particular, Bar-Hillel asserts, they have attempted to derive relations of synonymy and so-called ‘rules of transfOI`1'Il8.tiOH,, such as the active—pussive relation, from distributional studies alone, and they have hesitated to rely on considerations of (...) meaning in linguistic analysis. No one can quarrel with the suggestion that linguists interest themselves in meaning or transformation rules, but the relevance of logical syntax and semsmticsz (at least as we now know them) to this study is very dubious. I think that a closer investigation of the assumptions and concems of logical syntax and semantics will show that the hope of applying the results which have been achieved in these fields to the solution of linguistic problems is illusory. (shrink)
The only obligatory temporal expression in English is tense, yet Hans Reichenbach (1947) has argued convincingly that the simplest sentence is understood in terms of three temporal notions. Additional possibilities for a simple sentence are limited: English sentences have one time adverbial each. It is not immediately clear how to resolve these matters, that is, how (if at all) Reichenbach's account can be reconciled with the facts of English. This paper attempts to show that they can be reconciled, and presents (...) an analysis of temporal specification that is based directly on Reichenbach's account.Part I is devoted to a study of the way the three times—speech time, reference time, event time—are realized and interpreted. The relevant syntactic structures and their interaction and interpretation are examined in detail. Part II discusses how a grammar should deal with time specification, and proposes a set of interpretive rules. The study offers an analysis of simple sentences, sentences with complements, and habitual sentences. It is shown that tense and adverbials function differently, depending on the structure in which they appear. The temporal system is relational: the orientation and values of temporal expressions are not fixed, but their relational values are consistent. This consistency allows the statement of principles of interpretation. (shrink)
The aim of this paper is to provide context for and historical exegesis of Carnap’s alleged move from syntax to semantics. The Orthodox Received View states that there was a radical break, while the Unorthodox Received View holds that Carnap’s syntactical period already had many significant semantical elements. I will argue that both of them are partly right, both of them contain a kernel of truth: it is true that Carnap’s semantical period started after his Logical Syntax of (...) Language — in one sense of semantics. But it is also true that Carnap had already included semantical ideas in LSL: though not in the sense that URV maintains. This latter sense of semantics is related to what is usually called inferentialism, and by getting a clearer picture of Carnap’s original aims, context, and concept-usage, we might be in a better position to approach his alleged inferentialism. (shrink)
This book presents and exemplifies the theory of grammar called Semantic Syntax. The grammar, which offers a syntactic theory closely connected with semantic analyses, is a direct continuation of Generative Semantics; it will re-ignite interest in that framework which flourished and promised so much in the 1960s and 1970s.
What is the source of logical and mathematical truth? This book revitalizes conventionalism as an answer to this question. Conventionalism takes logical and mathematical truth to have their source in linguistic conventions. This was an extremely popular view in the early 20th century, but it was never worked out in detail and is now almost universally rejected in mainstream philosophical circles. Shadows of Syntax is the first book-length treatment and defense of a combined conventionalist theory of logic and mathematics. (...) It argues that our conventions, in the form of syntactic rules of language use, are perfectly suited to explain the truth, necessity, and a priority of logical and mathematical claims, as well as our logical and mathematical knowledge. (shrink)
Abstract It is widely assumed that the meaning of at least some types of expressions involves more than their reference to objects, and hence that there may be co-referential expressions which differ in meaning. It is also widely assumed that ?syntax does not suffice for semantics?, i.e. that we cannot account for the fact that expressions have semantic properties in purely syntactical or computational terms. The main goal of the paper is to argue against a third related assumption, namely (...) that what is responsible for a difference in meaning between co-referential expressions is the computational difference in the cognitive functioning of the expressions. ?Intentional aspects? of expressions?those features which their meanings involve in addition to reference?cannot be syntacticized, since they are individuated not in terms of any cognitive feature, but rather in terms of those properties of the referents through which the expressions refer to them, and cognitive features cannot determine such properties in exactly the same sense as they cannot determine reference. (shrink)