W. Labov's & T. Labov's findings concerning their child grammar acquisition ("Learning the Syntax of Questions" in Recent Advances in the Psychology of Language, Campbell, R. & Smith, P. Eds, New York: Plenum Press, 1978) are interpreted in terms of different semantics of why & other wh-questions. Z. Dubiel.
This is an introduction to the structure of sentences in human languages. It assumes no prior knowledge of linguistic theory and little of elementary grammar. It will suit students coming to syntactic theory for the first time either as graduates or undergraduates. It will also be useful for those in fields such as computational science, artificial intelligence, or cognitive psychology who need a sound knowledge of current syntactic theory.
Much of the best contemporary work in the philosophy of language and content makes appeal to the theories developed in generative syntax. In particular, there is a presumption that—at some level and in some way—the structures provided by syntactic theory mesh with or support our conception of content/linguistic meaning as grounded in our first-person understanding of our communicative speech acts. This paper will suggest that there is no such tight fit. Its claim will be that, if recent generative theories (...) are on the right lines, syntactic structure provides both too much and too little to serve as the structural partner for content, at least as that notion is generally understood in philosophy. The paper will substantiate these claims by an assessment of the recent work of King, Stanley, and others. (shrink)
Much has been said about the logical difference between rules and principles, yet few authors have focused on the distinct logical connectives linking the normative conditions of both norms. I intend to demonstrate that principles, unlike rules, are norms whose antecedents are linguistically formulated in a generic fashion, and thus logically described as inclusive disjunctions. This core feature incorporates the relevance criteria of normative antecedents into the world of principles and also explains their aptitude to conflict with opposing norms, namely (...) that their consequents are fulfilled to varying extents more frequently than those of rules. I conclude that the property of genericity should be predicated to the norm antecedent of principles, more precisely to the hypothetical action. This is of paramount importance to explain, in terms of logical implication and exclusion, the expansibility of competing principles, in contrast with the exclusive character of conflicting rules. (shrink)
Building on the success of the bestselling first edition, the second edition of this textbook provides a comprehensive and accessible introduction to the major issues in Principles and Parameters syntactic theory, including phrase structure, the lexicon, case theory, movement, and locality conditions. Includes new and extended problem sets in every chapter, all of which have been annotated for level and skill type. Features three new chapters on advanced topics including vP shells, object shells, control, gapping and ellipsis and an additional (...) chapter on advanced topics in binding. Offers a brief survey of both Lexical-Functional Grammar and Head-Driven Phrase Structure Grammar. Succeeds in strengthening the reader's foundational knowledge, and prepares them for more advanced study. Supported by an instructor's manual and online resources for students and instructors, available at www.blackwellpublishing.com/carnie. (shrink)
This groundbreaking book offers a new and compelling perspective on the structure of human language. The fundamental issue it addresses is the proper balance between syntax and semantics, between structure and derivation, and between rule systems and lexicon. It argues that the balance struck by mainstream generative grammar is wrong. It puts forward a new basis for syntactic theory, drawing on a wide range of frameworks, and charts new directions for research. In the past four decades, theories of syntactic (...) structure have become more abstract, and syntactic derivations have become ever more complex. Peter Culicover and Ray Jackendoff trace this development through the history of contemporary syntactic theory, showing how much it has been driven by theory-internal rather than empirical considerations. They develop an alternative that is responsive to linguistic, cognitive, computational, and biological concerns. At the core of this alternative is the Simpler Syntax Hypothesis: the most explanatory syntactic theory is one that imputes the minimum structure necessary to mediate between phonology and meaning. A consequence of this hypothesis is a far richer mapping between syntax and semantics than is generally assumed. Through concrete analyses of numerous grammatical phenomena, some well studied and some new, the authors demonstrate the empirical and conceptual superiority of the Simpler Syntax approach.Simpler Syntax is addressed to linguists of all persuasions. It will also be of central interest to those concerned with language in psychology, human biology, evolution, computational science, and artificial intellige. (shrink)
Turner argues that computer programs must have purposes, that implementation is not a kind of semantics, and that computers might need to understand what they do. I respectfully disagree: Computer programs need not have purposes, implementation is a kind of semantic interpretation, and neither human computers nor computing machines need to understand what they do.
It is commonly argued that the rules of language, as distinct from its semantic features, are the characteristics which most clearly distinguish language from the communication systems of other species. A number of linguists (e.g., Chomsky 1972, 1980; Pinker 1994) have suggested that the universal features of grammar (UG) are unique human adaptations showing no evolutionary continuities with any other species. However, recent summaries of the substantive features of UG are quite remarkable in the very general nature of the features (...) proposed. While the syntax of any given language can be quite complex, the specific rules vary so much between languages that the truly universal (i.e. innate) aspects of grammar are not complex at all. In fact, these features most closely resemble a set of general descriptions of our richly complex semantic cognition, and not a list of specific rules. General principles of the evolutionary process suggest that syntax is more properly understood as an emergent characteristic of the explosion of semantic complexity that occurred during hominid evolution. It is argued that grammatical rules used in given languages are likely to be simply conventionalized, invented features of language, and not the result of an innate, grammar-specific module. The grammatical and syntactic regularities that are found across languages occur simply because all languages attempt to communicate the same sorts of semantic information. (shrink)
One of the most important discoveries of the last thirty years is the extent to which the pattern of anaphoric interpretations is determined by the geometry of syntactic structure. As our understanding of these phenomena has steadily grown, the theory of syntax has often been driven by discoveries in this domain, and it is no accident that Chomsky's Binding Theory was a centerpiece of the principles and parameters approach of the 1980s. However, what remained accidental in Chomsky's theory, and (...) in most of the theories that have followed it, is the apparently complementary distribution of forms that support anaphora for a given antecedent. This book argues not only that the complementary distribution in question is robust empirically, but that its existence is derived by a competitive theory of anaphora. It is demonstrated in detail that the competitive theory provides a far better explanation of anti-locality, anti-subject orientation and the range of apparently exceptional distributions that have been long been problematic for other approaches, such as Chomsky's Binding Theory and the influential predication-based theory of Reinhart and Reuland. (shrink)
What is the source of logical and mathematical truth? This book revitalizes conventionalism as an answer to this question. Conventionalism takes logical and mathematical truth to have their source in linguistic conventions. This was an extremely popular view in the early 20th century, but it was never worked out in detail and is now almost universally rejected in mainstream philosophical circles. Shadows of Syntax is the first book-length treatment and defense of a combined conventionalist theory of logic and mathematics. (...) It argues that our conventions, in the form of syntactic rules of language use, are perfectly suited to explain the truth, necessity, and a priority of logical and mathematical claims, as well as our logical and mathematical knowledge. (shrink)
Proponents of the language of thought (LOT) thesis are realists when it comes to syntactically structured representations, and must defend their view against instrumentalists, who would claim that syntactic structures may be useful in describing cognition, but have no more causal powers in governing cognition than do the equations of physics in guiding the planets. This paper explores what it will take to provide an argument for LOT that can defend its conclusion from instrumentalism. I illustrate a difficulty in this (...) project by discussing arguments for LOT put forward by Horgan and Tienson. When their evidence is viewed in the light of results in connectionist research, it is hard to see how a realist conception of syntax can be formulated and defended. (shrink)
Three studies provided evidence that syntax influences intentionality judgments. In Experiment 1, participants made either speeded or unspeeded intentionality judgments about ambiguously intentional subjects or objects. Participants were more likely to judge grammatical subjects as acting intentionally in the speeded relative to the reflective condition (thus showing an intentionality bias), but grammatical objects revealed the opposite pattern of results (thus showing an unintentionality bias). In Experiment 2, participants made an intentionality judgment about one of the two actors in a (...) partially symmetric sentence (e.g., “John exchanged products with Susan”). The results revealed a tendency to treat the grammatical subject as acting more intentionally than the grammatical object. In Experiment 3 participants were encouraged to think about the events that such sentences typically refer to, and the tendency was significantly reduced. These results suggest a privileged relationship between language and central theory-of-mind concepts. More specifically, there may be two ways of determining intentionality judgments: (1) an automatic verbal bias to treat grammatical subjects (but not objects) as intentional (2) a deeper, more careful consideration of the events typically described by a sentence. (shrink)
Language—often said to set human beings apart from other animals—has resisted explanation in terms of evolution. Language has—among others—two fundamental and distinctive features: syntax and the ability to express non-present actions and events. We suggest that the relation between this representation (of non-present action) and syntax can be analyzed as a relation between a function and a structure to fulfill this function. The strategy of the paper is to ask if there is any evidence of pre-linguistic communication that (...) fulfills the function of communicating an absent action. We identify a structural similarity between understanding indexes of past actions of conspecifics (who did what to whom) and one of the simplest and most paradigmatic linguistic syntactic patterns – that of the simple transitive sentence. When a human being infers past events from an index (i.e., a trace, the conditions of a conspecifics or an animal, a constellation or an object) the interpreters’ comprehension must rely on concepts similar in structure and function to the ‘thematic roles’ believed to underpin the comprehension of linguistic syntax: in his or her mind the idea of a past action or event emerges along with thematic role-like concepts; in the case of the presentation of, e.g., a hunting trophy, the presenter could be understood to be an agent (subject) and the trophy a patient (direct object), while the past action killed is implied by the condition of the object and its possession by the presenter. We discuss whether both the presentation of a trophy and linguistic syntax might have emerged independently while having the same function (to represent a past action) or whether the presentation of an index of a deed could constitute a precursor of language. Both possibilities shed new light on early, and maybe first, language use. (shrink)
P.M.S. Hacker has argued that there are numerous misconceptions in James Conant's account of Wittgenstein's views and of those of Carnap. I discuss only Hacker's treatment of Conant on logical syntax in the _Tractatus. I try to show that passages in the _Tractatus which Hacker takes to count strongly against Conant's view do no such thing, and that he himself has not explained how he can account for a significant passage which certainly appears to support Conant's reading.
Recently several philosophers of science have proposed what has come to be known as the semantic account of scientific theories. It is presented as an improvement on the positivist account, which is now called the syntactic account of scientific theories. Bas van Fraassen claims that the syntactic account does not give a satisfactory definition of "empirical adequacy" and "empirical equivalence". He contends that his own semantic account does define these notations acceptably, through the concept of "embeddability", a concept which he (...) claims cannot be defined syntactically. Here, I define a syntactic relation which corresponds to the semantic relation of "embeddability". I suggest that the critical differences between the positivist account and van Fraassen's account have nothing to do with the distinction between semantics and syntax. (shrink)
The discrepancy between syntax and semantics is a painstaking issue that hinders a better comprehension of the underlying neuronal processes in the human brain. In order to tackle the issue, we at first describe a striking correlation between Wittgenstein's Tractatus, that assesses the syntactic relationships between language and world, and Perlovsky's joint language-cognitive computational model, that assesses the semantic relationships between emotions and “knowledge instinct”. Once established a correlation between a purely logical approach to the language and computable psychological (...) activities, we aim to find the neural correlates of syntax and semantics in the human brain. Starting from topological arguments, we suggest that the semantic properties of a proposition are processed in higher brain's functional dimensions than the syntactic ones. In a fully reversible process, the syntactic elements embedded in Broca's area project into multiple scattered semantic cortical zones. The presence of higher functional dimensions gives rise to the increase in informational content that takes place in semantic expressions. Therefore, diverse features of human language and cognitive world can be assessed in terms of both the logic armor described by the Tractatus, and the neurocomputational techniques at hand. One of our motivations is to build a neuro-computational framework able to provide a feasible explanation for brain's semantic processing, in preparation for novel computers with nodes built into higher dimensions. (shrink)
I critically examine some provocative arguments that John Searle presents in his book The Rediscovery of Mind to support the claim that the syntactic states of a classical computational system are "observer relative" or "mind dependent" or otherwise less than fully and objectively real. I begin by explaining how this claim differs from Searle's earlier and more well-known claim that the physical states of a machine, including the syntactic states, are insufficient to determine its semantics. In contrast, his more recent (...) claim concerns the syntax, in particular, whether a machine actually has symbols to underlie its semantics. I then present and respond to a number of arguments that Searle offers to support this claim, including whether machine symbols are observer relative because the assignment of syntax is arbitrary, or linked to universal realizability, or linked to the sub-personal interpretive acts of a homunculus, or linked to a person's consciousness. I conclude that a realist about the computational model need not be troubled by such arguments. Their key premises need further support. (shrink)
Dynamic Syntax is an action-based grammar formalism which models the process of natural language understanding as monotonic tree growth. This paper presents an introduction to the notions of incrementality and underspecification and update, drawing on the assumptions made by DS. It lays out the tools of the theoretical framework that are necessary to understand the accounts developed in the other contributions to the Special Issue. It also represents an up-to-date account of the framework, combining the developments that have previously (...) remained distributed in a diverse body of literature. (shrink)
This book presents a formal and philosophical analysis of language syntax. It refers to some ideas of E.Husserl and G. Frege, to S. Leśniewski's theory of syntactic categories and K. Ajdukiewicz's conception of formal grammar, also to Ch.S. Pierces's distinction between tokens (concrete linguistic entities) and types (ideal linguistic entities) and to A.A. Markov's theory of algorithms. The central aim of the book is - in the spirit of these ideas - to provide both strict yet comprehensive lectures on (...) two axiomatic theories of languages (grammars) irrespective of specific structure of their expression and the notation used in them. The main feature of these theories are that definitions of well-formed expression allow the formulation of algorithms for the examination of syntactic correctness of expressions and that their formalizations are bi-level, in reference to opposite philosophical orientations: nominalistic and idealistic. The theoretical considerations in the book speak in favour of the former. The book contains a translation of the basic contents of the book in Polish "Teorie języków syntaktycznie kategorialnych ("Theories of syntactically categorial languages"), PWN, Warszawa-Wrocław 1985, and extensive Introduction and Final Remarks. In Introduction are discussed the main assumptions, objectives and conditionings of presented theories and intuitive foundations of these theories. Final Remarks are connected with the subject-matter of the book and the ability to build syntax theories in the opposition spirit, because of the Platonic approach to language syntax. -/- . (shrink)
The ‘syntax’ and ‘combinatorics’ of my title are what Curry (1961) referred to as phenogrammatics and tectogrammatics respectively. Tectogrammatics is concerned with the abstract combinatorial structure of the grammar and directly informs semantics, while phenogrammatics deals with concrete operations on syntactic data structures such as trees or strings. In a series of previous papers (Muskens, 2001a; Muskens, 2001b; Muskens, 2003) I have argued for an architecture of the grammar in which ﬁnite sequences of lambda terms are the basic data (...) structures, pairs of terms syntax, semantics for example. These sequences then combine with the help of simple generalizations of the usual abstraction and application operations. This theory, which I call Lambda Grammars and which is closely related to the independently formulated theory of Abstract Categorial Grammars (de Groote, 2001; de Groote, 2002), in fact is an implementation of Curry’s ideas: the level of tectogrammar is encoded by the sequences of lambda-terms and their ways of combination, while the syntactic terms in those sequences constitute the phenogrammatical level. In de Groote’s formulation of the theory, tectogrammar is the level of abstract terms, while phenogrammar is the level of object terms. (shrink)
The relation between linguistics and logic has been discussed in a, recent paper by Bar-Hillel} where it is argued that a disregard for workin logical syntax and semantics has caused linguists to limit themselves too narrowly in their inquiries, and to fall into several errors. In particular, Bar-Hillel asserts, they have attempted to derive relations of synonymy and so-called ‘rules of transfOI`1'Il8.tiOH,, such as the active—pussive relation, from distributional studies alone, and they have hesitated to rely on considerations of (...) meaning in linguistic analysis. No one can quarrel with the suggestion that linguists interest themselves in meaning or transformation rules, but the relevance of logical syntax and semsmticsz (at least as we now know them) to this study is very dubious. I think that a closer investigation of the assumptions and concems of logical syntax and semantics will show that the hope of applying the results which have been achieved in these fields to the solution of linguistic problems is illusory. (shrink)
A new view of the functional role of the left anterior cortex in language use is proposed. The experimental record indicates that most human linguistic abilities are not localized in this region. In particular, most of syntax (long thought to be there) is not located in Broca's area and its vicinity (operculum, insula, and subjacent white matter). This cerebral region, implicated in Broca's aphasia, does have a role in syntactic processing, but a highly specific one: It is the neural (...) home to receptive mechanisms involved in the computation of the relation between transformationally moved phrasal constituents and their extraction sites (in line with the Trace-Deletion Hypothesis). It is also involved in the construction of higher parts of the syntactic tree in speech production. By contrast, basic combinatorial capacities necessary for language processing – for example, structure-building operations, lexical insertion – are not supported by the neural tissue of this cerebral region, nor is lexical or combinatorial semantics. The dense body of empirical evidence supporting this restrictive view comes mainly from several angles on lesion studies of syntax in agrammatic Broca's aphasia. Five empirical arguments are presented: experiments in sentence comprehension, cross-linguistic considerations (where aphasia findings from several language types are pooled and scrutinized comparatively), grammaticality and plausibility judgments, real-time processing of complex sentences, and rehabilitation. Also discussed are recent results from functional neuroimaging and from structured observations on speech production of Broca's aphasics. Syntactic abilities are nonetheless distinct from other cognitive skills and are represented entirely and exclusively in the left cerebral hemisphere. Although more widespread in the left hemisphere than previously thought, they are clearly distinct from other human combinatorial and intellectual abilities. The neurological record (based on functional imaging, split-brain and right-hemisphere-damaged patients, as well as patients suffering from a breakdown of mathematical skills) indicates that language is a distinct, modularly organized neurological entity. Combinatorial aspects of the language faculty reside in the human left cerebral hemisphere, but only the transformational component (or algorithms that implement it in use) is located in and around Broca's area. Key Words: agrammatism; aphasia; Broca's area; cerebral localization; dyscalculia; functional neuroanatomy; grammatical transformation; modularity; neuroimaging; syntax; trace deletion. (shrink)
The only obligatory temporal expression in English is tense, yet Hans Reichenbach (1947) has argued convincingly that the simplest sentence is understood in terms of three temporal notions. Additional possibilities for a simple sentence are limited: English sentences have one time adverbial each. It is not immediately clear how to resolve these matters, that is, how (if at all) Reichenbach's account can be reconciled with the facts of English. This paper attempts to show that they can be reconciled, and presents (...) an analysis of temporal specification that is based directly on Reichenbach's account.Part I is devoted to a study of the way the three times—speech time, reference time, event time—are realized and interpreted. The relevant syntactic structures and their interaction and interpretation are examined in detail. Part II discusses how a grammar should deal with time specification, and proposes a set of interpretive rules. The study offers an analysis of simple sentences, sentences with complements, and habitual sentences. It is shown that tense and adverbials function differently, depending on the structure in which they appear. The temporal system is relational: the orientation and values of temporal expressions are not fixed, but their relational values are consistent. This consistency allows the statement of principles of interpretation. (shrink)
A traditional view maintains that thought, while expressed in language, is non-linguistic in nature and occurs in non-linguistic beings as well. I assess this view against current theories of the evolutionary design of human grammar. I argue that even if some forms of human thought are shared with non-human animals, a residue remains that characterizes a unique way in which human thought is organized as a system. I explore the hypothesis that the cause of this difference is a grammatical way (...) of structuring semantic information, and I present evidence that the organization of grammar precisely reflects the organization of a specific mode of thought apparently distinctive of humans. Since there appears to be no known non-grammatical structuring principle for the relevant mode of thought, I suggest that grammar is that principle, with no independent ?Language of Thought? needed. (shrink)
Linear Syntax makes a case for a critical reassessment of the wide-spread view that syntax can be reduced to tree structures. It argues that a crucial part of the description of German clausal syntax should instead be based on concepts that are defined in terms of linear order. By connecting the descriptive tools of modern phrase-structure grammar with traditional descriptive scholarship, Andreas Kathol offers a new perspective on many long-standing problems in syntactic theory.
The Marxian Thesis about the role of violence in History, as it is enunciated in The Capital, is investigated through an analysis of the Hegelian character of its syntax, and the way Engels develops it; a non-teleological interpretation of the thesis is then defended, one that understands that violence presents a plurality of forms, a pervasive character and a heavy materiality.Trata-se de investigar a tese marxiana acerca do papel da violência na história, tal como enunciada em O Capital, analisando (...) sua sintaxe de matriz hegeliana e o modo como Engels articula tal tese, para então defender uma interpretação não-teleológica da violência, segundo a qual esta apresenta uma pluralidade de formas, um caráter totalmente difuso e uma pesada materialidade. (shrink)