Switch to: References

Add citations

You must login to add citations.
  1. Natural Recursion Doesn’t Work That Way: Automata in Planning and Syntax.Cem Bozsahin - 2016 - In Vincent C. Müller (ed.), Fundamental Issues of Artificial Intelligence. Cham: Springer. pp. 95-112.
    Natural recursion in syntax is recursion by linguistic value, which is not syntactic in nature but semantic. Syntax-specific recursion is not recursion by name as the term is understood in theoretical computer science. Recursion by name is probably not natural because of its infinite typeability. Natural recursion, or recursion by value, is not species-specific. Human recursion is not syntax-specific. The values on which it operates are most likely domain-specific, including those for syntax. Syntax seems to require no more (and no (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  • Inessential features, ineliminable features, and modal logics for model theoretic syntax.Hans-Jörg Tiede - 2008 - Journal of Logic, Language and Information 17 (2):217-227.
    While monadic second-order logic (MSO) has played a prominent role in model theoretic syntax, modal logics have been used in this context since its inception. When comparing propositional dynamic logic (PDL) to MSO over trees, Kracht (1997) noted that there are tree languages that can be defined in MSO that can only be defined in PDL by adding new features whose distribution is predictable. He named such features “inessential features”. We show that Kracht’s observation can be extended to other modal (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • The Construction of Meaning.Walter Kintsch & Praful Mangalath - 2011 - Topics in Cognitive Science 3 (2):346-370.
    We argue that word meanings are not stored in a mental lexicon but are generated in the context of working memory from long-term memory traces that record our experience with words. Current statistical models of semantics, such as latent semantic analysis and the Topic model, describe what is stored in long-term memory. The CI-2 model describes how this information is used to construct sentence meanings. This model is a dual-memory model, in that it distinguishes between a gist level and an (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  • Lexicalized Non-Local MCTAG with Dominance Links is NP-Complete.Lucas Champollion - 2011 - Journal of Logic, Language and Information 20 (3):343-359.
    An NP-hardness proof for non-local Multicomponent Tree Adjoining Grammar (MCTAG) by Rambow and Satta (1st International Workshop on Tree Adjoining Grammers 1992 ), based on Dahlhaus and Warmuth (in J Comput Syst Sci 33:456–472, 1986 ), is extended to some linguistically relevant restrictions of that formalism. It is found that there are NP-hard grammars among non-local MCTAGs even if any or all of the following restrictions are imposed: (i) lexicalization: every tree in the grammar contains a terminal; (ii) dominance links: (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  • Children’s Grammars Grow More Abstract with Age-Evidence from an Automatic Procedure for Identifying the Productive Units of Language.Gideon Borensztajn, Willem Zuidema & Rens Bod - 2009 - Topics in Cognitive Science 1 (1):175-188.
    We develop an approach to automatically identify the most probable multiword constructions used in children’s utterances, given syntactically annotated utterances from the Brown corpus of CHILDES. The found constructions cover many interesting linguistic phenomena from the language acquisition literature and show a progression from very concrete toward abstract constructions. We show quantitatively that for all children of the Brown corpus grammatical abstraction, defined as the relative number of variable slots in the productive units of their grammar, increases globally with age.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  • From Exemplar to Grammar: A Probabilistic Analogy‐Based Model of Language Learning.Rens Bod - 2009 - Cognitive Science 33 (5):752-793.
    While rules and exemplars are usually viewed as opposites, this paper argues that they form end points of the same distribution. By representing both rules and exemplars as (partial) trees, we can take into account the fluid middle ground between the two extremes. This insight is the starting point for a new theory of language learning that is based on the following idea: If a language learner does not know which phrase‐structure trees should be assigned to initial sentences, s/he allows (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  • Binding Theory in LTAG.Williams Hall & Lucas Champollion - unknown
    This paper provides a unification-based implementation of Binding Theory (BT) for the English language in the framework of feature-based lexicalized tree-adjoining grammar (LTAG). The grammar presented here does not actually coindex any noun phrases, it merely outputs a set of constraints on co- and contraindexation that may later be processed by a separate anaphora resolution module. It improves on previous work by implementing the full BT rather than just Condition A. The main technical innovation consists in allowing lists to appear (...)
    Direct download  
     
    Export citation  
     
    Bookmark