Switch to: References

Add citations

You must login to add citations.
  1. Geometric Representations for Minimalist Grammars.Peter Beim Graben & Sabrina Gerth - 2012 - Journal of Logic, Language and Information 21 (4):393-432.
    We reformulate minimalist grammars as partial functions on term algebras for strings and trees. Using filler/role bindings and tensor product representations, we construct homomorphisms for these data structures into geometric vector spaces. We prove that the structure-building functions as well as simple processors for minimalist languages can be realized by piecewise linear operators in representation space. We also propose harmony, i.e. the distance of an intermediate processing step from the final well-formed state in representation space, as a measure of processing (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  • Artificial Grammar Learning Capabilities in an Abstract Visual Task Match Requirements for Linguistic Syntax.Gesche Westphal-Fitch, Beatrice Giustolisi, Carlo Cecchetto, Jordan S. Martin & W. Tecumseh Fitch - 2018 - Frontiers in Psychology 9.
  • Evaluating models of robust word recognition with serial reproduction.Stephan C. Meylan, Sathvik Nair & Thomas L. Griffiths - 2021 - Cognition 210 (C):104553.
    Spoken communication occurs in a “noisy channel” characterized by high levels of environmental noise, variability within and between speakers, and lexical and syntactic ambiguity. Given these properties of the received linguistic input, robust spoken word recognition—and language processing more generally—relies heavily on listeners' prior knowledge to evaluate whether candidate interpretations of that input are more or less likely. Here we compare several broad-coverage probabilistic generative language models in their ability to capture human linguistic expectations. Serial reproduction, an experimental paradigm where (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Uncertainty About the Rest of the Sentence.John Hale - 2006 - Cognitive Science 30 (4):643-672.
    A word-by-word human sentence processing complexity metric is presented. This metric formalizes the intuition that comprehenders have more trouble on words contributing larger amounts of information about the syntactic structure of the sentence as a whole. The formalization is in terms of the conditional entropy of grammatical continuations, given the words that have been heard so far. To calculate the predictions of this metric, Wilson and Carroll's (1954) original entropy reduction idea is extended to infinite languages. This is demonstrated with (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   32 citations