10 found
Order:
Disambiguations
Stefan L. Frank [9]Stefan Frank [1]
See also
  1.  95
    Reconciling Embodied and Distributional Accounts of Meaning in Language.Mark Andrews, Stefan Frank & Gabriella Vigliocco - 2014 - Topics in Cognitive Science 6 (3):359-370.
    Over the past 15 years, there have been two increasingly popular approaches to the study of meaning in cognitive science. One, based on theories of embodied cognition, treats meaning as a simulation of perceptual and motor states. An alternative approach treats meaning as a consequence of the statistical distribution of words across spoken and written language. On the surface, these appear to be opposing scientific paradigms. In this review, we aim to show how recent cross-disciplinary developments have done much to (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  2. Uncertainty Reduction as a Measure of Cognitive Load in Sentence Comprehension.Stefan L. Frank - 2013 - Topics in Cognitive Science 5 (3):475-494.
    The entropy-reduction hypothesis claims that the cognitive processing difficulty on a word in sentence context is determined by the word's effect on the uncertainty about the sentence. Here, this hypothesis is tested more thoroughly than has been done before, using a recurrent neural network for estimating entropy and self-paced reading for obtaining measures of cognitive processing load. Results show a positive relation between reading time on a word and the reduction in entropy due to processing that word, supporting the entropy-reduction (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  3.  50
    Cross‐Linguistic Differences in Processing Double‐Embedded Relative Clauses: Working‐Memory Constraints or Language Statistics?Stefan L. Frank, Thijs Trompenaars & Shravan Vasishth - 2016 - Cognitive Science 40 (3):554-578.
    An English double-embedded relative clause from which the middle verb is omitted can often be processed more easily than its grammatical counterpart, a phenomenon known as the grammaticality illusion. This effect has been found to be reversed in German, suggesting that the illusion is language specific rather than a consequence of universal working memory constraints. We present results from three self-paced reading experiments which show that Dutch native speakers also do not show the grammaticality illusion in Dutch, whereas both German (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  4.  88
    Connectionist semantic systematicity.Stefan L. Frank, Willem F. G. Haselager & Iris van Rooij - 2009 - Cognition 110 (3):358-379.
    No categories
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  5.  32
    Surprisal-based comparison between a symbolic and a connectionist model of sentence processing.Stefan L. Frank - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society. pp. 1139--1144.
  6.  20
    Modeling knowledge‐based inferences in story comprehension.Stefan L. Frank, Mathieu Koppen, Leo G. M. Noordman & Wietske Vonk - 2003 - Cognitive Science 27 (6):875-910.
    A computational model of inference during story comprehension is presented, in which story situations are represented distributively as points in a high‐dimensional “situation‐state space.” This state space organizes itself on the basis of a constructed microworld description. From the same description, causal/temporal world knowledge is extracted. The distributed representation of story situations is more flexible than Golden and Rumelhart's [Discourse Proc 16 (1993) 203] localist representation.A story taking place in the microworld corresponds to a trajectory through situation‐state space. During the (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  7.  36
    Modeling knowledge‐based inferences in story comprehension.Stefan L. Frank, Mathieu Koppen, Leo G. M. Noordman & Wietske Vonk - 2003 - Cognitive Science 27 (6):875-910.
    A computational model of inference during story comprehension is presented, in which story situations are represented distributively as points in a high‐dimensional “situation‐state space.” This state space organizes itself on the basis of a constructed microworld description. From the same description, causal/temporal world knowledge is extracted. The distributed representation of story situations is more flexible than Golden and Rumelhart's [Discourse Proc 16 (1993) 203] localist representation.A story taking place in the microworld corresponds to a trajectory through situation‐state space. During the (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  8.  24
    Modeling the Structure and Dynamics of Semantic Processing.Armand S. Rotaru, Gabriella Vigliocco & Stefan L. Frank - 2018 - Cognitive Science 42 (8):2890-2917.
    The contents and structure of semantic memory have been the focus of much recent research, with major advances in the development of distributional models, which use word co‐occurrence information as a window into the semantics of language. In parallel, connectionist modeling has extended our knowledge of the processes engaged in semantic activation. However, these two lines of investigation have rarely been brought together. Here, we describe a processing model based on distributional semantics in which activation spreads throughout a semantic network, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  9.  21
    Reservoir computing and the Sooner-is-Better bottleneck.Stefan L. Frank & Hartmut Fitz - 2016 - Behavioral and Brain Sciences 39.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  10.  9
    Modeling the Influence of Language Input Statistics on Children's Speech Production.Ingeborg Roete, Stefan L. Frank, Paula Fikkert & Marisa Casillas - 2020 - Cognitive Science 44 (12):e12924.
    We trained a computational model (the Chunk-Based Learner; CBL) on a longitudinal corpus of child–caregiver interactions in English to test whether one proposed statistical learning mechanism—backward transitional probability—is able to predict children's speech productions with stable accuracy throughout the first few years of development. We predicted that the model less accurately reconstructs children's speech productions as they grow older because children gradually begin to generate speech using abstracted forms rather than specific “chunks” from their speech environment. To test this idea, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark