Switch to: References

Add citations

You must login to add citations.
  1. Modeling Brain Representations of Words' Concreteness in Context Using GPT‐2 and Human Ratings.Andrea Bruera, Yuan Tao, Andrew Anderson, Derya Çokal, Janosch Haber & Massimo Poesio - 2023 - Cognitive Science 47 (12):e13388.
    The meaning of most words in language depends on their context. Understanding how the human brain extracts contextualized meaning, and identifying where in the brain this takes place, remain important scientific challenges. But technological and computational advances in neuroscience and artificial intelligence now provide unprecedented opportunities to study the human brain in action as language is read and understood. Recent contextualized language models seem to be able to capture homonymic meaning variation (“bat”, in a baseball vs. a vampire context), as (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Event Knowledge in Large Language Models: The Gap Between the Impossible and the Unlikely.Carina Kauf, Anna A. Ivanova, Giulia Rambelli, Emmanuele Chersoni, Jingyuan Selena She, Zawad Chowdhury, Evelina Fedorenko & Alessandro Lenci - 2023 - Cognitive Science 47 (11):e13386.
    Word co‐occurrence patterns in language corpora contain a surprising amount of conceptual knowledge. Large language models (LLMs), trained to predict words in context, leverage these patterns to achieve impressive performance on diverse semantic tasks requiring world knowledge. An important but understudied question about LLMs’ semantic abilities is whether they acquire generalized knowledge of common events. Here, we test whether five pretrained LLMs (from 2018's BERT to 2023's MPT) assign a higher likelihood to plausible descriptions of agent−patient interactions than to minimally (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Behavioral Signatures of Memory Resources for Language: Looking beyond the Lexicon/Grammar Divide.Dagmar Divjak, Petar Milin, Srdan Medimorec & Maciej Borowski - 2022 - Cognitive Science 46 (11):e13206.
    Although there is a broad consensus that both the procedural and declarative memory systems play a crucial role in language learning, use, and knowledge, the mapping between linguistic types and memory structures remains underspecified: by default, a dual-route mapping of language systems to memory systems is assumed, with declarative memory handling idiosyncratic lexical knowledge and procedural memory handling rule-governed knowledge of grammar.We experimentally contrast the processing of morphology (case and aspect), syntax (subordination), and lexical semantics (collocations) in a healthy L1 (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation