Order:
Disambiguations
John T. Hale [4]John Hale [3]John K. Hale [1]John Me Hale [1]
  1.  35
    Uncertainty About the Rest of the Sentence.John Hale - 2006 - Cognitive Science 30 (4):643-672.
    A word-by-word human sentence processing complexity metric is presented. This metric formalizes the intuition that comprehenders have more trouble on words contributing larger amounts of information about the syntactic structure of the sentence as a whole. The formalization is in terms of the conditional entropy of grammatical continuations, given the words that have been heard so far. To calculate the predictions of this metric, Wilson and Carroll's (1954) original entropy reduction idea is extended to infinite languages. This is demonstrated with (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   32 citations  
  2.  95
    What a Rational Parser Would Do.John T. Hale - 2011 - Cognitive Science 35 (3):399-443.
    This article examines cognitive process models of human sentence comprehension based on the idea of informed search. These models are rational in the sense that they strive to find a good syntactic analysis quickly. Informed search derives a new account of garden pathing that handles traditional counterexamples. It supports a symbolic explanation for local coherence as well as an algorithmic account of entropy reduction. The models are expressed in a broad framework for theories of human sentence comprehension.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  3.  15
    Quantifying Structural and Non‐structural Expectations in Relative Clause Processing.Zhong Chen & John T. Hale - 2021 - Cognitive Science 45 (1):e12927.
    Information‐theoretic complexity metrics, such as Surprisal (Hale, 2001; Levy, 2008) and Entropy Reduction (Hale, 2003), are linking hypotheses that bridge theorized expectations about sentences and observed processing difficulty in comprehension. These expectations can be viewed as syntactic derivations constrained by a grammar. However, this expectation‐based view is not limited to syntactic information alone. The present study combines structural and non‐structural information in unified models of word‐by‐word sentence processing difficulty. Using probabilistic minimalist grammars (Stabler, 1997), we extend expectation‐based models to include (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  4.  11
    Modeling Structure‐Building in the Brain With CCG Parsing and Large Language Models.Miloš Stanojević, Jonathan R. Brennan, Donald Dunagan, Mark Steedman & John T. Hale - 2023 - Cognitive Science 47 (7):e13312.
    To model behavioral and neural correlates of language comprehension in naturalistic environments, researchers have turned to broad‐coverage tools from natural‐language processing and machine learning. Where syntactic structure is explicitly modeled, prior work has relied predominantly on context‐free grammars (CFGs), yet such formalisms are not sufficiently expressive for human languages. Combinatory categorial grammars (CCGs) are sufficiently expressive directly compositional models of grammar with flexible constituency that affords incremental interpretation. In this work, we evaluate whether a more expressive CCG provides a better (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  5.  17
    Automaton theories of human sentence comprehension.John T. Hale - 2014 - Stanford, California: CSLI Publications, Center for the Study of Language and Information.
    How could the kinds of grammars that linguists write actually be used in models of perceptual processing? This book relates grammars to cognitive architecture. It shows how incremental parsing works, step-by-step, and how specific learning rules might lead to frequency-sensitive preferences. Along the way, Hale reconsiders garden-pathing, the parallel/serial distinction and information-theoretical complexity metrics such as surprisal. A "must" for cognitive scientists of language. ".
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  6.  65
    Introduction to the Issue on Computational Models of Natural Language.John Hale & David Reitter - 2013 - Topics in Cognitive Science 5 (3):388-391.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  7. The timetable project.John Me Hale - 1972 - In Peter Albertson & Margery Barnett (eds.), Managing the Planet. Englewood Cliffs, N.J., Prentice-Hall.
    No categories
     
    Export citation  
     
    Bookmark  
  8.  9
    ANCIENT EPICS AND CHILDREN'S LITERATURE - (G.L.) Irby Epic Echoes in The Wind in the Willows. Pp. x + 140, ills. London and New York: Routledge, 2022. Cased, £44.99, US$59.95. ISBN: 978-1-03-210510-9. [REVIEW]Elizabeth Hale & John K. Hale - 2023 - The Classical Review 73 (2):703-705.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  9.  18
    Book review. [REVIEW]John Hale - 2007 - Journal of Logic, Language and Information 16 (2):217-220.
    This is a good book. Its main message is that a particular approach to natural language called type-logical grammar can, in-principle, be equipped with a learning theory. In this review, I first identify what type-logical grammar is, then outline what the learning theory is. Then I try to articulate why this message is important for the logical, linguistic and information-theoretic parts of cognitive science. Overall, I think the book’s main message is significant enough to warrant patience with its scientific limitations.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark