Switch to: References

Add citations

You must login to add citations.
  1. The influence of speaker gaze on listener comprehension: Contrasting visual versus intentional accounts.Maria Staudte, Matthew W. Crocker, Alexis Heloir & Michael Kipp - 2014 - Cognition 133 (1):317-328.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  • Eye’ll Help You Out! How the Gaze Cue Reduces the Cognitive Load Required for Reference Processing.Mirjana Sekicki & Maria Staudte - 2018 - Cognitive Science 42 (8):2418-2458.
    Referential gaze has been shown to benefit language processing in situated communication in terms of shifting visual attention and leading to shorter reaction times on subsequent tasks. The present study simultaneously assessed both visual attention and, importantly, the immediate cognitive load induced at different stages of sentence processing. We aimed to examine the dynamics of combining visual and linguistic information in creating anticipation for a specific object and the effect this has on language processing. We report evidence from three visual‐world (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  • Forward models and their implications for production, comprehension, and dialogue.Martin J. Pickering & Simon Garrod - 2013 - Behavioral and Brain Sciences 36 (4):377-392.
    Our target article proposed that language production and comprehension are interwoven, with speakers making predictions of their own utterances and comprehenders making predictions of other people's utterances at different linguistic levels. Here, we respond to comments about such issues as cognitive architecture and its neural basis, learning and development, monitoring, the nature of forward models, communicative intentions, and dialogue.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   13 citations  
  • An integrated theory of language production and comprehension.Martin J. Pickering & Simon Garrod - 2013 - Behavioral and Brain Sciences 36 (4):329-347.
    Currently, production and comprehension are regarded as quite distinct in accounts of language processing. In rejecting this dichotomy, we instead assert that producing and understanding are interwoven, and that this interweaving is what enables people to predict themselves and each other. We start by noting that production and comprehension are forms of action and action perception. We then consider the evidence for interweaving in action, action perception, and joint action, and explain such evidence in terms of prediction. Specifically, we assume (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   157 citations  
  • Eyes on the Mind: Investigating the Influence of Gaze Dynamics on the Perception of Others in Real-Time Social Interaction.Ulrich J. Pfeiffer, Leonhard Schilbach, Mathis Jording, Bert Timmermans, Gary Bente & Kai Vogeley - 2012 - Frontiers in Psychology 3.
    Direct download (9 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  • Perceiving where another person is looking: the integration of head and body information in estimating another person’s gaze.Pieter Moors, Filip Germeys, Iwona Pomianowska & Karl Verfaillie - 2015 - Frontiers in Psychology 6.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  • Preparing to be punched: Prediction may not always require inference of intentions.Helene Kreysa - 2013 - Behavioral and Brain Sciences 36 (4):362 - 363.
    Pickering & Garrod's (P&G's) framework assumes an efference copy based on the interlocutor's intentions. Yet, elaborate attribution of intentions may not always be necessary for online prediction. Instead, contextual cues such as speaker gaze can provide similar information with a lower demand on processing resources.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  • Can Speaker Gaze Modulate Syntactic Structuring and Thematic Role Assignment during Spoken Sentence Comprehension?Pia Knoeferle & Helene Kreysa - 2012 - Frontiers in Psychology 3.
  • When a look is enough: Neurophysiological correlates of referential speaker gaze in situated comprehension.Torsten Kai Jachmann, Heiner Drenhaus, Maria Staudte & Matthew W. Crocker - 2023 - Cognition 236 (C):105449.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Social eye gaze modulates processing of speech and co-speech gesture.Judith Holler, Louise Schubotz, Spencer Kelly, Peter Hagoort, Manuela Schuetze & Aslı Özyürek - 2014 - Cognition 133 (3):692-697.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  • Exploiting Listener Gaze to Improve Situated Communication in Dynamic Virtual Environments.Konstantina Garoufi, Maria Staudte, Alexander Koller & Matthew W. Crocker - 2016 - Cognitive Science 40 (7):1671-1703.
    Beyond the observation that both speakers and listeners rapidly inspect the visual targets of referring expressions, it has been argued that such gaze may constitute part of the communicative signal. In this study, we investigate whether a speaker may, in principle, exploit listener gaze to improve communicative success. In the context of a virtual environment where listeners follow computer-generated instructions, we provide two kinds of support for this claim. First, we show that listener gaze provides a reliable real-time index of (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • In the same boat : The influence of sharing the situational context on a speaker’s (a robot’s) persuasiveness.Kerstin Fischer, Lars Christian Jensen & Nadine Zitzmann - 2021 - Interaction Studies 22 (3):488-515.
    In this paper, we analyze what effects indicators of a shared situation have on a speaker’s persuasiveness by investigating how a robot’s advice is received when it indicates that it is sharing the situational context with its user. In our experiment, 80 participants interacted with a robot that referred to aspects of the shared context: Face tracking indicated that the robot saw the participant, incremental feedback suggested that the robot was following their actions, and comments about, and gestures towards, the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  • In the same boat.Kerstin Fischer, Lars Christian Jensen & Nadine Zitzmann - 2021 - Interaction Studies 22 (3):488-515.
    In this paper, we analyze what effects indicators of a shared situation have on a speaker’s persuasiveness by investigating how a robot’s advice is received when it indicates that it is sharing the situational context with its user. In our experiment, 80 participants interacted with a robot that referred to aspects of the shared context: Face tracking indicated that the robot saw the participant, incremental feedback suggested that the robot was following their actions, and comments about, and gestures towards, the (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  • Anticipation in Real‐World Scenes: The Role of Visual Context and Visual Memory.Moreno I. Coco, Frank Keller & George L. Malcolm - 2016 - Cognitive Science 40 (8):1995-2024.
    The human sentence processor is able to make rapid predictions about upcoming linguistic input. For example, upon hearing the verb eat, anticipatory eye-movements are launched toward edible objects in a visual scene. However, the cognitive mechanisms that underlie anticipation remain to be elucidated in ecologically valid contexts. Previous research has, in fact, mainly used clip-art scenes and object arrays, raising the possibility that anticipatory eye-movements are limited to displays containing a small number of objects in a visually impoverished context. In (...)
    No categories
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  • Theory of mind: mechanisms, methods, and new directions.Lindsey J. Byom & Bilge Mutlu - 2013 - Frontiers in Human Neuroscience 7.
  • You Look Human, But Act Like a Machine: Agent Appearance and Behavior Modulate Different Aspects of Human–Robot Interaction.Abdulaziz Abubshait & Eva Wiese - 2017 - Frontiers in Psychology 8:277299.
    Gaze following occurs automatically in social interactions, but the degree to which gaze is followed depends on whether an agent is perceived to have a mind, making its behavior socially more relevant for the interaction. Mind perception also modulates the attitudes we have towards others, and deter-mines the degree of empathy, prosociality and morality invested in social interactions. Seeing mind in others is not exclusive to human agents, but mind can also be ascribed to nonhuman agents like robots, as long (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations