Results for 'visual choice RT, auditory input variations'

1000+ found
Order:
  1.  32
    Effects of some variations in auditory input upon visual choice reaction time.Ira H. Bernstein & Barry A. Edelstein - 1971 - Journal of Experimental Psychology 87 (2):241.
  2.  35
    Effects of an irrelevant auditory stimulus on visual choice reaction time.J. Richard Simon & John L. Craft - 1970 - Journal of Experimental Psychology 86 (2):272.
  3.  22
    A choice reaction time test of ideomotor theory.Anthony G. Greenwald - 1970 - Journal of Experimental Psychology 86 (1):20.
  4. Grip force as a functional window to somatosensory cognition.Birgitta Dresp-Langley - 2022 - Frontiers in Psychology 13:1026439.
    Analysis of grip force signals tailored to hand and finger movement evolution and changes in grip force control during task execution provide unprecedented functional insight into somatosensory cognition. Somatosensory cognition is a basis of our ability to manipulate, move, and transform objects of the physical world around us, to recognize them on the basis of touch alone, and to grasp them with the right amount of force for lifting and manipulating them. Recent technology has permitted the wireless monitoring of grip (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  5.  4
    Visual Influence on Auditory Perception of Vowels by French-Speaking Children and Adults.Paméla Trudeau-Fisette, Laureline Arnaud & Lucie Ménard - 2022 - Frontiers in Psychology 13.
    Audiovisual interaction in speech perception is well defined in adults. Despite the large body of evidence suggesting that children are also sensitive to visual input, very few empirical studies have been conducted. To further investigate whether visual inputs influence auditory perception of phonemes in preschoolers in the same way as in adults, we conducted an audiovisual identification test. The auditory stimuli were presented either in an auditory condition only or simultaneously with a visual (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  6.  7
    Auditory Target Detection Enhances Visual Processing and Hippocampal Functional Connectivity.Roy Moyal, Hamid B. Turker, Wen-Ming Luh & Khena M. Swallow - 2022 - Frontiers in Psychology 13.
    Though dividing one’s attention between two input streams typically impairs performance, detecting a behaviorally relevant stimulus can sometimes enhance the encoding of unrelated information presented at the same time. Previous research has shown that selection of this kind boosts visual cortical activity and memory for concurrent items. An important unanswered question is whether such effects are reflected in processing quality and functional connectivity in visual regions and in the hippocampus. In this fMRI study, participants were asked to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  7.  13
    Time course of visual and auditory encoding.Marvin J. Dainoff - 1970 - Journal of Experimental Psychology 86 (2):214.
  8.  7
    Effects of Visual Information on Adults' and Infants' Auditory Statistical Learning.Erik D. Thiessen - 2010 - Cognitive Science 34 (6):1093-1106.
    Infant and adult learners are able to identify word boundaries in fluent speech using statistical information. Similarly, learners are able to use statistical information to identify word–object associations. Successful language learning requires both feats. In this series of experiments, we presented adults and infants with audio–visual input from which it was possible to identify both word boundaries and word–object relations. Adult learners were able to identify both kinds of statistical relations from the same input. Moreover, their learning (...)
    Direct download  
     
    Export citation  
     
    Bookmark   13 citations  
  9.  14
    Visual Speech Perception Cues Constrain Patterns of Articulatory Variation and Sound Change.Jonathan Havenhill & Youngah Do - 2018 - Frontiers in Psychology 9:337534.
    What are the factors that contribute to (or inhibit) diachronic sound change? While acoustically motivated sound changes are well documented, research on the articulatory and audiovisual-perceptual aspects of sound change is limited. This paper investigates the interaction of articulatory variation and audiovisual speech perception in the Northern Cities Vowel Shift (NCVS), a pattern of sound change observed in the Great Lakes region of the United States. We focus specifically on the maintenance of the contrast between the vowels /ɑ/ and /ɔ/, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  10.  11
    Statistically Induced Chunking Recall: A Memory‐Based Approach to Statistical Learning.Erin S. Isbilen, Stewart M. McCauley, Evan Kidd & Morten H. Christiansen - 2020 - Cognitive Science 44 (7):e12848.
    The computations involved in statistical learning have long been debated. Here, we build on work suggesting that a basic memory process, chunking, may account for the processing of statistical regularities into larger units. Drawing on methods from the memory literature, we developed a novel paradigm to test statistical learning by leveraging a robust phenomenon observed in serial recall tasks: that short‐term memory is fundamentally shaped by long‐term distributional learning. In the statistically induced chunking recall (SICR) task, participants are exposed to (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  11.  22
    The latency operating characteristic: II. Effects of visual stimulus intensity on choice reaction time.Joseph S. Lappin & Kenneth Disch - 1972 - Journal of Experimental Psychology 93 (2):367.
  12.  10
    Head Anticipation During Locomotion With Auditory Instruction in the Presence and Absence of Visual Input.Felix Dollack, Monica Perusquía-Hernández, Hideki Kadone & Kenji Suzuki - 2019 - Frontiers in Human Neuroscience 13.
  13.  2
    Spontaneous Eye Blinks Map the Probability of Perceptual Reinterpretation During Visual and Auditory Ambiguity.Supriya Murali & Barbara Händel - 2024 - Cognitive Science 48 (2):e13414.
    Spontaneous eye blinks are modulated around perceptual events. Our previous study, using a visual ambiguous stimulus, indicated that blink probability decreases before a reported perceptual switch. In the current study, we tested our hypothesis that an absence of blinks marks a time in which perceptual switches are facilitated in‐ and outside the visual domain. In three experiments, presenting either a visual motion quartet in light or darkness or a bistable auditory streaming stimulus, we found a co‐occurrence (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  14.  4
    Congruent aero-tactile stimuli bias perception of voicing continua.Dolly Goldenberg, Mark K. Tiede, Ryan T. Bennett & D. H. Whalen - 2022 - Frontiers in Human Neuroscience 16:879981.
    Multimodal integration is the formation of a coherent percept from different sensory inputs such as vision, audition, and somatosensation. Most research on multimodal integration in speech perception has focused on audio-visual integration. In recent years, audio-tactile integration has also been investigated, and it has been established that puffs of air applied to the skin and timed with listening tasks shift the perception of voicing by naive listeners. The current study has replicated and extended these findings by testing the effect (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  15.  42
    The Role of Words and Sounds in Infants' Visual Processing: From Overshadowing to Attentional Tuning.Vladimir M. Sloutsky & Christopher W. Robinson - 2008 - Cognitive Science 32 (2):342-365.
    Although it is well documented that language plays an important role in cognitive development, there are different views concerning the mechanisms underlying these effects. Some argue that even early in development, effects of words stem from top‐down knowledge, whereas others argue that these effects stem from auditory input affecting attention allocated to visual input. Previous research (e.g., Robinson & Sloutsky, 2004a) demonstrated that non‐speech sounds attenuate processing of corresponding visual input at 8, 12, and (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   22 citations  
  16.  21
    Interhemispheric effects on choice reaction times to one-, two-, and three-letter displays.Carlo Umilta, Nancy Frost & Ray Hyman - 1972 - Journal of Experimental Psychology 93 (1):198.
  17.  61
    Information‐Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory.Kat Agres, Samer Abdallah & Marcus Pearce - 2018 - Cognitive Science 42 (1):43-76.
    A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different types of statistical information affect listeners’ memory for auditory stimuli. We used a combination of behavioral and computational methods to investigate memory for non-linguistic auditory sequences. (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  18.  24
    Perceptual load influences auditory space perception in the ventriloquist aftereffect.Ranmalee Eramudugolla, Marc R. Kamke, Salvador Soto-Faraco & Jason B. Mattingley - 2011 - Cognition 118 (1):62-74.
    A period of exposure to trains of simultaneous but spatially offset auditory and visual stimuli can induce a temporary shift in the perception of sound location. This phenomenon, known as the 'ventriloquist aftereffect', reflects a realignment of auditory and visual spatial representations such that they approach perceptual alignment despite their physical spatial discordance. Such dynamic changes to sensory representations are likely to underlie the brain's ability to accommodate inter-sensory discordance produced by sensory errors (particularly in sound (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  19.  9
    The Brain Tracks Multiple Predictions About the Auditory Scene.Kelin M. Brace & Elyse S. Sussman - 2021 - Frontiers in Human Neuroscience 15:747769.
    The predictable rhythmic structure is important to most ecologically relevant sounds for humans, such as is found in the rhythm of speech or music. This study addressed the question of how rhythmic predictions are maintained in the auditory system when there are multiple perceptual interpretations occurring simultaneously and emanating from the same sound source. We recorded the electroencephalogram (EEG) while presenting participants with a tone sequence that had two different tone feature patterns, one based on the sequential rhythmic variation (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  20.  17
    Mode of input effects on subject-controlled processes.Carl E. McFarland & George Kellas - 1974 - Journal of Experimental Psychology 103 (2):343.
  21.  7
    Visual Classification of Music Style Transfer Based on PSO-BP Rating Prediction Model.Tianjiao Li - 2021 - Complexity 2021:1-9.
    In this paper, based on computer reading and processing of music frequency, amplitude, timbre, image pixel, color filling, and so forth, a method of image style transfer guided by music feature data is implemented in real-time playback, using existing music files and image files, processing and trying to reconstruct the fluent relationship between the two in terms of auditory and visual, generating dynamic, musical sound visualization with real-time changes in the visualization. Although recommendation systems have been well developed (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  22.  24
    Intermodality inconsistency of input and directed attention as determinants of the nature of adaptation.Lance K. Canon - 1970 - Journal of Experimental Psychology 84 (1):141.
  23.  13
    Visual Heuristics for Verb Production: Testing a Deep‐Learning Model With Experiments in Japanese.Franklin Chang, Tomoko Tatsumi, Yuna Hiranuma & Colin Bannard - 2023 - Cognitive Science 47 (8):e13324.
    Tense/aspect morphology on verbs is often thought to depend on event features like telicity, but it is not known how speakers identify these features in visual scenes. To examine this question, we asked Japanese speakers to describe computer‐generated animations of simple actions with variation in visual features related to telicity. Experiments with adults and children found that they could use goal information in the animations to select appropriate past and progressive verb forms. They also produced a large number (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  24.  31
    Exploring Variation Between Artificial Grammar Learning Experiments: Outlining a Meta‐Analysis Approach.Antony S. Trotter, Padraic Monaghan, Gabriël J. L. Beckers & Morten H. Christiansen - 2020 - Topics in Cognitive Science 12 (3):875-893.
    Studies of AGL have frequently used training and test stimuli that might provide multiple cues for learning, raising the question what subjects have actually learned. Using a selected subset of studies on humans and non‐human animals, Trotter et al. demonstrate how a meta‐analysis can be used to identify relevant experimental variables, providing a first step in asssessing the relative contribution of design features of grammars as well as of species‐specific effects on AGL.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  25.  18
    Top-down modulation of visual processing and knowledge after 250 ms supports object constancy of category decisions.Haline E. Schendan & Giorgio Ganis - 2015 - Frontiers in Psychology 6:79638.
    People categorize objects slowly when visual input is highly impoverished instead of optimal. While bottom-up models may explain a decision with optimal input, perceptual hypothesis testing (PHT) theories implicate top-down processes with impoverished input. Brain mechanisms and the time course of PHT are largely unknown. This event-related potential study used a neuroimaging paradigm that implicated prefrontal cortex in top-down modulation of occipitotemporal cortex. Subjects categorized more impoverished and less impoverished real and pseudo objects. PHT theories predict (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark  
  26.  49
    Redefining “Learning” in Statistical Learning: What Does an Online Measure Reveal About the Assimilation of Visual Regularities?Noam Siegelman, Louisa Bogaerts, Ofer Kronenfeld & Ram Frost - 2018 - Cognitive Science 42 (S3):692-727.
    From a theoretical perspective, most discussions of statistical learning have focused on the possible “statistical” properties that are the object of learning. Much less attention has been given to defining what “learning” is in the context of “statistical learning.” One major difficulty is that SL research has been monitoring participants’ performance in laboratory settings with a strikingly narrow set of tasks, where learning is typically assessed offline, through a set of two-alternative-forced-choice questions, which follow a brief visual or (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  27.  64
    Seeing to hear better: evidence for early audio-visual interactions in speech identification.Jean-Luc Schwartz, Frédéric Berthommier & Christophe Savariaux - 2004 - Cognition 93 (2):69-78.
    Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances sensitivity to acoustic information, decreasing the auditory detection threshold of speech embedded in noise [J. Acoust. Soc. Am. 109 (2001) 2272; J. Acoust. Soc. Am. 108 (2000) 1197]. However, detection is different from comprehension, and (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  28.  8
    Cross-Linguistic Influence on L2 Before and After Extreme Reduction in Input: The Case of Japanese Returnee Children.Maki Kubota, Caroline Heycock, Antonella Sorace & Jason Rothman - 2020 - Frontiers in Psychology 11:560874.
    This study investigates the choice of genitive forms (the woman’s book vs. the book of the woman) in the English of Japanese-English bilingual returnees (i.e. children who returned from a second language dominant environment to their first language environment). The specific aim was to examine whether change in language dominance/exposure influences choice of genitive form in the bilingual children; the more general question was the extent to which observed behaviour can be explained by cross linguistic influence (CLI). First, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  29.  5
    With No Attention Specifically Directed to It, Rhythmic Sound Does Not Automatically Facilitate Visual Task Performance.Jorg De Winne, Paul Devos, Marc Leman & Dick Botteldooren - 2022 - Frontiers in Psychology 13.
    In a century where humans and machines—powered by artificial intelligence or not—increasingly work together, it is of interest to understand human processing of multi-sensory stimuli in relation to attention and working memory. This paper explores whether and when supporting visual information with rhythmic auditory stimuli can optimize multi-sensory information processing. In turn, this can make the interaction between humans or between machines and humans more engaging, rewarding and activating. For this purpose a novel working memory paradigm was developed (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  30.  49
    Tracking Multiple Statistics: Simultaneous Learning of Object Names and Categories in English and Mandarin Speakers.Chi-Hsin Chen, Lisa Gershkoff-Stowe, Chih-Yi Wu, Hintat Cheung & Chen Yu - 2017 - Cognitive Science 41 (6):1485-1509.
    Two experiments were conducted to examine adult learners' ability to extract multiple statistics in simultaneously presented visual and auditory input. Experiment 1 used a cross‐situational learning paradigm to test whether English speakers were able to use co‐occurrences to learn word‐to‐object mappings and concurrently form object categories based on the commonalities across training stimuli. Experiment 2 replicated the first experiment and further examined whether speakers of Mandarin, a language in which final syllables of object names are more predictive (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  31.  86
    Effects of prosodically modulated sub-phonetic variation on lexical competition.Anne Pier Salverda, Delphine Dahan, Michael K. Tanenhaus, Katherine Crosswhite, Mikhail Masharov & Joyce McDonough - 2007 - Cognition 105 (2):466-476.
  32.  37
    Suggestion overrides automatic audiovisual integration.Catherine Déry, Natasha K. J. Campbell, Michael Lifshitz & Amir Raz - 2014 - Consciousness and Cognition 24:33-37.
    Cognitive scientists routinely distinguish between controlled and automatic mental processes. Through learning, practice, and exposure, controlled processes can become automatic; however, whether automatic processes can become deautomatized – recuperated under the purview of control – remains unclear. Here we show that a suggestion derails a deeply ingrained process involving involuntary audiovisual integration. We compared the performance of highly versus less hypnotically suggestible individuals in a classic McGurk paradigm – a perceptual illusion task demonstrating the influence of visual facial movements (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  33.  9
    Face-to-face contact during infancy: How the development of gaze to faces feeds into infants’ vocabulary outcomes.Zsofia Belteki, Carlijn van den Boomen & Caroline Junge - 2022 - Frontiers in Psychology 13.
    Infants acquire their first words through interactions with social partners. In the first year of life, infants receive a high frequency of visual and auditory input from faces, making faces a potential strong social cue in facilitating word-to-world mappings. In this position paper, we review how and when infant gaze to faces is likely to support their subsequent vocabulary outcomes. We assess the relevance of infant gaze to faces selectively, in three domains: infant gaze to different features (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  34. Confidence as a common currency between vision and audition.Vincent de Gardelle, Francois Le Corre & Pascal Mamassian - 2016 - PLoS ONE 11 (1).
    The idea of a common currency underlying our choice behaviour has played an important role in sciences of behaviour, from neurobiology to psychology and economics. However, while it has been mainly investigated in terms of values, with a common scale on which goods would be evaluated and compared, the question of a common scale for subjective probabilities and confidence in particular has received only little empirical investigation so far. The present study extends previous work addressing this question, by showing (...)
    Direct download  
     
    Export citation  
     
    Bookmark   9 citations  
  35. Visual Endurance and Auditory Perdurance.Błażej Skrzypulec - 2020 - Erkenntnis 85 (2):467-488.
    Philosophers often state that the persistence of objects in vision is experienced differently than the persistence of sounds in audition. This difference is expressed by using metaphors from the metaphysical endurantism/perdurantism debate. For instance, it is claimed that only sounds are perceived as “temporally extended”. The paper investigates whether it is justified to characterize visually experienced objects and auditorily experienced sounds as different types of entities: endurants and perdurants respectively. This issue is analyzed from the perspective of major specifications of (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  36.  12
    Energy integration in intersensory facilitation.Ira H. Bernstein, Robert Rose & Victor M. Ashe - 1970 - Journal of Experimental Psychology 86 (2):196.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  37.  48
    The Temporal Structure of Olfactory Experience.Keith A. Wilson - 2023 - In Benjamin D. Young & Andreas Keller (eds.), Theoretical Perspectives on Smell. Routledge. pp. 111-130.
    Visual experience is often characterised as being essentially spatial, and auditory experience essentially temporal. But this contrast, which is based upon the temporal structure of the objects of sensory experience rather than the experiences to which they give rise, is somewhat superficial. By carefully examining the various sources of temporal variation in the chemical senses we can more clearly identify the temporal profile of the resulting smell and taste (aka flavour) experiences. This in turn suggests that at least (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  38.  11
    Language Separation in Bidialectal Speakers: Evidence From Eye Tracking.Björn Lundquist & Øystein A. Vangsnes - 2018 - Frontiers in Psychology 9:369862.
    The aim of this study was to find out how people process the dialectal variation encountered in the daily linguistic input. We conducted an eye tracking study (Visual Word Paradigm) that targeted the predictive processing of grammatical gender markers. Three different groups of Norwegian speakers took part in the experiment: one group of students from the capital Oslo, and two groups of dialect speakers from the Western Norwegian town Sogndal. One Sogndal group was defined as ``stable dialect speakers'', (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  39. Competition among visual, verbal, and auditory modalities: a socio-semiotic perspective.Nana Zhou - forthcoming - Semiotica.
    This article presents a fresh perspective on the interplay among visual, verbal, and auditory modalities, positing that these modalities, as semogenic resources, compete to express dynamic meanings. The theoretical paradigm emphasizes that whether a modality or an element within a modality gets or loses semantic status, it will elicit an additional layer of social meaning to depict a comprehensive picture of a story together with an explicit semiotic meaning. The article adopts a qualitative method to analyze the data, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  40.  20
    A spatially oriented decision does not induce consciousness in a motor task.Bruce Bridgeman & Valerie Huemer - 1998 - Consciousness and Cognition 7 (3):454-464.
    Visual information follows at least two branches in the human nervous system, following a common input stage: a cognitive ''what'' branch governs perception and experience, while a sensorimotor ''how'' branch handles visually guided behavior though its outputs are unconscious. The sensorimotor system is probed with an isomorphic task, requiring a 1:1 relationship between target position and motor response. The cognitive system, in contrast, is probed with a forced qualitative decision, expressed verbally, about the location of a target. Normally, (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  41. Music and multimodal mental imagery.Bence Nanay - forthcoming - In Music and Mental Imagery. Routledge.
    Mental imagery is early perceptual processing that is not triggered by corresponding sensory stimulation in the relevant sense modality. Multimodal mental imagery is early perceptual processing that is triggered by sensory stimulation in a different sense modality. For example, when early visual or tactile processing is triggered by auditory sensory stimulation, this amounts to multimodal mental imagery. Pulling together philosophy, psychology and neuroscience, I will argue in this paper that multimodal mental imagery plays a crucial role in our (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  42.  45
    The Visual and the Auditory.Mikle D. Ledgerwood - 1994 - Semiotics:381-391.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  43.  12
    Modeling violations of the race model inequality in bimodal paradigms: co-activation from decision and non-decision components.Michael Zehetleitner, Emil Ratko-Dehnert & Hermann J. Müller - 2015 - Frontiers in Human Neuroscience 9:93369.
    The redundant-signals paradigm (RSP) is designed to investigate response behavior in perceptual tasks in which response-relevant targets are defined by either one or two features, or modalities. The common finding is that responses are speeded for redundantly compared to singly defined targets. This redundant-signals effect (RSE) can be accounted for by race models if the response times do not violate the race model inequality (RMI). When there are violations of the RMI, race models are effectively excluded as a viable account (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  44.  25
    A double stimulation test of ideomotor theory with implications for selective attention.Anthony G. Greenwald - 1970 - Journal of Experimental Psychology 84 (3):392.
  45.  50
    Learning low-dimensional representations via the usage of multiple-class labels.S. Edelman - unknown
    Learning to recognize visual objects from examples requires the ability to find meaningful patterns in spaces of very high dimensionality. We present a method for dimensionality reduction which effectively biases the learning system by combining multiple constraints via the use of class labels. The use of extensive class labels steers the resulting lowdimensional representation to become invariant to those directions of variation in the input space that are irrelevant to classification; this is done merely by making class labels (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  46.  10
    Absence of modulatory action on haptic height perception with musical pitch.Michele Geronazzo, Federico Avanzini & Massimo Grassi - 2015 - Frontiers in Psychology 6:139245.
    Although acoustic frequency is not a spatial property of physical objects, in common language, pitch, i.e., the psychological correlated of frequency, is often labeled spatially (i.e., “high in pitch” or “low in pitch”). Pitch-height is known to modulate (and interact with) the response of participants when they are asked to judge spatial properties of non-auditory stimuli (e.g., visual) in a variety of behavioral tasks. In the current study we investigated whether the modulatory action of pitch-height extended to the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  47.  22
    Investigating Established EEG Parameter During Real-World Driving.Janna Protzak & Klaus Gramann - 2018 - Frontiers in Psychology 9:412837.
    In real life, behavior is influenced by dynamically changing contextual factors and is rarely limited to simple tasks and binary choices. For a meaningful interpretation of brain dynamics underlying more natural cognitive processing in active humans, ecologically valid test scenarios are essential. To understand whether brain dynamics in restricted artificial lab settings reflect the neural activity in complex natural environments, we systematically tested the auditory event-related P300 in both settings. We developed an integrative approach comprising an initial P300-study in (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  48.  21
    Laws of visual choice reaction time.Warren H. Teichner & Marjorie J. Krebs - 1974 - Psychological Review 81 (1):75-98.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   36 citations  
  49.  10
    Patterns of eye blinks are modulated by auditory input in humans.Stefan E. Huber, Markus Martini & Pierre Sachse - 2022 - Cognition 221:104982.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  50.  45
    Simple reaction time as a function of stimulus intensity in decibels of light and sound.David L. Kohfeld - 1971 - Journal of Experimental Psychology 88 (2):251.
1 — 50 / 1000