On the effects of multimodal information integration in multitasking
Abstract
There have recently been considerable advances in our understanding of the neuronal mechanisms underlying multitasking, but the role of multimodal integration for this faculty has remained rather unclear. We examined this issue by comparing different modality combinations in a multitasking paradigm. In-depth neurophysiological analyses of event-related potentials were conducted to complement the obtained behavioral data. Specifically, we applied signal decomposition using second order blind identification to the multi-subject ERP data and source localization. We found that both general multimodal information integration and modality-specific aspects modulate behavioral performance and associated neurophysiological correlates. Simultaneous multimodal input generally increased early attentional processing of visual stimuli as well as measures of cognitive effort and conflict. Yet, tactile-visual input caused larger impairments in multitasking than audio-visual input. General aspects of multimodal information integration modulated the activity in the premotor cortex as well as different visual association areas concerned with the integration of visual information with input from other modalities. On top of this, differences in the specific combination of modalities also affected performance and measures of conflict/effort originating in prefrontal regions.