Abstract
In this paper we review digital technologies that can be used to study what the experiences of past peoples might have been. We focus on the use of immersive virtual reality (VR) systems to frame hypotheses about the visual and auditory experiences of past individuals, based on available archeological evidence. These reconstructions of past places and landscapes are often focused on visual data. We argue that we should move beyond this ocularcentric focus by integrating sound and other modalities into VR. However, even those that emphasize sound in archaeology—as in archaeoacoustics (Scarre & Lawson, 2006; Diaz-Andreu & Mattioli, 2015; Suárez et al., 2016)—often retain a unimodal emphasis that limits how much we can understand of past peoples’ sensory experience. We argue that it is important to emphasize the importance of seeing and hearing at the same time (i.e. multi-modal sensory integration) in phenomenological archaeology. This is possible using immersive virtual reality systems that can engage users with both sight and sound simultaneously.