Title:
Travelling without moving: Auditory scene cues for translational self-motion
Travelling without moving: Auditory scene cues for translational self-motion
Author(s)
Valjamae, Aleksander
Larsson, Pontus
Vastfjall, Daniel
Kleiner, Mendel
Larsson, Pontus
Vastfjall, Daniel
Kleiner, Mendel
Advisor(s)
Editor(s)
Collections
Supplementary to
Permanent Link
Abstract
Creating a sense of illusory self-motion is crucial for many Virtual Reality applications and the auditory modality is an essential, but often neglected, component for such stimulations. In this paper, perceptual optimization of auditory-induced, translational self-motion (vection) simulation is studied using binaurally synthesized and reproduced sound fields. The results suggest that auditory scene consistency and ecologically validity makes a minimum set of acoustic cues sufficient for eliciting auditory-induced vection. Specifically, it was found that a focused attention task and sound objects' motion characteristics (approaching or receding) play an important role in self-motion perception. In addition, stronger sensations for auditory induced self-translation than for previously investigated self-rotation also suggest a strong ecological validity bias, as translation is the most common movement direction.
Sponsor
Date Issued
2005-07
Extent
Resource Type
Text
Resource Subtype
Proceedings