Series
International Conference on Auditory Display (ICAD)

Series Type
Event Series
Description
Associated Organization(s)
Associated Organization(s)
Organizational Unit

Publication Search Results

Now showing 1 - 5 of 5
  • Item
    Perceptual effects of auditory information about own and other movements
    (Georgia Institute of Technology, 2012-06) Schmitz, Gerd ; Effenberg, Alfred O.
    In sport accurate predictions of other persons’ movements are essential. Former studies have shown that predictions can be enhanced by mapping movements onto sound (sonification) and providing audiovisual feedback [1]. The present study investigated behavioral mechanisms of movement sonification and scrutinized whether effects of own movements and those of other persons can be predicted just by listening to them. Eight athletes heard sonifications of an indoor rower and quantified resulting velocities of a virtual boat. Although boat velocity was not mapped onto sound directly, it explained subjects’ quantifications by regression analysis (R2 = 0.80) significantly better than the directly sonified amplitude and force parameters. Thus perception of boat velocity might have emerged from those sonifications. Predictions of effects of unknown movements were above chance level and as good as predictions of own movements. Furthermore athletes were able to identify their own technique among others (d’ = 0.47 ± 0.43). The results confirm large perceptual effects of auditory feedback and - most importantly - suggest that movement sonification can address central motor representations just by listening to it. Therefore not only predictability but also synchronization with other persons’ movements might be supported.
  • Item
    CircoSonic: a sonification of CIRCOS, a circular graph of pair wise table data
    (Georgia Institute of Technology, 2012-06) Nguyen, Vinh Xuan
    This paper presents, applies and evaluates “CircoSonic,” an interactive sonification of “Circos.” It outlines the development of modifying a gaming engine to replicate Circos, a circular graph for comparing pair wise relationships in a 2D data table, with the added capabilities of sonification through interaction. The developed prototype is evaluated using an insight based methodology and uses a static dataset of historic, current and projected water availability of the Murray Darling Basin. A muted version of CircoSonic is used in the evaluation to establish a baseline comparison of visualization to visualization, from which a projected comparison can be made for visualization to visualization and sonification. In general Circos was found to outperform CircoSonic, except in the aspect of generating complex insights which referred to contextual knowledge. It is concluded that a move from static to dynamic data may show different results and that further investigation is needed into the sonification of novel visualizations.
  • Item
    A Modular Computer Vision Sonification Model For The Visually Impaired
    (Georgia Institute of Technology, 2012-06) Banf, Michael ; Blanz, Volker
    This paper presents a Modular Computer Vision Sonification Model which is a general framework for acquisition, exploration and sonification of visual information to support visually impaired people. The model exploits techniques from Computer Vision and aims to convey as much information as possible about the image to the user, including color, edges and what we refer to as Orientation maps and Micro-Textures. We deliberatively focus on low level features to provide a very general image analysis tool. Our sonification approach relies on MIDI using "real-world" instead of synthetic instruments. The goal is to provide direct perceptual access to images or environments actively and in real time. Our system is already in use, at an experimental stage, at a local residential school, helping congenital blind children develop various cognitive abilities such as geometric understanding and spatial sense as well as offering an intuitive approach to colors and textures.
  • Item
    Auditory Support for Situation Awareness in Video Surveillance
    (Georgia Institute of Technology, 2012-06) Höferlin, Benjamin ; Höferlin, Markus ; Goloubets, Boris ; Heidemann, Gunther ; Weiskopf, Daniel
    We introduce a parameter mapping sonification to support situational awareness of surveillance operators during their task of monitoring video data. The presented auditory display produces a continuous ambient soundscape reflecting the changes in video data. For this purpose, we use low-level computer vision techniques, such as optical-flow extraction and background subtraction, and rely on the capabilities of the human auditory system for high-level recognition. Special focus is put on the mapping between video features and sound parameters. We optimize this mapping to provide a good interpretability of the sound pattern, as well as an aesthetic non-obtrusive sonification: precision of the conveyed information, psychoacoustic capabilities of the auditory system, and aesthetical guidelines of sound design are considered by optimally balancing the mapping parameters using gradient descent. A user study evaluates the capabilities and limitations of the presented sonification, as well as its applicability to supporting situational awareness in surveillance scenarios.
  • Item
    Multi-dimensional synchronization for rhythmic sonification
    (Georgia Institute of Technology, 2012-06) Boyd, Jeffrey E. ; Godbout, Andrew
    Human locomotion is fundamentally periodic, so when sonifying gait, it is desirable to exploit this periodicity to produce rhythmic sonification synchronized to the motion. To achieve this rhythmic sonification, some mechanism is required to synchronize an oscillator to the period of the motion. This paper presents a method to synchronize to multidimensional signals like those produced by a motion capture system. Using a subset of the joint-angle signals produced by motion capture, the method estimates the phase of a periodic, multidimensional model to match data observed from a moving subject. It does this using an optimization algorithm applied to a suitable objective function. We demonstrate the synchronization with data from a publicly available motion capture database, producing sonifications of drum beats synchronized to footfalls of subjects. The method is robust and shares some common features of phase-locked loops used for synchronizing one-dimensional sinusoidal signals. We foresee applications to sonification for athletics and clinical treatment of gait disorders.