Series
International Conference on Auditory Display (ICAD)

Series Type
Event Series
Description
Associated Organization(s)
Associated Organization(s)
Organizational Unit

Publication Search Results

Now showing 1 - 3 of 3
  • Item
    Sonifying the London Underground Real-Time-Disruption Map
    (Georgia Institute of Technology, 2007-06) Nickerson, Louise Valgerour ; Stockman, Tony ; Thiebaut, Jean-Baptiste
    In mobile computing, there is a need for interfaces that better suit the context of use. Auditory interfaces have the potential to address the limitations of small screens and support eyes-free tasks. In order to fill this gap, we must develop more fluid and usable auditory interfaces. A key aspect of this is understanding the process of designing overviews. In this work, we describe a conceptual strategy for providing an overview of disruptions in the London Underground: The approach adopted is based on what information is perceived as most crucial to the user.
  • Item
    Auditory graphs: A summary of current experience and towards a research agenda
    (Georgia Institute of Technology, 2005-07) Stockman, Tony ; Nickerson, Louise Valgerour ; Hind, Greg
    In this paper we shall briefly review previous work we have found directly relevant to our own research on the use of auditory graphs. We will then summarise previous unpublished experiences of us- ing auditory graphs in the domain of medical signal analysis, and further recent work on the use of auditory graphs for analysing spreadsheet data. We conclude by outlining issues we believe to be relevant in the formation of a research agenda for the design and evaluation of the technology.
  • Item
    Sonically exposing the desktop space to non-visual users: an experiment in overview information presentation
    (Georgia Institute of Technology, 2005-07) Nickerson, Louise Valgerour ; Stockman, Tony
    The vast majority of computer interfaces do not translate well onto non-visual displays (e.g. for blind users, wearable/mobile computing, etc). Screen readers are the most prevalent aural technology to expose graphical user interfaces to the visually impaired. However, they eliminate many of the advantages of direct manipulation and WYSIWYG applications. While the use of sound in interfaces has become more prevalent due to advancement in sound cards for computers, it is still primarily for alerts and status-reporting. The use of sound can be expanded to enhance or replace a GUI by providing a 3D auditory environment. However, users of this environment would need a reliable and effective method of navigation. Little is known of the usability of a system based on sound identification and localisation. In this work, we describe an experiment which will examine users' ability to navigate a 3D auditory environment based on these concepts.