The 'GUIB' spatial auditory display - generation of an audio-based interface for blind computer users
Author(s)
Crispien, Kai
Petrie, Helen
Advisor(s)
Editor(s)
Collections
Supplementary to:
Permanent Link
Abstract
In order to provide access to graphical-based user interfaces (GUI's) for blind computer users, a screen-reader program system is under development which conveys the GUI into auditive and/or tactile form. The spatial auditory display of the screen-reader program is based on a cost-effective binaural audio processing system, using head-related transfer function (HRTF) synthesis technology. Non-individual HRTFs are used to synthesise virtual spatial acoustic locations. Synthesised speech and non-verbal audio signals, referred to as 'auditory icons' and 'earcons', are used to represent graphical information contents of the GUI display. Both, speech and non-speech audio components are positioned in virtual 3D acoustic space, in order to aid orientation and navigation for non-visual users. For the presentation of continuous text, such as word-processor or spreadsheet applications-based documents, a software mechanism was developed which synchronizes text-to-speech synthesiser devices to the spatial processing system. This mechanism also provides procedures to combine non-verbal audio cues with the synthesised speech output. The spatial presentation of the auditory display thus can convey information about the spatial layout of the user interface and text-based applications through direct perception of acoustic locations.Additionally, spatial audio presentation assists the use of pointing devices (e.g., tactile displays,mouse) and enables the use of multiple, simultaneously presented auditory information.
Sponsor
Date
1994-11
Extent
Resource Type
Text
Resource Subtype
Proceedings