Person:
Walker, Bruce N.

Associated Organization(s)
Organizational Unit
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 10 of 28
  • Item
    Measuring the use of sound in everyday software
    (Georgia Institute of Technology, 2009-05) Walker, Bruce N. ; Davison, Benjamin K.
    Members of the ICAD community might contend that auditory interfaces and even just well-designed sound in computer interfaces could be used more often than is currently the case. However, it is not entirely clear where, when, and how sound is actually being employed in everyday software. We discuss the development of a long-term research project aimed at identifying and categorizing sound use in software. Our mixed- methods approach explores software artifacts from three perspectives: detailed program behavior, source code word count of audio terms, and audio infrastructure. These complementary approaches could provide a deeper understanding of sound use today and, we hope, lead to predicting, guiding, and improving the future trajectory of its use.
  • Item
    Efficiency of Spearcon-Enhanced Navigation of One Dimensional Electronic Menus
    (International Community for Auditory Display, 2008-06) Palladino, Dianne K. ; Walker, Bruce N.
    This study investigated navigation through a cell phone menu in the presence of auditory cues (text-to-speech and spearcons), visual cues, or both. A total of 127 undergraduates navigated through a 50-item alphabetically listed menu to find a target name. Participants using visual cues (either alone or combined with auditory cues) responded faster than those using only auditory cues. Performance was not found to be significantly different among the two auditory only conditions. Although not significant, when combined with visual cues, spearcons improved navigational efficiency more than both text-to- speech cues and menus using no sound, and provided evidence for the ability of sound to enhance visual menus. Research results provide evidence applicable to efficient auditory menu creation.
  • Item
    Learnabiltiy of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech
    (International Community for Auditory Display, 2008-06) Dingler, Tilman ; Lindsay, Jeffrey ; Walker, Bruce N.
    Awareness of features in our environment is essential for many daily activities. While often awareness of such features comes from vision, this modality is sometimes unavailable or undesirable. In these instances, auditory cues can be an excellent method of representing environmental features. The study reported here investigated the learnability of well known (auditory icons, earcons, and speech) and more novel (spearcons, earcon-icon hybrids, and sized hybrids) sonification techniques for representing common environmental features. Spearcons, which are speech stimuli that have been greatly sped up, were found to be as learnable as speech, while earcons unsurprisingly were much more difficult to learn. Practical implications are discussed.
  • Item
    AudioPlusWidgets: Bringing Sound to Software Widgets and Interface Components
    (International Community for Auditory Display, 2008-06) Davison, Benjamin K ; Walker, Bruce N.
    Using sound as part of the user interface in a typical software application is still extremely rare, despite the technical capabilities of computers to support such usage. The ICAD community has developed several interface concepts, patterns, and toolkits, and yet the overall software scene has remained dominated by the visual-only user interface. AudioPlusWidgets is a software library offering scientifically grounded audio enhancements to the standard Java Swing API. Through metaphors and transparency, AudioPlusWidgets can be inserted into existing code with minimal changes, easily adding auditory capabilities to the interface components in the system. This library uses an event-based model and an audio manager to render speech, MIDI, and prerecorded sounds.
  • Item
    Encoding and Representation of Information in Auditory Graphs: Descriptive Reports of Listener Strategies for Understanding Data
    (International Community for Auditory Display, 2008-06) Nees, Michael A. ; Walker, Bruce N.
    While a growing wealth of data have offered insights into the best practices for auditory display design and application, little is known about how listeners internally represent and use the information presented in auditory displays. At the conclusion of three separate studies, participants responded to an open-ended question about the strategies they used to perform auditory graphing tasks. We report a descriptive analysis of these qualitative responses. Participants' comments were coded by two raters along a number of dimensions that were chosen to represent a comprehensive set of encoding and task strategy possibilities. These descriptive analyses suggest that auditory graph listeners use a variety of strategies to cognitively represent the data in the display. Furthermore, these qualitative data offer a number of insights and questions for future research on information representation for auditory displays.
  • Item
    Individual Differences and the Field of Auditory Display: Past Research, A Present Study, and an Agenda for the Future
    (Georgia Institute of Technology, 2007-06) Mauney, Lisa M. ; Walker, Bruce N.
    There has been some interest in the study of individual differences in the field of auditory displays, but we argue that there is a much greater potential than has been realized, to date. Relevant types of individual differences that may be applicable to interpreting auditory information include perceptual abilities, cognitive abilities, musical abilities, and learning styles. There are many measures of these individual differences available; however, they have not been thoroughly utilized in the auditory display arena. We discuss several types of individual differences relevant to auditory displays. We then present some examples of past research, along with the results of a current investigation of individual differences in auditory displays. Finally, we propose an agenda as to what research and tests should be used to further study this area.
  • Item
    Learning Rates for Auditory Menus Enhanced with Spearcons Versus Earcons
    (Georgia Institute of Technology, 2007-06) Palladino, Dianne K. ; Walker, Bruce N.
    Increasing the usability of menus on small electronic devices is essential due to their increasing proliferation and decreasing physical sizes in the marketplace. Auditory menus are being studied as an enhancement to the menus on these devices. This study compared the learning rates for earcons (hierarchical representations of menu locations using musical tones) and spearcons (compressed speech) as potential candidates for auditory menu enhancement. We found that spearcons outperformed earcons significantly in rate of learning. We also found evidence that spearcon comprehension was enhanced by a brief training cycle, and that participants considered the process of learning spearcons much easier than the same process using earcons. Since the efficiency of learning and the perceived ease of use of auditory menus will increase the likelihood they are embraced by those who need them, this paper presents compelling evidence that spearcons may be the superior choice for such applications.
  • Item
    Sonification Sandbox Reconstruction: Software Standard for Auditory Graphs
    (Georgia Institute of Technology, 2007-06) Davison, Benjamin K. ; Walker, Bruce N.
    We report on an overhaul to the Sonification Sandbox. The Sonification Sandbox provides a cross-platform, flexible tool for converting tabular information into a descriptive auditory graph. It is implemented in Java, using the Java Sound API to generate MIDI output. An improved modular code structure provides a strong user interface and model framework for auditory graph representation and manipulation. A researcher can integrate part or the entire program into a different experimental implementation. The upgraded Sonification Sandbox provides a rich description of the auditory graph representation that can be saved or exported into various file formats. This description includes data representations of pitch, timbre, polarity, pan, and volume, along with graph contexts analogous to visual graph axes. Applications for the Sonification Sandbox include experimentation with various sonification techniques, data analytics beyond visualization, science education, auditory display for the blind, and musical interpretation of data.
  • Item
    Listener, Task, and Auditory Graph: Toward a Conceptual Model of Auditory Graph Comprehension
    (Georgia Institute of Technology, 2007-06) Nees, Michael A. ; Walker, Bruce N.
    Auditory graph design and implementation often has been subject to criticisms of arbitrary or atheoretical decision-making processes in both research and application. Despite increasing interest in auditory displays coupled with more than two decades of auditory graph research, no theoretical models of how a listener processes an auditory graph have been proposed. The current paper seeks to present a conceptual level account of the factors relevant to the comprehension of auditory graphs by human listeners. We attempt to make links to the relevant literature on basic auditory perception, and we offer explicit justification for, or discussion of, a number of common design practices that are often justified only implicitly or by intuition in the auditory graph literature. Finally, we take initial steps toward a qualitative, conceptual level model of auditory graph comprehension that will help to organize the available data on auditory graph comprehension and make predictions for future research and applications with auditory graphs
  • Item
    Aquarium sonification: Soundscapes for accessible dynamic informal learning environments
    (Georgia Institute of Technology, 2006-06) Walker, Bruce N. ; Godfrey, Mark T. ; Orlosky, Jason E. ; Bruce, Carrie ; Sanford, Jon
    Museums, science centers, zoos and aquaria are faced with educating and entertaining an increasingly diverse visitor population with varying physical and sensory needs. There are very few guidelines to help these facilities develop non-visual exhibit information, especially for dynamic exhibits. In an effort to make such informal learning environments (ILEs) more accessible to visually impaired visitors, the Georgia Tech Accessible Aquarium Project is studying auditory display and sonification methods for use in exhibit interpretation. The work presented here represents the initial tool building stage. We discuss the sonification system we are developing, and present some examples of the soundscape implementations that have been produced so far.