Person:
Walker, Bruce N.

Associated Organization(s)
Organizational Unit
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 10 of 66
Thumbnail Image
Item

Mastery of Assistive Technology in K-12 Education

2024-08 , Satterfield, Richard (Ben) , Walker, Bruce N. , Milchus, Karen , LaForce, Salimah , Griffiths, Patricia , DeStefano, Lizanne , Blake, Matthew

This article describes the evaluation of a prototype of the Continuum of AT Mastery (CATM), an instrument developed for measuring individual progress toward mastery of assistive technology (AT).  In this second of two one-year studies, we examined the applicability of the CATM in K-12 educational settings.   This manuscript includes results of field testing of the CATM in K-12 schools and presents results of inter-rater and test-retest reliability.  item_description: This is the compilation of data collected from focus groups, field testing, interviews, and analysis from a study of the instrument known as the Continuum of AT Mastery (CATM) in K-12 settings.

Thumbnail Image
Item

The Development of a Measurement Tool for Mastery of Assistive Technology

2021-06-30 , Satterfield, Richard (Ben) , Walker, Bruce N. , Milchus, Karen

This report describes the development of a survey tool used to measure and assess “mastery of assistive technology”. A Delphi Panel comprised of experts in the area of Assistive Technology (AT) was gathered to explore the question of “What is mastery of AT?” For the purposes of this study, mastery was defined as becoming a “power user” of AT. Panelists were asked to identify what characteristics are associated with being a power user of AT. The panel gave these characteristics Likert Scale rankings as to their applicability as a predictor of becoming a power user and as an indicator of having become a power user. The rankings were compared, and the panel was asked to revisit the rankings in order to identify the most important factors. The panel identified 12 predictors and 14 indicators that they felt were highly predictive of becoming a power user or indicative of being one. These factors were analyzed and found to coalesce around four constructs or areas of mastery: (1) Experience (Usage) with AT; (2) Proficiency with AT; (3) Knowledge of AT; and (4) Personal Connection with AT. An online survey-based tool for measuring AT mastery was developed based on these constructs and presented to the panel for feedback and critique.

No Thumbnail Available
Item

Can You Hear My Heartbeat?: Hearing an Expressive Biosignal Elicits Empathy - Supplementary Data

2021-05-07 , Winters, R. Michael , Leslie, Grace , Walker, Bruce N.

Interfaces designed to elicit empathy provide an opportunity for HCI with important pro-social outcomes. Recent research has demonstrated that perceiving expressive biosignals can facilitate emotional understanding and connection with others, but this work has been largely limited to visual approaches. We propose that hearing these signals will also elicit empathy, and test this hypothesis with sounding heartbeats. In a lab-based within-subjects study, participants (N = 27) completed an emotion recognition task in different heartbeat conditions. We found that hearing heartbeats changed participants’ emotional perspective and increased their reported ability to “feel what the other was feeling.” From these results, we argue that auditory heartbeats are well-suited as an empathic intervention, and might be particularly useful for certain groups and use-contexts because of its musical and non-visual nature. This work establishes a baseline for empathic auditory interfaces, and offers a method to evaluate the effects of future designs.

Thumbnail Image
Item

Auditory displays to facilitate object targeting in 3D space

2019-06 , May, Keenan R. , Sobel, Briana , Wilson, Jeff , Walker, Bruce N.

In both extreme and everyday situations, humans need to find nearby objects that cannot be located visually. In such situations, auditory display technology could be used to display information supporting object targeting. Unfortunately, spatial audio inadequately conveys sound source elevation, which is crucial for locating objects in 3D space. To address this, three auditory display concepts were developed and evaluated in the context of finding objects within a virtual room, in either low or no visibility conditions: (1) a one-time height-denoting モarea cue,ヤ (2) ongoing モproximity feedback,ヤ or (3) both. All three led to improvements in performance and subjective workload compared to no sound. Displays (2) and (3) led to the largest improvements. This pattern was smaller, but still present, when visibility was low, compared to no visibility. These results indicate that persons who need to locate nearby objects in limited visibility conditions could benefit from the types of auditory displays considered here.

Thumbnail Image
Item

ACCESSIBLE SONIFICATION OF TOTAL SOLAR ECLIPSE 2024: ACCESSIBLE MAP AND MULTIMODAL VIRTUAL REALITY EXPERIENCE

2024-06 , Walmer, Auralee , Cash, Nicolette , Yin, Wenqing , Majors, Teairis , Biggs, Brandon , Walker, Bruce N.

April 8th, 2024 marks a rare astronomical event impacting a notable portion of North America – a total solar eclipse. While this event can be witnessed by those along the path of totality, many may find it inaccessible due to their geographical location or a possible visual impairment which would exclude them from the experience. We aim to make the solar eclipse more accessible, by creating an audio-visual representation of the event that can be both educational and awe-inspiring. There are three primary components of this project, each designed to be accessible before, during, and after the 2024 eclipse: (1) accessible audio-visual maps; (2) a virtual reality simulation with educational and immersive features; and (3) a soundscape to accompany the real-time total solar eclipse event. Each component introduces intentionally designed sonic parameters, as well as narrative elements, to guide listeners through each feature of the project. We explore the challenges and benefits of expressing the total solar eclipse auditorily, and argue that an audio-visual format provides both educational and engaging benefits. Our ultimate goal is to provide an experience that is illuminating, enriching, and most importantly, accessible to anyone regardless of visual impairment or geographical location.

Thumbnail Image
Item

Highcharts Sonification Studio: an online, open-source, extensible, and accessible data sonification tool

2021-06 , Cantrell, Stanley J. , Walker, Bruce N. , Moseng, Øystein

The Highcharts Sonification Studio is the culmination of a multi-year collaboration between Highsoft — the creators of Highcharts — and the Georgia Tech Sonification Lab to develop an extensible, accessible, online spreadsheet and multimodal graphing platform for the auditory display, assistive technology, and STEM education communities. The Highcharts Sonification Studio leverages the advances in auditory display and sonification research, as well as over 20 years of experience gained through research and development of the original Sonification Sandbox. We discuss the iterative design and evaluation process of the Highcharts Sonification Studio to ensure usability and accessibility, highlight opportunities for growth of the tool, and its use for research, art, and education within the ICAD community and beyond.

Thumbnail Image
Item

Perceived Relational Risk and Perceived Situational Risk Scales

2020-10-10 , Stuck, Rachel E. , Walker, Bruce N.

This technical report provides an overview of how to use scales that were developed for perceived relational risk and perceived situational risk.

Thumbnail Image
Item

Evaluation of a non-visual auditory choropleth and travel map viewer

2022-06 , Biggs, Brandon , Toth, Christopher , Stockman, Tony , Coughlan, James M. , Walker, Bruce N.

The auditory virtual reality interface of Audiom, a web-based map viewer, was evaluated by thirteen blind participants. In Audiom, the user is an avatar that navigates, using the arrow keys, through geographic data, as if they are playing a first-person, egocentric game. The research questions were: What will make blind users want to use Audiom maps? And Can participants demonstrate basic acquisition of spatial knowledge after viewing an auditory map? A dynamic choropleth map of state-level US COVID-19 data, and a detailed OpenStreetMap powered travel map, were evaluated. All participants agreed they wanted more maps of all kinds, in particular county-level COVID data, and they would use Audiom once some bugs were fixed and their few recommended features were added. Everyone wanted to see Audiom embedded in their existing travel and mapping applications. All participants were able to answer a question evaluating spatial knowledge. Participants also agreed this spatial information was not available in existing applications.

Thumbnail Image
Item

To sonify or not to sonify? Educator perceptions of auditory display in interactive simulations

2021-06 , Fiedler, Brett L. , Walker, Bruce N. , Moore, Emily B.

With the growing presence of auditory display in popular learning tools, it is beneficial to researchers to consider not only the perceptions of the students who use the tools, but the educators who include the tools in their curriculum. We surveyed over 4000 educators to investigate educator perceptions and preferences across four interactive physics simulations for the presence and qualities of non-speech auditory display, as well as surveying users' selfrated musical sophistication as potentially predictive of auditory display preference. We find that the majority of teachers preferred the simulations with auditory display and consistently rated aspects of the experience using simulations with sound positively over the without-sound variants. We also identify simulation design features that align with trends in educator ratings. We did not find the measured musical sophistication to be a predictor of auditory display preference.

Thumbnail Image
Item

Hearing artificial intelligence: Sonification guidelines & results from a case-study in melanoma diagnosis

2019-06 , R. Michael, Winters , Kalra, Ankur , Walker, Bruce N.

The applications of artificial intelligence are becoming more and more prevalent in everyday life. Although many AI systems can operate autonomously, their goal is often assisting humans. Knowledge from the AI system must somehow be perceptualized. Towards this goal, we present a case-study in the application of data-driven non-speech audio for melanoma diagnosis. A physician photographs a suspicious skin lesion, triggering a sonification of the system's penultimate classification layer. We iterated on sonification strategies and coalesced around designs representing three general approaches. We tested each in a group of novice listeners (n=7) for mean sensitivity, specificity, and learning effects. The mean accuracy was greatest for a simple model, but a trained dermatologist preferred a perceptually compressed model of the full classification layer. We discovered that training the AI on sonifications from this model improved accuracy further. We argue for perceptual compression as a general technique and for a comprehensible number of simultaneous streams.