Title:
"Spindex" (speech index) enhances menu navigation user experience of touch screen devices in various input gestures: tapping, wheeling, and flicking

dc.contributor.advisor Walker, Bruce N.
dc.contributor.author Jeon, Myounghoon en_US
dc.contributor.committeeMember Frank Durso
dc.contributor.committeeMember Corso, Gregory M.
dc.contributor.department Psychology en_US
dc.date.accessioned 2011-03-04T20:09:28Z
dc.date.available 2011-03-04T20:09:28Z
dc.date.issued 2010-11-11 en_US
dc.description.abstract In a large number of electronic devices, users interact with the system by navigating through various menus. Auditory menus can complement or even replace visual menus, so research on auditory menus has recently increased with mobile devices as well as desktop computers. Despite the potential importance of auditory displays on touch screen devices, little research has been attempted to enhance the effectiveness of auditory menus for those devices. In the present study, I investigated how advanced auditory cues enhance auditory menu navigation on a touch screen smartphone, especially for new input gestures such as tapping, wheeling, and flicking methods for navigating a one-dimensional menu. Moreover, I examined if advanced auditory cues improve user experience, not only for visuals-off situations, but also for visuals-on contexts. To this end, I used a novel auditory menu enhancement called a "spindex" (i.e., speech index), in which brief audio cues inform the users of where they are in a long menu. In this study, each item in a menu was preceded by a sound based on the item's initial letter. One hundred and twenty two undergraduates navigated through an alphabetized list of 150 song titles. The study was a split-plot design with manipulated auditory cue type (text-to-speech (TTS) alone vs. TTS plus spindex), visual mode (on vs. off), and input gesture style (tapping, wheeling, and flicking). Target search time and subjective workload for the TTS + spindex were lower than those of the TTS alone in all input gesture types regardless of visual type. Also, on subjective ratings scales, participants rated the TTS + spindex condition higher than the plain TTS on being 'effective' and 'functionally helpful'. The interaction between input methods and output modes (i.e., auditory cue types) and its effects on navigation behaviors was also analyzed based on the two-stage navigation strategy model used in auditory menus. Results were discussed in analogy with visual search theory and in terms of practical applications of spindex cues. en_US
dc.description.degree M.S. en_US
dc.identifier.uri http://hdl.handle.net/1853/37101
dc.publisher Georgia Institute of Technology en_US
dc.subject Sonification en_US
dc.subject Auditory menu en_US
dc.subject Auditory displays en_US
dc.subject.lcsh Multimodal user interfaces (Computer systems)
dc.subject.lcsh Haptic devices
dc.subject.lcsh User interfaces
dc.subject.lcsh Human-computer interaction
dc.subject.lcsh Computer sound processing
dc.subject.lcsh Smartphones
dc.title "Spindex" (speech index) enhances menu navigation user experience of touch screen devices in various input gestures: tapping, wheeling, and flicking en_US
dc.type Text
dc.type.genre Thesis
dspace.entity.type Publication
local.contributor.advisor Walker, Bruce N.
local.contributor.corporatename College of Sciences
local.contributor.corporatename School of Psychology
relation.isAdvisorOfPublication 5bedf397-416e-498e-aa60-c67c0ee43473
relation.isOrgUnitOfPublication 85042be6-2d68-4e07-b384-e1f908fae48a
relation.isOrgUnitOfPublication 768a3cd1-8d73-4d47-b418-0fc859ce897d
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
jeon_myounghoon_201012_mast.pdf
Size:
841.84 KB
Format:
Adobe Portable Document Format
Description: