Title:
Empirically Informed Sound Synthesis Application for Enhancing the Perception of Expressive Robotic Movement

dc.contributor.author Bellona, Jon
dc.contributor.author Bai, Lin
dc.contributor.author Dahl, Luke
dc.contributor.author LaViers, Amy
dc.contributor.corporatename International Community for Auditory Display
dc.contributor.corporatename University of Virginia. Department of Music
dc.contributor.corporatename University of Virginia. Department of Electrical and Computer Engineering
dc.contributor.corporatename University of Illinois. Department of Mechanical Science and Engineering
dc.date.accessioned 2017-06-15T17:52:10Z
dc.date.available 2017-06-15T17:52:10Z
dc.date.issued 2017-06
dc.description Presented at the 23rd International Conference on Auditory Display (ICAD 2017) in Pennsylvania, USA.
dc.description.abstract Since people often communicate internal states and intentions through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed “expressive.” However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents an application for synthesizing sounds that match various movement qualities. Its design is based on an empirical study analyzing sound and movement qualities, where movement qualities are parametrized according to Laban’s Effort System. Our results suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented here and discussed within the context of designing movement-quality-to-sound-quality mappings in our sound synthesis application. This application will be used in future work testing user perceptions of expressive movements with synchronous sounds. en_US
dc.identifier.citation Bellona, J., et al. "Empirically Informed Sound Synthesis Application for Enhancing the Perception of Expressive Robotic Movement" Presented at the 23rd International Conference on Auditory Display (ICAD2017), June 20-23, 2017, Pennsylvania State University, State College, PA, USA. en_US
dc.identifier.doi https://doi.org/10.21785/icad2017.049 en_US
dc.identifier.uri http://hdl.handle.net/1853/58369
dc.publisher Georgia Institute of Technology en_US
dc.publisher Georgia Institute of Technology
dc.publisher.original International Community on Auditory Display
dc.publisher.original International Community for Auditory Display (ICAD)
dc.relation.ispartofseries International Conference on Auditory Display (ICAD)
dc.rights This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. en_US
dc.rights.uri http://creativecommons.org/licenses/by-nc/4.0/
dc.subject Auditory display en_US
dc.subject Robot-human interaction en_US
dc.subject Sound synthesis en_US
dc.title Empirically Informed Sound Synthesis Application for Enhancing the Perception of Expressive Robotic Movement en_US
dc.type Text
dc.type.genre Proceedings
dspace.entity.type Publication
local.contributor.corporatename Sonification Lab
local.relation.ispartofseries International Conference on Auditory Display (ICAD)
relation.isOrgUnitOfPublication 2727c3e6-abb7-4df0-877f-9f218987b22a
relation.isSeriesOfPublication 6cb90d00-3311-4767-954d-415c9341a358
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
ICAD2017_paper_49.pdf
Size:
940.93 KB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
3.13 KB
Format:
Item-specific license agreed upon to submission
Description:
Collections