Title:
Haptic Simulation for Robot-Assisted Dressing

dc.contributor.author Yu, Wenhao
dc.contributor.author Kapusta, Ariel
dc.contributor.author Tan, Jie
dc.contributor.author Kemp, Charles C.
dc.contributor.author Turk, Greg
dc.contributor.author Liu, C. Karen
dc.contributor.corporatename Georgia Institute of Technology. Rehabilitation Engineering Research Center on Technologies to Support Successful Aging With Disability en_US
dc.contributor.corporatename Georgia Institute of Technology. Institute for Robotics and Intelligent Machines en_US
dc.contributor.corporatename Georgia Institute of Technology. Healthcare Robotics Lab en_US
dc.date.accessioned 2017-08-01T16:09:45Z
dc.date.available 2017-08-01T16:09:45Z
dc.date.issued 2017
dc.description © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works. en_US
dc.description 2017 IEEE International Conference on Robotics and Automation (ICRA) Singapore, May 29 - June 3, 2017. en_US
dc.description DOI: 10.1109/ICRA.2017.7989716 en_US
dc.description.abstract There is a considerable need for assistive dressing among people with disabilities, and robots have the potential to fulfill this need. However, training such a robot would require extensive trials in order to learn the skills of assistive dressing. Such training would be time-consuming and require considerable effort to recruit participants and conduct trials. In addition, for some cases that might cause injury to the person being dressed, it is impractical and unethical to perform such trials. In this work, we focus on a representative dressing task of pulling the sleeve of a hospital gown onto a person’s arm. We present a system that learns a haptic classifier for the outcome of the task given few (2-3) real-world trials with one person. Our system first optimizes the parameters of a physics simulator using real-world data. Using the optimized simulator, the system then simulates more haptic sensory data with noise models that account for randomness in the experiment. We then train hidden Markov Models (HMMs) on the simulated haptic data. The trained HMMs can then be used to classify and predict the outcome of the assistive dressing task based on haptic signals measured by a real robot’s end effector. This system achieves 92.83% accuracy in classifying the outcome of the robot-assisted dressing task with people not included in simulation optimization. We compare our classifiers to those trained on real-world data. We show that the classifiers from our system can categorize the dressing task outcomes more accurately than classifiers trained on ten times more real data. en_US
dc.identifier.citation Yu, W., Kapusta, A., Tan, J., Kemp, C. C., Turk, G., & Liu, C. K. (2017). Haptic simulation for robot-assisted dressing. 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, pp. 6044-6051. en_US
dc.identifier.doi 10.1109/ICRA.2017.7989716 en_US
dc.identifier.isbn 978-1-5090-4633-1 (Electronic)
dc.identifier.uri http://hdl.handle.net/1853/58512
dc.language.iso en_US en_US
dc.publisher Georgia Institute of Technology en_US
dc.publisher.original Institute of Electrical and Electronics Engineers
dc.subject Assistive dressing en_US
dc.subject Assistive technology en_US
dc.subject Haptic data en_US
dc.subject Hidden Markov models en_US
dc.subject Robot sensing en_US
dc.title Haptic Simulation for Robot-Assisted Dressing en_US
dc.type Text
dc.type.genre Proceedings
dspace.entity.type Publication
local.contributor.author Turk, Greg
local.contributor.author Kemp, Charles C.
local.contributor.corporatename Healthcare Robotics Lab
local.contributor.corporatename Institute for Robotics and Intelligent Machines (IRIM)
local.contributor.corporatename Rehabilitation Engineering Research Center on Technologies to Support Aging-in-Place for People with Long-Term Disabilities
relation.isAuthorOfPublication 1361247d-c446-453b-8b4a-8e87c3d4210b
relation.isAuthorOfPublication e4f743b9-0557-4889-a16e-00afe0715f4c
relation.isOrgUnitOfPublication c6394b0e-6e8b-42dc-aeed-0e22560bd6f1
relation.isOrgUnitOfPublication 66259949-abfd-45c2-9dcc-5a6f2c013bcf
relation.isOrgUnitOfPublication beb39be5-dd4e-4cbd-810d-8b5f852ba609
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
yu-kapusta-tan_et_al_2017.pdf
Size:
4.3 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
3.13 KB
Format:
Item-specific license agreed upon to submission
Description: