Title:
Data-Driven Haptic Perception for Robot-Assisted Dressing
Data-Driven Haptic Perception for Robot-Assisted Dressing
Author(s)
Kapusta, Ariel
Yu, Wenhao
Bhattacharjee, Tapomayukh
Liu, C. Karen
Turk, Greg
Kemp, Charles C.
Yu, Wenhao
Bhattacharjee, Tapomayukh
Liu, C. Karen
Turk, Greg
Kemp, Charles C.
Advisor(s)
Editor(s)
Collections
Supplementary to
Permanent Link
Abstract
Dressing is an important activity of daily living
(ADL) with which many people require assistance due to
impairments. Robots have the potential to provide dressing
assistance, but physical interactions between clothing and the
human body can be complex and difficult to visually observe.
We provide evidence that data-driven haptic perception can be
used to infer relationships between clothing and the human
body during robot-assisted dressing. We conducted a carefully
controlled experiment with 12 human participants during which
a robot pulled a hospital gown along the length of each person’s
forearm 30 times. This representative task resulted in one of
the following three outcomes: the hand missed the opening
to the sleeve; the hand or forearm became caught on the
sleeve; or the full forearm successfully entered the sleeve. We
found that hidden Markov models (HMMs) using only forces
measured at the robot’s end effector classified these outcomes
with high accuracy. The HMMs’ performance generalized
well to participants (98.61% accuracy) and velocities (98.61%
accuracy) outside of the training data. They also performed well
when we limited the force applied by the robot (95.8% accuracy
with a 2N threshold), and could predict the outcome early in the
process. Despite the lightweight hospital gown, HMMs that used
forces in the direction of gravity substantially outperformed
those that did not. The best performing HMMs used forces in
the direction of motion and the direction of gravity.
Sponsor
Date Issued
2016-08
Extent
Resource Type
Text
Resource Subtype
Proceedings