Person:
Turk, Greg

Associated Organization(s)
Organizational Unit
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 3 of 3
  • Item
    Multidimensional Capacitive Sensing for Robot-Assisted Dressing and Bathing
    (Georgia Institute of Technology, 2019-05-24) Erickson, Zackory ; Clever, Henry M. ; Gangaram, Vamsee ; Turk, Greg ; Liu, C. Karen ; Kemp, Charles C.
    Robotic assistance presents an opportunity to benefit the lives of many people with physical disabilities, yet accurately sensing the human body and tracking human motion remain difficult for robots. We present a multidimensional capacitive sensing technique that estimates the local pose of a human limb in real time. A key benefit of this sensing method is that it can sense the limb through opaque materials, including fabrics and wet cloth. Our method uses a multielectrode capacitive sensor mounted to a robot’s end effector. A neural network model estimates the position of the closest point on a person’s limb and the orientation of the limb’s central axis relative to the sensor’s frame of reference. These pose estimates enable the robot to move its end effector with respect to the limb using feedback control. We demonstrate that a PR2 robot can use this approach with a custom six electrode capacitive sensor to assist with two activities of daily living— dressing and bathing. The robot pulled the sleeve of a hospital gown onto able-bodied participants’ right arms, while tracking human motion. When assisting with bathing, the robot moved a soft wet washcloth to follow the contours of able-bodied participants’ limbs, cleaning their surfaces. Overall, we found that multidimensional capacitive sensing presents a promising approach for robots to sense and track the human body during assistive tasks that require physical human-robot interaction.
  • Item
    Haptic Simulation for Robot-Assisted Dressing
    (Georgia Institute of Technology, 2017) Yu, Wenhao ; Kapusta, Ariel ; Tan, Jie ; Kemp, Charles C. ; Turk, Greg ; Liu, C. Karen
    There is a considerable need for assistive dressing among people with disabilities, and robots have the potential to fulfill this need. However, training such a robot would require extensive trials in order to learn the skills of assistive dressing. Such training would be time-consuming and require considerable effort to recruit participants and conduct trials. In addition, for some cases that might cause injury to the person being dressed, it is impractical and unethical to perform such trials. In this work, we focus on a representative dressing task of pulling the sleeve of a hospital gown onto a person’s arm. We present a system that learns a haptic classifier for the outcome of the task given few (2-3) real-world trials with one person. Our system first optimizes the parameters of a physics simulator using real-world data. Using the optimized simulator, the system then simulates more haptic sensory data with noise models that account for randomness in the experiment. We then train hidden Markov Models (HMMs) on the simulated haptic data. The trained HMMs can then be used to classify and predict the outcome of the assistive dressing task based on haptic signals measured by a real robot’s end effector. This system achieves 92.83% accuracy in classifying the outcome of the robot-assisted dressing task with people not included in simulation optimization. We compare our classifiers to those trained on real-world data. We show that the classifiers from our system can categorize the dressing task outcomes more accurately than classifiers trained on ten times more real data.
  • Item
    Data-Driven Haptic Perception for Robot-Assisted Dressing
    (Georgia Institute of Technology, 2016-08) Kapusta, Ariel ; Yu, Wenhao ; Bhattacharjee, Tapomayukh ; Liu, C. Karen ; Turk, Greg ; Kemp, Charles C.
    Dressing is an important activity of daily living (ADL) with which many people require assistance due to impairments. Robots have the potential to provide dressing assistance, but physical interactions between clothing and the human body can be complex and difficult to visually observe. We provide evidence that data-driven haptic perception can be used to infer relationships between clothing and the human body during robot-assisted dressing. We conducted a carefully controlled experiment with 12 human participants during which a robot pulled a hospital gown along the length of each person’s forearm 30 times. This representative task resulted in one of the following three outcomes: the hand missed the opening to the sleeve; the hand or forearm became caught on the sleeve; or the full forearm successfully entered the sleeve. We found that hidden Markov models (HMMs) using only forces measured at the robot’s end effector classified these outcomes with high accuracy. The HMMs’ performance generalized well to participants (98.61% accuracy) and velocities (98.61% accuracy) outside of the training data. They also performed well when we limited the force applied by the robot (95.8% accuracy with a 2N threshold), and could predict the outcome early in the process. Despite the lightweight hospital gown, HMMs that used forces in the direction of gravity substantially outperformed those that did not. The best performing HMMs used forces in the direction of motion and the direction of gravity.