Organizational Unit:
Rehabilitation Engineering Research Center on Technologies to Support Aging-in-Place for People with Long-Term Disabilities

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 6 of 6
  • Item
    Haptic Simulation for Robot-Assisted Dressing
    (Georgia Institute of Technology, 2017) Yu, Wenhao ; Kapusta, Ariel ; Tan, Jie ; Kemp, Charles C. ; Turk, Greg ; Liu, C. Karen
    There is a considerable need for assistive dressing among people with disabilities, and robots have the potential to fulfill this need. However, training such a robot would require extensive trials in order to learn the skills of assistive dressing. Such training would be time-consuming and require considerable effort to recruit participants and conduct trials. In addition, for some cases that might cause injury to the person being dressed, it is impractical and unethical to perform such trials. In this work, we focus on a representative dressing task of pulling the sleeve of a hospital gown onto a person’s arm. We present a system that learns a haptic classifier for the outcome of the task given few (2-3) real-world trials with one person. Our system first optimizes the parameters of a physics simulator using real-world data. Using the optimized simulator, the system then simulates more haptic sensory data with noise models that account for randomness in the experiment. We then train hidden Markov Models (HMMs) on the simulated haptic data. The trained HMMs can then be used to classify and predict the outcome of the assistive dressing task based on haptic signals measured by a real robot’s end effector. This system achieves 92.83% accuracy in classifying the outcome of the robot-assisted dressing task with people not included in simulation optimization. We compare our classifiers to those trained on real-world data. We show that the classifiers from our system can categorize the dressing task outcomes more accurately than classifiers trained on ten times more real data.
  • Item
    Data-Driven Haptic Perception for Robot-Assisted Dressing
    (Georgia Institute of Technology, 2016-08) Kapusta, Ariel ; Yu, Wenhao ; Bhattacharjee, Tapomayukh ; Liu, C. Karen ; Turk, Greg ; Kemp, Charles C.
    Dressing is an important activity of daily living (ADL) with which many people require assistance due to impairments. Robots have the potential to provide dressing assistance, but physical interactions between clothing and the human body can be complex and difficult to visually observe. We provide evidence that data-driven haptic perception can be used to infer relationships between clothing and the human body during robot-assisted dressing. We conducted a carefully controlled experiment with 12 human participants during which a robot pulled a hospital gown along the length of each person’s forearm 30 times. This representative task resulted in one of the following three outcomes: the hand missed the opening to the sleeve; the hand or forearm became caught on the sleeve; or the full forearm successfully entered the sleeve. We found that hidden Markov models (HMMs) using only forces measured at the robot’s end effector classified these outcomes with high accuracy. The HMMs’ performance generalized well to participants (98.61% accuracy) and velocities (98.61% accuracy) outside of the training data. They also performed well when we limited the force applied by the robot (95.8% accuracy with a 2N threshold), and could predict the outcome early in the process. Despite the lightweight hospital gown, HMMs that used forces in the direction of gravity substantially outperformed those that did not. The best performing HMMs used forces in the direction of motion and the direction of gravity.
  • Item
    Autobed: Open Hardware for Accessible Web-based Control of an Electric Bed
    (Georgia Institute of Technology, 2016) Grice, Phillip M. ; Chitalia, Yash ; Rich, Megan ; Clever, Henry M. ; Kemp, Charles C.
    Individuals with severe motor impairments often have difficulty operating the standard controls of electric beds and so require a caregiver to adjust their position for utility, comfort, or to prevent pressure ulcers. Assistive human-computer interaction devices allow many such individuals to operate a computer and web browser. Here, we present the Autobed, a Wi-Fi-connected device that enables control of an Invacare Full-Electric Homecare Bed, a Medicare-approved device in the US, from any modern web browser, without modification of existing hardware. We detail the design and operation of the Autobed. We also examine its usage by one individual with severe motor impairments and his primary caregiver in their own home, including usage logs from a period of 102 days and detailed questionnaires. Finally, we make the entire system, including hardware design and components, software, and build instructions, available under permissive open-source licenses.
  • Item
    A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Grice, Phillip M. ; Kapusta, Ariel ; Killpack, Marc D. ; Park, Daehyung ; Kemp, Charles C.
    We present a system that enables a robot to reach locations in dense clutter using only haptic sensing. Our system integrates model predictive control [1], learned initial conditions [2], tactile recognition of object types [3], haptic mapping, and geometric planning to efficiently reach locations using whole- arm tactile sensing [4]. We motivate our work, present a system architecture, summarize each component of the system, and present results from our evaluation of the system reaching to target locations in dense artificial foliage.
  • Item
    Learning to Reach into the Unknown: Selecting Initial Conditions When Reaching in Clutter
    (Georgia Institute of Technology, 2014-09) Park, Daehyung ; Kapusta, Ariel ; Kim, You Keun ; Rehg, James M. ; Kemp, Charles C.
    Often in highly-cluttered environments, a robot can observe the exterior of the environment with ease, but cannot directly view nor easily infer its detailed internal structure (e.g., dense foliage or a full refrigerator shelf). We present a data-driven approach that greatly improves a robot’s success at reaching to a goal location in the unknown interior of an environment based on observable external properties, such as the category of the clutter and the locations of openings into the clutter (i.e., apertures). We focus on the problem of selecting a good initial configuration for a manipulator when reaching with a greedy controller. We use density estimation to model the probability of a successful reach given an initial condition and then perform constrained optimization to find an initial condition with the highest estimated probability of success. We evaluate our approach with two simulated robots reaching in clutter, and provide a demonstration with a real PR2 robot reaching to locations through random apertures. In our evaluations, our approach significantly outperformed two alter- native approaches when making two consecutive reach attempts to goals in distinct categories of unknown clutter. Notably, our approach only uses sparse readily-apparent features.
  • Item
    Interleaving Planning and Control for Efficient Haptically-guided Reaching in Unknown Environments
    (Georgia Institute of Technology, 2014) Park, Daehyung ; Kapusta, Ariel ; Hawke, Jeffrey ; Kemp, Charles C.
    We present a new method for reaching in an initially unknown environment with only haptic sensing. In this paper, we propose a haptically-guided interleaving planning and control (HIPC) method with a haptic mapping framework. HIPC runs two planning methods, interleaving a task-space and a joint-space planner, to provide fast reaching performance. It continually replans a valid trajectory, alternating between planners and quickly reflecting collected tactile information from an unknown environment. One key idea is that tactile sensing can be used to directly map an immediate cause of interference when reaching. The mapping framework efficiently assigns raw tactile information from whole-arm tactile sensors into a 3D voxel-based collision map. Our method uses a previously published contact-regulating controller based on model predictive control (MPC). In our evaluation with a physics simulation of a humanoid robot, interleaving was superior at reaching in the 9 types of environments we used.