Person:
Kemp, Charles C.

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 10 of 63
  • Item
    Visually Estimating Contact Pressure for Humans and Robots
    (Georgia Institute of Technology, 2022-08-24) Kemp, Charles C.
    IRIM hosts each semester a symposium to feature presentations from faculty and presentations of research that has been funded by our IRIM seed grant program in the last year. The symposium is a chance for faculty to meet new PhD students on campus, as well as a chance to get a better idea of what IRIM colleagues are up to these days. The goal of the symposium is to spark new ideas, new collaborations, and even new friends!
  • Item
    From One to Many: My Personal Quest for Meaningful Mobile Manipulation
    (Georgia Institute of Technology, 2021-08-25) Kemp, Charles C.
  • Item
    Haptic Simulation for Robot-Assisted Dressing
    (Georgia Institute of Technology, 2017) Yu, Wenhao ; Kapusta, Ariel ; Tan, Jie ; Kemp, Charles C. ; Turk, Greg ; Liu, C. Karen
    There is a considerable need for assistive dressing among people with disabilities, and robots have the potential to fulfill this need. However, training such a robot would require extensive trials in order to learn the skills of assistive dressing. Such training would be time-consuming and require considerable effort to recruit participants and conduct trials. In addition, for some cases that might cause injury to the person being dressed, it is impractical and unethical to perform such trials. In this work, we focus on a representative dressing task of pulling the sleeve of a hospital gown onto a person’s arm. We present a system that learns a haptic classifier for the outcome of the task given few (2-3) real-world trials with one person. Our system first optimizes the parameters of a physics simulator using real-world data. Using the optimized simulator, the system then simulates more haptic sensory data with noise models that account for randomness in the experiment. We then train hidden Markov Models (HMMs) on the simulated haptic data. The trained HMMs can then be used to classify and predict the outcome of the assistive dressing task based on haptic signals measured by a real robot’s end effector. This system achieves 92.83% accuracy in classifying the outcome of the robot-assisted dressing task with people not included in simulation optimization. We compare our classifiers to those trained on real-world data. We show that the classifiers from our system can categorize the dressing task outcomes more accurately than classifiers trained on ten times more real data.
  • Item
    Multimodal Execution Monitoring for Anomaly Detection During Robot Manipulation
    (Georgia Institute of Technology, 2016-05) Park, Daehyung ; Erickson, Zackory ; Bhattacharjee, Tapomayukh ; Kemp, Charles C.
    Online detection of anomalous execution can be valuable for robot manipulation, enabling robots to operate more safely, determine when a behavior is inappropriate, and otherwise exhibit more common sense. By using multiple complementary sensory modalities, robots could potentially detect a wider variety of anomalies, such as anomalous contact or a loud utterance by a human. However, task variability and the potential for false positives make online anomaly detection challenging, especially for long-duration manipulation behaviors. In this paper, we provide evidence for the value of multimodal execution monitoring and the use of a detection threshold that varies based on the progress of execution. Using a data-driven approach, we train an execution monitor that runs in parallel to a manipulation behavior. Like previous methods for anomaly detection, our method trains a hidden Markov model (HMM) using multimodal observations from non-anomalous executions. In contrast to prior work, our system also uses a detection threshold that changes based on the execution progress. We evaluated our approach with haptic, visual, auditory, and kinematic sensing during a variety of manipulation tasks performed by a PR2 robot. The tasks included pushing doors closed, operating switches, and assisting ablebodied participants with eating yogurt. In our evaluations, our anomaly detection method performed substantially better with multimodal monitoring than single modality monitoring. It also resulted in more desirable ROC curves when compared with other detection threshold methods from the literature, obtaining higher true positive rates for comparable false positive rates.
  • Item
    Towards Assistive Feeding with a General-Purpose Mobile Manipulator
    (Georgia Institute of Technology, 2016-05) Park, Daehyung ; Kim, You Keun ; Erickson, Zackory ; Kemp, Charles C.
    General-purpose mobile manipulators have the potential to serve as a versatile form of assistive technology. However, their complexity creates challenges, including the risk of being too difficult to use. We present a proof-of-concept robotic system for assistive feeding that consists of a Willow Garage PR2, a high-level web-based interface, and specialized autonomous behaviors for scooping and feeding yogurt. As a step towards use by people with disabilities, we evaluated our system with 5 able-bodied participants. All 5 successfully ate yogurt using the system and reported high rates of success for the system’s autonomous behaviors. Also, Henry Evans, a person with severe quadriplegia, operated the system remotely to feed an able-bodied person. In general, people who operated the system reported that it was easy to use, including Henry. The feeding system also incorporates corrective actions designed to be triggered either autonomously or by the user. In an offline evaluation using data collected with the feeding system, a new version of our multimodal anomaly detection system outperformed prior versions.
  • Item
    Autobed: A Web-Controlled Robotic Bed
    (Georgia Institute of Technology, 2016-02) Grice, Phillip M. ; Chitalia, Yash ; Rich, Megan ; Clever, Henry ; Evans, Henry ; Evans, Jane ; Kemp, Charles C.
    We (the Healthcare Robotics Lab at Georgia Tech) have developed an additional module for an Invacare fully electric hospital bed (Model 5410IVC) so that the bed can be controlled from a web-based interface. This module can be easily plugged between the hand control and the Invacare bed, without having to modify any existing hardware on the bed. We call a bed so modified an 'Autobed.' With this feature, users who are unable to operate the standard bed controls, but can access a web browser, are able to position the bed by themselves without having to rely on a caregiver (for example, patients with quadriplegia). This page describes how to make the Autobed module using relatively inexpensive, commercially available hardware. This document is a representation of the content provided at http://hsi.gatech.edu/hrl/project_autobed_v2.shtml as of February 15th, 2016, and is intended to create a lasting, citable, and archival copy of this material, which details the design and instructions for building the 'Autobed' device.
  • Item
    Optimization of Robot Configurations for Assistive Tasks
    (Georgia Institute of Technology, 2016) Kapusta, Ariel ; Kemp, Charles C.
    Robots can provide assistance with activities of daily living (ADLs) to humans with motor impairments. Specialized robots, such as desktop robotic feeding systems, have been successful for specific assistive tasks when placed in fixed and designated positions with respect to the user. General-purpose mobile manipulators could act as a more versatile form of assistive technology, able to perform many tasks, but selecting a configuration for the robots from which to perform a task can be challenging due to the high number of degrees of freedom of the robots and the complexity of the tasks. As with the specialized, fixed robots, once in a good configuration, another system or the user can provide the fine control to perform the details of the task. In this short paper, we present Task-centric Optimization of robot Configurations (TOC), a method for selecting configurations for a PR2 and a robotic bed to allow the PR2 to provide effective assistance with ADLs. TOC builds upon previous work, Task-centric initial Configuration Selection (TCS), addressing some of the limitations of TCS. Notable alterations are selecting configurations from the continuous configuration space using a Covariance Matrix Adaptation Evolution Strategy (CMA-ES) optimization, introducing a joint-limit-weighted manipulability term, and changing the framework to move all optimization offline and using function approximation at run-time. To evaluate TOC, we created models of 13 activities of daily living (ADLs) and compared TOC’s and TCS’s performance with these 13 assistive tasks in a computer simulation of a PR2, a robotic bed, and a model of a human body. TOC performed as well or better than TCS in most of our tests against state estimation error. We also implemented TOC on a real PR2 and a real robotic bed and found that from the TOC-selected configuration the PR2 could reach all task-relevant goals on a mannequin on the bed.
  • Item
    Autobed: Open Hardware for Accessible Web-based Control of an Electric Bed
    (Georgia Institute of Technology, 2016) Grice, Phillip M. ; Chitalia, Yash ; Rich, Megan ; Clever, Henry M. ; Kemp, Charles C.
    Individuals with severe motor impairments often have difficulty operating the standard controls of electric beds and so require a caregiver to adjust their position for utility, comfort, or to prevent pressure ulcers. Assistive human-computer interaction devices allow many such individuals to operate a computer and web browser. Here, we present the Autobed, a Wi-Fi-connected device that enables control of an Invacare Full-Electric Homecare Bed, a Medicare-approved device in the US, from any modern web browser, without modification of existing hardware. We detail the design and operation of the Autobed. We also examine its usage by one individual with severe motor impairments and his primary caregiver in their own home, including usage logs from a period of 102 days and detailed questionnaires. Finally, we make the entire system, including hardware design and components, software, and build instructions, available under permissive open-source licenses.
  • Item
    Material Recognition from Heat Transfer given Varying Initial Conditions and Short-Duration Contact
    (Georgia Institute of Technology, 2015) Bhattacharjee, Tapomayukh ; Wade, Joshua ; Kemp, Charles C.
    When making contact with an object, a robot can use a tactile sensor consisting of a heating element and a temperature sensor to recognize the object’s material based on conductive heat transfer from the tactile sensor to the object. When this type of tactile sensor has time to fully reheat prior to contact and the duration of contact is long enough to achieve a thermal steady state, numerous methods have been shown to perform well. In order to enable robots to more efficiently sense their environments and take advantage of brief contact events over which they lack control, we focus on the problem of material recognition from heat transfer given varying initial conditions and short-duration contact. We present both modelbased and data-driven methods. For the model-based method, we modeled the thermodynamics of the sensor in contact with a material as contact between two semi-infinite solids. For the data-driven methods, we used three machine learning algorithms (SVM+PCA, k-NN+PCA, HMMs) with time series of raw temperature measurements and temperature change estimates. When recognizing 11 materials with varying initial conditions and 3- fold cross-validation, SVM+PCA outperformed all other methods, achieving 84% accuracy
  • Item
    Inferring Object Properties from Incidental Contact with a Tactile-Sensing Forearm
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    Whole-arm tactile sensing enables a robot to sense properties of contact across its entire arm. By using this large sensing area, a robot has the potential to acquire useful information from incidental contact that occurs while performing a task. Within this paper, we demonstrate that data-driven methods can be used to infer mechanical properties of objects from incidental contact with a robot’s forearm. We collected data from a tactile-sensing forearm as it made contact with various objects during a simple reaching motion. We then used hidden Markov models (HMMs) to infer two object properties (rigid vs. soft and fixed vs. movable) based on low-dimensional features of time-varying tactile sensor data (maximum force, contact area, and contact motion). A key issue is the extent to which data-driven methods can generalize to robot actions that differ from those used during training. To investigate this issue, we developed an idealized mechanical model of a robot with a compliant joint making contact with an object. This model provides intuition for the classification problem. We also conducted tests in which we varied the robot arm’s velocity and joint stiffness. We found that, in contrast to our previous methods [1], multivariate HMMs achieved high cross-validation accuracy and successfully generalized what they had learned to new robot motions with distinct velocities and joint stiffnesses.