Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 7 of 7
  • Item
    Multimodal Execution Monitoring for Anomaly Detection During Robot Manipulation
    (Georgia Institute of Technology, 2016-05) Park, Daehyung ; Erickson, Zackory ; Bhattacharjee, Tapomayukh ; Kemp, Charles C.
    Online detection of anomalous execution can be valuable for robot manipulation, enabling robots to operate more safely, determine when a behavior is inappropriate, and otherwise exhibit more common sense. By using multiple complementary sensory modalities, robots could potentially detect a wider variety of anomalies, such as anomalous contact or a loud utterance by a human. However, task variability and the potential for false positives make online anomaly detection challenging, especially for long-duration manipulation behaviors. In this paper, we provide evidence for the value of multimodal execution monitoring and the use of a detection threshold that varies based on the progress of execution. Using a data-driven approach, we train an execution monitor that runs in parallel to a manipulation behavior. Like previous methods for anomaly detection, our method trains a hidden Markov model (HMM) using multimodal observations from non-anomalous executions. In contrast to prior work, our system also uses a detection threshold that changes based on the execution progress. We evaluated our approach with haptic, visual, auditory, and kinematic sensing during a variety of manipulation tasks performed by a PR2 robot. The tasks included pushing doors closed, operating switches, and assisting ablebodied participants with eating yogurt. In our evaluations, our anomaly detection method performed substantially better with multimodal monitoring than single modality monitoring. It also resulted in more desirable ROC curves when compared with other detection threshold methods from the literature, obtaining higher true positive rates for comparable false positive rates.
  • Item
    Material Recognition from Heat Transfer given Varying Initial Conditions and Short-Duration Contact
    (Georgia Institute of Technology, 2015) Bhattacharjee, Tapomayukh ; Wade, Joshua ; Kemp, Charles C.
    When making contact with an object, a robot can use a tactile sensor consisting of a heating element and a temperature sensor to recognize the object’s material based on conductive heat transfer from the tactile sensor to the object. When this type of tactile sensor has time to fully reheat prior to contact and the duration of contact is long enough to achieve a thermal steady state, numerous methods have been shown to perform well. In order to enable robots to more efficiently sense their environments and take advantage of brief contact events over which they lack control, we focus on the problem of material recognition from heat transfer given varying initial conditions and short-duration contact. We present both modelbased and data-driven methods. For the model-based method, we modeled the thermodynamics of the sensor in contact with a material as contact between two semi-infinite solids. For the data-driven methods, we used three machine learning algorithms (SVM+PCA, k-NN+PCA, HMMs) with time series of raw temperature measurements and temperature change estimates. When recognizing 11 materials with varying initial conditions and 3- fold cross-validation, SVM+PCA outperformed all other methods, achieving 84% accuracy
  • Item
    Inferring Object Properties from Incidental Contact with a Tactile-Sensing Forearm
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    Whole-arm tactile sensing enables a robot to sense properties of contact across its entire arm. By using this large sensing area, a robot has the potential to acquire useful information from incidental contact that occurs while performing a task. Within this paper, we demonstrate that data-driven methods can be used to infer mechanical properties of objects from incidental contact with a robot’s forearm. We collected data from a tactile-sensing forearm as it made contact with various objects during a simple reaching motion. We then used hidden Markov models (HMMs) to infer two object properties (rigid vs. soft and fixed vs. movable) based on low-dimensional features of time-varying tactile sensor data (maximum force, contact area, and contact motion). A key issue is the extent to which data-driven methods can generalize to robot actions that differ from those used during training. To investigate this issue, we developed an idealized mechanical model of a robot with a compliant joint making contact with an object. This model provides intuition for the classification problem. We also conducted tests in which we varied the robot arm’s velocity and joint stiffness. We found that, in contrast to our previous methods [1], multivariate HMMs achieved high cross-validation accuracy and successfully generalized what they had learned to new robot motions with distinct velocities and joint stiffnesses.
  • Item
    A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Grice, Phillip M. ; Kapusta, Ariel ; Killpack, Marc D. ; Park, Daehyung ; Kemp, Charles C.
    We present a system that enables a robot to reach locations in dense clutter using only haptic sensing. Our system integrates model predictive control [1], learned initial conditions [2], tactile recognition of object types [3], haptic mapping, and geometric planning to efficiently reach locations using whole- arm tactile sensing [4]. We motivate our work, present a system architecture, summarize each component of the system, and present results from our evaluation of the system reaching to target locations in dense artificial foliage.
  • Item
    Rapid Categorization of Object Properties from Incidental Contact with a Tactile Sensing Robot Arm,
    (Georgia Institute of Technology, 2013-10) Bhattacharjee, Tapomayukh ; Kapusta, Ariel ; Rehg, James M. ; Kemp, Charles C.
    We demonstrate that data-driven methods can be used to rapidly categorize objects encountered through incidental contact on a robot arm. Allowing incidental contact with surrounding objects has benefits during manipulation such as increasing the workspace during reaching tasks. The information obtained from such contact, if available online, can potentially be used to map the environment and help in manipulation tasks. In this paper, we address this problem of online categorization using incidental contact during goal oriented motion. In cluttered environments, the detailed internal structure of clutter can be difficult to infer, but the environment type is often apparent. In a randomized cluttered environment of known object types and “outliers”, our approach uses Hidden Markov Models to capture the dynamic robot-environment interactions and to categorize objects based on the interactions. We combined leaf and trunk objects to create artificial foliage as a test environment. We collected data using a skin-sensor on the robot’s forearm while it reached into clutter. Our algorithm classifies the objects rapidly with low computation time and few data-samples. Using a taxel-by-taxel classification approach, we can successfully categorize simultaneous contacts with multiple objects and can also identify outlier objects in the environment based on the prior associated with an object’s likelihood in the given environment.
  • Item
    Tactile Sensing over Articulated Joints with Stretchable Sensors
    (Georgia Institute of Technology, 2013-04) Bhattacharjee, Tapomayukh ; Jain, Advait ; Vaish, Sarvagya ; Killpack, Marc D. ; Kemp, Charles C.
    Biological organisms benefit from tactile sensing across the entire surfaces of their bodies. Robots may also be able to benefit from this type of sensing, but fully covering a robot with robust and capable tactile sensors entails numerous challenges. To date, most tactile sensors for robots have been used to cover rigid surfaces. In this paper, we focus on the challenge of tactile sensing across articulated joints, which requires sensing across a surface whose geometry varies over time. We first demonstrate the importance of sensing across joints by simulating a planar arm reaching in clutter and finding the frequency of contact at the joints. We then present a simple model of how much a tactile sensor would need to stretch in order to cover a 2 degree-of-freedom (DoF) wrist joint. Next, we describe and characterize a new tactile sensor made with stretchable fabrics. Finally, we present results for a stretchable sleeve with 25 tactile sensors that covers the forearm, 2 DoF wrist, and end effector of a humanoid robot. This sleeve enabled the robot to reach a target in instrumented clutter and reduce contact forces.
  • Item
    Haptic Classification and Recognition of Objects Using a Tactile Sensing Forearm
    (Georgia Institute of Technology, 2012-10) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    In this paper, we demonstrate data-driven inference of mechanical properties of objects using a tactile sensor array (skin) covering a robot's forearm. We focus on the mobility (sliding vs. fixed), compliance (soft vs. hard), and identity of objects in the environment, as this information could be useful for efficient manipulation and search. By using the large surface area of the forearm, a robot could potentially search and map a cluttered volume more efficiently, and be informed by incidental contact during other manipulation tasks. Our approach tracks a contact region on the forearm over time in order to generate time series of select features, such as the maximum force, contact area, and contact motion. We then process and reduce the dimensionality of these time series to generate a feature vector to characterize the contact. Finally, we use the k-nearest neighbor algorithm (k-NN) to classify a new feature vector based on a set of previously collected feature vectors. Our results show a high cross-validation accuracy in both classification of mechanical properties and object recognition. In addition, we analyze the effect of taxel resolution, duration of observation, feature selection, and feature scaling on the classification accuracy.