Person:
Kemp, Charles C.

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 3 of 3
  • Item
    3D Human Pose Estimation on a Configurable Bed from a Pressure Image
    ( 2018) Clever, Henry M. ; Kapusta, Ariel ; Park, Daehyung ; Erickson, Zackory ; Chitalia, Yash ; Kemp, Charles C.
    Robots have the potential to assist people in bed, such as in healthcare settings, yet bedding materials like sheets and blankets can make observation of the human body difficult for robots. A pressure-sensing mat on a bed can provide pressure images that are relatively insensitive to bedding materials. However, prior work on estimating human pose from pressure images has been restricted to 2D pose estimates and flat beds. In this work, we present two convolutional neural networks to estimate the 3D joint positions of a person in a configurable bed from a single pressure image. The first network directly outputs 3D joint positions, while the second outputs a kinematic model that includes estimated joint angles and limb lengths. We evaluated our networks on data from 17 human participants with two bed configurations: supine and seated. Our networks achieved a mean joint position error of 77 mm when tested with data from people outside the training set, outperforming several baselines. We also present a simple mechanical model that provides insight into ambiguity associated with limbs raised off of the pressure mat, and demonstrate that Monte Carlo dropout can be used to estimate pose confidence in these situations. Finally, we provide a demonstration in which a mobile manipulator uses our network’s estimated kinematic model to reach a location on a person’s body in spite of the person being seated in a bed and covered by a blanket.
  • Item
    A Model that Predicts the Material Recognition Performance of Thermal Tactile Sensing
    ( 2016) Bhattacharjee, Tapomayukh ; Bai, Haoping ; Chen, Haofeng ; Kemp, Charles C.
    Tactile sensing can enable a robot to infer properties of its surroundings, such as the material of an object. Heat transfer based sensing can be used for material recognition due to differences in the thermal properties of materials. While datadriven methods have shown promise for this recognition problem, many factors can influence performance, including sensor noise, the initial temperatures of the sensor and the object, the thermal effusivities of the materials, and the duration of contact. We present a physics-based mathematical model that predicts material recognition performance given these factors. Our model uses semi-infinite solids and a statistical method to calculate an F1 score for the binary material recognition. We evaluated our method using simulated contact with 69 materials and data collected by a real robot with 12 materials. Our model predicted the material recognition performance of support vector machine (SVM) with 96% accuracy for the simulated data, with 92% accuracy for real-world data with constant initial sensor temperatures, and with 91% accuracy for real-world data with varied initial sensor temperatures. Using our model, we also provide insight into the roles of various factors on recognition performance, such as the temperature difference between the sensor and the object. Overall, our results suggest that our model could be used to help design better thermal sensors for robots and enable robots to use them more effectively.
  • Item
    Inferring Object Properties from Incidental Contact with a Tactile-Sensing Forearm
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    Whole-arm tactile sensing enables a robot to sense properties of contact across its entire arm. By using this large sensing area, a robot has the potential to acquire useful information from incidental contact that occurs while performing a task. Within this paper, we demonstrate that data-driven methods can be used to infer mechanical properties of objects from incidental contact with a robot’s forearm. We collected data from a tactile-sensing forearm as it made contact with various objects during a simple reaching motion. We then used hidden Markov models (HMMs) to infer two object properties (rigid vs. soft and fixed vs. movable) based on low-dimensional features of time-varying tactile sensor data (maximum force, contact area, and contact motion). A key issue is the extent to which data-driven methods can generalize to robot actions that differ from those used during training. To investigate this issue, we developed an idealized mechanical model of a robot with a compliant joint making contact with an object. This model provides intuition for the classification problem. We also conducted tests in which we varied the robot arm’s velocity and joint stiffness. We found that, in contrast to our previous methods [1], multivariate HMMs achieved high cross-validation accuracy and successfully generalized what they had learned to new robot motions with distinct velocities and joint stiffnesses.