Person:
Rehg, James M.

Associated Organization(s)
Organizational Unit
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 5 of 5
  • Item
    Inferring Object Properties from Incidental Contact with a Tactile-Sensing Forearm
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    Whole-arm tactile sensing enables a robot to sense properties of contact across its entire arm. By using this large sensing area, a robot has the potential to acquire useful information from incidental contact that occurs while performing a task. Within this paper, we demonstrate that data-driven methods can be used to infer mechanical properties of objects from incidental contact with a robot’s forearm. We collected data from a tactile-sensing forearm as it made contact with various objects during a simple reaching motion. We then used hidden Markov models (HMMs) to infer two object properties (rigid vs. soft and fixed vs. movable) based on low-dimensional features of time-varying tactile sensor data (maximum force, contact area, and contact motion). A key issue is the extent to which data-driven methods can generalize to robot actions that differ from those used during training. To investigate this issue, we developed an idealized mechanical model of a robot with a compliant joint making contact with an object. This model provides intuition for the classification problem. We also conducted tests in which we varied the robot arm’s velocity and joint stiffness. We found that, in contrast to our previous methods [1], multivariate HMMs achieved high cross-validation accuracy and successfully generalized what they had learned to new robot motions with distinct velocities and joint stiffnesses.
  • Item
    Learning to Reach into the Unknown: Selecting Initial Conditions When Reaching in Clutter
    (Georgia Institute of Technology, 2014-09) Park, Daehyung ; Kapusta, Ariel ; Kim, You Keun ; Rehg, James M. ; Kemp, Charles C.
    Often in highly-cluttered environments, a robot can observe the exterior of the environment with ease, but cannot directly view nor easily infer its detailed internal structure (e.g., dense foliage or a full refrigerator shelf). We present a data-driven approach that greatly improves a robot’s success at reaching to a goal location in the unknown interior of an environment based on observable external properties, such as the category of the clutter and the locations of openings into the clutter (i.e., apertures). We focus on the problem of selecting a good initial configuration for a manipulator when reaching with a greedy controller. We use density estimation to model the probability of a successful reach given an initial condition and then perform constrained optimization to find an initial condition with the highest estimated probability of success. We evaluate our approach with two simulated robots reaching in clutter, and provide a demonstration with a real PR2 robot reaching to locations through random apertures. In our evaluations, our approach significantly outperformed two alter- native approaches when making two consecutive reach attempts to goals in distinct categories of unknown clutter. Notably, our approach only uses sparse readily-apparent features.
  • Item
    Rapid Categorization of Object Properties from Incidental Contact with a Tactile Sensing Robot Arm,
    (Georgia Institute of Technology, 2013-10) Bhattacharjee, Tapomayukh ; Kapusta, Ariel ; Rehg, James M. ; Kemp, Charles C.
    We demonstrate that data-driven methods can be used to rapidly categorize objects encountered through incidental contact on a robot arm. Allowing incidental contact with surrounding objects has benefits during manipulation such as increasing the workspace during reaching tasks. The information obtained from such contact, if available online, can potentially be used to map the environment and help in manipulation tasks. In this paper, we address this problem of online categorization using incidental contact during goal oriented motion. In cluttered environments, the detailed internal structure of clutter can be difficult to infer, but the environment type is often apparent. In a randomized cluttered environment of known object types and “outliers”, our approach uses Hidden Markov Models to capture the dynamic robot-environment interactions and to categorize objects based on the interactions. We combined leaf and trunk objects to create artificial foliage as a test environment. We collected data using a skin-sensor on the robot’s forearm while it reached into clutter. Our algorithm classifies the objects rapidly with low computation time and few data-samples. Using a taxel-by-taxel classification approach, we can successfully categorize simultaneous contacts with multiple objects and can also identify outlier objects in the environment based on the prior associated with an object’s likelihood in the given environment.
  • Item
    Haptic Classification and Recognition of Objects Using a Tactile Sensing Forearm
    (Georgia Institute of Technology, 2012-10) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    In this paper, we demonstrate data-driven inference of mechanical properties of objects using a tactile sensor array (skin) covering a robot's forearm. We focus on the mobility (sliding vs. fixed), compliance (soft vs. hard), and identity of objects in the environment, as this information could be useful for efficient manipulation and search. By using the large surface area of the forearm, a robot could potentially search and map a cluttered volume more efficiently, and be informed by incidental contact during other manipulation tasks. Our approach tracks a contact region on the forearm over time in order to generate time series of select features, such as the maximum force, contact area, and contact motion. We then process and reduce the dimensionality of these time series to generate a feature vector to characterize the contact. Finally, we use the k-nearest neighbor algorithm (k-NN) to classify a new feature vector based on a set of previously collected feature vectors. Our results show a high cross-validation accuracy in both classification of mechanical properties and object recognition. In addition, we analyze the effect of taxel resolution, duration of observation, feature selection, and feature scaling on the classification accuracy.
  • Item
    Perceiving Clutter and Surfaces for Object Placement in Indoor Environments
    (Georgia Institute of Technology, 2010-12) Schuster, Martin J. ; Okerman, Jason ; Nguyen, Hai ; Rehg, James M. ; Kemp, Charles C.
    Handheld manipulable objects can often be found on flat surfaces within human environments. Researchers have previously demonstrated that perceptually segmenting a flat surface from the objects resting on it can enable robots to pick and place objects. However, methods for performing this segmentation can fail when applied to scenes with natural clutter. For example, low-profile objects and dense clutter that obscures the underlying surface can complicate the interpretation of the scene. As a first step towards characterizing the statistics of real-world clutter in human environments, we have collected and hand labeled 104 scans of cluttered tables using a tilting laser range finder (LIDAR) and a camera. Within this paper, we describe our method of data collection, present notable statistics from the dataset, and introduce a perceptual algorithm that uses machine learning to discriminate surface from clutter. We also present a method that enables a humanoid robot to place objects on uncluttered parts of flat surfaces using this perceptual algorithm. In cross-validation tests, the perceptual algorithm achieved a correct classification rate of 78.70% for surface and 90.66% for clutter, and outperformed our previously published algorithm. Our humanoid robot succeeded in 16 out of 20 object placing trials on 9 different unaltered tables, and performed successfully in several high-clutter situations. 3 out of 4 failures resulted from placing objects too close to the edge of the table.