Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 10 of 42
  • Item
    A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Grice, Phillip M. ; Kapusta, Ariel ; Killpack, Marc D. ; Park, Daehyung ; Kemp, Charles C.
    We present a system that enables a robot to reach locations in dense clutter using only haptic sensing. Our system integrates model predictive control [1], learned initial conditions [2], tactile recognition of object types [3], haptic mapping, and geometric planning to efficiently reach locations using whole- arm tactile sensing [4]. We motivate our work, present a system architecture, summarize each component of the system, and present results from our evaluation of the system reaching to target locations in dense artificial foliage.
  • Item
    Learning to Reach into the Unknown: Selecting Initial Conditions When Reaching in Clutter
    (Georgia Institute of Technology, 2014-09) Park, Daehyung ; Kapusta, Ariel ; Kim, You Keun ; Rehg, James M. ; Kemp, Charles C.
    Often in highly-cluttered environments, a robot can observe the exterior of the environment with ease, but cannot directly view nor easily infer its detailed internal structure (e.g., dense foliage or a full refrigerator shelf). We present a data-driven approach that greatly improves a robot’s success at reaching to a goal location in the unknown interior of an environment based on observable external properties, such as the category of the clutter and the locations of openings into the clutter (i.e., apertures). We focus on the problem of selecting a good initial configuration for a manipulator when reaching with a greedy controller. We use density estimation to model the probability of a successful reach given an initial condition and then perform constrained optimization to find an initial condition with the highest estimated probability of success. We evaluate our approach with two simulated robots reaching in clutter, and provide a demonstration with a real PR2 robot reaching to locations through random apertures. In our evaluations, our approach significantly outperformed two alter- native approaches when making two consecutive reach attempts to goals in distinct categories of unknown clutter. Notably, our approach only uses sparse readily-apparent features.
  • Item
    Finding and Navigating to Household Objects with UHF RFID Tags by Optimizing RF Signal Strength
    (Georgia Institute of Technology, 2014-09) Deyle, Travis ; Reynolds, Matthew S. ; Kemp, Charles C.
    We address the challenge of finding and navigating to an object with an attached ultra-high frequency radio- frequency identification (UHF RFID) tag. With current off-the- shelf technology, one can affix inexpensive self-adhesive UHF RFID tags to hundreds of objects, thereby enabling a robot to sense the RF signal strength it receives from each uniquely identified object. The received signal strength indicator (RSSI) associated with a tagged object varies widely and depends on many factors, including the object’s pose, material prop- erties and surroundings. This complexity creates challenges for methods that attempt to explicitly estimate the object’s pose. We present an alternative approach that formulates finding and navigating to a tagged object as an optimization problem where the robot must find a pose of a directional antenna that maximizes the RSSI associated with the target tag. We then present three autonomous robot behaviors that together perform this optimization by combining global and local search. The first behavior uses sparse sampling of RSSI across the entire environment to move the robot to a location near the tag; the second samples RSSI over orientation to point the robot toward the tag; and the third samples RSSI from two antennas pointing in different directions to enable the robot to approach the tag. We justify our formulation using the radar equation and associated literature. We also demonstrate that it has good performance in practice via tests with a PR2 robot from Willow Garage in a house with a variety of tagged household objects.
  • Item
    Interleaving Planning and Control for Efficient Haptically-guided Reaching in Unknown Environments
    (Georgia Institute of Technology, 2014) Park, Daehyung ; Kapusta, Ariel ; Hawke, Jeffrey ; Kemp, Charles C.
    We present a new method for reaching in an initially unknown environment with only haptic sensing. In this paper, we propose a haptically-guided interleaving planning and control (HIPC) method with a haptic mapping framework. HIPC runs two planning methods, interleaving a task-space and a joint-space planner, to provide fast reaching performance. It continually replans a valid trajectory, alternating between planners and quickly reflecting collected tactile information from an unknown environment. One key idea is that tactile sensing can be used to directly map an immediate cause of interference when reaching. The mapping framework efficiently assigns raw tactile information from whole-arm tactile sensors into a 3D voxel-based collision map. Our method uses a previously published contact-regulating controller based on model predictive control (MPC). In our evaluation with a physics simulation of a humanoid robot, interleaving was superior at reaching in the 9 types of environments we used.
  • Item
    Rapid Categorization of Object Properties from Incidental Contact with a Tactile Sensing Robot Arm,
    (Georgia Institute of Technology, 2013-10) Bhattacharjee, Tapomayukh ; Kapusta, Ariel ; Rehg, James M. ; Kemp, Charles C.
    We demonstrate that data-driven methods can be used to rapidly categorize objects encountered through incidental contact on a robot arm. Allowing incidental contact with surrounding objects has benefits during manipulation such as increasing the workspace during reaching tasks. The information obtained from such contact, if available online, can potentially be used to map the environment and help in manipulation tasks. In this paper, we address this problem of online categorization using incidental contact during goal oriented motion. In cluttered environments, the detailed internal structure of clutter can be difficult to infer, but the environment type is often apparent. In a randomized cluttered environment of known object types and “outliers”, our approach uses Hidden Markov Models to capture the dynamic robot-environment interactions and to categorize objects based on the interactions. We combined leaf and trunk objects to create artificial foliage as a test environment. We collected data using a skin-sensor on the robot’s forearm while it reached into clutter. Our algorithm classifies the objects rapidly with low computation time and few data-samples. Using a taxel-by-taxel classification approach, we can successfully categorize simultaneous contacts with multiple objects and can also identify outlier objects in the environment based on the prior associated with an object’s likelihood in the given environment.
  • Item
    Fast Reaching in Clutter While Regulating Forces Using Model Predictive Control
    (Georgia Institute of Technology, 2013-10) Killpack, Marc D. ; Kemp, Charles C.
    Moving a robot arm quickly in cluttered and unmodeled workspaces can be difficult because of the inherent risk of high impact forces. Additionally, compliance by itself is not enough to limit contact forces due to multi-contact phenomena (jamming, etc.). The work in this paper extends our previous research on manipulation in cluttered environments by explicitly modeling robot arm dynamics and using model predictive control (MPC) with whole-arm tactile sensing to improve the speed and force control. We first derive discretetime dynamic equations of motion that we use for MPC. Then we formulate a multi-time step model predictive controller that uses this dynamic model. These changes allow us to control contact forces while increasing overall end effector speed. We also describe a constraint that regulates joint velocities in order to mitigate unexpected impact forces while reaching to a goal. We present results using tests from a simulated three link planar arm that is representative of the kinematics and mass of an average male’s torso, shoulder and elbow joints reaching in high and low clutter scenarios. These results show that our controller allows the arm to reach a goal up to twice as fast as our previous work, while still controlling the contact forces to be near a user-defined threshold.
  • Item
    Whole-arm Tactile Sensing for Beneficial and Acceptable Contact During Robotic Assistance
    (Georgia Institute of Technology, 2013-06) Grice, Phillip M. ; Killpack, Marc D. ; Jain, Advait ; Vaish, Sarvagya ; Hawke, Jeffrey ; Kemp, Charles C.
    Many assistive tasks involve manipulation near the care-receiver's body, including self-care tasks such as dressing, feeding, and personal hygiene. A robot can provide assistance with these tasks by moving its end effector to poses near the care-receiver's body. However, perceiving and maneuvering around the care-receiver's body can be challenging due to a variety of issues, including convoluted geometry, compliant materials, body motion, hidden surfaces, and the object upon which the body is resting (e.g., a wheelchair or bed). Using geometric simulations, we first show that an assistive robot can achieve a much larger percentage of end-effector poses near the care-receiver's body if its arm is allowed to make contact. Second, we present a novel system with a custom controller and whole-arm tactile sensor array that enables a Willow Garage PR2 to regulate contact forces across its entire arm while moving its end effector to a commanded pose. We then describe tests with two people with motor impairments, one of whom used the system to grasp and pull a blanket over himself and to grab a cloth and wipe his face, all while in bed at his home. Finally, we describe a study with eight able-bodied users in which they used the system to place objects near their bodies. On average, users perceived the system to be safe and comfortable, even though substantial contact occurred between the robot's arm and the user's body.
  • Item
    In-Hand Radio Frequency Identification (RFID) for Robotic Manipulation
    (Georgia Institute of Technology, 2013-05) Deyle, Travis ; Tralie, Christopher J. ; Reynolds, Matthew S. ; Kemp, Charles C.
    We present a unique multi-antenna RFID reader (a sensor) embedded in a robot's manipulator that is designed to operate with ordinary UHF RFID tags in a short-range, near-field electromagnetic regime. Using specially designed near-field antennas enables our sensor to obtain spatial information from tags at ranges of less than 1 meter. In this work, we characterize the near-field sensor's ability to detect tagged objects in the robots manipulator, present robot behaviors to determine the identity of a grasped object, and investigate how additional RF signal properties can be used for “pre-touch” capabilities such as servoing to grasp an object. The future combination of long-range (far-field) and short-range (near-field) UHF RFID sensing has the potential to enable roboticists to jump-start applications by obviating or supplementing false-positive-prone visual object recognition. These techniques may be especially useful in the healthcare and service sectors, where mis-identification of an object (for example, a medication bottle) could have catastrophic consequences.
  • Item
    ROS Commander (ROSCo): Behavior Creation for Home Robots
    (Georgia Institute of Technology, 2013-05) Nguyen, Hai ; Ciocarlie, Matei ; Hsiao, Kaijen ; Kemp, Charles C.
    We introduce ROS Commander (ROSCo), an open source system that enables expert users to construct, share, and deploy robot behaviors for home robots. A user builds a behavior in the form of a Hierarchical Finite State Machine (HFSM) out of generic, parameterized building blocks, with a real robot in the develop and test loop. Once constructed, users save behaviors in an open format for direct use with robots, or for use as parts of new behaviors. When the system is deployed, a user can show the robot where to apply behaviors relative to fiducial markers (AR Tags), which allows the robot to quickly become operational in a new environment. We show evidence that the underlying state machine representation and current building blocks are capable of spanning a variety of desirable behaviors for home robots, such as opening a refrigerator door with two arms, flipping a light switch, unlocking a door, and handing an object to someone. Our experiments show that sensor-driven behaviors constructed with ROSCo can be executed in realistic home environments with success rates between 80% and 100%. We conclude by describing a test in the home of a person with quadriplegia, in which the person was able to automate parts of his home using previously-built Behaviors.
  • Item
    Tactile Sensing over Articulated Joints with Stretchable Sensors
    (Georgia Institute of Technology, 2013-04) Bhattacharjee, Tapomayukh ; Jain, Advait ; Vaish, Sarvagya ; Killpack, Marc D. ; Kemp, Charles C.
    Biological organisms benefit from tactile sensing across the entire surfaces of their bodies. Robots may also be able to benefit from this type of sensing, but fully covering a robot with robust and capable tactile sensors entails numerous challenges. To date, most tactile sensors for robots have been used to cover rigid surfaces. In this paper, we focus on the challenge of tactile sensing across articulated joints, which requires sensing across a surface whose geometry varies over time. We first demonstrate the importance of sensing across joints by simulating a planar arm reaching in clutter and finding the frequency of contact at the joints. We then present a simple model of how much a tactile sensor would need to stretch in order to cover a 2 degree-of-freedom (DoF) wrist joint. Next, we describe and characterize a new tactile sensor made with stretchable fabrics. Finally, we present results for a stretchable sleeve with 25 tactile sensors that covers the forearm, 2 DoF wrist, and end effector of a humanoid robot. This sleeve enabled the robot to reach a target in instrumented clutter and reduce contact forces.