Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 10 of 21
  • Item
    Rapid Categorization of Object Properties from Incidental Contact with a Tactile Sensing Robot Arm,
    (Georgia Institute of Technology, 2013-10) Bhattacharjee, Tapomayukh ; Kapusta, Ariel ; Rehg, James M. ; Kemp, Charles C.
    We demonstrate that data-driven methods can be used to rapidly categorize objects encountered through incidental contact on a robot arm. Allowing incidental contact with surrounding objects has benefits during manipulation such as increasing the workspace during reaching tasks. The information obtained from such contact, if available online, can potentially be used to map the environment and help in manipulation tasks. In this paper, we address this problem of online categorization using incidental contact during goal oriented motion. In cluttered environments, the detailed internal structure of clutter can be difficult to infer, but the environment type is often apparent. In a randomized cluttered environment of known object types and “outliers”, our approach uses Hidden Markov Models to capture the dynamic robot-environment interactions and to categorize objects based on the interactions. We combined leaf and trunk objects to create artificial foliage as a test environment. We collected data using a skin-sensor on the robot’s forearm while it reached into clutter. Our algorithm classifies the objects rapidly with low computation time and few data-samples. Using a taxel-by-taxel classification approach, we can successfully categorize simultaneous contacts with multiple objects and can also identify outlier objects in the environment based on the prior associated with an object’s likelihood in the given environment.
  • Item
    Fast Reaching in Clutter While Regulating Forces Using Model Predictive Control
    (Georgia Institute of Technology, 2013-10) Killpack, Marc D. ; Kemp, Charles C.
    Moving a robot arm quickly in cluttered and unmodeled workspaces can be difficult because of the inherent risk of high impact forces. Additionally, compliance by itself is not enough to limit contact forces due to multi-contact phenomena (jamming, etc.). The work in this paper extends our previous research on manipulation in cluttered environments by explicitly modeling robot arm dynamics and using model predictive control (MPC) with whole-arm tactile sensing to improve the speed and force control. We first derive discretetime dynamic equations of motion that we use for MPC. Then we formulate a multi-time step model predictive controller that uses this dynamic model. These changes allow us to control contact forces while increasing overall end effector speed. We also describe a constraint that regulates joint velocities in order to mitigate unexpected impact forces while reaching to a goal. We present results using tests from a simulated three link planar arm that is representative of the kinematics and mass of an average male’s torso, shoulder and elbow joints reaching in high and low clutter scenarios. These results show that our controller allows the arm to reach a goal up to twice as fast as our previous work, while still controlling the contact forces to be near a user-defined threshold.
  • Item
    Whole-arm Tactile Sensing for Beneficial and Acceptable Contact During Robotic Assistance
    (Georgia Institute of Technology, 2013-06) Grice, Phillip M. ; Killpack, Marc D. ; Jain, Advait ; Vaish, Sarvagya ; Hawke, Jeffrey ; Kemp, Charles C.
    Many assistive tasks involve manipulation near the care-receiver's body, including self-care tasks such as dressing, feeding, and personal hygiene. A robot can provide assistance with these tasks by moving its end effector to poses near the care-receiver's body. However, perceiving and maneuvering around the care-receiver's body can be challenging due to a variety of issues, including convoluted geometry, compliant materials, body motion, hidden surfaces, and the object upon which the body is resting (e.g., a wheelchair or bed). Using geometric simulations, we first show that an assistive robot can achieve a much larger percentage of end-effector poses near the care-receiver's body if its arm is allowed to make contact. Second, we present a novel system with a custom controller and whole-arm tactile sensor array that enables a Willow Garage PR2 to regulate contact forces across its entire arm while moving its end effector to a commanded pose. We then describe tests with two people with motor impairments, one of whom used the system to grasp and pull a blanket over himself and to grab a cloth and wipe his face, all while in bed at his home. Finally, we describe a study with eight able-bodied users in which they used the system to place objects near their bodies. On average, users perceived the system to be safe and comfortable, even though substantial contact occurred between the robot's arm and the user's body.
  • Item
    In-Hand Radio Frequency Identification (RFID) for Robotic Manipulation
    (Georgia Institute of Technology, 2013-05) Deyle, Travis ; Tralie, Christopher J. ; Reynolds, Matthew S. ; Kemp, Charles C.
    We present a unique multi-antenna RFID reader (a sensor) embedded in a robot's manipulator that is designed to operate with ordinary UHF RFID tags in a short-range, near-field electromagnetic regime. Using specially designed near-field antennas enables our sensor to obtain spatial information from tags at ranges of less than 1 meter. In this work, we characterize the near-field sensor's ability to detect tagged objects in the robots manipulator, present robot behaviors to determine the identity of a grasped object, and investigate how additional RF signal properties can be used for “pre-touch” capabilities such as servoing to grasp an object. The future combination of long-range (far-field) and short-range (near-field) UHF RFID sensing has the potential to enable roboticists to jump-start applications by obviating or supplementing false-positive-prone visual object recognition. These techniques may be especially useful in the healthcare and service sectors, where mis-identification of an object (for example, a medication bottle) could have catastrophic consequences.
  • Item
    ROS Commander (ROSCo): Behavior Creation for Home Robots
    (Georgia Institute of Technology, 2013-05) Nguyen, Hai ; Ciocarlie, Matei ; Hsiao, Kaijen ; Kemp, Charles C.
    We introduce ROS Commander (ROSCo), an open source system that enables expert users to construct, share, and deploy robot behaviors for home robots. A user builds a behavior in the form of a Hierarchical Finite State Machine (HFSM) out of generic, parameterized building blocks, with a real robot in the develop and test loop. Once constructed, users save behaviors in an open format for direct use with robots, or for use as parts of new behaviors. When the system is deployed, a user can show the robot where to apply behaviors relative to fiducial markers (AR Tags), which allows the robot to quickly become operational in a new environment. We show evidence that the underlying state machine representation and current building blocks are capable of spanning a variety of desirable behaviors for home robots, such as opening a refrigerator door with two arms, flipping a light switch, unlocking a door, and handing an object to someone. Our experiments show that sensor-driven behaviors constructed with ROSCo can be executed in realistic home environments with success rates between 80% and 100%. We conclude by describing a test in the home of a person with quadriplegia, in which the person was able to automate parts of his home using previously-built Behaviors.
  • Item
    Tactile Sensing over Articulated Joints with Stretchable Sensors
    (Georgia Institute of Technology, 2013-04) Bhattacharjee, Tapomayukh ; Jain, Advait ; Vaish, Sarvagya ; Killpack, Marc D. ; Kemp, Charles C.
    Biological organisms benefit from tactile sensing across the entire surfaces of their bodies. Robots may also be able to benefit from this type of sensing, but fully covering a robot with robust and capable tactile sensors entails numerous challenges. To date, most tactile sensors for robots have been used to cover rigid surfaces. In this paper, we focus on the challenge of tactile sensing across articulated joints, which requires sensing across a surface whose geometry varies over time. We first demonstrate the importance of sensing across joints by simulating a planar arm reaching in clutter and finding the frequency of contact at the joints. We then present a simple model of how much a tactile sensor would need to stretch in order to cover a 2 degree-of-freedom (DoF) wrist joint. Next, we describe and characterize a new tactile sensor made with stretchable fabrics. Finally, we present results for a stretchable sleeve with 25 tactile sensors that covers the forearm, 2 DoF wrist, and end effector of a humanoid robot. This sleeve enabled the robot to reach a target in instrumented clutter and reduce contact forces.
  • Item
    Older Adults Medication Management in the Home: How can Robots Help?
    (Georgia Institute of Technology, 2013-03) Prakash, Akanksha ; Beer, Jenay M. ; Deyle, Travis ; Smarr, Cory-Ann ; Chen, Tiffany L. ; Mitzner, Tracy L. ; Kemp, Charles C. ; Rogers, Wendy A.
    Successful management of medications is critical to maintaining healthy and independent living for older adults. However, medication non-adherence is a common problem with a high risk for severe consequences [5], which can jeopardize older adults’ chances to age in place [1]. Well-designed robots assisting with medication management tasks could support older adults’ independence. Design of successful robots will be enhanced through understanding concerns, attitudes, and preferences for medication assistance tasks. We assessed older adults’ reactions to medication hand-off from a mobile manipulator with 12 participants (68-79 years). We identified factors that affected their attitudes toward a mobile manipulator for supporting general medication management tasks in the home. The older adults were open to robot assistance; however, their preferences varied depending on the nature of the medication management task. For instance, they preferred a robot (over a human) to remind them to take medications, but preferred human assistance for deciding what medication to take and for administering the medication. Factors such as perceptions of one’s own capability and robot reliability influenced their attitudes.
  • Item
    Haptic Classification and Recognition of Objects Using a Tactile Sensing Forearm
    (Georgia Institute of Technology, 2012-10) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    In this paper, we demonstrate data-driven inference of mechanical properties of objects using a tactile sensor array (skin) covering a robot's forearm. We focus on the mobility (sliding vs. fixed), compliance (soft vs. hard), and identity of objects in the environment, as this information could be useful for efficient manipulation and search. By using the large surface area of the forearm, a robot could potentially search and map a cluttered volume more efficiently, and be informed by incidental contact during other manipulation tasks. Our approach tracks a contact region on the forearm over time in order to generate time series of select features, such as the maximum force, contact area, and contact motion. We then process and reduce the dimensionality of these time series to generate a feature vector to characterize the contact. Finally, we use the k-nearest neighbor algorithm (k-NN) to classify a new feature vector based on a set of previously collected feature vectors. Our results show a high cross-validation accuracy in both classification of mechanical properties and object recognition. In addition, we analyze the effect of taxel resolution, duration of observation, feature selection, and feature scaling on the classification accuracy.
  • Item
    Informing Assistive Robots with Models of Contact Forces from Able-Bodied Face Wiping and Shaving
    (Georgia Institute of Technology, 2012-09) Hawkins, Kelsey P. ; King, Chih-Hung ; Chen, Tiffany L. ; Kemp, Charles C.
    Hygiene and feeding are activities of daily living (ADLs) that often involve contact with a person's face. Robots can assist people with motor impairments to perform these tasks by holding a tool that makes contact with the care receiver's face. By sensing the forces applied to the face with the tool, robots could potentially provide assistance that is more comfortable, safe, and effective. In order to inform the design of robotic controllers and assistive robots, we investigated the forces able-bodied people apply to themselves when wiping and shaving their faces. We present our methods for capturing and modeling these forces, results from a study with 9 participants, and recommendations for assistive robots. Our contributions include a trapezoidal force model that assumes participants have a target force they attempt to achieve for each stroke of the tool. We discuss advantages of this 3 parameter model and show that it fits our data well relative to other candidate models. We also provide statistics of the models' rise rates, fall rates, and target forces for the 9 participants in our study. In addition, we illustrate how the target forces varied based on the task, participant, and location on the face.
  • Item
    The Wouse: A Wearable Wince Detector to Stop Assistive Robots
    (Georgia Institute of Technology, 2012-09) Grice, Phillip M. ; Lee, Andy ; Evans, Henry ; Kemp, Charles C.
    Persons with severe motor impairments depend heavily upon caregivers for the performance of everyday tasks. Ongoing work is exploring the potential of giving motor-impaired users control of semi-autonomous assistive mobile manipulators to enable them to perform some self-care tasks such as scratching or shaving. Because these users are less able to escape a robot malfunction, or operate a traditional run-stop, physical human-robot interaction poses safety risks. We review approaches to safety in assistive robotics with a focus on accessible run-stops, and propose wincing as an accessible gesture for activating a run-stop device. We also present the wouse, a novel device for detecting wincing from skin movement near the eye, consisting of optical mouse components mounted near a user's temple via safety goggles. Using this device, we demonstrate a complete system to run-stop a Willow Garage PR2 robot, and perform two preliminary user studies. The first study examines discrimination of wincing from self-produced facial expressions. The results indicate the possibility for discrimination, though variability between users and inconsistent detection of skin movement remain significant challenges. The second experiment examines discrimination of wincing from external mechanical manipulations of the face during self-care tasks. The results indicate that the wouse, using a classifier trained with data from the first experiment, can be used during face-manipulation tasks. The device produced no false positives, but succeeded in correctly identifying wincing events in only two of four subjects.