Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 10 of 16
  • Item
    Pulling Open Novel Doors and Drawers with Equilibrium Point Control
    (Georgia Institute of Technology, 2009-12) Jain, Advait ; Kemp, Charles C.
    A large variety of doors and drawers can be found within human environments. Humans regularly operate these mechanisms without difficulty, even if they have not previously interacted with a particular door or drawer. In this paper, we empirically demonstrate that equilibrium point control can enable a humanoid robot to pull open a variety of doors and drawers without detailed prior models, and infer their kinematics in the process. Our implementation uses a 7 DoF anthropomorphic arm with series elastic actuators (SEAs) at each joint, a hook as an end effector, and low mechanical impedance. For our control scheme, each SEA applies a gravity compensating torque plus a torque from a simulated, torsional, viscoelastic spring. Each virtual spring has constant stiffness and damping, and a variable equilibrium angle. These equilibrium angles form a joint space equilibrium point (JEP), which has a corresponding Cartesian space equilibrium point (CEP) for the arm's end effector. We present two controllers that generate a CEP at each time step (ca. 100 ms) and use inverse kinematics to command the arm with the corresponding JEP. One controller produces a linear CEP trajectory and the other alters its CEP trajectory based on real-time estimates of the mechanism's kinematics. We also present results from empirical evaluations of their performance (108 trials). In these trials, both controllers were robust with respect to variations in the mechanism, the pose of the base, the stiffness of the arm, and the way the handle was hooked. We also tested the more successful controller with 12 distinct mechanisms. In these tests, it was able to open 11 of the 12 mechanisms in a single trial, and successfully categorized the 11 mechanisms as having a rotary or prismatic joint, and opening to the right or left. Additionally, in the 7 out of 8 trials with rotary joints, the robot accurately estimated the location of the axis of rotation.
  • Item
    RF vision: RFID receive signal strength indicator (RSSI) images for sensor fusion and mobile manipulation
    (Georgia Institute of Technology, 2009-10) Deyle, Travis ; Nguyen, Hai ; Reynolds, Matt S. ; Kemp, Charles C.
    In this work we present a set of integrated methods that enable an RFID-enabled mobile manipulator to approach and grasp an object to which a self-adhesive passive (battery-free) UHF RFID tag has been affixed. Our primary contribution is a new mode of perception that produces images of the spatial distribution of received signal strength indication (RSSI) for each of the tagged objects in an environment. The intensity of each pixel in the 'RSSI image' is the measured RF signal strength for a particular tag in the corresponding direction. We construct these RSSI images by panning and tilting an RFID reader antenna while measuring the RSSI value at each bearing. Additionally, we present a framework for estimating a tagged object's 3D location using fused ID-specific features derived from an RSSI image, a camera image, and a laser range finder scan. We evaluate these methods using a robot with actuated, long-range RFID antennas and finger-mounted short-range antennas. The robot first scans its environment to discover which tagged objects are within range, creates a user interface, orients toward the user-selected object using RF signal strength, estimates the 3D location of the object using an RSSI image with sensor fusion, approaches and grasps the object, and uses its finger-mounted antennas to confirm that the desired object has been grasped. In our tests, the sensor fusion system with an RSSI image correctly located the requested object in 17 out of 18 trials (94.4%), an 11.1% improvement over the system's performance when not using an RSSI image. The robot correctly oriented to the requested object in 8 out of 9 trials (88.9%), and in 3 out of 3 trials the entire system successfully grasped the object selected by the user.
  • Item
    PPS-Tags: Physical, Perceptual and Semantic Tags for Autonomous Mobile Manipulation
    (Georgia Institute of Technology, 2009-10) Nguyen, Hai ; Deyle, Travis ; Reynolds, Matt S. ; Kemp, Charles C.
    For many promising application areas, autonomous mobile manipulators do not yet exhibit sufficiently robust performance. We propose the use of tags applied to task-relevant locations in human environments in order to help autonomous mobile manipulators physically interact with the location, perceive the location, and understand the location’s semantics. We call these tags physical, perceptual and semantic tags (PPS-tags). We present three examples of PPS-tags, each of which combines compliant and colorful material with a UHF RFID tag. The RFID tag provides a unique identifier that indexes into a semantic database that holds information such as the following: what actions can be performed at the location, how can these actions be performed, and what state changes should be observed upon task success? We also present performance results for our robot operating on a PPS-tagged light switch, rocker light switch, lamp, drawer, and trash can. We tested the robot performing the available actions from 4 distinct locations with each of these 5 tagged devices. For the light switch, rocker light switch, lamp, and trash can, the robot succeeded in all trials (24/24). The robot failed to open the drawer when starting from an oblique angle, and thus succeeded in 6 out of 8 trials. We also tested the ability of the robot to detect failure in unusual circumstances, such as the lamp being unplugged and the drawer being stuck.
  • Item
    Behavior-Based Door Opening with Equilibrium Point Control
    (Georgia Institute of Technology, 2009-06) Jain, Advait ; Kemp, Charles C.
    Within this paper we present a set of behaviors that enable a mobile manipulator to reliably open a variety of doors. After a user designates a location within 20cm of the door handle, the robot autonomously locates the door handle using a tilting laser range finder, approaches the handle using its omnidirectional base, reaches out to haptically find the door, makes contact with the handle, twists it, and pushes open the door. The robot uses equilibrium point control for all arm motions. Our implementation uses a 7 DoF anthropomorphic arm with series elastic actuators (SEAs). For our control scheme, each SEA applies a gravity compensating torque plus a torque from a simulated, torsional, viscoelastic spring. Each virtual spring has constant stiffness and damping, and a variable equilibrium point. The behaviors use inverse kinematics to generate trajectories for these joint-space equilibrium points that correspond with Cartesian equilibrium point trajectories for the end effector. With 43 trials and 8 different doors, we show that these compliant trajectories enable the robot to robustly reach out to make contact with doors (100%), operate door handles (96.9%), and push doors open (100%). The complete system including perception and navigation succeeded with unlocked doors in 28 out of 32 trials (87.5%) and locked doors in 8 out of 8 trials (100%). Through 157 trials with a single door, we empirically show that our method for door handle twisting reduces interaction forces and is robust to variations in arm stiffness, the end effector trajectory, and the friction between the end effector and the handle.
  • Item
    1000 Trials: An empirically validated end effector that robustly grasps objects from the floor
    (Georgia Institute of Technology, 2009-05) Xu, Zhe ; Deyle, Travis ; Kemp, Charles C.
    Unstructured, human environments present great challenges and opportunities for robotic manipulation and grasping. Robots that reliably grasp household objects with unknown or uncertain properties would be especially useful, since these robots could better generalize their capabilities across the wide variety of objects found within domestic environments. Within this paper, we address the problem of picking up an object sitting on a plane in isolation, as can occur when someone drops an object on the floor - a common problem for motor- impaired individuals. We assume that the robot has the ability to coarsely position itself in front of the object, but otherwise grasps the object with an open-loop strategy that does not vary from object to object. We present a novel end effector that is capable of robustly picking up a diverse array of everyday handheld objects given these conditions. This straight-forward, inexpensive, nonpre- hensile end effector combines a compliant finger with a thin planar component with a leading wedge that slides underneath the object. We empirically validated the efficacy of this design through a set of 1096 trials over which we systematically varied the object location, object type, object configuration, and floor characteristics. Our implementation, which we mounted on a iRobot Create, had a success rate of 94.71 % on 680 trials, which used 4 floor types with 34 objects of particular relevance to assistive applications in 5 different poses each (4x34x5=680). The robot also had strong performance with objects that would be difficult to grasp using a traditional end effector, such as a dollar bill, a pill, a cloth, a credit card, a coin, keys, and a watch. Prior to this test, we performed 416 trials in order to assess the performance of the end effector with respect to variations in object position.
  • Item
    Human-Robot Interaction Studies for Autonomous Mobile Manipulation for the Motor Impaired
    (Georgia Institute of Technology, 2009-03) Choi, Young Sang ; Anderson, Cressel D. ; Deyle, Travis ; Kemp, Charles C.
    We are developing an autonomous mobile assistive robot named El-E to help individuals with severe motor impairments by performing various object manipulation tasks such as fetching, transporting, placing, and delivering. El-E can autonomously approach a location specified by the user through an interface such as a standard laser pointer and pick up a nearby object. The initial target user population of the robot is individuals suffering from amyotrophic lateral sclerosis (ALS). ALS, also known as Lou Gehrig’s disease, is a progressive neuro-degenerative disease resulting in motor impairments throughout the entire body. Due to the severity and progressive nature of ALS, the results from developing robotic technologies to assist ALS patients could be applied to wider motor impaired populations. To accomplish successful development and real world application of assistive robot technology, we have to acquire familiarity with the needs and everyday living conditions of these individuals. We also believe the participation of prospective users throughout the design and development process is essential in improving the usability and accessibility of the robot for the target user population. To assess the needs of prospective users and to evaluate the technology being developed, we applied various methodologies of human studies including interviewing, photographing, and conducting controlled experiments. We present an overview of research from the Healthcare Robotics Lab related to patient needs assessment and human experiments with emphasis on the methods of human centered approach.
  • Item
    Hand It Over or Set It Down: A User Study of Object Delivery with an Assistive Mobile Manipulator
    (Georgia Institute of Technology, 2009) Choi, Young Sang ; Chen, Tiffany L. ; Jain, Advait ; Anderson, Cressel D. ; Glass, Jonathan D. ; Kemp, Charles C.
    Delivering an object to a user would be a generally useful capability for service robots. Within this paper, we look at this capability in the context of assistive object retrieval for motor-impaired users. We first describe a behavior-based system that enables our mobile robot EL-E to autonomously deliver an object to a motor-impaired user. We then present our evaluation of this system with 8 motor-impaired patients from the Emory ALS Center. As part of this study, we compared handing the object to the user (direct delivery) with placing the object on a nearby table (indirect delivery). We tested the robot delivering a cordless phone, a medicine bottle, and a TV remote, which were ranked as three of the top four most important objects for robotic delivery by ALS patients in a previous study. Overall, the robot successfully delivered these objects in 126 out of 144 trials (88%) with a success rate of 97% for indirect delivery and 78% for direct delivery. In an accompanying survey, participants showed high satisfaction with the robot with 4 people preferring direct delivery and 4 people preferring indirect delivery. Our results indicate that indirect delivery to a surface can be a robust and reliable delivery method with high user satisfaction, and that robust direct delivery will require methods that handle diverse postures and body types.
  • Item
    Bio-inspired Assistive Robotics: Service Dogs as a Model for Human-Robot Interaction and Mobile Manipulation
    (Georgia Institute of Technology, 2008-10) Nguyen, Hai ; Kemp, Charles C.
    Service dogs have successfully provided assistance to thousands of motor-impaired people worldwide. As a step towards the creation of robots that provide comparable assistance, we present a biologically inspired robot capable of obeying many of the same commands and exploiting the same environmental modifications as service dogs. The robot responds to a subset of the 71 verbal commands listed in the service dog training manual used by Georgia Canines for Independence. In our implementation, the human directs the robot by giving a verbal command and illuminating a task-relevant location with an off-the-shelf green laser pointer. We also describe a novel and inexpensive way to engineer the environment in order to help assistive robots perform useful tasks with generality and robustness. In particular, we show that by tying or otherwise affixing colored towels to doors and drawers an assistive robot can robustly open these doors and drawers in a manner similar to a service dog. This is analogous to the common practice of tying bandannas or handkerchiefs to door handles and drawer handles in order to enable service dogs to operate them. This method has the advantage of simplifying both the perception and physical interaction required to perform the task. It also enables the robot to use the same small set of behaviors to perform a variety of tasks across distinct doors and drawers. We report quantitative results for our assistive robot when performing assistive tasks in response to user commands in a modified environment. In our tests, the robot successfully opened two different drawers in 18 out of 20 trials (90%), closed a drawer in 9 out of 10 trials (90%), and opened a door that required first operating a handle and then pushing it open in 8 out of 10 trials (80%). Additionally, the robot succeeded in single trial tests of opening a microwave, grasping an object, placing an object, delivering an object, and responding to various other commands, such as staying quiet.
  • Item
    Laser Pointers and a Touch Screen: Intuitive Interfaces for Autonomous Mobile Manipulation for the Motor Impaired
    (Georgia Institute of Technology, 2008-10) Choi, Young Sang ; Anderson, Cressel D. ; Glass, Jonathan D. ; Kemp, Charles C.
    El-E (“Ellie”) is a prototype assistive robot designed to help people with severe motor impairments manipulate everyday objects. When given a 3D location, El-E can autonomously approach the location and pick up a nearby object. Based on interviews of patients with amyotrophic lateral sclerosis (ALS), we have developed and tested three distinct interfaces that enable a user to provide a 3D location to El-E and thereby select an object to be manipulated: an ear-mounted laser pointer, a hand-held laser pointer, and a touch screen interface. Within this paper, we present the results from a user study comparing these three user interfaces with a total of 134 trials involving eight patients with varying levels of impairment recruited from the Emory ALS Clinic. During this study, participants used the three interfaces to select everyday objects to be approached, grasped, and lifted off of the ground. The three interfaces enabled motor impaired users to command a robot to pick up an object with a 94.8% success rate overall after less than 10 minutes of learning to use each interface. On average, users selected objects 69% more quickly with the laser pointer interfaces than with the touch screen interface. We also found substantial variation in user preference. With respect to the Revised ALS Functional Rating Scale (ALSFRS-R), users with greater upper-limb mobility tended to prefer the hand-held laser pointer, while those with less upper-limb mobility tended to prefer the ear-mounted laser pointer. Despite the extra efficiency of the laser pointer interfaces, three patients preferred the touch screen interface, which has unique potential for manipulating remote objects out of the user’s line of sight. In summary, these results indicate that robots can enhance accessibility by supporting multiple interfaces. Furthermore, this work demonstrates that the communication of 3D locations during human-robot interaction can serve as a powerful abstraction barrier that supports distinct interfaces to assistive robots while using identical, underlying robotic functionality.
  • Item
    A foveated passive UHF RFID system for mobile manipulation
    (Georgia Institute of Technology, 2008-09) Deyle, Travis ; Anderson, Cressel D. ; Kemp, Charles C. ; Reynolds, Matt S.
    We present a novel antenna and system architecture for mobile manipulation based on passive RFID technology operating in the 850 MHz - 950 MHz ultra-high-frequency (UHF) spectrum. This system exploits the electromagnetic properties of UHF radio signals to present a mobile robot with both wide-angle dasiaperipheral visionpsila, sensing multiple tagged objects in the area in front of the robot, and focused, high-acuity dasiacentral visionpsila, sensing only tagged objects close to the end effector of the manipulator. These disparate tasks are performed using the same UHF RFID tag, coupled in two different electromagnetic modes. Wide-angle sensing is performed with an antenna designed for far-field electromagnetic wave propagation, while focused sensing is performed with a specially designed antenna mounted on the end effector that optimizes near-field magnetic coupling. We refer to this RFID system as dasiafoveatedpsila, by analogy with the anatomy of the human eye. We report a series of experiments on an untethered autonomous mobile manipulator in a 2.5D environment that demonstrate the features of this architecture using two novel behaviors, one in which data from the far-field antenna is used to determine if a specific tagged object is present in the robotpsilas working area and to navigate to that object, and a second using data from the near-field antenna to grasp a specified object from a collection of visually identical objects. The same UHF RFID tag is used to facilitate both the navigation and grasping tasks.