Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 6 of 6
  • Item
    Pulling Open Novel Doors and Drawers with Equilibrium Point Control
    (Georgia Institute of Technology, 2009-12) Jain, Advait ; Kemp, Charles C.
    A large variety of doors and drawers can be found within human environments. Humans regularly operate these mechanisms without difficulty, even if they have not previously interacted with a particular door or drawer. In this paper, we empirically demonstrate that equilibrium point control can enable a humanoid robot to pull open a variety of doors and drawers without detailed prior models, and infer their kinematics in the process. Our implementation uses a 7 DoF anthropomorphic arm with series elastic actuators (SEAs) at each joint, a hook as an end effector, and low mechanical impedance. For our control scheme, each SEA applies a gravity compensating torque plus a torque from a simulated, torsional, viscoelastic spring. Each virtual spring has constant stiffness and damping, and a variable equilibrium angle. These equilibrium angles form a joint space equilibrium point (JEP), which has a corresponding Cartesian space equilibrium point (CEP) for the arm's end effector. We present two controllers that generate a CEP at each time step (ca. 100 ms) and use inverse kinematics to command the arm with the corresponding JEP. One controller produces a linear CEP trajectory and the other alters its CEP trajectory based on real-time estimates of the mechanism's kinematics. We also present results from empirical evaluations of their performance (108 trials). In these trials, both controllers were robust with respect to variations in the mechanism, the pose of the base, the stiffness of the arm, and the way the handle was hooked. We also tested the more successful controller with 12 distinct mechanisms. In these tests, it was able to open 11 of the 12 mechanisms in a single trial, and successfully categorized the 11 mechanisms as having a rotary or prismatic joint, and opening to the right or left. Additionally, in the 7 out of 8 trials with rotary joints, the robot accurately estimated the location of the axis of rotation.
  • Item
    Behavior-Based Door Opening with Equilibrium Point Control
    (Georgia Institute of Technology, 2009-06) Jain, Advait ; Kemp, Charles C.
    Within this paper we present a set of behaviors that enable a mobile manipulator to reliably open a variety of doors. After a user designates a location within 20cm of the door handle, the robot autonomously locates the door handle using a tilting laser range finder, approaches the handle using its omnidirectional base, reaches out to haptically find the door, makes contact with the handle, twists it, and pushes open the door. The robot uses equilibrium point control for all arm motions. Our implementation uses a 7 DoF anthropomorphic arm with series elastic actuators (SEAs). For our control scheme, each SEA applies a gravity compensating torque plus a torque from a simulated, torsional, viscoelastic spring. Each virtual spring has constant stiffness and damping, and a variable equilibrium point. The behaviors use inverse kinematics to generate trajectories for these joint-space equilibrium points that correspond with Cartesian equilibrium point trajectories for the end effector. With 43 trials and 8 different doors, we show that these compliant trajectories enable the robot to robustly reach out to make contact with doors (100%), operate door handles (96.9%), and push doors open (100%). The complete system including perception and navigation succeeded with unlocked doors in 28 out of 32 trials (87.5%) and locked doors in 8 out of 8 trials (100%). Through 157 trials with a single door, we empirically show that our method for door handle twisting reduces interaction forces and is robust to variations in arm stiffness, the end effector trajectory, and the friction between the end effector and the handle.
  • Item
    Hand It Over or Set It Down: A User Study of Object Delivery with an Assistive Mobile Manipulator
    (Georgia Institute of Technology, 2009) Choi, Young Sang ; Chen, Tiffany L. ; Jain, Advait ; Anderson, Cressel D. ; Glass, Jonathan D. ; Kemp, Charles C.
    Delivering an object to a user would be a generally useful capability for service robots. Within this paper, we look at this capability in the context of assistive object retrieval for motor-impaired users. We first describe a behavior-based system that enables our mobile robot EL-E to autonomously deliver an object to a motor-impaired user. We then present our evaluation of this system with 8 motor-impaired patients from the Emory ALS Center. As part of this study, we compared handing the object to the user (direct delivery) with placing the object on a nearby table (indirect delivery). We tested the robot delivering a cordless phone, a medicine bottle, and a TV remote, which were ranked as three of the top four most important objects for robotic delivery by ALS patients in a previous study. Overall, the robot successfully delivered these objects in 126 out of 144 trials (88%) with a success rate of 97% for indirect delivery and 78% for direct delivery. In an accompanying survey, participants showed high satisfaction with the robot with 4 people preferring direct delivery and 4 people preferring indirect delivery. Our results indicate that indirect delivery to a surface can be a robust and reliable delivery method with high user satisfaction, and that robust direct delivery will require methods that handle diverse postures and body types.
  • Item
    A Clickable World: Behavior Selection Through Pointing and Context for Mobile Manipulation
    (Georgia Institute of Technology, 2008-09) Nguyen, Hai ; Jain, Advait ; Anderson, Cressel D. ; Kemp, Charles C.
    We present a new behavior selection system for human-robot interaction that maps virtual buttons overlaid on the physical environment to the robotpsilas behaviors, thereby creating a clickable world. The user clicks on a virtual button and activates the associated behavior by briefly illuminating a corresponding 3D location with an off-the-shelf green laser pointer. As we have described in previous work, the robot can detect this click and estimate its 3D location using an omnidirectional camera and a pan/tilt stereo camera. In this paper, we show that the robot can select the appropriate behavior to execute using the 3D location of the click, the context around this 3D location, and its own state. For this work, the robot performs this selection process using a cascade of classifiers. We demonstrate the efficacy of this approach with an assistive object-fetching application. Through empirical evaluation, we show that the 3D location of the click, the state of the robot, and the surrounding context is sufficient for the robot to choose the correct behavior from a set of behaviors and perform the following tasks: pick-up a designated object from a floor or table, deliver an object to a designated person, place an object on a designated table, go to a designated location, and touch a designated location with its end effector.
  • Item
    Behaviors for Robust Door Opening and Doorway Traversal with a Force-Sensing Mobile Manipulator
    (Georgia Institute of Technology, 2008-06) Jain, Advait ; Kemp, Charles C.
    Fully autonomous robots will often need to open doors and traverse doorways in order to freely operate within human environments, and assistive robots that open doors on command would potentially benefit the motor impaired. In spite of these opportunities, autonomous manipulation of doors remains a challenging problem after more than a decade of research. Until recently, published research has focused on one or two aspects of door opening, and included results from only a small number of tests on a single door. Within this paper we present a set of behaviors that enable a mobile manipulator to reliably open a variety of doors and traverse doorways using force-sensing fingers and a laser range finder. With this system, a user only needs to briefly illuminate a door handle using a green laser pointer, after which the robot autonomously locates the door handle, finds the manipulable end of the door handle, twists the door handle, and pushes the door open while traversing the doorway. The behaviors use sensory feedback to continuously monitor task-relevant aspects of the world and respond to common forms of variation in the task, such as whether the door is locked or unlocked, is blocked or unblocked, opens to the right or left, or has a handle that twists down clockwise or counterclockwise. We tested the robot in 30 trials with 6 different doors from an initial position over 1.6 meters away from the door handle. For the 24 trials with unlocked doors, the robot succeeded at the entire task in 21 trials (87.5% success rate). In the 6 trials with locked doors, the robot successfully detected that the door was locked in all 6 trials (100.0% success rate). For all 30 trials, the robot stopped in a safe manner without requiring human intervention after detecting failure or success at the task. We conclude with a discussion of how this work relates to several broader issues for intelligent manipulation within human environments, including the use of 3D locations to select behaviors, the generality of serialized sub-tasks, task-relevant features, active perception, force sensing, and methods for scaling systems to handle more tasks of greater complexity.
  • Item
    EL-E: An Assistive Mobile Manipulator that Autonomously Fetches Objects from Flat Surfaces
    (Georgia Institute of Technology, 2008-03-12) Nguyen, Hai ; Anderson, Cressel D. ; Trevor, Alexander J. B. ; Jain, Advait ; Xu, Zhe ; Kemp, Charles C.
    Objects within human environments are usually found on flat surfaces that are orthogonal to gravity, such as floors, tables, and shelves. We first present a new assistive robot that is explicitly designed to take advantage of this common structure in order to retrieve unmodeled, everyday objects for people with motor impairments. This compact, stati- cally stable mobile manipulator has a novel kinematic and sensory configuration that facilitates autonomy and human- robot interaction within indoor human environments. Sec- ond, we present a behavior system that enables this robot to fetch objects selected with a laser pointer from the floor and tables. The robot can approach an object selected with the laser pointer interface, detect if the object is on an elevated surface, raise or lower its arm and sensors to this surface, and visually and tacitly grasp the object. Once the object is acquired, the robot can place the object on a laser des- ignated surface above the floor, follow the laser pointer on the floor, or deliver the object to a seated person selected with the laser pointer. Within this paper we present initial results for object acquisition and delivery to a seated, able- bodied individual. For this test, the robot succeeded in 6 out of 7 trials (86%).