Person:
Kemp, Charles C.

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 10 of 15
  • Item
    Whole-arm Tactile Sensing for Beneficial and Acceptable Contact During Robotic Assistance
    (Georgia Institute of Technology, 2013-06) Grice, Phillip M. ; Killpack, Marc D. ; Jain, Advait ; Vaish, Sarvagya ; Hawke, Jeffrey ; Kemp, Charles C.
    Many assistive tasks involve manipulation near the care-receiver's body, including self-care tasks such as dressing, feeding, and personal hygiene. A robot can provide assistance with these tasks by moving its end effector to poses near the care-receiver's body. However, perceiving and maneuvering around the care-receiver's body can be challenging due to a variety of issues, including convoluted geometry, compliant materials, body motion, hidden surfaces, and the object upon which the body is resting (e.g., a wheelchair or bed). Using geometric simulations, we first show that an assistive robot can achieve a much larger percentage of end-effector poses near the care-receiver's body if its arm is allowed to make contact. Second, we present a novel system with a custom controller and whole-arm tactile sensor array that enables a Willow Garage PR2 to regulate contact forces across its entire arm while moving its end effector to a commanded pose. We then describe tests with two people with motor impairments, one of whom used the system to grasp and pull a blanket over himself and to grab a cloth and wipe his face, all while in bed at his home. Finally, we describe a study with eight able-bodied users in which they used the system to place objects near their bodies. On average, users perceived the system to be safe and comfortable, even though substantial contact occurred between the robot's arm and the user's body.
  • Item
    Improving robot manipulation with data-driven object-centric models of everyday forces
    (Georgia Institute of Technology, 2013-06) Jain, Advait ; Kemp, Charles C.
    Based on a lifetime of experience, people anticipate the forces associated with performing a manipulation task. In contrast, most robots lack common sense about the forces involved in everyday manipulation tasks. In this paper, we present data-driven methods to inform robots about the forces that they are likely to encounter when performing specific tasks. In the context of door opening, we demonstrate that data-driven object-centric models can be used to haptically recognize specific doors, haptically recognize classes of door (e.g., refrigerator vs. kitchen cabinet), and haptically detect anomalous forces while opening a door, even when opening a specific door for the first time.We also demonstrate that two distinct robots can use forces captured from people opening doors to better detect anomalous forces. These results illustrate the potential for robots to use shared databases of forces to bettermanipulate theworld and attain common sense about everyday forces.
  • Item
    Reaching in clutter with whole-arm tactile sensing
    (Georgia Institute of Technology, 2013-04) Jain, Advait ; Killpack, Marc D. ; Edsinger, Aaron ; Kemp, Charles C.
    Clutter creates challenges for robot manipulation, including a lack of non-contact trajectories and reduced visibility for line-of-sight sensors. We demonstrate that robots can use whole-arm tactile sensing to perceive clutter and maneuver within it, while keeping contact forces low. We first present our approach to manipulation, which emphasizes the benefits of making contact across the entire manipulator and assumes the manipulator has low-stiffness actuation and tactile sensing across its entire surface. We then present a novel controller that exploits these assumptions. The controller only requires haptic sensing, handles multiple contacts, and does not need an explicit model of the environment prior to contact. It uses model predictive control with a time horizon of length one and a linear quasi-static mechanical model. In our experiments, the controller enabled a real robot and a simulated robot to reach goal locations in a variety of environments, including artificial foliage, a cinder block, and randomly generated clutter, while keeping contact forces low. While reaching, the robots performed maneuvers that included bending objects, compressing objects, sliding objects, and pivoting around objects. In simulation, whole-arm tactile sensing also outperformed per-link force–torque sensing in moderate clutter, with the relative benefits increasing with the amount of clutter.
  • Item
    Tactile Sensing over Articulated Joints with Stretchable Sensors
    (Georgia Institute of Technology, 2013-04) Bhattacharjee, Tapomayukh ; Jain, Advait ; Vaish, Sarvagya ; Killpack, Marc D. ; Kemp, Charles C.
    Biological organisms benefit from tactile sensing across the entire surfaces of their bodies. Robots may also be able to benefit from this type of sensing, but fully covering a robot with robust and capable tactile sensors entails numerous challenges. To date, most tactile sensors for robots have been used to cover rigid surfaces. In this paper, we focus on the challenge of tactile sensing across articulated joints, which requires sensing across a surface whose geometry varies over time. We first demonstrate the importance of sensing across joints by simulating a planar arm reaching in clutter and finding the frequency of contact at the joints. We then present a simple model of how much a tactile sensor would need to stretch in order to cover a 2 degree-of-freedom (DoF) wrist joint. Next, we describe and characterize a new tactile sensor made with stretchable fabrics. Finally, we present results for a stretchable sleeve with 25 tactile sensors that covers the forearm, 2 DoF wrist, and end effector of a humanoid robot. This sleeve enabled the robot to reach a target in instrumented clutter and reduce contact forces.
  • Item
    Towards an Assistive Robot that Autonomously Performs Bed Baths for Patient Hygiene
    (Georgia Institute of Technology, 2010-10) King, Chih-Hung ; Chen, Tiffany L. ; Jain, Advait ; Kemp, Charles C.
    This paper describes the design and implementation of a behavior that allows a robot with a compliant arm to perform wiping motions that are involved in bed baths. A laser-based operator-selection interface enables an operator to select an area to clean, and the robot autonomously performs a wiping motion using equilibrium point control. We evaluated the performance of the system by measuring the ability of the robot to remove an area of debris on human skin. We tested the performance of the behavior algorithm by commanding the robot to wipe off a 1-inch square area of debris placed on the surface of the upper arm, forearm, thigh, and shank of a human subject. Using image processing, we determined the hue content of the debris and used this representation to determine the percentage of debris that remained on the arm after the robot completed the task. In our experiments, the robot removed most of the debris (>96%) on four parts of the limbs. In addition, the robot performed the wiping task using relatively low force (<;3 N).
  • Item
    Operating articulated objects based on experience
    (Georgia Institute of Technology, 2010-10) Sturm, Jürgen ; Jain, Advait ; Stachniss, Cyrill ; Kemp, Charles C. ; Burgard, Wolfram
    Many tasks that would be of benefit to users in domestic environments require that robots manipulate articulated objects such as doors and drawers. In this paper, we present a novel approach that simultaneously estimates the kinematic model of an articulated object based on the trajectory described by the robot's end effector, and uses this model to predict the future trajectory of the end effector. One advantage of our approach is that the robot can directly use these predictions to generate an equilibrium point control path for operating the mechanism. Additionally, our approach can improve these predictions based on previously learned articulation models. We have implemented and tested our approach on a real mobile manipulator. Through 40 trials, we show that the robot can reliably open various household objects, including cabinet doors, sliding doors, office drawers, and a dishwasher. Furthermore, we demonstrate that using the information from previous interactions as a prior significantly improves the prediction accuracy.
  • Item
    The complex structure of simple devices: A survey of trajectories and forces that open doors and drawers
    (Georgia Institute of Technology, 2010-09) Jain, Advait ; Nguyen, Hai ; Rath, Mrinal ; Okerman, Jason ; Kemp, Charles C.
    Instrumental activities of daily living (IADLs) involve physical interactions with diverse mechanical systems found within human environments. In this paper, we describe our efforts to capture the everyday mechanics of doors and drawers, which form an important sub-class of mechanical systems for IADLs. We also discuss the implications of our results for the design of assistive robots. By answering questions such as “How high are the handles of most doors and drawers?” and “What forces are necessary to open most doors and drawers?”, our approach can inform robot designers as they make tradeoffs between competing requirements for assistive robots, such as cost, workspace, and power. Using a custom motion/force capture system, we captured kinematic trajectories and forces while operating 29 doors and 15 drawers in 6 homes and 1 office building in Atlanta, GA, USA. We also hand-measured the kinematics of 299 doors and 152 drawers in 11 area homes. We show that operation of these seemingly simple mechanisms involves significant complexities, including non-linear forces and large kinematic variation. We also show that the data exhibit significant structure. For example, 91.8% of the variation in the force sequences used to open doors can be represented using a 2-dimensional linear subspace. This complexity and structure suggests that capturing everyday mechanics may be a useful approach for improving the design of assistive robots.
  • Item
    Pulling Open Doors and Drawers: Coordinating an Omni-directional Base and a Compliant Arm with Equilibrium Point Control
    (Georgia Institute of Technology, 2010-05) Jain, Advait ; Kemp, Charles C.
    Previously, we have presented an implementation of impedance control inspired by the Equilibrium Point Hypothesis that we refer to as equilibrium point control (EPC). We have demonstrated that EPC can enable a robot in a fixed position to robustly pull open a variety of doors and drawers, and infer their kinematics without detailed prior models. In this paper, we extend this framework to support autonomous motion of the robot's omni-directional base both before and during pulling. With our new methods, we show that the robot can autonomously approach and open doors and drawers for which only the location and orientation of the handle have been provided. We also demonstrate that EPC can coordinate the movement of the robot's omni-directional base and compliant arm while pulling open a door or drawer, which leads to significantly improved performance. Through 40 trials with 10 different doors and drawers, we empirically demonstrated the robustness of the system. The robot succeeded in 37 out of 40 trials, and had no more than a single failure for any particular door or drawer.
  • Item
    EL-E: An Assistive Mobile Manipulator that Autonomously Fetches Objects from Flat Surfaces
    (Georgia Institute of Technology, 2010-01-01) Jain, Advait ; Kemp, Charles C.
    Assistive mobile robots that autonomously manipulate objects within everyday settings have the potential to improve the lives of the elderly, injured, and disabled. Within this paper, we present the most recent version of the assistive mobile manipulator EL-E with a focus on the subsystem that enables the robot to retrieve objects from and deliver objects to flat surfaces. Once provided with a 3D location via brief illumination with a laser pointer, the robot autonomously approaches the location and then either grasps the nearest object or places an object. We describe our implementation in detail, while highlighting design principles and themes, including the use of specialized behaviors, taskrelevant features, and low-dimensional representations. We also present evaluations of EL-E’s performance relative to common forms of variation. We tested EL-E’s ability to approach and grasp objects from the 25 object categories that were ranked most important for robotic retrieval by motor-impaired patients from the Emory ALS Center. Although reliability varied, EL-E succeeded at least once with objects from 21 out of 25 of these categories. EL-E also approached and grasped a cordless telephone on 12 different surfaces including floors, tables, and counter tops with 100% success. The same test using a vitamin pill (ca. 15 mm × 5 mm× 5 mm) resulted in 58% success.
  • Item
    Pulling Open Novel Doors and Drawers with Equilibrium Point Control
    (Georgia Institute of Technology, 2009-12) Jain, Advait ; Kemp, Charles C.
    A large variety of doors and drawers can be found within human environments. Humans regularly operate these mechanisms without difficulty, even if they have not previously interacted with a particular door or drawer. In this paper, we empirically demonstrate that equilibrium point control can enable a humanoid robot to pull open a variety of doors and drawers without detailed prior models, and infer their kinematics in the process. Our implementation uses a 7 DoF anthropomorphic arm with series elastic actuators (SEAs) at each joint, a hook as an end effector, and low mechanical impedance. For our control scheme, each SEA applies a gravity compensating torque plus a torque from a simulated, torsional, viscoelastic spring. Each virtual spring has constant stiffness and damping, and a variable equilibrium angle. These equilibrium angles form a joint space equilibrium point (JEP), which has a corresponding Cartesian space equilibrium point (CEP) for the arm's end effector. We present two controllers that generate a CEP at each time step (ca. 100 ms) and use inverse kinematics to command the arm with the corresponding JEP. One controller produces a linear CEP trajectory and the other alters its CEP trajectory based on real-time estimates of the mechanism's kinematics. We also present results from empirical evaluations of their performance (108 trials). In these trials, both controllers were robust with respect to variations in the mechanism, the pose of the base, the stiffness of the arm, and the way the handle was hooked. We also tested the more successful controller with 12 distinct mechanisms. In these tests, it was able to open 11 of the 12 mechanisms in a single trial, and successfully categorized the 11 mechanisms as having a rotary or prismatic joint, and opening to the right or left. Additionally, in the 7 out of 8 trials with rotary joints, the robot accurately estimated the location of the axis of rotation.