Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 2 of 2
  • Item
    Improving robot manipulation with data-driven object-centric models of everyday forces
    (Georgia Institute of Technology, 2013-06) Jain, Advait ; Kemp, Charles C.
    Based on a lifetime of experience, people anticipate the forces associated with performing a manipulation task. In contrast, most robots lack common sense about the forces involved in everyday manipulation tasks. In this paper, we present data-driven methods to inform robots about the forces that they are likely to encounter when performing specific tasks. In the context of door opening, we demonstrate that data-driven object-centric models can be used to haptically recognize specific doors, haptically recognize classes of door (e.g., refrigerator vs. kitchen cabinet), and haptically detect anomalous forces while opening a door, even when opening a specific door for the first time.We also demonstrate that two distinct robots can use forces captured from people opening doors to better detect anomalous forces. These results illustrate the potential for robots to use shared databases of forces to bettermanipulate theworld and attain common sense about everyday forces.
  • Item
    Reaching in clutter with whole-arm tactile sensing
    (Georgia Institute of Technology, 2013-04) Jain, Advait ; Killpack, Marc D. ; Edsinger, Aaron ; Kemp, Charles C.
    Clutter creates challenges for robot manipulation, including a lack of non-contact trajectories and reduced visibility for line-of-sight sensors. We demonstrate that robots can use whole-arm tactile sensing to perceive clutter and maneuver within it, while keeping contact forces low. We first present our approach to manipulation, which emphasizes the benefits of making contact across the entire manipulator and assumes the manipulator has low-stiffness actuation and tactile sensing across its entire surface. We then present a novel controller that exploits these assumptions. The controller only requires haptic sensing, handles multiple contacts, and does not need an explicit model of the environment prior to contact. It uses model predictive control with a time horizon of length one and a linear quasi-static mechanical model. In our experiments, the controller enabled a real robot and a simulated robot to reach goal locations in a variety of environments, including artificial foliage, a cinder block, and randomly generated clutter, while keeping contact forces low. While reaching, the robots performed maneuvers that included bending objects, compressing objects, sliding objects, and pivoting around objects. In simulation, whole-arm tactile sensing also outperformed per-link force–torque sensing in moderate clutter, with the relative benefits increasing with the amount of clutter.