Title:
Reaching in clutter with whole-arm tactile sensing

Thumbnail Image
Author(s)
Jain, Advait
Killpack, Marc D.
Edsinger, Aaron
Kemp, Charles C.
Authors
Advisor(s)
Advisor(s)
Editor(s)
Associated Organization(s)
Series
Supplementary to
Abstract
Clutter creates challenges for robot manipulation, including a lack of non-contact trajectories and reduced visibility for line-of-sight sensors. We demonstrate that robots can use whole-arm tactile sensing to perceive clutter and maneuver within it, while keeping contact forces low. We first present our approach to manipulation, which emphasizes the benefits of making contact across the entire manipulator and assumes the manipulator has low-stiffness actuation and tactile sensing across its entire surface. We then present a novel controller that exploits these assumptions. The controller only requires haptic sensing, handles multiple contacts, and does not need an explicit model of the environment prior to contact. It uses model predictive control with a time horizon of length one and a linear quasi-static mechanical model. In our experiments, the controller enabled a real robot and a simulated robot to reach goal locations in a variety of environments, including artificial foliage, a cinder block, and randomly generated clutter, while keeping contact forces low. While reaching, the robots performed maneuvers that included bending objects, compressing objects, sliding objects, and pivoting around objects. In simulation, whole-arm tactile sensing also outperformed per-link force–torque sensing in moderate clutter, with the relative benefits increasing with the amount of clutter.
Sponsor
Date Issued
2013-04
Extent
Resource Type
Text
Resource Subtype
Article
Rights Statement
Rights URI