Sensing Touch from Vision for Humans and Robots

Author(s)
Grady, Patrick
Advisor(s)
Editor(s)
Associated Organization(s)
Supplementary to:
Abstract
To affect their environment, humans and robots use their hands and grippers to push, pick up, and manipulate the world around them. At the core of this interaction is physical contact which determines the underlying mechanics of the grasp. While contact is useful in understanding manipulation, it is difficult to measure. In this thesis, we explore methods to estimate contact between humans, robots, and objects using easy-to-collect imagery. First, we demonstrate a method which leverages subtle visual changes to infer the pressure between a human hand and surface using RGB images. We initially explore this work in a constrained laboratory setting, but also develop a weakly-supervised data collection technique to estimate hand pressure in less constrained settings. A parallel approach allows us to estimate the pressure and force that soft robotic grippers apply to their environments, allowing for precise closed-loop control of a robot. Finally, we develop a joint pose and contact estimator which may generalize to internet-scale images. Our model leverages multiple heterogeneously labeled datasets and images with contact labeled by human annotators. Overall, this thesis makes progress towards understanding human and robot manipulation from only visual sensing.
Sponsor
Date
2023-12-06
Extent
Resource Type
Text
Resource Subtype
Dissertation
Rights Statement
Rights URI