Title:
A Point-and-Click Interface for the Real World: Laser Designation of Objects for Mobile Manipulation

Thumbnail Image
Author(s)
Kemp, Charles C.
Anderson, Cressel D.
Nguyen, Hai
Trevor, Alexander J. B.
Xu, Zhe
Authors
Advisor(s)
Advisor(s)
Editor(s)
Associated Organization(s)
Series
Supplementary to
Abstract
We present a novel interface for human-robot interaction that enables a human to intuitively and unambiguously se- lect a 3D location in the world and communicate it to a mo- bile robot. The human points at a location of interest and illuminates it (“clicks it”) with an unaltered, off-the-shelf, green laser pointer. The robot detects the resulting laser spot with an omnidirectional, catadioptric camera with a narrow-band green filter. After detection, the robot moves its stereo pan/tilt camera to look at this location and esti- mates the location’s 3D position with respect to the robot’s frame of reference. Unlike previous approaches, this interface for gesture-based pointing requires no instrumentation of the environment, makes use of a non-instrumented everyday pointing device, has low spatial error out to 3 meters, is fully mobile, and is robust enough for use in real-world applications. We demonstrate that this human-robot interface enables a person to designate a wide variety of everyday objects placed throughout a room. In 99.4% of these tests, the robot successfully looked at the designated object and estimated its 3D position with low average error. We also show that this interface can support object acquisition by a mobile manipulator. For this application, the user selects an object to be picked up from the floor by “clicking” on it with the laser pointer interface. In 90% of these trials, the robot successfully moved to the designated object and picked it up off of the floor.
Sponsor
Date Issued
2008-03
Extent
Resource Type
Text
Resource Subtype
Proceedings
Post-print
Rights Statement
Rights URI