Title:
Laser Pointers and a Touch Screen: Intuitive Interfaces for Autonomous Mobile Manipulation for the Motor Impaired

Thumbnail Image
Author(s)
Choi, Young Sang
Anderson, Cressel D.
Glass, Jonathan D.
Kemp, Charles C.
Authors
Advisor(s)
Advisor(s)
Editor(s)
Associated Organization(s)
Series
Supplementary to
Abstract
El-E (“Ellie”) is a prototype assistive robot designed to help people with severe motor impairments manipulate everyday objects. When given a 3D location, El-E can autonomously approach the location and pick up a nearby object. Based on interviews of patients with amyotrophic lateral sclerosis (ALS), we have developed and tested three distinct interfaces that enable a user to provide a 3D location to El-E and thereby select an object to be manipulated: an ear-mounted laser pointer, a hand-held laser pointer, and a touch screen interface. Within this paper, we present the results from a user study comparing these three user interfaces with a total of 134 trials involving eight patients with varying levels of impairment recruited from the Emory ALS Clinic. During this study, participants used the three interfaces to select everyday objects to be approached, grasped, and lifted off of the ground. The three interfaces enabled motor impaired users to command a robot to pick up an object with a 94.8% success rate overall after less than 10 minutes of learning to use each interface. On average, users selected objects 69% more quickly with the laser pointer interfaces than with the touch screen interface. We also found substantial variation in user preference. With respect to the Revised ALS Functional Rating Scale (ALSFRS-R), users with greater upper-limb mobility tended to prefer the hand-held laser pointer, while those with less upper-limb mobility tended to prefer the ear-mounted laser pointer. Despite the extra efficiency of the laser pointer interfaces, three patients preferred the touch screen interface, which has unique potential for manipulating remote objects out of the user’s line of sight. In summary, these results indicate that robots can enhance accessibility by supporting multiple interfaces. Furthermore, this work demonstrates that the communication of 3D locations during human-robot interaction can serve as a powerful abstraction barrier that supports distinct interfaces to assistive robots while using identical, underlying robotic functionality.
Sponsor
Date Issued
2008-10
Extent
Resource Type
Text
Resource Subtype
Proceedings
Post-print
Rights Statement
Rights URI