Title:
Generating Human-like Motion for Robots
Generating Human-like Motion for Robots
Author(s)
Gielniak, Michael J.
Liu, C. Karen
Thomaz, Andrea L.
Liu, C. Karen
Thomaz, Andrea L.
Advisor(s)
Editor(s)
Collections
Supplementary to
Permanent Link
Abstract
Action prediction and fluidity are key elements of human-robot teamwork. If a robot’s actions are hard to understand, it can impede fluid HRI. Our goal is to improve the clarity of robot motion by making it more humanlike. We present an algorithm that autonomously synthesizes human-like variants of an input motion. Our approach is a three stage pipeline. First we optimize motion with respect to spatio-temporal correspondence (STC), which emulates the coordinated effects of human joints that are connected by muscles. We present three experiments that validate that our STC optimization approach increases human-likeness and recognition accuracy for human social partners. Next in the pipeline, we avoid repetitive motion by adding variance, through exploiting redundant and underutilized spaces of the input motion, which creates multiple motions from a single input. In two experiments we validate that our variance approach maintains the human-likeness from the previous step, and that a social partner can still accurately recognize the motion’s intent. As a final step, we maintain the robot’s ability to interact with it’s world by providing it the ability to satisfy constraints. We provide experimental analysis of the effects of constraints on the synthesized human-like robot motion variants.
Sponsor
Date Issued
2013-07
Extent
Resource Type
Text
Resource Subtype
Article
Post-print
Post-print