Title:
Generating Human-like Motion for Robots

dc.contributor.author Gielniak, Michael J. en_US
dc.contributor.author Liu, C. Karen en_US
dc.contributor.author Thomaz, Andrea L. en_US
dc.contributor.corporatename Georgia Institute of Technology. Center for Robotics and Intelligent Machines en_US
dc.contributor.corporatename Georgia Institute of Technology. College of Computing en_US
dc.contributor.corporatename Georgia Institute of Technology. Socially Intelligent Machines Lab en_US
dc.date.accessioned 2013-07-19T15:22:33Z
dc.date.available 2013-07-19T15:22:33Z
dc.date.issued 2013-07
dc.description © The Author(s) 2013 en_US
dc.description The definitive article is published in the International Journal of Robotics Research © Sage Publications. Located at: http://ijr.sagepub.com/content/early/2013/07/12/0278364913490533 en_US
dc.description DOI: 10.1177/0278364913490533 en_US
dc.description.abstract Action prediction and fluidity are key elements of human-robot teamwork. If a robot’s actions are hard to understand, it can impede fluid HRI. Our goal is to improve the clarity of robot motion by making it more humanlike. We present an algorithm that autonomously synthesizes human-like variants of an input motion. Our approach is a three stage pipeline. First we optimize motion with respect to spatio-temporal correspondence (STC), which emulates the coordinated effects of human joints that are connected by muscles. We present three experiments that validate that our STC optimization approach increases human-likeness and recognition accuracy for human social partners. Next in the pipeline, we avoid repetitive motion by adding variance, through exploiting redundant and underutilized spaces of the input motion, which creates multiple motions from a single input. In two experiments we validate that our variance approach maintains the human-likeness from the previous step, and that a social partner can still accurately recognize the motion’s intent. As a final step, we maintain the robot’s ability to interact with it’s world by providing it the ability to satisfy constraints. We provide experimental analysis of the effects of constraints on the synthesized human-like robot motion variants. en_US
dc.identifier.citation Michael J. Gielniak, C. Karen Liu, and Andrea L. Thomaz, "Generating Human-like Motion For Robots," The International Journal of Robotics Research, 0278364913490533, first published on July 15, 2013. en_US
dc.identifier.doi 10.1177/0278364913490533
dc.identifier.issn 0278-3649
dc.identifier.uri http://hdl.handle.net/1853/48472
dc.language.iso en_US en_US
dc.publisher Georgia Institute of Technology en_US
dc.publisher.original Sage Publications en_US
dc.subject Action prediction en_US
dc.subject Fluidity en_US
dc.subject Spatio-temporal correspondence en_US
dc.subject Human-like motion for robots en_US
dc.title Generating Human-like Motion for Robots en_US
dc.title.alternative Generating Life-like Motion for Robots en_US
dc.type Text
dc.type.genre Article
dc.type.genre Post-print
dspace.entity.type Publication
local.contributor.corporatename College of Computing
local.contributor.corporatename Socially Intelligent Machines Lab
local.contributor.corporatename Institute for Robotics and Intelligent Machines (IRIM)
relation.isOrgUnitOfPublication c8892b3c-8db6-4b7b-a33a-1b67f7db2021
relation.isOrgUnitOfPublication 57e47d4b-8e04-4c68-a99e-2cb4580b4844
relation.isOrgUnitOfPublication 66259949-abfd-45c2-9dcc-5a6f2c013bcf
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
SIM_Gielniak_2013.001.pdf
Size:
773 KB
Format:
Adobe Portable Document Format
Description: