Organizational Unit:
Socially Intelligent Machines Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)

Publication Search Results

Now showing 1 - 7 of 7
  • Item
    Generating Human-like Motion for Robots
    (Georgia Institute of Technology, 2013-07) Gielniak, Michael J. ; Liu, C. Karen ; Thomaz, Andrea L.
    Action prediction and fluidity are key elements of human-robot teamwork. If a robot’s actions are hard to understand, it can impede fluid HRI. Our goal is to improve the clarity of robot motion by making it more humanlike. We present an algorithm that autonomously synthesizes human-like variants of an input motion. Our approach is a three stage pipeline. First we optimize motion with respect to spatio-temporal correspondence (STC), which emulates the coordinated effects of human joints that are connected by muscles. We present three experiments that validate that our STC optimization approach increases human-likeness and recognition accuracy for human social partners. Next in the pipeline, we avoid repetitive motion by adding variance, through exploiting redundant and underutilized spaces of the input motion, which creates multiple motions from a single input. In two experiments we validate that our variance approach maintains the human-likeness from the previous step, and that a social partner can still accurately recognize the motion’s intent. As a final step, we maintain the robot’s ability to interact with it’s world by providing it the ability to satisfy constraints. We provide experimental analysis of the effects of constraints on the synthesized human-like robot motion variants.
  • Item
    Enhancing Interaction Through Exaggerated Motion Synthesis
    (Georgia Institute of Technology, 2012-03) Gielniak, Michael J. ; Thomaz, Andrea L.
    Other than eye gaze and referential gestures (e.g. pointing), the relationship between robot motion and observer attention is not well understood. We explore this relationship to achieve social goals, such as influencing human partner behavior or directing attention. We present an algorithm that creates exaggerated variants of a motion in real-time. Through two experiments we confirm that exaggerated motion is perceptibly different than the input motion, provided that the motion is sufficiently exaggerated. We found that different levels of exaggeration correlate to human expectations of robot-like, human-like, and cartoon-like motion. We present empirical evidence that use of exaggerated motion in experiments enhances the interaction through the benefits of increased engagement and perceived entertainment value. Finally, we provide statistical evidence that exaggerated motion causes a human partner to have better retention of interaction details and predictable gaze direction
  • Item
    Task-Aware Variations in Robot Motion
    (Georgia Institute of Technology, 2011-05) Gielniak, Michael J. ; Liu, C. Karen ; Thomaz, Andrea L.
    Social robots can benefit from motion variance because non-repetitive gestures will be more natural and intuitive for human partners. We introduce a new approach for synthesizing variance, both with and without constraints, using a stochastic process. Based on optimal control theory and operational space control, our method can generate an infinite number of variations in real-time that resemble the kinematic and dynamic characteristics from the single input motion sequence. We also introduce a stochastic method to generate smooth but nondeterministic transitions between arbitrary motion variants. Furthermore, we quantitatively evaluate taskaware variance against random white torque noise, operational space control, style-based inverse kinematics, and retargeted human motion to prove that task-aware variance generates human-like motion. Finally, we demonstrate the ability of task-aware variance to maintain velocity and time-dependent features that exist in the input motion.
  • Item
    Anticipation in Robot Motion
    (Georgia Institute of Technology, 2011) Gielniak, Michael J. ; Thomaz, Andrea L.
    Robots that display anticipatory motion provide their human partners with greater time to respond in interactive tasks because human partners are aware of robot intent earlier. We create anticipatory motion autonomously from a single motion exemplar by extracting hand and body symbols that communicate motion intent and moving them earlier in the motion. We validate that our algorithm extracts the most salient frame (i.e. the correct symbol) which is the most informative about motion intent to human observers. Furthermore, we show that anticipatory variants allow humans to discern motion intent sooner than motions without anticipation, and that humans are able to reliably predict motion intent prior to the symbol frame when motion is anticipatory. Finally, we quantified the time range for robot motion when humans can perceive intent more accurately and the collaborative social benefits of anticipatory motion are greatest.
  • Item
    Spatiotemporal Correspondence as a Metric for Human-like Robot Motion
    (Georgia Institute of Technology, 2011) Gielniak, Michael J. ; Thomaz, Andrea L.
    Coupled degrees-of-freedom exhibit correspondence, in that their trajectories in uence each other. In this paper we add evidence to the hypothesis that spatiotemporal corre- spondence (STC) of distributed actuators is a component of human-like motion. We demonstrate a method for making robot motion more human-like, by optimizing with respect to a nonlinear STC metric. Quantitative evaluation of STC between coordinated robot motion, human motion capture data, and retargeted human motion capture data projected onto an anthropomorphic robot suggests that coordinating robot motion with respect to the STC metric makes the motion more human-like. A user study based on mimick- ing shows that STC-optimized motion is (1) more often rec- ognized as a common human motion, (2) more accurately identi ed as the originally intended motion, and (3) mim- icked more accurately than a non-optimized version. We conclude that coordinating robot motion with respect to the STC metric makes the motion more human-like. Finally, we present and discuss data on potential reasons why coordi- nating motion increases recognition and ability to mimic.
  • Item
    Stylized Motion Generalization Through Adaptation of Velocity Profiles
    (Georgia Institute of Technology, 2010) Gielniak, Michael J. ; Liu, C. Karen ; Thomaz, Andrea L.
    Stylized motion is prevalent in the field of Human-Robot Interaction (HRI). Robot designers typically hand craft or work with professional animators to design behaviors for a robot that will be communicative or life-like when interacting with a human partner. A challenge is to apply this stylized trajectory in varied contexts (e.g. performing a stylized gesture with different end-effector constraints). The goal of this research is to create useful, task-based motion with variance that spans the reachable space of the robot and satisfies constraints, while preserving the “style” of the original motion. We claim the appropriate representation for adapting and generalizing a trajectory is not in Cartesian or joint angle space, but rather in joint velocity space, which allows for unspecified initial conditions to be supplied by interaction with the dynamic environment. The benefit of this representation is that a single trajectory can be extended to accomplish similar tasks in the world given constraints in the environment. We present quantitative data using a continuity metric to prove that, given a stylized initial trajectory, we can create smoother generalized motion than with traditional techniques such as cyclic-coordinate descent.
  • Item
    Secondary Action in Robot Motion
    (Georgia Institute of Technology, 2010) Gielniak, Michael J. ; Liu, C. Karen ; Thomaz, Andrea L.
    Secondary action, a concept borrowed from character animation, improves the animation realism by augmenting natural, passive motion to primary action. We use dynamic simulation to induce three techniques of secondary motion for robot hardware, which exploit actuation passivity to overcome hardware constraints and change the dynamic perception of the robot and its motion characteristics. Results of secondary motion due to internal and external forces are presented including discussion on how to choose the appropriate technique for a particular application.