Organizational Unit:
Humanoid Robotics Laboratory

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 4 of 4
  • Item
    Correct Software Synthesis for Stable Speed-Controlled Robotic Walking
    (Georgia Institute of Technology, 2013-06) Dantam, Neil ; Hereid, Ayonga ; Ames, Aaron ; Stilman, Mike
    We present a software synthesis method for speed- controlled robot walking based on supervisory control of a context-free Motion Grammar. First, we use Human-Inspired control to identify parameters for fixed speed walking and for transitions between fixed speeds, guaranteeing dynamic stability. Next, we build a Motion Grammar representing the discrete- time control for this set of speeds. Then, we synthesize C code from this grammar and generate supervisors¹ online to achieve desired walking speeds, guaranteeing correctness of discrete computation. Finally, we demonstrate this approach on the Aldebaran NAO, showing stable walking transitions with dynamically selected speeds.
  • Item
    Path Planning with Uncertainty: Voronoi Uncertainty Fields
    (Georgia Institute of Technology, 2013-05) Ok, Kyel ; Ansari, Sameer ; Gallagher, Billy ; Sica, William ; Dellaert, Frank ; Stilman, Mike
    In this paper, a two-level path planning algorithm that deals with map uncertainty is proposed. The higher level planner uses modified generalized Voronoi diagrams to guarantee finding a connected path from the start to the goal if a collision-free path exists. The lower level planner considers uncertainty of the observed obstacles in the environment and assigns repulsive forces based on their distance to the robot and their positional uncertainty. The attractive forces from the Voronoi nodes and the repulsive forces from the uncertainty- biased potential fields form a hybrid planner we call Voronoi Uncertainty Fields (VUF). The proposed planner has two strong properties: (1) bias against uncertain obstacles, and (2) completeness. We analytically prove the properties and run simulations to validate our method in a forest-like environment.
  • Item
    Linguistic Transfer of Human Assembly Tasks to Robots
    (Georgia Institute of Technology, 2012-10) Dantam, Neil ; Essa, Irfan ; Stilman, Mike
    We demonstrate the automatic transfer of an assembly task from human to robot. This work extends efforts showing the utility of linguistic models in verifiable robot control policies by now performing real visual analysis of human demonstrations to automatically extract a policy for the task. This method tokenizes each human demonstration into a sequence of object connection symbols, then transforms the set of sequences from all demonstrations into an automaton, which represents the task-language for assembling a desired object. Finally, we combine this assembly automaton with a kinematic model of a robot arm to reproduce the demonstrated task.
  • Item
    Navigation Among Movable Obstacles in Unknown Environments
    (Georgia Institute of Technology, 2010-10) Wu, Hai-Ning ; Levihn, Martin ; Stilman, Mike
    This paper explores the Navigation Among Movable Obstacles (NAMO) problem in an unknown environment. We consider the realistic scenario in which the robot has to navigate to a goal position in an unknown environment consisting of static and movable objects. The robot may move objects if the goal can not be reached otherwise or if moving the object may significantly shorten the path to the goal. We consider real situations in which the robot only has limited sensing information and where the action selection can therefore only be based on partial knowledge learned from the environment at that point. This paper introduces an algorithm that significantly reduces the necessary calculations to accomplish this task compared to a direct approach. We present an efficient implementation for the case of planar, axis-aligned environments and report experimental results on challenging scenarios with more than 50 objects.