Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 3 of 3
  • Item
    Autonomously learning to visually detect where manipulation will succeed
    (Georgia Institute of Technology, 2013-09) Nguyen, Hai ; Kemp, Charles C.
    Visual features can help predict if a manipulation behavior will succeed at a given location. For example, the success of a behavior that flips light switches depends on the location of the switch. We present methods that enable a mobile manipulator to autonomously learn a function that takes an RGB image and a registered 3D point cloud as input and returns a 3D location at which a manipulation behavior is likely to succeed. With our methods, robots autonomously train a pair of support vector machine (SVM) classifiers by trying behaviors at locations in the world and observing the results. Our methods require a pair of manipulation behaviors that can change the state of the world between two sets (e.g., light switch up and light switch down), classifiers that detect when each behavior has been successful, and an initial hint as to where one of the behaviors will be successful. When given an image feature vector associated with a 3D location, a trained SVM predicts if the associated manipulation behavior will be successful at the 3D location. To evaluate our approach, we performed experiments with a PR2 robot from Willow Garage in a simulated home using behaviors that flip a light switch, push a rocker-type light switch, and operate a drawer. By using active learning, the robot efficiently learned SVMs that enabled it to consistently succeed at these tasks. After training, the robot also continued to learn in order to adapt in the event of failure.
  • Item
    Robots for Humanity: A Case Study in Assistive Mobile Manipulation
    (Georgia Institute of Technology, 2013-03) Chen, Tiffany L. ; Ciocarlie, Matei ; Cousins, Steve ; Grice, Phillip M. ; Hawkins, Kelsey ; Hsiao, Kaijen ; Kemp, Charles C. ; King, Chih-Hung ; Lazewatsky, Daniel A. ; Nguyen, Hai ; Paepcke, Andreas ; Pantofaru, Caroline ; Smart, William D. ; Takayama, Leila
    Assistive mobile manipulators have the potential to one day serve as surrogates and helpers for people with disabilities, giving them the freedom to perform tasks such as scratching an itch, picking up a cup, or socializing with their families. This article introduces a collaborative project with the goal of putting assistive mobile manipulators into real homes to work with people with disabilities. Through a participatory design process in which users have been actively involved from day one, we are identifying and developing assistive capabilities for the PR2 robot. Our approach is to develop a diverse suite of open source software tools that blend the capabilities of the user and the robot. Within this article, we introduce the project, describe our progress, and discuss lessons we have learned.
  • Item
    RFID-Guided Robots for Pervasive Automation
    (Georgia Institute of Technology, 2010-01-15) Deyle, Travis ; Nguyen, Hai ; Reynolds, Matt S. ; Kemp, Charles C.
    Passive UHF RFID tags are well matched to robots' needs. Unlike lowfrequency (LF) and high-frequency (HF) RFID tags, passive UHF RFID tags are readable from across a room, enabling a mobile robot to efficiently discover and locate them. Using tags' unique IDs, a semantic database, and RF perception via actuated antennas, this paper shows how a robot can reliably interact with people and manipulate labeled objects.