Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 3 of 3
  • Item
    Touched By a Robot: An Investigation of Subjective Responses to Robot-initiated Touch
    (Georgia Institute of Technology, 2011-03) Chen, Tiffany L. ; King, Chih-Hung ; Thomaz, Andrea L. ; Kemp, Charles C.
    By initiating physical contact with people, robots can be more useful. For example, a robotic caregiver might make contact to provide physical assistance or facilitate communication. So as to better understand how people respond to robot-initiated touch, we conducted a 2x2 between-subjects experiment with 56 people in which a robotic nurse autonomously touched and wiped the subject's forearm. Our independent variables were whether or not the robot verbally warned the person before contact, and whether the robot verbally indicated that the touch was intended to clean the person's skin (instrumental touch) or to provide comfort (affective touch). On average, regardless of the treatment, participants had a generally positive subjective response. However, with instrumental touch people responded significantly more favorably. Since the physical behavior of the robot was the same for all trials, our results demonstrate that the perceived intent of the robot can significantly influence a person's subjective response to robot-initiated touch. Our results suggest that roboticists should consider this factor in addition to the mechanics of physical interaction. Unexpectedly, we found that participants tended to respond more favorably without a verbal warning. Although inconclusive, our results suggest that verbal warnings prior to contact should be carefully designed, if used at all.
  • Item
    Autonomous Active Learning of Task-Relevant Features for Mobile Manipulation
    (Georgia Institute of Technology, 2011) Nguyen, Hai ; Kemp, Charles C.
    We present an active learning approach that enables a mobile manipulator to autonomously learn task-relevant features. For a given behavior, our system trains a Support Vector Machine (SVM) that predicts the 3D locations at which the behavior will succeed. This decision is made based on visual features that surround each 3D location. After a quick initialization by the user, the robot efficiently collects and labels positive and negative examples fully autonomously. To demonstrate the efficacy of our approach, we present results for behaviors that flip a light switch up and down, push the top or bottom of a rocker-type light switch, and open or close a drawer. Our implementation uses a Willow Garage PR2 robot. We show that our approach produces classifiers that predict the success of these behaviors. In addition, we show that the robot can continuously learn from its experience. In our initial evaluation of 6 behaviors with learned classifiers, each behavior succeeded in 5 out of 5 trials with at most one retry.
  • Item
    Older Adults' Acceptance of Assistive Robots for the Home
    (Georgia Institute of Technology, 2011) Mitzner, Tracy L. ; Smarr, Cory-Ann ; Beer, Jenay M. ; Chen, Tiffany L. ; Springman, Jennifer Megan ; Prakash, Akanksha ; Kemp, Charles C. ; Rogers, Wendy A.