Organizational Unit:
Healthcare Robotics Lab

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 6 of 6
  • Item
    Autobed: Open Hardware for Accessible Web-based Control of an Electric Bed
    (Georgia Institute of Technology, 2016) Grice, Phillip M. ; Chitalia, Yash ; Rich, Megan ; Clever, Henry M. ; Kemp, Charles C.
    Individuals with severe motor impairments often have difficulty operating the standard controls of electric beds and so require a caregiver to adjust their position for utility, comfort, or to prevent pressure ulcers. Assistive human-computer interaction devices allow many such individuals to operate a computer and web browser. Here, we present the Autobed, a Wi-Fi-connected device that enables control of an Invacare Full-Electric Homecare Bed, a Medicare-approved device in the US, from any modern web browser, without modification of existing hardware. We detail the design and operation of the Autobed. We also examine its usage by one individual with severe motor impairments and his primary caregiver in their own home, including usage logs from a period of 102 days and detailed questionnaires. Finally, we make the entire system, including hardware design and components, software, and build instructions, available under permissive open-source licenses.
  • Item
    A Robotic System for Reaching in Dense Clutter that Integrates Model Predictive Control, Learning, Haptic Mapping, and Planning
    (Georgia Institute of Technology, 2014-09) Bhattacharjee, Tapomayukh ; Grice, Phillip M. ; Kapusta, Ariel ; Killpack, Marc D. ; Park, Daehyung ; Kemp, Charles C.
    We present a system that enables a robot to reach locations in dense clutter using only haptic sensing. Our system integrates model predictive control [1], learned initial conditions [2], tactile recognition of object types [3], haptic mapping, and geometric planning to efficiently reach locations using whole- arm tactile sensing [4]. We motivate our work, present a system architecture, summarize each component of the system, and present results from our evaluation of the system reaching to target locations in dense artificial foliage.
  • Item
    Assistive Mobile Manipulation for Self-Care Tasks Around the Head
    (Georgia Institute of Technology, 2014) Hawkins, Kelsey P. ; Grice, Phillip M. ; Chen, Tiffany L. ; King, Chih-Hung ; Kemp, Charles C.
    Human-scale mobile robots with arms have the potential to assist people with a variety of tasks. We present a proof-of-concept system that has enabled a person with severe quadriplegia named Henry Evans to shave himself in his own home using a general purpose mobile manipulator (PR2 from Willow Garage). The robot primarily provides assistance by holding a tool (e.g., an electric shaver) at user-specified locations around the user’s head, while he/she moves his/her head against it. If the robot detects forces inappropriate for the task (e.g., shaving), it withdraws the tool. The robot also holds a mirror with its other arm, so that the user can see what he/she is doing. For all aspects of the task, the robot and the human work together. The robot uses a series of distinct semi-autonomous subsystems during the task to navigate to poses next to the wheelchair, attain initial arm configurations, register a 3D model of the person’s head, move the tool to coarse semantically-labeled tool poses (e.g, “Cheek”), and finely position the tool via incremental movements. Notably, while moving the tool near the user’s head, the robot uses an ellipsoidal coordinate system attached to the 3D head model. In addition to describing the complete robotic system, we report results from Henry Evans using it to shave both sides of his face while sitting in his wheelchair at home. He found the process to be long (54 minutes) and the interface unintuitive. Yet, he also found the system to be comfortable to use, felt safe while using it, was satisfied with it, and preferred it to a human caregiver.
  • Item
    Whole-arm Tactile Sensing for Beneficial and Acceptable Contact During Robotic Assistance
    (Georgia Institute of Technology, 2013-06) Grice, Phillip M. ; Killpack, Marc D. ; Jain, Advait ; Vaish, Sarvagya ; Hawke, Jeffrey ; Kemp, Charles C.
    Many assistive tasks involve manipulation near the care-receiver's body, including self-care tasks such as dressing, feeding, and personal hygiene. A robot can provide assistance with these tasks by moving its end effector to poses near the care-receiver's body. However, perceiving and maneuvering around the care-receiver's body can be challenging due to a variety of issues, including convoluted geometry, compliant materials, body motion, hidden surfaces, and the object upon which the body is resting (e.g., a wheelchair or bed). Using geometric simulations, we first show that an assistive robot can achieve a much larger percentage of end-effector poses near the care-receiver's body if its arm is allowed to make contact. Second, we present a novel system with a custom controller and whole-arm tactile sensor array that enables a Willow Garage PR2 to regulate contact forces across its entire arm while moving its end effector to a commanded pose. We then describe tests with two people with motor impairments, one of whom used the system to grasp and pull a blanket over himself and to grab a cloth and wipe his face, all while in bed at his home. Finally, we describe a study with eight able-bodied users in which they used the system to place objects near their bodies. On average, users perceived the system to be safe and comfortable, even though substantial contact occurred between the robot's arm and the user's body.
  • Item
    Robots for Humanity: A Case Study in Assistive Mobile Manipulation
    (Georgia Institute of Technology, 2013-03) Chen, Tiffany L. ; Ciocarlie, Matei ; Cousins, Steve ; Grice, Phillip M. ; Hawkins, Kelsey ; Hsiao, Kaijen ; Kemp, Charles C. ; King, Chih-Hung ; Lazewatsky, Daniel A. ; Nguyen, Hai ; Paepcke, Andreas ; Pantofaru, Caroline ; Smart, William D. ; Takayama, Leila
    Assistive mobile manipulators have the potential to one day serve as surrogates and helpers for people with disabilities, giving them the freedom to perform tasks such as scratching an itch, picking up a cup, or socializing with their families. This article introduces a collaborative project with the goal of putting assistive mobile manipulators into real homes to work with people with disabilities. Through a participatory design process in which users have been actively involved from day one, we are identifying and developing assistive capabilities for the PR2 robot. Our approach is to develop a diverse suite of open source software tools that blend the capabilities of the user and the robot. Within this article, we introduce the project, describe our progress, and discuss lessons we have learned.
  • Item
    The Wouse: A Wearable Wince Detector to Stop Assistive Robots
    (Georgia Institute of Technology, 2012-09) Grice, Phillip M. ; Lee, Andy ; Evans, Henry ; Kemp, Charles C.
    Persons with severe motor impairments depend heavily upon caregivers for the performance of everyday tasks. Ongoing work is exploring the potential of giving motor-impaired users control of semi-autonomous assistive mobile manipulators to enable them to perform some self-care tasks such as scratching or shaving. Because these users are less able to escape a robot malfunction, or operate a traditional run-stop, physical human-robot interaction poses safety risks. We review approaches to safety in assistive robotics with a focus on accessible run-stops, and propose wincing as an accessible gesture for activating a run-stop device. We also present the wouse, a novel device for detecting wincing from skin movement near the eye, consisting of optical mouse components mounted near a user's temple via safety goggles. Using this device, we demonstrate a complete system to run-stop a Willow Garage PR2 robot, and perform two preliminary user studies. The first study examines discrimination of wincing from self-produced facial expressions. The results indicate the possibility for discrimination, though variability between users and inconsistent detection of skin movement remain significant challenges. The second experiment examines discrimination of wincing from external mechanical manipulations of the face during self-care tasks. The results indicate that the wouse, using a classifier trained with data from the first experiment, can be used during face-manipulation tasks. The device produced no false positives, but succeeded in correctly identifying wincing events in only two of four subjects.