Person:
Kemp, Charles C.

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 5 of 5
  • Item
    Haptic Classification and Recognition of Objects Using a Tactile Sensing Forearm
    (Georgia Institute of Technology, 2012-10) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    In this paper, we demonstrate data-driven inference of mechanical properties of objects using a tactile sensor array (skin) covering a robot's forearm. We focus on the mobility (sliding vs. fixed), compliance (soft vs. hard), and identity of objects in the environment, as this information could be useful for efficient manipulation and search. By using the large surface area of the forearm, a robot could potentially search and map a cluttered volume more efficiently, and be informed by incidental contact during other manipulation tasks. Our approach tracks a contact region on the forearm over time in order to generate time series of select features, such as the maximum force, contact area, and contact motion. We then process and reduce the dimensionality of these time series to generate a feature vector to characterize the contact. Finally, we use the k-nearest neighbor algorithm (k-NN) to classify a new feature vector based on a set of previously collected feature vectors. Our results show a high cross-validation accuracy in both classification of mechanical properties and object recognition. In addition, we analyze the effect of taxel resolution, duration of observation, feature selection, and feature scaling on the classification accuracy.
  • Item
    Haptic Classification and Recognition of Objects Using a Tactile Sensing Forearm
    (Georgia Institute of Technology, 2012-10) Bhattacharjee, Tapomayukh ; Rehg, James M. ; Kemp, Charles C.
    In this paper, we demonstrate data-driven inference of mechanical properties of objects using a tactile sensor array (skin) covering a robot’s forearm. We focus on the mobility (sliding vs. fixed), compliance (soft vs. hard), and identity of objects in the environment, as this information could be useful for efficient manipulation and search. By using the large surface area of the forearm, a robot could potentially search and map a cluttered volume more efficiently, and be informed by incidental contact during other manipulation tasks. Our approach tracks a contact region on the forearm over time in order to generate time series of select features, such as the maximum force, contact area, and contact motion. We then process and reduce the dimensionality of these time series to generate a feature vector to characterize the contact. Finally, we use the k-nearest neighbor algorithm (k-NN) to classify a new feature vector based on a set of previously collected feature vectors. Our results show a high cross-validation accuracy in both classification of mechanical properties and object recognition. In addition, we analyze the effect of taxel resolution, duration of observation, feature selection, and feature scaling on the classification accuracy.
  • Item
    Informing Assistive Robots with Models of Contact Forces from Able-Bodied Face Wiping and Shaving
    (Georgia Institute of Technology, 2012-09) Hawkins, Kelsey P. ; King, Chih-Hung ; Chen, Tiffany L. ; Kemp, Charles C.
    Hygiene and feeding are activities of daily living (ADLs) that often involve contact with a person's face. Robots can assist people with motor impairments to perform these tasks by holding a tool that makes contact with the care receiver's face. By sensing the forces applied to the face with the tool, robots could potentially provide assistance that is more comfortable, safe, and effective. In order to inform the design of robotic controllers and assistive robots, we investigated the forces able-bodied people apply to themselves when wiping and shaving their faces. We present our methods for capturing and modeling these forces, results from a study with 9 participants, and recommendations for assistive robots. Our contributions include a trapezoidal force model that assumes participants have a target force they attempt to achieve for each stroke of the tool. We discuss advantages of this 3 parameter model and show that it fits our data well relative to other candidate models. We also provide statistics of the models' rise rates, fall rates, and target forces for the 9 participants in our study. In addition, we illustrate how the target forces varied based on the task, participant, and location on the face.
  • Item
    The Wouse: A Wearable Wince Detector to Stop Assistive Robots
    (Georgia Institute of Technology, 2012-09) Grice, Phillip M. ; Lee, Andy ; Evans, Henry ; Kemp, Charles C.
    Persons with severe motor impairments depend heavily upon caregivers for the performance of everyday tasks. Ongoing work is exploring the potential of giving motor-impaired users control of semi-autonomous assistive mobile manipulators to enable them to perform some self-care tasks such as scratching or shaving. Because these users are less able to escape a robot malfunction, or operate a traditional run-stop, physical human-robot interaction poses safety risks. We review approaches to safety in assistive robotics with a focus on accessible run-stops, and propose wincing as an accessible gesture for activating a run-stop device. We also present the wouse, a novel device for detecting wincing from skin movement near the eye, consisting of optical mouse components mounted near a user's temple via safety goggles. Using this device, we demonstrate a complete system to run-stop a Willow Garage PR2 robot, and perform two preliminary user studies. The first study examines discrimination of wincing from self-produced facial expressions. The results indicate the possibility for discrimination, though variability between users and inconsistent detection of skin movement remain significant challenges. The second experiment examines discrimination of wincing from external mechanical manipulations of the face during self-care tasks. The results indicate that the wouse, using a classifier trained with data from the first experiment, can be used during face-manipulation tasks. The device produced no false positives, but succeeded in correctly identifying wincing events in only two of four subjects.
  • Item
    A Robotic System for Autonomous Medication and Water Delivery
    (Georgia Institute of Technology, 2012) Emeli, Victor ; Wagner, Alan R. ; Kemp, Charles C.
    Poor medication adherence and dehydration are well-documented challenges for older adults living independently that lead to reduced quality of life. Robotic delivery of pills and water in the home could potentially improve medication adherence and hydration for older adults by providing timely, reliable, and convenient delivery. In this technical report, we present a prototype multi-robot system that can autonomously deliver pills and water to a person in a realistic home environment. The system consists of a mobile robot with a tray, a stationary dispensing robot, and a smartphone carried by the user. Within this paper, we discuss the opportunity to improve quality of life, describe our robotic system, and convey results from an experimental evaluation of the system’s delivery performance.