Organizational Unit:
Contextual Computing Group

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)

Publication Search Results

Now showing 1 - 6 of 6
  • Item
    The Gesture Watch: A Wireless Contact-Free Gesture Based Wrist Interface
    (Georgia Institute of Technology, 2007-10) Kim, Jungsoo ; He, Jiasheng ; Lyons, Kent ; Starner, Thad
    We introduce the Gesture Watch, a mobile wireless device worn on a user’s wrist that allows hand gesture control of other devices. The Gesture Watch utilizes an array of infrared proximity sensors to sense hand gestures made over the device and interprets the gestures using hidden Markov models. The Gesture Watch maps intuitive gross hand gestures to control signals such as the play and pause commands commonly found on mobile media players. We present our evaluation of the Gesture Watch designed to determine its the accuracy and usability. In our study, 10 participants used the GestureWatch in both mobile and stationary conditions as well as indoors and outdoors. Overall, we attained a recognition accuracy of 95.5% and found that the Gesture Watch worked well in both indoor and outdoor environments and while mobile.
  • Item
    GART: The Gesture and Activity Recognition Toolkit
    (Georgia Institute of Technology, 2007-07) Brashear, Helene ; Kim, Jung Soo ; Lyons, Kent ; Starner, Thad ; Westeyn, Tracy
    The Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications.
  • Item
    Towards a One-Way American Sign Language Translator
    (Georgia Institute of Technology, 2004-05) Brashear, Helene ; Henderson, Valerie ; Hernandez-Rebollar, Jose ; McGuire, R. Martin ; Ross, Danielle S. ; Starner, Thad
    Inspired by the Defense Advanced Research Projects Agency's (DARPA) recent successes in speech recognition, we introduce a new task for sign language recognition research: a mobile one-way American Sign Language translator. We argue that such a device should be feasible in the next few years, may provide immediate practical benefits for the Deaf community, and leads to a sustainable program of research comparable to early speech recognition efforts. We ground our efforts in a particular scenario, that of a Deaf individual seeking an apartment and discuss the system requirements and our interface for this scenario. Finally, we describe initial recognition results of 94% accuracy on a 141 sign vocabulary signed in phrases of fours signs using a one-handed glove-based system and hidden Markov models (HMMs).
  • Item
    Using Multiple Sensors for Mobile Sign Language Recognition
    (Georgia Institute of Technology, 2003-10) Brashear, Helene ; Starner, Thad ; Lukowicz, Paul ; Junker, Holger
    We build upon a constrained, lab-based Sign Language recognition system with the goal of making it a mobile assistive technology. We examine using multiple sensors for disambiguation of noisy data to improve recognition accuracy. Our experiment compares the results of training a small gesture vocabulary using noisy vision data, accelerometer data and both data sets combined.
  • Item
    Mobile Capture for Wearable Computer Usability Testing
    (Georgia Institute of Technology, 2001-10) Lyons, Kent ; Starner, Thad
    The mobility of wearable computers makes usability testing difficult. In order to fully understand how a user interacts with the wearable, the researcher must examine both the user’s direct interactions with the computer, as well as the external context the user perceives during their interaction. We present a tool that augments a wearable computer with additional hardware and software to capture the information needed to perform a usability study in the field under realistic conditions. We examine the challenges in doing the capture and present our implementation. We also describe VizWear, a tool for examining the captured data. Finally, we present our experiences using the system for a sample user study.
  • Item
    The Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer Vision System for Home Automation Control and Medical Monitoring
    (Georgia Institute of Technology, 2000-10) Starner, Thad ; Auxier, Jake ; Ashbrook, Daniel ; Gandy, Maribeth
    In this paper we present a wearable device for control of home automation systems via hand gestures. This solution has many advantages over traditional home automation interfaces in that it can be used by those with loss of vision, motor skills, and mobility. By combining other sources of context with the pendant we can reduce the number and complexity of gestures while maintaining functionality. As users input gestures, the system can also analyze their movements for pathological tremors. This information can then be used for medical diagnosis, therapy, and emergency services. Currently, the Gesture Pendant can recognize control gestures with an accuracy of 95% and user-defined gestures with an accuracy of 97% It can detect tremors above 2HZ within ±.1 Hz.