EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments

No Thumbnail Available
Author(s)
Li, Richard
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Collections
Supplementary to:
Abstract
Food journalling is the primary recommendation of physicians for many health concerns, including weight loss and a variety of diseases. The state-of-the-art approach for food journalling is by asking patients to record and self-report their own dietary activities for the day. However, it has been shown that the adherence and accuracy of such data is very low, resulting in little benefit for the patient. As a result, determining when someone is eating has been a point of interest to the ubiquitous computing community for several years. While many wearable approaches have been proposed and assessed in lab settings thus far, no practical solutions have been evaluated in the real world yet. In our work, we present a hearing aid form factor device that tracks the motion of the jaw from the ear. Further, we highlight three main contributions that contribute to its effectiveness and practicality: 1) Assess how well three sensing modalities performed and how their corresponding form factors are perceived. 2) Implement a novel approach to collect data for training a generalizable machine learning model by using a semi-controlled home environment. 3) Evaluate the system in unconstrained environments, obtaining state-of-the-art results validated with video footage from a wearable camera.
Sponsor
Date
2017-11-13
Extent
03:42 minutes
Resource Type
Moving Image
Resource Subtype
Presentation
Rights Statement
Rights URI