Title:
Automatic eating detection in real-world settings with commodity sensing

Thumbnail Image
Author(s)
Thomaz, Edison
Authors
Advisor(s)
Abowd, Gregory D.
Essa, Irfan
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
Series
Supplementary to
Abstract
Motivated by challenges and opportunities in nutritional epidemiology and food journaling, ubiquitous computing researchers have proposed numerous techniques for automated dietary monitoring (ADM) over the years. Although progress has been made, a truly practical system that can automatically recognize what people eat in real-world settings remains elusive. This dissertation addresses the problem of ADM by focusing on practical eating moment detection. Eating detection is a foundational element of ADM since automatically recognizing when a person is eating is required before identifying what and how much is being consumed. Additionally, eating detection can serve as the basis for new types of dietary self-monitoring practices such as semi-automated food journaling. In this thesis, I show that everyday eating moments such as breakfast, lunch, and dinner can be automatically detected in real-world settings by opportunistically leveraging sensors in practical, off-the-shelf wearable devices. I refer to this instrumentation approach as "commodity sensing". The work covered by this thesis encompasses a series of experiments I conducted with a total of 106 participants where I explored a variety of sensing modalities for automatic eating moment detection. The modalities studied include first-person images taken with wearable cameras, ambient sounds, and on-body inertial sensors. I discuss the extent to which first-person images reflecting everyday experiences can be used to identify eating moments using two approaches: human computation, and by employing a combination of state-of-the-art machine learning and computer vision techniques. Furthermore, I also describe privacy challenges that arise with first-person photographs. Next, I present results showing how certain sounds associated with eating can be recognized and used to infer eating activities. Finally, I elaborate on findings from three studies focused on the use of on-body inertial sensors (head and wrists) to recognize eating moments both in a semi-controlled laboratory setting and in real-world conditions. I conclude by relating findings and insights to practical applications, and highlighting opportunities for future work.
Sponsor
Date Issued
2016-01-07
Extent
Resource Type
Text
Resource Subtype
Dissertation
Rights Statement
Rights URI