event

PhD Defense by Edison Thomaz

Primary tabs

Ph.D. Defense of Dissertation Announcement

 

Title: Automatic Eating Detection in Real-World Settings with Commodity Sensing

 

Edison Thomaz

Human-Centered Computing Ph.D. Candidate

School of Interactive Computing

College of Computing

Georgia Institute of Technology

 

Date: Thursday, November 19th, 2015

Time: 2:00 PM EST

Location: GVU Cafe, 2nd Floor, Technology Square Research Building (TSRB)

 

COMMITTEE:

 

Dr. Gregory Abowd, School of Interactive Computing, Georgia Tech (Advisor)
Dr. Irfan Essa, School of Interactive Computing, Georgia Tech (Advisor)

Dr. Thad Starner, School of Interactive Computing, Georgia Tech
Dr. Elizabeth Mynatt, School of Interactive Computing, Georgia Tech
Dr. Tanzeem Choudhury, Department of Information Science, Cornell University

Dr. David Conroy, College of Health and Human Development, Penn State University

 

ABSTRACT

Motivated by challenges and opportunities in nutritional epidemiology and food journaling, ubiquitous computing researchers have proposed numerous techniques for automated dietary monitoring (ADM) over the years. Although progress has been made, a truly practical system that can automatically recognize what people eat in real-world settings remains elusive. This dissertation addresses the problem of ADM by focusing on practical eating moment detection. Eating detection is a foundational element of ADM since automatically recognizing when a person is eating is required before identifying what and how much is being consumed. Additionally, eating detection can serve as the basis for new types of dietary self-monitoring practices such as semi-automated food journaling.

In this thesis, I show that everyday human eating activities such as breakfast, lunch, and dinner can be automatically detected in real-world settings by opportunistically leveraging sensors in practical, off-the-shelf wearable devices. I refer to this instrumentation approach as "commodity sensing". The work covered by this thesis encompasses a series of experiments I conducted with a total of 106 participants where I explored a variety of sensing modalities for automatic eating moment detection. The modalities studied include first-person images taken with wearable cameras, ambient sounds, and on-body inertial sensors.

I discuss the extent to which first-person images reflecting everyday experiences can be used to identify eating moments using two approaches: human computation, and by employing a combination of state-of-the-art machine learning and computer vision techniques. Furthermore, I also describe privacy challenges that arise with first-person photographs. Next, I present results showing how certain sounds associated with eating can be recognized and used to infer eating activities. Finally, I elaborate on findings from three studies focused on the use of on-body inertial sensors (head and wrists) to recognize eating moments both in a semi-controlled laboratory setting and in real-world conditions. I conclude by relating findings and insights to practical applications, and highlighting opportunities for future work.

Status

  • Workflow Status:Published
  • Created By:Tatianna Richardson
  • Created:11/09/2015
  • Modified By:Fletcher Moore
  • Modified:10/07/2016

Categories

Keywords

Target Audience