skip to main content
research-article
Public Access

Auracle: Detecting Eating Episodes with an Ear-mounted Sensor

Published: 18 September 2018 Publication History

Abstract

In this paper, we propose Auracle, a wearable earpiece that can automatically recognize eating behavior. More specifically, in free-living conditions, we can recognize when and for how long a person is eating. Using an off-the-shelf contact microphone placed behind the ear, Auracle captures the sound of a person chewing as it passes through the bone and tissue of the head. This audio data is then processed by a custom analog/digital circuit board. To ensure reliable (yet comfortable) contact between microphone and skin, all hardware components are incorporated into a 3D-printed behind-the-head framework. We collected field data with 14 participants for 32 hours in free-living conditions and additional eating data with 10 participants for 2 hours in a laboratory setting. We achieved accuracy exceeding 92.8% and F1 score exceeding 77.5% for eating detection. Moreover, Auracle successfully detected 20-24 eating episodes (depending on the metrics) out of 26 in free-living conditions. We demonstrate that our custom device could sense, process, and classify audio data in real time. Additionally, we estimate Auracle can last 28.1 hours with a 110 mAh battery while communicating its observations of eating behavior to a smartphone over Bluetooth.

References

[1]
Oliver Amft, Mathias Stäger, Paul Lukowicz, and Gerhard Tröster. 2005. Analysis of Chewing Sounds for Dietary Monitoring. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp).
[2]
O. Amft and G. Troster. 2009. On-Body Sensing Solutions for Automatic Dietary Monitoring. IEEE Pervasive Computing 8, 2 (April 2009), 62--70.
[3]
Abdelkareem Bedri, Richard Li, Malcolm Haynes, Raj P. Kosaraju, Ishaan Grover, Temiloluwa Prioleau, Min Y. Beh, Mayank Goel, Thad Starner, and Gregory Abowd. 2017. EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments. Proc. ACM Interactive, Mobile and Wearable Ubiquitous Technology 1, 3 (Sept. 2017).
[4]
Abdelkareem Bedri, Apoorva Verlekar, Edison Thomaz, Valerie Avva, and Thad Starner. 2015. Detecting Mastication: A Wearable Approach. In Proceedings of the ACM on International Conference on Multimodal Interaction.
[5]
Abdelkareem Bedri, Apoorva Verlekar, Edison Thomaz, Valerie Avva, and Thad Starner. 2015. A wearable system for detecting eating activities with proximity sensors in the outer ear. In Proceedings of the ACM International Symposium on Wearable Computers. ACM, 91--92.
[6]
Yoav Benjamini and Daniel Yekutieli. 2001. The control of the false discovery rate in multiple testing under dependency, In Annals of Statistics. Annals of Statistics.
[7]
Shengjie Bi, Tao Wang, Ellen Davenport, Ronald Peterson, Ryan Halter, Jacob Sorber, and David Kotz. 2017. Toward a Wearable Sensor for Eating Detection. In Proceedings of the 2017 Workshop on Wearable Systems and Applications (WearSys). ACM Press, 17--22.
[8]
Thomas Bodenheimer, Ellen Chen, and Heather D. Bennett. 2009. Confronting The Growing Burden Of Chronic Disease: Can The U.S. Health Care Workforce Do The Job? Health Affairs 28, 1 (1 Jan. 2009), 64--74.
[9]
Keum S. Chun, Sarnab Bhattacharya, and Edison Thomaz. 2018. Detecting Eating Episodes by Tracking Jawbone Movements with a Non-Contact Wearable Sensor. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 1 (26 March 2018), 1--21.
[10]
Yujie Dong, Jenna Scisco, Mike Wilson, Eric Muth, and Adam Hoover. 2014. Detecting periods of eating during free-living by tracking wrist motion. IEEE Journal of Biomedical and Health Informatics 18, 4 (July 2014), 1253--1260. http://view.ncbi.nlm.nih.gov/ /24058042
[11]
Muhammad Farooq, Juan M. Fontana, and Edward Sazonov. 2014. A novel approach for food intake detection using electroglottography. Physiological measurement 35, 5 (May 2014), 739--751.
[12]
Muhammad Farooq and Edward Sazonov. 2016. A Novel Wearable Device for Food Intake and Physical Activity Recognition. Sensors 16, 7 (11 July 2016).
[13]
Haik Kalantarian, Nabil Alshurafa, and Majid Sarrafzadeh. 2014. A Wearable Nutrition Monitoring System. In Proceedings of the International Conference on Wearable and Implantable Body Sensor Networks (BSN).
[14]
Haik Kalantarian, Nabil Alshurafa, and Majid Sarrafzadeh. 2017. A Survey of Diet Monitoring Technology. IEEE Pervasive Computing 16, 1 (Jan. 2017), 57--65.
[15]
F. L. E. Lecluse, M. P. Brocaar, and J. Verschuure. 1975. The Electroglottography and its Relation to Glottal Activity. Folia Phoniatrica et Logopaedica 27, 3 (1975), 215--224.
[16]
Rebecca M. Leech, Anthony Worsley, Anna Timperio, and Sarah A. McNaughton. 2015. Characterizing eating patterns: a comparison of eating occasion definitions. The American Journal of Clinical Nutrition (7 Oct. 2015).
[17]
Jindong Liu, Edward Johns, Louis Atallah, Claire Pettitt, Benny Lo, Gary Frost, and Guang-Zhong Yang. 2012. An Intelligent Food-Intake Monitoring System Using Wearable Sensors. In Proceedings of the International Conference on Wearable and Implantable Body Sensor Networks. IEEE, 154--160.
[18]
Christopher Merck, Christina Maher, Mark Mirtchouk, Min Zheng, Yuxiao Huang, and Samantha Kleinberg. 2016. Multimodality Sensing for Eating Recognition. In Proceedings of the EAI International Conference on Pervasive Computing Technologies for Healthcare. ACM Press.
[19]
Mark Mirtchouk, Drew Lustig, Alexandra Smith, Ivan Ching, Min Zheng, and Samantha Kleinberg. 2017. Recognizing Eating from Body-Worn Sensors: Combining Free-living and Laboratory Data. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT) 1, 3 (Sept. 2017), 85+.
[20]
Temiloluwa Olubanjo and Maysam Ghovanloo. 2014. Real-time swallowing detection based on tracheal acoustics. In Proc. of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 4384--4388.
[21]
Vasileios Papapanagiotou, Christos Diou, Lingchuan Zhou, Janet van den Boer, Monica Mars, and Anastasios Delopoulos. 2016. A novel chewing detection system based on PPG, audio and accelerometry. IEEE Journal of Biomedical and Health Informatics (2016).
[22]
Sebastian Päßler, Matthias Wolff, and Wolf-Joachim Fischer. 2012. Food intake monitoring: an acoustical approach to automated food intake activity detection and classification of consumed food. Physiological Measurement 33, 6 (01 June 2012), 1073--1093.
[23]
Mitesh S. Patel, David A. Asch, and Kevin G. Volpp. 2015. Wearable devices as facilitators, not drivers, of health behavior change. JAMA 313, 5 (03 Feb. 2015), 459--460. http://view.ncbi.nlm.nih.gov/ /25569175
[24]
Temiloluwa Prioleau, Elliot Moore, and Maysam Ghovanloo. 2017. Unobtrusive and Wearable Systems for Automatic Dietary Monitoring. IEEE Transactions on Biomedical Engineering 64, 9 (Sept. 2017), 2075--2089.
[25]
Tauhidur Rahman, Alexander T. Adams, Mi Zhang, Erin Cherry, Bobby Zhou, Huaishu Peng, and Tanzeem Choudhury. 2014. BodyBeat: A Mobile System for Sensing Non-speech Body Sounds. In Proceedings of the Annual International Conference on Mobile Systems, Applications, and Services (MobiSys).
[26]
Sasank Reddy, Andrew Parker, Josh Hyman, Jeff Burke, Deborah Estrin, and Mark Hansen. 2007. Image browsing, processing, and clustering for participatory sensing. In Proceedings of the Workshop on Embedded Networked Sensors (EmNets). ACM Press, 13--17.
[27]
Edward Sazonov, Stephanie Schuckers, Paulo Lopez-Meyer, Oleksandr Makeyev, Nadezhda Sazonova, Edward L. Melanson, and Michael Neuman. 2008. Non-invasive monitoring of chewing and swallowing for objective quantification of ingestive behavior. Physiological measurement 29, 5 (May 2008), 525--541.
[28]
Edward S. Sazonov, Oleksandr Makeyev, Stephanie Schuckers, Paulo Lopez-Meyer, Edward L. Melanson, and Michael R. Neuman. 2010. Automatic detection of swallowing events by acoustical means for applications of monitoring of ingestive behavior. IEEE Transactions on Biomedical Engineering 57, 3 (March 2010), 626--633. http://view.ncbi.nlm.nih.gov/ /19789095
[29]
Sougata Sen, Vigneshwaran Subbaraju, Archan Misra, Rajesh K. Balan, and Youngki Lee. 2015. The case for smartwatch-based diet monitoring. In IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops). IEEE, 585--590.
[30]
Sougata Sen, Vigneshwaran Subbaraju, Archan Misra, Rajesh K. Balan, and Youngki Lee. 2017. Experiences in Building a Real-World Eating Recogniser. In Proceedings of the International on Workshop on Physical Analytics (WPA). ACM, 7--12.
[31]
Masaki Shuzo, Shintaro Komori, Tomoko Takashima, Guillaume Lopez, Seiji Tatsuta, Shintaro Yanagimoto, Shin'ichi Warisawa, Jean-Jacques Delaunay, and Ichiro Yamada. 2010. Wearable Eating Habit Sensing System Using Internal Body Sound. Journal of Advanced Mechanical Design, Systems, and Manufacturing 4, 1 (2010), 158--166.
[32]
Mingui Sun, Lora E. Burke, Zhi H. Mao, Yiran Chen, Hsin C. Chen, Yicheng Bai, Yuecheng Li, Chengliu Li, and Wenyan Jia. 2014. eButton: A Wearable Computer for Health Monitoring and Personal Assistance. In Proceedings of the Annual Design Automation Conference.
[33]
Edison Thomaz, Aman Parnami, Irfan Essa, and Gregory D. Abowd. 2013. Feasibility of identifying eating moments from first-person images leveraging human computation. In Proceedings of the International SenseCam 8 Pervasive Imaging Conference (SenseCam). ACM Press, 26--33.
[34]
Edison Thomaz, Cheng Zhang, Irfan Essa, and Gregory D. Abowd. 2015. Inferring Meal Eating Activities in Real World Settings from Ambient Sounds. In Proceedings of the International Conference on Intelligent User Interfaces (IUI). ACM Press, 427--431.
[35]
Tri Vu, Feng Lin, Nabil Alshurafa, and Wenyao Xu. 2017. Wearable Food Intake Monitoring Technologies: A Comprehensive Review. Computers 6, 1(2017).
[36]
Jamie A. Ward, Paul Lukowicz, and Hans W Gellersen. 2011. Performance Metrics for Activity Recognition. ACM Trans. Intell Syst. Technol. 2, 1, Article 6 (Jan. 2011), 23 pages.
[37]
Koji Yatani and Khai N. Truong. 2012. BodyScope: a wearable acoustic sensor for activity recognition. In Proceedings of the ACM Conference on Ubiquitous Computing (UbiComp). 341--350.
[38]
Rui Zhang and Oliver Amft. 2016. Bite Glasses: Measuring Chewing Using EMG and Bone Vibration in Smart Eyeglasses. In Proceedings of the ACM International Symposium on Wearable Computers.
[39]
Rui Zhang and Oliver Amft. 2016. Regular-look Eyeglasses Can Monitor Chewing. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. ACM, 389--392.
[40]
Rui Zhang and Oliver Amft. 2018. Monitoring Chewing and Eating in Free-Living Using Smart Eyeglasses. IEEE Journal of Biomedical and Health Informatics 22, 1 (Jan. 2018), 23--32.
[41]
Rui Zhang, Severin Bernhart, and Oliver Amft. 2016. Diet eyeglasses: Recognising food chewing using EMG and smart eyeglasses. In Proceedings of IEEE International Conference on Wearable and Implantable Body Sensor Networks (BSN).

Cited By

View all
  • (2024)Continuous glucose monitoring for automatic real-time assessment of eating events and nutrition: a scoping reviewFrontiers in Nutrition10.3389/fnut.2023.130834810Online publication date: 8-Jan-2024
  • (2024)HabitSense: A Privacy-Aware, AI-Enhanced Multimodal Wearable Platform for mHealth ApplicationsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785918:3(1-48)Online publication date: 9-Sep-2024
  • (2024)MunchSonic: Tracking Fine-grained Dietary Actions through Active Acoustic Sensing on EyeglassesProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676619(96-103)Online publication date: 5-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 2, Issue 3
September 2018
1536 pages
EISSN:2474-9567
DOI:10.1145/3279953
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 September 2018
Accepted: 01 September 2018
Revised: 01 July 2018
Received: 01 May 2018
Published in IMWUT Volume 2, Issue 3

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Acoustic sensing
  2. Activity recognition
  3. Automated dietary monitoring
  4. Earables
  5. Eating detection
  6. Eating episodes
  7. Field studies
  8. Unconstrained environment
  9. Wearable computing

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)244
  • Downloads (Last 6 weeks)26
Reflects downloads up to 03 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Continuous glucose monitoring for automatic real-time assessment of eating events and nutrition: a scoping reviewFrontiers in Nutrition10.3389/fnut.2023.130834810Online publication date: 8-Jan-2024
  • (2024)HabitSense: A Privacy-Aware, AI-Enhanced Multimodal Wearable Platform for mHealth ApplicationsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785918:3(1-48)Online publication date: 9-Sep-2024
  • (2024)MunchSonic: Tracking Fine-grained Dietary Actions through Active Acoustic Sensing on EyeglassesProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676619(96-103)Online publication date: 5-Oct-2024
  • (2024)EchoGuide: Active Acoustic Guidance for LLM-Based Eating Event Analysis from Egocentric VideosProceedings of the 2024 ACM International Symposium on Wearable Computers10.1145/3675095.3676611(40-47)Online publication date: 5-Oct-2024
  • (2024)NIR-sighted: A Programmable Streaming Architecture for Low-Energy Human-Centric Vision ApplicationsACM Transactions on Embedded Computing Systems10.1145/367207623:6(1-26)Online publication date: 11-Sep-2024
  • (2024)GustosonicSense: Towards understanding the design of playful gustosonic eating experiencesProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642182(1-12)Online publication date: 11-May-2024
  • (2024)Egocentric Image Captioning for Privacy-Preserved Passive Dietary Intake MonitoringIEEE Transactions on Cybernetics10.1109/TCYB.2023.324399954:2(679-692)Online publication date: Feb-2024
  • (2024)EarPrint: Earphone-Based Implicit User Authentication With Behavioral and Physiological AcousticsIEEE Internet of Things Journal10.1109/JIOT.2024.341762211:19(31128-31143)Online publication date: 1-Oct-2024
  • (2023)Technology to Automatically Record Eating Behavior in Real Life: A Systematic ReviewSensors10.3390/s2318775723:18(7757)Online publication date: 8-Sep-2023
  • (2023)Ear canal pressure sensor for food intake detectionFrontiers in Electronics10.3389/felec.2023.11736074Online publication date: 18-Jul-2023
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media