Abstract
Chronic and widespread diseases such as obesity, diabetes, and hypercholesterolemia require patients to monitor their food intake, and food journaling is currently the most common method for doing so. However, food journaling is subject to self-bias and recall errors, and is poorly adhered to by patients. In this paper, we propose an alternative by introducing EarBit, a wearable system that detects eating moments. We evaluate the performance of inertial, optical, and acoustic sensing modalities and focus on inertial sensing, by virtue of its recognition and usability performance. Using data collected in a simulated home setting with minimum restrictions on participants’ behavior, we build our models and evaluate them with an unconstrained outside-the-lab study. For both studies, we obtained video footage as ground truth for participants activities. Using leave-one-user-out validation, EarBit recognized all the eating episodes in the semi-controlled lab study, and achieved an accuracy of 90.1% and an F1-score of 90.9% in detecting chewing instances. In the unconstrained, outside-the-lab evaluation, EarBit obtained an accuracy of 93% and an F1-score of 80.1% in detecting chewing instances. It also accurately recognized all but one recorded eating episodes. These episodes ranged from a 2 minute snack to a 30 minute meal.
- Oliver Amft, Mathias Stäger, Paul Lukowicz, and Gerhard Tröster. 2005. Analysis of Chewing Sounds for Dietary Monitoring. In Proceedings of the 7th International Conference on Ubiquitous Computing (UbiComp’05). Springer-Verlag, Berlin, Heidelberg, 56--72. Google ScholarDigital Library
- O. Amft and G. Troster. 2006. Methods for Detection and Classification of Normal Swallowing from Muscle Activation and Sound. In 2006 Pervasive Health Conference and Workshops. 1--10.Google Scholar
- Oliver Amft and Gerhard Tröster. 2008. Recognition of dietary activity events using on-body sensors. Artificial intelligence in medicine 42, 2 (2008), 121--136. Google ScholarDigital Library
- Ling Bao and Stephen S Intille. 2004. Activity recognition from user-annotated acceleration data. In International Conference on Pervasive Computing. Springer, 1--17.Google ScholarCross Ref
- Lindsay T Bartholome, Roseann E Peterson, Susan K Raatz, and Nancy C Raymond. 2013. A comparison of the accuracy of self-reported intake with measured intake of a laboratory overeating episode in overweight and obese women with and without binge eating disorder. European journal of nutrition 52, 1 (2013), 193--202.Google Scholar
- Abdelkareem Bedri, Apoorva Verlekar, Edison Thomaz, Valerie Avva, and Thad Starner. 2015. Detecting Mastication: A Wearable Approach. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (ICMI ’15). ACM, New York, NY, USA, 247--250. Google ScholarDigital Library
- Abdelkareem Bedri, Apoorva Verlekar, Edison Thomaz, Valerie Avva, and Thad Starner. 2015. A Wearable System for Detecting Eating Activities with Proximity Sensors in the Outer Ear. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC ’15). ACM, New York, NY, USA, 91--92. Google ScholarDigital Library
- Marianne Bertrand and Diane Whitmore Schanzenbach. 2009. Time use and food consumption. The American Economic Review 99, 2 (2009), 170--176.Google ScholarCross Ref
- Y. Bi, M. Lv, C. Song, W. Xu, N. Guan, and W. Yi. 2016. AutoDietary: A Wearable Acoustic Sensor System for Food Intake Recognition in Daily Life. IEEE Sensors Journal 16, 3 (Feb 2016), 806--816.Google ScholarCross Ref
- Sheelagh Carpendale. 2008. Evaluating information visualizations. In Information Visualization. Springer, 19--45. Google ScholarDigital Library
- Junghoon Chae, Insoo Woo, SungYe Kim, Ross Maciejewski, Fengqing Zhu, Edward J. Delp, Carol J. Boushey, and David S. Ebert. 2011. Volume estimation using food specific shape templates in mobile image-based dietary assessment. In Proc SPIE Int Soc Opt Eng.Google Scholar
- Jungman Chung, Jungmin Chung, Wonjun Oh, Yongkyu Yoo, Won Gu Lee, and Hyunwoo Bang. 2017. A glasses-type wearable device for monitoring the patterns of food intake and facial activity. Scientific Reports 7 (2017), 41690.Google ScholarCross Ref
- Y. Dong, J. Scisco, M. Wilson, E. Muth, and A. Hoover. 2014. Detecting Periods of Eating During Free-Living by Tracking Wrist Motion. IEEE Journal of Biomedical and Health Informatics 18, 4 (July 2014), 1253--1260.Google ScholarCross Ref
- J. M. Fontana, M. Farooq, and E. Sazonov. 2014. Automatic Ingestion Monitor: A Novel Wearable Device for Monitoring of Ingestive Behavior. IEEE Transactions on Biomedical Engineering 61, 6 (June 2014), 1772--1779.Google ScholarCross Ref
- Adam Fouse, Nadir Weibel, Edwin Hutchins, and James D. Hollan. 2011. ChronoViz: A System for Supporting Navigation of Time-coded Data. In CHI ’11 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’11). ACM, New York, NY, USA, 299--304. Google ScholarDigital Library
- Y. Gao, N. Zhang, H. Wang, X. Ding, X. Ye, G. Chen, and Y. Cao. 2016. iHear Food: Eating Detection Using Commodity Bluetooth Headsets. In 2016 IEEE First International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE). 163--172.Google Scholar
- H. Kalantarian, N. Alshurafa, and M. Sarrafzadeh. 2017. A Survey of Diet Monitoring Technology. IEEE Pervasive Computing 16, 1 (Jan 2017), 57--65. Google ScholarDigital Library
- Nicky Kern, Bernt Schiele, and Albrecht Schmidt. 2003. Multi-sensor activity context detection for wearable computing. In European Symposium on Ambient Intelligence. Springer, 220--232.Google ScholarCross Ref
- A. M. Khan, Y. K. Lee, S. Y. Lee, and T. S. Kim. 2010. A Triaxial Accelerometer-Based Physical-Activity Recognition via Augmented-Signal Features and a Hierarchical Recognizer. IEEE Transactions on Information Technology in Biomedicine 14, 5 (Sept 2010), 1166--1172. Google ScholarDigital Library
- Julie A. Kientz, Shwetak N. Patel, Brian Jones, Ed Price, Elizabeth D. Mynatt, and Gregory D. Abowd. 2008. The Georgia Tech Aware Home. In CHI ’08 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’08). ACM, New York, NY, USA, 3675--3680. Google ScholarDigital Library
- Jonathan Lazar, Jinjuan Heidi Feng, and Harry Hochheiser. 2010. Research methods in human-computer interaction. John Wiley 8 Sons. Google ScholarDigital Library
- J. Liu, E. Johns, L. Atallah, C. Pettitt, B. Lo, G. Frost, and G. Z. Yang. 2012. An Intelligent Food-Intake Monitoring System Using Wearable Sensors. In 2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks. 154--160. Google ScholarDigital Library
- S. Matos, S. S. Birring, I. D. Pavord, and H. Evans. 2006. Detection of cough signals in continuous audio recordings using hidden Markov models. IEEE Transactions on Biomedical Engineering 53, 6 (June 2006), 1078--1083.Google ScholarCross Ref
- E Mcgrath. 1995. Methodology matters: Doing research in the behavioral and social sciences. In Readings in Human-Computer Interaction: Toward the Year 2000 (2nd ed. Citeseer. Google ScholarDigital Library
- Mark Mirtchouk, Christopher Merck, and Samantha Kleinberg. 2016. Automated estimation of food type and amount consumed from body-worn audio and motion sensors. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 451--462. Google ScholarDigital Library
- Dan Morris, T Scott Saponas, Andrew Guillory, and Ilya Kelner. 2014. RecoFit: using a wearable sensor to find, recognize, and count repetitive exercises. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems. ACM, 3225--3234. Google ScholarDigital Library
- Sean A Munson and Sunny Consolvo. 2012. Exploring goal-setting, rewards, self-monitoring, and sharing to motivate physical activity. In 2012 6th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops. IEEE, 25--32.Google ScholarCross Ref
- Temiloluwa Olubanjo and Maysam Ghovanloo. 2014. Real-time swallowing detection based on tracheal acoustics. In Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on. IEEE, 4384--4388.Google ScholarCross Ref
- T. Olubanjo and M. Ghovanloo. 2014. Tracheal activity recognition based on acoustic signals. In 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. 1436--1439.Google Scholar
- Sebastian Päßler, Matthias Wolff, and Wolf-Joachim Fischer. 2012. Food intake monitoring: an acoustical approach to automated food intake activity detection and classification of consumed food. Physiological measurement 33, 6 (2012), 1073.Google Scholar
- Temiloluwa Prioleau, Elliot Moore, and Maysam Ghovanloo. 2017. Unobtrusive and Wearable Systems for Automatic Dietary Monitoring. IEEE Transactions on Biomedical Engineering (2017).Google Scholar
- Pavel Pudil, Jana Novovičová, and Josef Kittler. 1994. Floating search methods in feature selection. Pattern recognition letters 15, 11 (1994), 1119--1125. Google ScholarDigital Library
- Shah Atiqur Rahman, Christopher Merck, Yuxiao Huang, and Samantha Kleinberg. 2015. Unintrusive eating recognition using Google Glass. In Pervasive Computing Technologies for Healthcare (PervasiveHealth), 2015 9th International Conference on. IEEE, 108--111. Google ScholarDigital Library
- Tauhidur Rahman, Alexander Travis Adams, Mi Zhang, Erin Cherry, Bobby Zhou, Huaishu Peng, and Tanzeem Choudhury. 2014. BodyBeat: a mobile system for sensing non-speech body sounds.. In MobiSys, Vol. 14. 2--13. Google ScholarDigital Library
- Edison Thomaz, Irfan Essa, and Gregory D. Abowd. 2015. A Practical Approach for Recognizing Eating Moments with Wrist-mounted Inertial Sensing. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’15). ACM, New York, NY, USA, 1029--1040. Google ScholarDigital Library
- Tri Vu, Feng Lin, Nabil Alshurafa, and Wenyao Xu. 2017. Wearable Food Intake Monitoring Technologies: A Comprehensive Review. Computers 6, 1 (2017), 4.Google Scholar
- Chang Xu, Ye He, Nitin Khannan, Albert Parra, Carol Boushey, and Edward Delp. 2013. Image-based food volume estimation. Int. Workshop on Multimedia for Cooking 8 Eating Activities (CEA) (2013), 75--80. Google ScholarDigital Library
- Koji Yatani and Khai N Truong. 2012. Bodyscope: a wearable acoustic sensor for activity recognition. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 341--350. Google ScholarDigital Library
- Koji Yatani and Khai N Truong. 2012. BodyScope: a wearable acoustic sensor for activity recognition. UbiComp ’12: Proceedings of the 2012 ACM Conference on Ubiquitous Computing (2012), 341--350. Google ScholarDigital Library
- Rui Zhang and Oliver Amft. 2016. Bite Glasses: Measuring Chewing Using Emg and Bone Vibration in Smart Eyeglasses. In Proceedings of the 2016 ACM International Symposium on Wearable Computers (ISWC ’16). ACM, New York, NY, USA, 50--52. Google ScholarDigital Library
- Rui Zhang and Oliver Amft. 2016. Regular-look eyeglasses can monitor chewing. In Int. Joint Conf. on Pervasive 8 Ubiquitous Computing. 389--392. Google ScholarDigital Library
- Rui Zhang, Severin Bernhart, and Oliver Amft. 2016. Diet eyeglasses: Recognising food chewing using EMG and smart eyeglasses. In Wearable and Implantable Body Sensor Networks (BSN), 2016 IEEE 13th International Conference on. IEEE, 7--12.Google ScholarCross Ref
Index Terms
- EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments
Recommendations
FitByte: Automatic Diet Monitoring in Unconstrained Situations Using Multimodal Sensing on Eyeglasses
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing SystemsIn an attempt to help users reach their health goals and practitioners understand the relationship between diet and disease, researchers have proposed many wearable systems to automatically monitor food consumption. When a person consumes food, he/she ...
Auracle: Detecting Eating Episodes with an Ear-mounted Sensor
In this paper, we propose Auracle, a wearable earpiece that can automatically recognize eating behavior. More specifically, in free-living conditions, we can recognize when and for how long a person is eating. Using an off-the-shelf contact microphone ...
IMChew: Chewing Analysis using Earphone Inertial Measurement Units
BodySys '24: Proceedings of the Workshop on Body-Centric Computing SystemsEating at a slower pace can aid in improved digestion and nutrient absorption. It further contributes to a lower risk of obesity and gastric cancer. Hence, our work aims to explore unobtrusive tools for detecting and counting chewing activity to assist ...
Comments