Understanding the Social Context of Eating with Multimodal Smartphone Sensing: The Role of Country Diversity

Authors: Nathan Kammoun, Lakmal Meegahapola, Daniel Gatica-Perez

Abstract: Understanding the social context of eating is crucial for promoting healthy eating behaviors by providing timely interventions. Multimodal smartphone sensing data has the potential to provide valuable insights into eating behavior, particularly in mobile food diaries and mobile health applications. However, research on the social context of eating with smartphone sensor data is limited, despite extensive study in nutrition and behavioral science. Moreover, the impact of country differences on the social context of eating, as measured by multimodal phone sensor data and self-reports, remains under-explored. To address this research gap, we present a study using a smartphone sensing dataset from eight countries (China, Denmark, India, Italy, Mexico, Mongolia, Paraguay, and the UK). Our study focuses on a set of approximately 24K self-reports on eating events provided by 678 college students to investigate the country diversity that emerges from smartphone sensors during eating events for different social contexts (alone or with others). Our analysis revealed that while some smartphone usage features during eating events were similar across countries, others exhibited unique behaviors in each country. We further studied how user and country-specific factors impact social context inference by developing machine learning models with population-level (non-personalized) and hybrid (partially personalized) experimental setups. We showed that models based on the hybrid approach achieve AUC scores up to 0.75 with XGBoost models. These findings have implications for future research on mobile food diaries and mobile health sensing systems, emphasizing the importance of considering country differences in building and deploying machine learning models to minimize biases and improve generalization across different populations.

Submitted to arXiv on 01 Jun. 2023

Explore the paper tree

Click on the tree nodes to be redirected to a given paper and access their summaries and virtual assistant

Also access our AI generated Summaries, or ask questions about this paper to our AI assistant.

Look for similar papers (in beta version)

By clicking on the button above, our algorithm will scan all papers in our database to find the closest based on the contents of the full papers and not just on metadata. Please note that it only works for papers that we have generated summaries for and you can rerun it from time to time to get a more accurate result while our database grows.