1. Introduction
As stated by the World Health Organization (WHO) “Nutrition is coming to the fore as a major modifiable determinant of chronic disease, with scientific evidence increasingly supporting the view that alterations in diet have strong effects, both positive and negative, on health throughout life” [
1]. It is therefore of key importance to find efficient and solid methodologies to study eating behavior and food intake in order to help reduce potential long-term health problems caused by unhealthy diets. Past research on eating behaviors and attitudes relies intensively on self-reporting tools, such as 24-h recalls, food records (food diaries), and food frequency questionnaires (FFQ; [
2,
3,
4]). However, there is an increasing understanding of the limitations of this classical approach to studying eating behaviors and attitudes. One of the major limitations of this approach is that self-reporting tools rely on participants’ recall, which may be inaccurate or biased (especially when studying the actual amount of food or liquid intake [
5]). Recall biases can be caused by demand characteristics, which are cues that may indicate the study aims to participants, leading them to change their behaviors or responses based on what they think the research is about [
6], or more generally by the desire to comply with social norms and expectations when it comes to food intake [
7,
8]. Additionally, the majority of the studies investigating eating behavior are performed in the lab, which does not allow for a realistic replication of the many influences on eating behavior that occur in real-life (e.g., [
9]). Hence, to overcome these limitations, it is crucial to examine eating behavior and the effect of interventions in daily life, at home, or at institutions such as schools and hospitals. In contrast to lab research settings, humans typically behave naturally in these settings. It is also important that testing real-life eating behaviors in naturalistic settings relies on implicit, non-obtrusive measures [
10] which are objective and able to overcome potential biases.
There is a growing interest in identifying technologies able to improve the quality and validity of data collected to advance nutrition science. Such technologies should enable measuring eating behavior passively (i.e., without requiring action or mental effort on the part of the users), objectively, and reliably in realistic contexts. Importantly, to maximize the efficiency of real-life measurement, it is vital to develop technologies that capture eating behavior patterns in a low-cost, unobtrusive, and easy-to-analyze way. For real-world practicality, the technologies should be comfortable and acceptable, so that they can be used in naturalistic settings for extended periods while respecting the users’ privacy.
To gain insight into the state of the art in this field, we performed a search for published papers using technologies to measure eating behavior in real-life settings. In addition to papers describing specific systems and technologies, this search returned many review papers, some of which contained systematic reviews.
Evaluating these systematic reviews, we found that an up-to-date overview encompassing all (close-to) available technologies to automatically record eating behavior in real-life settings is still missing. To fill this gap, we here provide such an overview, categorized by what type of eating behavior they measure and which type of sensor technology they use. We indicate to what extent these technologies are readily available for use. With this review, we aim to (1) help researchers identify the most suitable technology to measure eating behavior in real-life, and to provide a basis for determining next steps in (2) research on measuring eating behavior in real-life and (3) technology development.
2. Methods and Procedure
Literature Review
Our literature search reporting adheres to the Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) checklist [
11,
12]. The PRISMA guidelines ensure that the literature is reviewed in a standard and systematic manner. This process underlies four phases: identification, screening, eligibility, and inclusion. The PRISMA diagram showing the search flow and inclusion/exclusion of records and reports in this study is shown in
Figure 1.
Eligibility Criteria
Our literature search aimed to identify mature and practical (i.e., not too complex or restrictive) technologies that can be used to unobtrusively assess food or drink intake in real-life conditions. The inclusion criterium was a sensor-based approach to the detection of eating or drinking. Studies not describing a sensor-based device to detect eating or drinking were excluded.
Screening Strategy
Figure 1 presents an overview of the screening strategy. In the first round, the titles and abstracts returned (n = 1241, after the elimination of 68 duplicates) were reviewed against the eligibility criterium. If the title and/or abstract mentioned a sensor-based approach to the detection of eating or drinking, the paper was included in the initial screening stage to be further assessed in the full-text screening stage. Papers that did not describe a sensor-based device to detect eating or drinking were excluded. Full-text screening was conducted on the remaining articles (n = 126), leading to a final sample of 73 included papers from the initial automatic search. Papers focusing on animal studies (n=4), food recognition (n=7), nutrient estimation (n=6), system design (n=2), or other non-related topics (n=10), were excluded. While review papers (n=20) were also excluded from our technology overview (
Table 1), they were evaluated (
Table A1 and
Table A2 in
Appendix A) and used to define the scope of this study. Additional papers were identified by manual search via the reference lists of full texts that were screened (n= 87). Full-text screening of these additional papers led to a final sample of 49 included papers from the manual search. Papers about dietary recall (n=7), food recognition (n=11), nutrient estimation (n=7), describing systems already described in papers from the initial automatic search (n=2), or other non-related topics (n=6), were excluded. Again, review papers (n=5) were excluded from our technology overview (
Table 1) but evaluated and used to define the scope of this study. The total number of papers included in this review amounts to 122.
All screening rounds were conducted by two of the authors. Each record was reviewed by two reviewers to decide its eligibility based on the title and abstract of each study, taking into consideration the exclusion criteria. When a record was rejected by one reviewer and accepted by the other, it was further evaluated by all authors and kept for eligibility when a majority voted in favor.
Reporting
We evaluated and summarized the review papers that our search returned in two tables (
Appendix A).
Table A1 includes systematic reviews, while
Table A2 includes non-systematic reviews. We defined systematic reviews as reviews following the PRISMA methodology. For all reviews we reported the year of publication and the general scope of the review. For systematic reviews we also reported the years of inclusion, the number of papers included, and the specific requirements for inclusion.
We summarized the core information about the devices and technologies for measuring eating and drinking behaviors from our search results in
Table 1. This table categorizes the studies retrieved in our literature search in terms of their measurement objectives, target measures, the devices and algorithms that were used as well as their (commercial or public) availability, and the way they were applied (method). In the column ‘Objective’, the purposes of the measurements are described. The three objectives we distinguish are ‘Eating/drinking activity detection’, ‘Bite/chewing/swallowing detection’ and ‘Portion size estimation’. Note that the second and third objectives can be considered as subcategories of the first – technologies are included in the first if they could not be grouped under the second or third objectives. The objectives are further specified in the column ‘Measurement targets’. In the column ‘Device’, we itemize the measurement tools or sensors used in the different systems. For each type of device, one or more representative papers were selected, bearing in mind the TRL (Technology Readiness Level [
15]), the availability (off-the-shelf) of the device and algorithm that were used, the year of publication (recent), and the number of times it was cited. The minimum TRL level was 2, and the paper with the highest TRL level among papers using similar techniques was selected as the representative paper. A concise description of each representative example is given in the column ‘Method’. The commercial availability of the example devices and algorithms is indicated in the columns ‘Off-the-shelf device’ and ‘Ready-to-use algorithm’. Lastly, other studies using similar systems are listed in the column ‘Similar papers’. Systems combining devices for several different measurement targets can appear in different table rows. To indicate this, they are labeled with successive letters for each row they appear in (e.g., 1a and 1b).
For each of the three objectives we counted the number of papers that described sensors that are designed (1) to be attached to the body, (2) to be attached to an object, (3) to be placed in the environment, or (4) to be held in the hand. Sensors attached to the body were further subdivided by body location. The results are visualized using bar graphs.
Figure 1.
PRISMA flow diagram describing the different phases of the procedure used to identify tools for food and drink intake assessment.
Figure 1.
PRISMA flow diagram describing the different phases of the procedure used to identify tools for food and drink intake assessment.
3. Results
Table 1 summarizes the core information of devices and technologies for measuring eating and drinking behaviors from our search results.
Eating and Drinking-Activity Detection
For ‘eating/drinking activity detection’, many systems have been reported that measure eating and drinking-related motions. In particular, many papers reported measuring these actions using motion sensors such as inertial sensor modules (i.e., Inertial Measurement Units or IMUs). IMUs typically consist of various sensors such as an accelerator, gyroscope, and magnetometer. Those sensors are embedded in smartphones and wearable devices such as smartwatches. In [
16] researchers collected IMUs’ signals with off-the-shelf smartwatches to identify hand-based eating and drinking-related activities. In this case, participants wore smartwatches on their preferred wrists. Other studies have employed IMUs worn on the wrist, head, neck, and combinations thereof [
17,
18,
19,
20]. Besides IMUs, proximity sensors, piezoelectric sensors, and radar sensors are also used to detect hand-mouth gestures or jawbone movements [
21,
22,
23]. Pressure sensors are used to measure eating activity as well. For instance, in [
24] eating activities and the amount of consumed food are measured by a pressure-sensitive tablecloth and tray. These devices provide information on food intake-related actions, such as cutting, scooping, stirring, the identification of the plate or container on which the action is executed, and allow the tracking of weight changes of plates and containers. Microphones, RGB-D images, and video cameras are also used to detect eating and drinking-related motions. In [
25], eating actions are detected by a ready-to-use algorithm as the 3D overlap of the mouth and food, using RGB-D images taken with a commercially available smartphone. Ear-worn sensors can measure in-body glucose levels [
26] and tooth-mounted dielectric sensors can measure impedance changes in the mouth signaling the presence of food [
27]. Although these latter methods can directly detect eating activity, the associated devices and data processing algorithms are still in the research phase.
Bite, Chewing, or Swallowing Detection
In the category of ‘bite/chewing/swallowing detection’, we grouped studies in which the number of bites (bite count), bite weight, and chewing or swallowing actions are measured. Motion sensors and video are used to detect bites (count). For instance, OpenPose is off-the-shelf software that analyses bite counts from videos [
28]. To assess bite weight, weight sensors and acoustic sensors have been used [
29,
30].
Chewing or swallowing is the most well-studied eating and drinking-related activity, as reflected by the number of papers focusing on such activities (31 papers). Motion sensors and microphones are typically employed for this purpose. For instance, in [
31], a gyroscope is used for chewing detection, an accelerometer for swallowing detection, and a proximity sensor to detect hand-to-mouth gestures. Microphones are typically used to register chewing and swallowing sounds. In most cases, commercially available microphones are applied, while the applied detection algorithms are custom-made. Video, electroglottograph (EGG), and electromyography (EMG) devices are also used to detect chewing and swallowing. EGG detects the variations in the electrical impedance caused by the passage of food during swallowing, while EMG in these studies monitors the masseter and temporalis muscle activation for recording chewing strokes.
Portion size Estimation
Portion size is estimated mainly by using weight sensors and food image analysis. Regarding weight sensors, the amount of food consumed is calculated by comparing the weights of plates before and after eating. An open-source system consisting of a wireless pocket-sized kitchen scale connected to a mobile application has been reported in [
32]. A system turning an everyday smartphone into a weighing scale is also available [
33]. The relative vibration intensity of the smartphone’s vibration motor and its built-in accelerometer are used to estimate the weight of food that is placed on the smartphone (
Figure 2). Off-the-shelf smartphone cameras are typically used for volume estimation from food images. Also, several studies use RGB-D images to get more accurate volume estimations from information on the height of the target food. For image-based approaches, AI-based algorithms are often employed to calculate portion size. Some studies made prototype systems applicable to real-life situations. In [
34], acoustic data from a microphone was collected along with food images to measure the distance from the camera to the food. This enables scaling the size of food in the image to its actual size without training images and reference objects. However, in other cases, image processing mostly needs a reference for comparing the food size. Besides image analysis, in [
35], researchers took a 360-degree scanned video obtained with a laser module and a diffraction lens and applied their volume estimation algorithm to the data. In addition to the above devices, a method to estimate portion size using EMG has been reported [
36]. In this study, EMG embedded in an armband device detects different patterns of signals based on the weight which a user is holding.
For estimating portion size in drinks, several kinds of sensors have been tested. An IMU in a smartwatch was used to estimate drink intake volume from sip duration [
37]. Also, in [
38], liquid sensors such as a capacitive sensor and a conductivity sensor were used to monitor the filling levels in a cup. Some research groups developed so-called smart fridges that automatically register food items and quantities. In [
39], image analysis of a thermal image taken by an infrared (IR) sensor embedded in a fridge provides an estimation of a drink volume. Another study proposed a system called the Playful Bottle system [
40] which consists of a smartphone attached to a drinking mug. Drinking motions such as picking up the mug, tilting it back, and placing it on the desk are detected by the phone’s accelerometer. After the drinking action is completed and the water line becomes steady, the phone’s camera captures the image of the amount of liquid in the mug (
Figure 3).
Sensor Location
Figure 4 indicates where sensors are typically located per Objective. The locations of sensors are classified as body-attached (e.g., ear, neck, head, glasses), embedded in objects (e.g., plates, cutlery), and in the environment (e.g., distant camera, magnetic trackers). For eating/drinking activity detection, sensors are mostly worn on the body and then embedded in the objects. Body-worn sensors are also used for bite/chewing/swallowing detection. On the other hand, for portion size estimation, object-embedded and handheld sensors are mainly chosen depending on the measuring targets.
Figure 5 shows the locations of wearable body sensors used in the reviewed studies. Sensors attached on wrists are most frequently used (32 cases), followed by embedded in glasses (19 cases) and attached to the ear (14 cases).
Table 1.
Summary of core information of devices and technologies for measuring eating and drinking behaviors.
Table 1.
Summary of core information of devices and technologies for measuring eating and drinking behaviors.
Objective |
Measurement Target |
Device |
Representative paper |
Method |
Off-the-shelf device |
Ready-to-use algorithm |
Similar papers |
Eating/ drinking activity detection |
eating/ drinking motion |
motion sensor |
[16] |
eating and drinking detection from smartwatch IMU signal |
Y |
N |
[41]a, [42]a, [43,44], [45], [46]a, [40]a, [21,47], [48]a, [22,49,50], [51]a, [37]a, [52]a, [17,18,53,54], [36]a, [55], [19]a, [56,57,58], [59]a, [60], [61]a, [62], [20]a, [63]a, [64,65,66,67] |
[23] |
detecting eating and drinking gestures from FMCW radar signal |
N |
N |
|
[24]a |
eating activities and amount consumed measured by pressure sensitive tablecloth and tray |
N |
N |
|
microphone |
[46]b |
eating detection from fused inertial-acoustic sensing using smartwatch with embedded IMU and microphone |
Y |
N |
[26]a, [68], [59]b |
RGB-D image |
[25]a |
eating action detected from smartphone RGB-D image as 3D overlap between mouth and food |
Y |
Y |
|
video |
[69] |
eating detection from cap-mounted video camera |
Y |
N |
[54]a |
liquid level |
liquid sensor |
[70] |
capacitive liquid level sensor |
N |
N |
[71] |
in-body glucose level |
glucose sensor |
[26]b |
glucose level measured by ear-worn sensor |
N |
N |
|
impedance change in mouth |
dielectric sensor |
[27] |
RF coupled tooth-mounted dielectric sensor measures impedance changes due to food in mouth |
N |
N |
|
user identification |
PPG (photoplethysmography) sensor |
[52]b |
identify the user from heart rate |
N |
N |
|
Bite/ chewing/ swallowing detection |
bites (count) |
motion sensor |
[72] |
a gyroscope mounted on a finger to detect motions of picking up food and delivering it to the mouth |
Y |
N |
[73,74] |
video |
[28] |
bite count by video analysis using OpenPose pose estimation software |
Y |
Y |
|
bite weight |
weight sensor |
[29]a |
plate-type base station with embedded weight sensors to measure amount and location of bites |
N |
N |
[54]a |
acoustic sensor |
[30] |
commercial ear buds, estimation model based on nonaudio and audio features |
Y |
N |
|
chewing/ swallowing |
motion sensor |
[31]a |
chewing detection from gyroscope, swallowing detection from accelerometer, hand-to-mouth gestures from proximity sensor |
Y |
Y |
[75,76,77,78], [48]b, [79,80], [81]a, [82], [83]a, [84,85,86], [61]b, [20]b |
microphone |
[87] |
wearable microphone with minicomputer to detect chewing/swallowing sounds |
Y |
N |
[42]b, [88,89,90], [81]b, [19]b, [91], [83]b, [92] |
video |
[93] |
classification of facial action units from video |
Y |
N |
[54]b |
EGG |
[94] |
swallowing detected by larynx-mounted EGG device |
Y |
N |
|
EMG |
[95] |
eyeglasses equipped with EMG to monitor temporalis muscles' activity |
N |
N |
[42]c, [96] |
Portion size estimation |
portion size food |
motion sensor |
[33] |
acceleration sensor of smartphone, measuring vibration intensity |
Y |
Y |
[97]a |
weight sensor |
[32] |
wireless pocket-sized kitchen scale connected to app |
Y |
Y |
[98,99,100,101,102], [54]b, [103], [29]b, [97]b, [104], [105]a, [106], [63]b, [24]b |
image |
[107] |
AI-based system to calculate food leftovers |
Y |
Y |
[31]b, [34,108,109,110,111,112,113,114,115], [116]b, [105]b,[105,117,118,119,120,121,122] |
[34] |
measuring the distance from the camera to the food using smartphone images combined with microphone data |
Y |
N |
[123] |
[124] |
RGB-D image and AI-based system to estimate consumed food volume using before and after-meal images |
Y |
Y |
[25]b, [125,126,127,128,129,130,131] |
laser |
[35] |
360-degree scanned video; the system design includes a volume estimation algorithm and a hardware add-on that consists of a laser module and a diffraction lens. |
N |
N |
[132] |
EMG |
[36]b |
weight of food consumed from EMG data |
N |
N |
|
portion size drink |
motion sensor |
[37]b |
volume from sip duration from IMU in smartwatch |
Y |
N |
[41]b, [51]b |
infrared (IR) sensor |
[39] |
thermal image by IR sensor embedded in smart fridge |
N |
N |
|
liquid sensor |
[38] |
capacitive sensor, conductivity sensor, flow sensor, pressure sensor, force sensors embedded in different mug prototypes |
N |
N |
|
image |
[40]b |
smartphone camera attached to mug |
N |
N |
[133] |
4. Discussion
This systematic review provides an up-to-date overview of all (close-to) available technologies to automatically record eating behavior in real-life. Technologies included in this review should enable measuring eating behavior passively (i.e., without users’ active input), objectively, and reliably in realistic contexts, to avoid relying on subjective user recall. We performed our review in order to help researchers identify the most suitable technology to measure eating behavior in real-life settings, and to provide a basis for determining next steps in both technology development and measuring eating behavior in real-life. 1328 studies were screened, and 122 studies were included after application of objective inclusion and exclusion criteria. 25 studies contained more than one technology. We found that often, relatively simple sensors are used to measure eating behaviors. Motion sensors are commonly used for eating/drinking activity detection and bite/chewing/swallowing; in addition, microphones are often used in studies focusing on chewing/swallowing. These sensors are usually attached to the body, in particular to the wrist for eating/drinking activity detection and to areas close to the face for detecting bite/chewing/swallowing. For portion size estimation, weight sensors and images from photo cameras are mostly used.
Concerning next steps in technology development, the information from the columns ‘Off-the-shelf device’ and ‘Ready-to-use algorithm’ in the technology overview table indicates which devices and algorithms are not ready for use yet and would benefit from further development. The category ‘portion size estimation’ seems most mature with respect to off-the-shelf availability and ready-to-use algorithms. Overall, what is mostly missing is ready-to-use algorithms. It is an enormous challenge to build fixed algorithms that accurately recognize eating behavior under varying conditions of sensor noise, types of food and individuals’ behavior and appearance. Typically, with algorithms we refer to machine learning or AI algorithms. These are trained using annotated (correctly labelled) data, and only work well in conditions that are similar to the ones they were trained in. In most reviewed studies, demonstrations of algorithms are limited to controlled conditions and a small number of participants. Therefore, these algorithms still need to be tested and evaluated for accuracy and generalizability outside the laboratory, such as in homes, restaurants, and hospitals.
When it comes to real-life studies, the obtrusiveness of the devices is an important factor. Devices should minimally interfere with the natural behavior of participants. Devices worn on the body with wires connected to a battery or other devices may restrict eating motions and constantly remind participants that they are being recorded. Wireless devices are suitable in that perspective, but at the same time, battery duration may be a limitation for long-term studies. Devices such as tray-embedded sensors and cameras that are not attached to the participant’s body are advantageous in both obtrusiveness and battery duration.
Although video cameras can provide holistic data on participants’ eating behaviors, they present privacy concerns. When a camera is used to film the course of a meal, the data provides the participant’s physical characteristics and enables the identification of the participant. Also, when the experiments are done at home, participants cannot avoid showing their private environment. Ideally, the experiments should allow collecting data anonymously if they are not needed for a certain purpose such as clinical data collection. This could be possible by only storing extracted features from the camera data rather than the images themselves, though this prohibits later validation and improvement of feature extraction [
134]. Systems using weight sensors do not suffer from privacy issues as camera images from the face do. [
105] used a weight sensor in combination with a camera pointing downward at the scales to keep track of the consumption of various types of seasonings.
For future research, we think it will be powerful to combine methods and sensor technologies. While most studies rely on single types of technologies, there are successful examples of combinations that illustrate a number of ways in which system and data quality can be improved. For instance, a novel and robust device called SnackBox [
135] consists of three containers for different types of snacks embedded on weight sensors (
Figure 6) and can be used to monitor snacking behavior at home. It can be connected to wearables and smartphones, thereby allowing for contextualized interpretation of signals recorded from the participant and for targeted Ecological Momentary Assessment (EMA [
136]). With EMA individuals are probed to report current behavior and experiences in their natural environment and avoid relying on memory. For instance, when the SnackBox detects snacking behavior, EMA can assess the individual’s current mood state through a short questionnaire. This affords the collection of more detailed and more accurate information compared to asking for this information at a later moment in time. Combining different sensor technologies can also have other benefits. Some studies used a motion detector or an audio sensor as switches to turn on other devices such as a chewing counter or a camera [
61,
90]. These systems are useful to collect data only during meal durations, therewith limiting superfluous data collection which is undesirable from the point of view of privacy and of battery life of the devices that are worn the whole day. In a study imitating a restaurant setting, a system consisting of custom-made table-embedded weight sensors and passive RFID (Radio-frequency identification) antennas was used [
98]. This system detects the weight change in the food served on the table and recognizes what the food is by RFID tags therewith complementing information that would have been obtained by using either sensor alone, and facilitating interpretation of data. Other studies used an IMU in combination with a microphone to detect eating behaviors [
59,
81]. It was concluded that the acoustic sensor in combination with motion-related sensors improved the detection accuracy significantly compared to motion-related sensors alone.
Besides investing in research on combining methods and sensor technologies, research applying and validating these technologies in out-of-the-lab studies are essential. Test generalizability between lab and real-life study should be examined as well as generalizations across situations and user groups, and user experience. These studies will lead to further improvements and/or insight into the context in which the technology can or cannot be used.
The current review has some limitations. First, we did not include a measure of accuracy or reliability of the technologies in our table. Some of the reviews listed in our reviews’ table (
Table 1 in appendix 1, e.g., [
138], [
144]) included the presence of evaluation metrics indicating the performance of the technologies (e.g., accuracy, sensitivity, and precision) as an inclusion criterion. We decided not to have this specific inclusion criterion as we think in our case it is hard to have comparable measures among studies. Also, whether accuracy is ‘good’ very much depends on the specific research question and study design. Second, our classification of whether an algorithm is ready-to-use could not be based on information directly provided in the paper but should be considered as a somewhat subjective estimate from the authors of this review.
In conclusion, there are some promising devices to measure eating behavior in naturalistic settings. However, it will take some time before some of these devices and algorithms will become commercially available due to lack of examples from a large number of test users and in various conditions. Until then, research in- and outside the lab needs to be carried out using custom-made devices and algorithms, and/or with combinations of existing devices. The approach to combine different technologies is recommended as it can lead to multimodal datasets consisting of different aspects of eating behavior (e.g., when people are eating and at what rate), dietary intake (e.g., what people are eating and how much), and contextual factors (e.g., why people are eating and with whom). We expect this to result in a much fuller understanding of individual eating patterns and dynamics, in real-time and in context, which can be used to develop adaptive, personalized interventions. New technologies measuring individual eating behaviors will be beneficial not only in consumer behavioral studies but also in the field of food and medical industries. New insights on eating patterns and traits discovered using these technologies may contribute to clarifying the use of food products in a wide range of consumers or to allowing for guidance in improving patients’ diets.
Author Contributions
Conceptualization, HH, PP, AT, AB; Data curation HH, AT, Formal analysis, HH, AT; Funding acquisition, AB; Investigation, HH, PP, AT; Methodology, AT; Supervision, AB; Visualization, AT; Writing – original draft, HH, PP, AT, AB; Writing – review & editing, HH, PP, AT, AB, GC.
Funding
This study was funded by Kikkoman Europe R&D Laboratory B.V.
Conflicts of Interest
This study was funded by Kikkoman Europe R&D Laboratory B.V. Haruka Hiraguchi is employed by Kikkoman Europe R&D Laboratory B.V.. Haruka Hiraguchi reports no potential conflicts with the study. All other authors declare no conflict of interest.
Appendix A
Table A1.
Systematic review papers that were evaluated and used to define the scope of this study (see Introduction). Listed for all systematic reviews are the year of publication, the focus of the review, the years of inclusion, the number of papers included, and the specific requirements for inclusion. Text in italic represents literal quotes.
Table A1.
Systematic review papers that were evaluated and used to define the scope of this study (see Introduction). Listed for all systematic reviews are the year of publication, the focus of the review, the years of inclusion, the number of papers included, and the specific requirements for inclusion. Text in italic represents literal quotes.
Reference |
Year of publication |
Focus of review |
Years of inclusion |
Number of papers included |
Specific requirements for inclusion |
[137] |
2020 |
In this review paper [they] provide an overview about automatic food intake monitoring, by focusing on technical aspects and Computer Vision works which solve the main involved tasks (i.e., classification, recognitions, segmentation, etc.). |
2010-2020 |
23 papers that present systems for automatic food intake monitoring + 46 papers that address Computer Vision tasks related to food images analysis
|
Method should apply Computer Vision techniques. |
[138] |
2020 |
[This] scoping review was conducted in order to:1. catalog the current use of wearable devices and sensors that automatically detect eating activity (dietary intake and/or eating behavior) specifically in free-living research settings; 2. and identify the sample size, sensor types, ground-truth measures, eating outcomes, and evaluation metrics used to evaluate these sensors.
|
prior to December22, 2019 |
33 |
I - description of any wearable device or sensor (i.e., worn on the body) that was usedto automatically (i.e., no required actions by the user) detect any form of eating (e.g., content of food consumed, quantity of food consumed, eating event, etc.). Proxies for “eating” measures, such as glucose levels or energy expenditure, were not included. II- “In-field” (non-lab) testing of the sensor(s), in which eating and activities were performed at-will with no restrictions (i.e., what, where, with whom, when, and how the user ate could not be restricted). III - At least one evaluation metric (e.g., Accuracy, Sensitivity, Precision, F1-score) that indicated the performance of the sensor on detecting its respective form of eating. |
[139] |
2019 |
The goal of this review was to identify unique technology-based tools for dietary intake assessment, including smartphone applications, those that captured digital images of foods and beverages for the purpose of dietary intake assessment, and dietary assessment tools available from the Web or that were accessed from a personal computer (PC). |
January 2011 -September 2017 |
43 |
(1) publications were in English, (2) articles were published from January 2011 to September 2017, and (3) sufficient information was available to evaluate tool features, functions, and uses.
|
[140] |
2017 |
This article reviews the most relevant and recent researches on automatic diet monitoring, discussing their strengths and weaknesses. In particular, the article reviews two approaches to this problem, accounting for most of the work in the area. The first approach is based on image analysis and aims at extracting information about food content automatically from food images. The second one relies on wearable sensors and has the detection of eating behaviours as its main goal. |
not specified |
not specified |
n/a |
[141] |
2019 |
The aim of this review is to synthesise research to date that utilises upper limb motion tracking sensors, either individually or in combination with other technologies (e.g., cameras, microphones), to objectively assess eating behaviour. |
2005-2018 |
69 |
(1) used at least one wearable motion sensor, (2) that was mounted to the wrist, lower arm, or upper arm (referred to as the upper limb in this review), (3) for eating behaviour assessment or human activity detection, where one of the classified activities is eating or drinking. We explicitly also included studies that additionally employed other sensors on other parts of the body (e.g., cameras, microphones, scales).
|
[142] |
2022 |
This paper consists of a systematic review of sensors and machine learning approaches for detecting food intake episodes. [...] The main questions of this systematic review were as follows: (RQ1) What sensors can be used to access food intake moments effectively? (RQ2) What can be done to integrate such sensors into daily lives seamlessly? (RQ3) What processing must be done to achieve good accuracy?
|
2010-2021 |
30 |
(1) research work that performs food intake detection; (2) research work that uses sensors to detect food with the help of sensors; (3) research work that presents some processing of food detection to propose diet; (4) research work that use wearable biosensors to detect food intake; (5) research work that use the methodology of deep learning, Support Vector Machines or Convolutional Neural Networks related to food intake; (6) research work that is not directly related to image processing techniques; (7) research work that is original; (8) papers published between 2010 and 2021; and(9) papers written in English
|
[143] |
2021 |
This article presents a comprehensive review of the use of sensor methodologies for portion size estimation. [...] Three research questions were chosen to guide this systematic review:RQ1) What are the available state-of-the-art SB-FPSE methodologies? [...] RQ2) What methods are employed for portion size estimation from sensor data and how accurate are these methods? [...] RQ3) Which sensor modalities are more suitable for use in the free-living conditions?
|
since 2000 |
67 |
Articles published in peer-reviewed venues; […] Papers that describe methods for estimation of portion size; FPSE methods that are either automatic or semi-automatic; written in English. |
[134] |
2022 |
[They] reviewed the current methods to automatically detect eating behavior events from video recordings.
|
2010–2021 |
13 |
Original research articles [...] published in the English language and containing findings on video analysis for human eating behavior from January 2010 to December 2021. [...] Conference papers were included. [...] Articles concerning non-human studies were excluded. We excluded research articles on eating behavior with video electroencephalogram monitoring, verbal interaction analysis, or sensors, as well as research studies not focusing on automated measures as they are beyond the scope of video analysis.
|
[144] |
2022 |
The aim of this study was to identify and collate sensor-based technologies that are feasible for dietitians to use to assist with performing dietary assessments in real-world practice settings. |
2016-2021 |
54 |
Any scientific paper published between January 2016 and December 2021 that used sensor-based devices to passively detect and record the initiation of eating in real-time. Studies were further excluded during the full text screening stage if they did not evaluate device performance or if the same research group conducted a more recent study describing a device that superseded previous studies of the same device.Studies evaluating a device that did not have the capacity to detect and record the start time of food intake, did not use sensors, were not applicable for use in free-living settings, or were discontinued at the time of the search were also excluded. |
[145] |
2021 |
This paper reviews the most recent solutions to automatic fluid intake monitoring both commercially and in the literature. The available technologies are divided into four categories: wearables, surfaces with embedded sensors, vision- and environmental-based solutions, and smart containers. |
2010-2020 |
115 |
Papers that did not study liquid intake and only studied food intake or other unrelated activities were excluded. Since this review is focused on the elderly population, in the wearable section, we only included literature that used wristbands and textile technology which could be easily worn without affecting the normal daily activity of the subjects. We have excluded devices that were not watch/band or textile based such as throat and ear microphones or ear inertial devices as they are not practical for everyday use. [...] Although this review is focused on the elderly population, studies that used adult subjects were not excluded, as there are too few that only used seniors. |
Table A2.
Non-systematic review papers that were evaluated and used to define the scope of this study (see Introduction). Listed for all non-systematic reviews are the year of publication, and the focus of the review. Text in italic represents literal quotes.
Table A2.
Non-systematic review papers that were evaluated and used to define the scope of this study (see Introduction). Listed for all non-systematic reviews are the year of publication, and the focus of the review. Text in italic represents literal quotes.
Reference |
Year of publication |
Focus of review |
[146] |
2019 |
A group of 30 experts got together to discuss the state of evidence with regard to monitoring calorie intake and eating behaviors [...] characterized into 3 domains: (1) image-based sensing (e.g, wearable and smartphone-based cameras combined with machine learning algorithms); (2) eating action unit (EAU) sensors (eg, to measure feeding gesture and chewing rate); and (3) biochemical measures (e.g, serum and plasma metabolite concentrations). They discussed how each domain functions, provided examples of promising solutions, and highlighted potential challenges and opportunities in each domain.
|
[147] |
2022 |
This paper concerns the validity of new consumer research technologies, as applied in a food behaviour context. Therefore, [they] introduce three validity criteria based on psychological theory concerning biases resulting from the awareness a consumer has of a measurement situation. [...] The three criteria addressing validity are: 1. Reflection: the research method requires the ‘person(a)’ of the consumer, i.e., he/she needs to think about his-/herself or his/her behaviour, 2. Awareness: the method requires the consumer to know he or she is being tested, 3. Informed: the method requires the consumer to know the underlying research question.
|
[148] |
2022 |
They present a high-level overview of [their] recent work on intake monitoring using a smartwatch, as well as methods using an in-ear microphone. [...] [This paper's] goal is to inform researchers and users of intake monitoring methods regarding (i) the development of new methods based on commercially available devices, (ii) what to expect in terms of effectiveness, and (iii) how these methods can be used in research as well as in practical applications.
|
[149] |
2021 |
A review of the state of the art of wearable sensors and methodologies proposed for monitoring ingestive behavior in humans |
[150] |
2017 |
This article evaluates the potential of various approaches to dietary monitoring with respect to convenience, accuracy, and applicability to real-world environments. [They] emphasize the application of technology and sensor-based solutions to the health-monitoring domain, and [they] evaluate various form factors to provide a comprehensive survey of the prior art in the field. |
[151] |
2022 |
The original ultimate goal of the studies reviewed in this paper was to use the laboratory test meal, measured with the UEM [Universal Eating Monitor], to translate animal models of ingestion to humans for the study of the physiological controls of food intake under standardized conditions.
|
[152] |
2022 |
This paper describes many food weight detection systems which includes sensor systems consisting of a load cell, manual food waste method, wearable sensors. |
[153] |
2018 |
This paper summarizes recent technological advancements, such as remote sensing devices, digital photography, and multisensor devices, which have the potential to improve the assessment of dietary intake and physical activity in free-living adults. |
[154] |
2022 |
Focusing on non-invasive solutions, we categorised identified technologies according to five study domains: 1) detecting food-related emotions, 2) monitoring food choices, 3) detecting eating actions, 4) identifying the type of food consumed, and 5) estimating the amount of food consumed. Additionally, [they] considered technologies not yet applied in the targeted research disciplines but worth considering in future research. |
[155] |
2020 |
In this article [they] describe how wrist-worn wearables, on-body cameras, and body-mounted biosensors can be used to capture data about when, what, and how much people eat and drink. [They] illustrate how these new techniques can be integrated to provide complete solutions for the passive, objective assessment of a wide range of traditional dietary factors, as well as novel measures of eating architecture, within person variation in intakes, and food/nutrient combinations within meals. |
[156] |
2021 |
This survey discusses the best-performing methodologies that have been developed so far for automatic food recognition and volume estimation. |
[157] |
2020 |
This paper reviews various novel digital methods for food volume estimation and explores the potential for adopting such technology in the Southeast Asian context. |
[158] |
2017 |
This paper presents a meticulous review of the latest sensing platforms and data analytic approaches to solve the challenges of food-intake monitoring, ranging from ear-based chewing and swallowing detection systems that capture eating gestures to wearable cameras that identify food types and caloric content through image processing techniques. This paper focuses on the comparison of different technologies and approaches that relate to user comfort, body location, and applications for medical research. |
[159] |
2020 |
In this survey, a wide range of chewing activity detection explored to outline the sensing design, classification methods, performances, chewing parameters, chewing data analysis as well as the challenges and limitations associated with them.
|
References
- Organization, W.H. Diet, nutrition, and the prevention of chronic diseases: report of a joint WHO/FAO expert consultation; World Health Organization: 2003; Volume Vol. 916.
- Magarey, A.; Watson, J.; Golley, R.K.; Burrows, T.; Sutherland, R.; McNaughton, S.A.; Denney-Wilson, E.; Campbell, K.; Collins, C. Assessing dietary intake in children and adolescents: Considerations and recommendations for obesity research. International Journal of Pediatric Obesity 2011, 6, 2–11. [Google Scholar] [CrossRef] [PubMed]
- Shim, J.-S.; Oh, K.; Kim, H.C. Dietary assessment methods in epidemiologic studies. Epidemiology and Health 2014, 36, e2014009. [Google Scholar] [CrossRef] [PubMed]
- Thompson, F.E.; Subar, A.F.; Loria, C.M.; Reedy, J.L.; Baranowski, T. Need for technological innovation in dietary assessment. Journal of the American Dietetic Association 2010, 110, 48–51. [Google Scholar] [CrossRef] [PubMed]
- Almiron-Roig, E.; Solis-Trapala, I.; Dodd, J.; Jebb, S.A. Estimating food portions. Influence of unit number, meal type and energy density. Appetite 2013, 71, 95–103. [Google Scholar] [CrossRef]
- Sharpe, D.; Whelton, W.J. Frightened by an old scarecrow: The remarkable resilience of demand characteristics. Review of General Psychology 2016, 20, 349–368. [Google Scholar] [CrossRef]
- Nix, E.; Wengreen, H.J. Social approval bias in self-reported fruit and vegetable intake after presentation of a normative message in college students. Appetite 2017, 116, 552–558. [Google Scholar] [CrossRef]
- Robinson, E.; Kersbergen, I.; Brunstrom, J.M.; Field, M. I'm watching you. Awareness that food consumption is being monitored is a demand characteristic in eating-behaviour experiments. Appetite 2014, 83, 19–25. [Google Scholar] [CrossRef]
- O'Connor, D.B.; Jones, F.; Conner, M.; McMillan, B.; Ferguson, E. Effects of daily hassles and eating style on eating behavior. Health Psychology 2008, 27, S20–S31. [Google Scholar] [CrossRef]
- Spruijt-Metz, D.; Wen, C.K.F.; Bell, B.M.; Intille, S.; Huang, J.S.; Baranowski, T. Advances and controversies in diet and physical activity measurement in youth. American Journal of Preventive Medicine 2018, 55, e81–e91. [Google Scholar] [CrossRef]
- Page, M.J.; Moher, D.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. PRISMA 2020 explanation and elaboration: updated guidance and exemplars for reporting systematic reviews. BMJ 2021, 372, n160. [Google Scholar] [CrossRef]
- Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Annals of Internal Medicine 2009, 151, 264–269. [Google Scholar] [CrossRef] [PubMed]
- Wohlin, C.; Kalinowski, M.; Romero Felizardo, K.; Mendes, E. Successful combination of database search and snowballing for identification of primary studies in systematic literature studies. Information and Software Technology 2022, 147, 106908. [Google Scholar] [CrossRef]
- Wohlin, C. Guidelines for snowballing in systematic literature studies and a replication in software engineering. In Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, London, England, United Kingdom; 2014; p. Article 38. [Google Scholar] [CrossRef]
- EU. Technology readiness levels (TRL). In Proceedings of the Horizon 2020 - Work Programme 2014-2015: General Annexes, Brussels, 2014.
- Diamantidou, E.; Giakoumis, D.; Votis, K.; Tzovaras, D.; Likothanassis, S. Comparing deep learning and human crafted features for recognising hand activities of daily living from wearables. In Proceedings of the 2022 23rd IEEE International Conference on Mobile Data Management (MDM), 6-9 June 2022; pp. 381–384. [Google Scholar] [CrossRef]
- Kyritsis, K.; Diou, C.; Delopoulos, A. A data driven end-to-end approach for in-the-wild monitoring of eating behavior using smartwatches. IEEE Journal of Biomedical and Health Informatics 2021, 25, 22–34. [Google Scholar] [CrossRef]
- Kyritsis, K.; Tatli, C.L.; Diou, C.; Delopoulos, A. Automated analysis of in meal eating behavior using a commercial wristband IMU sensor. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jeju, Korea (South), 11-15 July 2017; pp. 2843–2846. [Google Scholar] [CrossRef]
- Mirtchouk, M.; Lustig, D.; Smith, A.; Ching, I.; Zheng, M.; Kleinberg, S. Recognizing eating from body-worn sensors: combining free-living and laboratory data. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2017, 1, Article 85. [Google Scholar] [CrossRef]
- Zhang, S.; Zhao, Y.; Nguyen, D.T.; Xu, R.; Sen, S.; Hester, J.; Alshurafa, N. NeckSense: A multi-sensor necklace for detecting eating activities in free-living conditions. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2020, 4, Article 72. [Google Scholar] [CrossRef] [PubMed]
- Chun, K.S.; Bhattacharya, S.; Thomaz, E. Detecting eating episodes by tracking jawbone movements with a non-contact wearable sensor. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2018, 2, Article 4. [Google Scholar] [CrossRef]
- Farooq, M.; Doulah, A.; Parton, J.; McCrory, M.A.; Higgins, J.A.; Sazonov, E. Validation of sensor-based food intake detection by multicamera video observation in an unconstrained environment. Nutrients 2019, 11, 609. [Google Scholar] [CrossRef] [PubMed]
- Wang, C.; Kumar, T.S.; De Raedt, W.; Camps, G.; Hallez, H.; Vanrumste, B. Eat-Radar: Continuous gine-grained eating gesture detection using FMCW radar and 3D temporal convolutional network. arXiv 2022, arXiv:2211.04253. [Google Scholar] [CrossRef]
- Zhou, B.; Cheng, J.; Sundholm, M.; Reiss, A.; Huang, W.; Amft, O.; Lukowicz, P. Smart table surface: A novel approach to pervasive dining monitoring. In Proceedings of the 2015 IEEE International Conference on Pervasive Computing and Communications (PerCom), 23-27 March 2015; pp. 155–162. [Google Scholar] [CrossRef]
- Adachi, K.; Yanai, K. DepthGrillCam: A mobile application for real-time eating action recording using RGB-D images. In Proceedings of the 7th International Workshop on Multimedia Assisted Dietary Management, Lisboa, Portugal; 2022; pp. 55–59. [Google Scholar] [CrossRef]
- Rosa, B.G.; Anastasova-Ivanova, S.; Lo, B.; Yang, G.Z. Towards a fully automatic food intake recognition system using acoustic, image capturing and glucose measurements. In Proceedings of the 16th International Conference on Wearable and Implantable Body Sensor Networks, BSN 2019 - Proceedings C7 - 8771070, 2019. [CrossRef]
- Tseng, P.; Napier, B.; Garbarini, L.; Kaplan, D.L.; Omenetto, F.G. Functional, RF-trilayer sensors for tooth-mounted, wireless monitoring of the oral cavity and food consumption. Advanced Materials 2018, 30, 1703257. [Google Scholar] [CrossRef]
- Qiu, J.; Lo, F.P.W.; Lo, B. Assessing individual dietary intake in food sharing scenarios with a 360 camera and deep learning. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), 19-22 May 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Mertes, G.; Ding, L.; Chen, W.; Hallez, H.; Jia, J.; Vanrumste, B. Measuring and localizing individual bites using a sensor augmented plate during unrestricted eating for the aging population. IEEE Journal of Biomedical and Health Informatics 2020, 24, 1509–1518. [Google Scholar] [CrossRef]
- Papapanagiotou, V.; Ganotakis, S.; Delopoulos, A. Bite-weight estimation using commercial ear buds. In Proceedings of the 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Jalisco, Mexico, 1-5 November 2021; pp. 7182–7185. [Google Scholar] [CrossRef]
- Bedri, A.; Li, D.; Khurana, R.; Bhuwalka, K.; Goel, M. FitByte: Automatic diet monitoring in unconstrained situations using multimodal sensing on eyeglasses. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA; 2020; pp. 1–12. [Google Scholar] [CrossRef]
- Biasizzo, A.; Koroušić Seljak, B.; Valenčič, E.; Pavlin, M.; Santo Zarnik, M.; Blažica, B.; O’Kelly, D.; Papa, G. An open-source approach to solving the problem of accurate food-intake monitoring. IEEE Access 2021, 9, 162835–162846. [Google Scholar] [CrossRef]
- Zhang, S.; Xu, Q.; Sen, S.; Alshurafa, N. VibroScale: Turning your smartphone into a weighing scale. In Proceedings of the Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers,, Virtual Event, Mexico, 2020; pp. 176–179. [Google Scholar] [CrossRef]
- Gao, J.; Tan, W.; Ma, L.; Wang, Y.; Tang, W. MUSEFood: Multi-Sensor-Based Food Volume Estimation on Smartphones. In Proceedings of the 2019 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), 19-23 Aug. 2019; pp. 899–906. [CrossRef]
- Makhsous, S.; Mohammad, H.M.; Schenk, J.M.; Mamishev, A.V.; Kristal, A.R. A novel mobile structured light system in food 3d reconstruction and volume estimation. Sensors 2019, 19. [Google Scholar] [CrossRef] [PubMed]
- Lee, J.; Paudyal, P.; Banerjee, A.; Gupta, S.K.S. FIT-EVE&ADAM: Estimation of velocity & energy for automated diet activity monitoring. In Proceedings of the 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA), Cancun, Mexico, 18-21 December 2017; pp. 1071–1074. [Google Scholar]
- Hamatani, T.; Elhamshary, M.; Uchiyama, A.; Higashino, T. FluidMeter: Gauging the human daily fluid intake using smartwatches. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2018, 2, Article 113. [Google Scholar] [CrossRef]
- Kreutzer, J.F.; Deist, J.; Hein, C.M.; Lueth, T.C. Sensor systems for monitoring fluid intake indirectly and directly. In Proceedings of the 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), San Francisco, CA, USA, 14-17 June 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Sharma, A.; Misra, A.; Subramaniam, V.; Lee, Y. SmrtFridge: IoT-based, user interaction-driven food item & quantity sensing. In Proceedings of the 17th Conference on Embedded Networked Sensor Systems, New York, NY, USA; 2019; pp. 245–257. [Google Scholar] [CrossRef]
- Chiu, M.-C.; Chang, S.-P.; Chang, Y.-C.; Chu, H.-H.; Chen, C.C.-H.; Hsiao, F.-H.; Ko, J.-C. Playful bottle: a mobile social persuasion system to motivate healthy water intake. In Proceedings of the 11th International Conference on Ubiquitous Computing, Orlando, Florida, USA; 2009; pp. 185–194. [Google Scholar] [CrossRef]
- Amft, O.; Bannach, D.; Pirkl, G.; Kreil, M.; Lukowicz, P. Towards wearable sensing-based assessment of fluid intake. In Proceedings of the 8th IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops), 29 March-2 April 2010, 2010; pp. 298–303. [CrossRef]
- Amft, O.; Troster, G. On-body sensing solutions for automatic dietary monitoring. IEEE Pervasive Computing 2009, 8, 62–70. [Google Scholar] [CrossRef]
- Ortega Anderez, D.; Lotfi, A.; Langensiepen, C. A hierarchical approach in food and drink intake recognition using wearable inertial sensors. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference, Corfu, Greece; 2018; pp. 552–557. [Google Scholar] [CrossRef]
- Ortega Anderez, D.; Lotfi, A.; Pourabdollah, A. Temporal convolution neural network for food and drink intake recognition. In Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes, Greece; 2019; pp. 580–586. [Google Scholar] [CrossRef]
- Bai, Y.; Jia, W.; Mao, Z.H.; Sun, M. Automatic eating detection using a proximity sensor. In Proceedings of the 40th Annual Northeast Bioengineering Conference (NEBEC), 25-27 April 2014; pp. 1–2. [Google Scholar] [CrossRef]
- Bhattacharya, S.; Adaimi, R.; Thomaz, E. Leveraging sound and wrist motion to detect activities of daily living with commodity smartwatches. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2022, 6, Article 42. [Google Scholar] [CrossRef]
- Chun, K.S.; Sanders, A.B.; Adaimi, R.; Streeper, N.; Conroy, D.E.; Thomaz, E. Towards a generalizable method for detecting fluid intake with wrist-mounted sensors and adaptive segmentation. In Proceedings of the 24th International Conference on Intelligent User Interfaces, Marina del Ray, California; 2019; pp. 80–85. [Google Scholar] [CrossRef]
- Doulah, A.; Ghosh, T.; Hossain, D.; Imtiaz, M.H.; Sazonov, E. “Automatic Ingestion Monitor Version 2” – A novel wearable device for automatic food intake detection and passive capture of food images. IEEE Journal of Biomedical and Health Informatics 2021, 25, 568–576. [Google Scholar] [CrossRef] [PubMed]
- Du, B.; Lu, C.X.; Kan, X.; Wu, K.; Luo, M.; Hou, J.; Li, K.; Kanhere, S.; Shen, Y.; Wen, H. HydraDoctor: real-time liquids intake monitoring by collaborative sensing. In Proceedings of the 20th International Conference on Distributed Computing and Networking, Bangalore, India; 2019; pp. 213–217. [Google Scholar] [CrossRef]
- Gomes, D.; Sousa, I. Real-time drink trigger detection in free-living conditions using inertial sensors. Sensors 2019, 19, 2145. [Google Scholar] [CrossRef]
- Griffith, H.; Shi, Y.; Biswas, S. A container-attachable inertial sensor for real-time hydration tracking. Sensors 2019, 19, 4008. [Google Scholar] [CrossRef]
- Jovanov, E.; Nallathimmareddygari, V.R.; Pryor, J.E. SmartStuff: A case study of a smart water bottle. In Proceedings of the 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 16-20 August 2016; pp. 6307–6310. [Google Scholar] [CrossRef]
- Kadomura, A.; Li, C.-Y.; Tsukada, K.; Chu, H.-H.; Siio, I. Persuasive technology to improve eating behavior using a sensor-embedded fork. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Seattle, Washington; 2014; pp. 319–329. [Google Scholar] [CrossRef]
- Lasschuijt, M.P.; Brouwer-Brolsma, E.; Mars, M.; Siebelink, E.; Feskens, E.; de Graaf, K.; Camps, G. Concept development and use of an automated food intake and eating behavior assessment method. Journal of Visualized Experiments 2021, 168, e62144. [Google Scholar]
- Lee, J.; Paudyal, P.; Banerjee, A.; Gupta, S.K.S. A user-adaptive modeling for eating action identification from wristband time series. ACM Transactions on Interactive Intelligent Systems 2019, 9, Article 22. [Google Scholar] [CrossRef]
- Rahman, S.A.; Merck, C.; Huang, Y.; Kleinberg, S. Unintrusive eating recognition using Google Glass. In Proceedings of the 9th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth), Istanbul, Turkey, 20-23 May 2015; pp. 108–111. [Google Scholar] [CrossRef]
- Schiboni, G.; Amft, O. Sparse natural gesture spotting in free living to monitor drinking with wrist-worn inertial sensors. In Proceedings of the 2018 ACM International Symposium on Wearable Computers, Singapore, Singapore; 2018; pp. 140–147. [Google Scholar] [CrossRef]
- Sen, S.; Subbaraju, V.; Misra, A.; Balan, R.; Lee, Y. Annapurna: An automated smartwatch-based eating detection and food journaling system. Pervasive and Mobile Computing 2020, 68, 101259. [Google Scholar] [CrossRef]
- Staab, S.; Bröning, L.; Luderschmidt, J.; Martin, L. Performance comparison of finger and wrist motion tracking to detect bites during food consumption. In Proceedings of the 2022 ACM Conference on Information Technology for Social Good, Limassol, Cyprus; 2022; pp. 198–204. [Google Scholar] [CrossRef]
- Wellnitz, A.; Wolff, J.-P.; Haubelt, C.; Kirste, T. Fluid intake recognition using inertial sensors. In Proceedings of the 6th International Workshop on Sensor-based Activity Recognition and Interaction, Rostock, Germany, 2020; p. Article 4. [Google Scholar] [CrossRef]
- Xu, Y.; Guanling, C.; Yu, C. Automatic eating detection using head-mount and wrist-worn accelerometers. In Proceedings of the 17th International Conference on E-health Networking, Application & Services (HealthCom); pp. 578–581. [CrossRef]
- Zhang, R.; Zhang, J.; Gade, N.; Cao, P.; Kim, S.; Yan, J.; Zhang, C. EatingTrak: Detecting fine-grained eating moments in the wild using a wrist-mounted IMU. Proceedings of the ACM on Human-Computer Interaction 2022, 6, Article 214. [Google Scholar] [CrossRef]
- Zhang, Z.; Zheng, H.; Rempel, S.; Hong, K.; Han, T.; Sakamoto, Y.; Irani, P. A smart utensil for detecting food pick-up gesture and amount while eating. In Proceedings of the 11th Augmented Human International Conference, Winnipeg, Manitoba, Canada, 2020; p. Article 2. [Google Scholar] [CrossRef]
- Amemiya, H.; Yamagishi, Y.; Kaneda, S. Automatic recording of meal patterns using conductive chopsticks. In Proceedings of the 2013 IEEE 2nd Global Conference on Consumer Electronics (GCCE), 1-4 October 2013; pp. 350–351. [Google Scholar] [CrossRef]
- Nakamura, Y.; Arakawa, Y.; Kanehira, T.; Fujiwara, M.; Yasumoto, K. SenStick: Comprehensive sensing platform with an ultra tiny all-in-one sensor board for IoT research. Journal of Sensors 2017, 2017, 6308302. [Google Scholar] [CrossRef]
- Zuckerman, O.; Gal, T.; Keren-Capelovitch, T.; Karsovsky, T.; Gal-Oz, A.; Weiss, P.L.T. DataSpoon: Overcoming design challenges in tangible and embedded assistive technologies. In Proceedings of the TEI '16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, Eindhoven, Netherlands; 2016; pp. 30–37. [Google Scholar] [CrossRef]
- Liu, K.C.; Hsieh, C.Y.; Huang, H.Y.; Chiu, L.T.; Hsu, S.J.P.; Chan, C.T. Drinking event detection and episode identification using 3D-printed smart cup. IEEE Sensors Journal 2020, 20, 13743–13751. [Google Scholar] [CrossRef]
- Shin, J.; Lee, S.; Lee, S.-J. Accurate eating detection on a daily wearable necklace (demo). In Proceedings of the 17th Annual International Conference on Mobile Systems, Applications, and Services; 2019; pp. 649–650. [Google Scholar] [CrossRef]
- Bi, S.; Kotz, D. Eating detection with a head-mounted video camera. In Proceedings of the 2022 IEEE 10th International Conference on Healthcare Informatics (ICHI), 11-14 June 2022; pp. 60–66. [Google Scholar] [CrossRef]
- Kaner, G.; Genç, H.U.; Dinçer, S.B.; Erdoğan, D.; Coşkun, A. GROW: A smart bottle that uses its surface as an ambient display to motivate daily water intake. In Proceedings of the Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal QC, Canada, 2018; p. Paper LBW077. [Google Scholar] [CrossRef]
- Lester, J.; Tan, D.; Patel, S.; Brush, A.J.B. Automatic classification of daily fluid intake. In Proceedings of the 2010 4th International Conference on Pervasive Computing Technologies for Healthcare, Munich, Germany, 22-25 March 2010; pp. 1–8. [Google Scholar] [CrossRef]
- Lin, B.; Hoover, A. A comparison of finger and wrist motion tracking to detect bites during food consumption. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN); pp. 1–4. [CrossRef]
- Dong, Y.; Hoover, A.; Scisco, J.; Muth, E. A new method for measuring meal intake in humans via automated wrist motion tracking. Applied Psychophysiology and Biofeedback 2012, 37, 205–215. [Google Scholar] [CrossRef] [PubMed]
- Jasper, P.W.; James, M.T.; Hoover, A.W.; Muth, E.R. Effects of bite count feedback from a wearable device and goal setting on consumption in young adults. Journal of the Academy of Nutrition and Dietetics 2016, 116, 1785–1793. [Google Scholar] [CrossRef]
- Arun, A.; Bhadra, S. An accelerometer based eyeglass to monitor food intake in free-living and lab environment. In Proceedings of the 2020 IEEE SENSORS, 25-28 October 2020; pp. 1–4. [Google Scholar] [CrossRef]
- Bedri, A.; Li, R.; Haynes, M.; Kosaraju, R.P.; Grover, I.; Prioleau, T.; Beh, M.Y.; Goel, M.; Starner, T.; Abowd, G. EarBit: Using wearable sensors to detect eating episodes in unconstrained environments. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2017, 1, Article 37. [Google Scholar] [CrossRef]
- Cheng, J.; Zhou, B.; Kunze, K.; Rheinländer, C.C.; Wille, S.; Wehn, N.; Weppner, J.; Lukowicz, P. Activity recognition and nutrition monitoring in every day situations with a textile capacitive neckband. In Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication, Zurich, Switzerland, 2013; 2013; pp. 155–158. [Google Scholar] [CrossRef]
- Chun, K.S.; Jeong, H.; Adaimi, R.; Thomaz, E. Eating episode detection with jawbone-mounted inertial sensing. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), 20-24 July 2020; pp. 4361–4364. [Google Scholar] [CrossRef]
- Ghosh, T.; Hossain, D.; Imtiaz, M.; McCrory, M.A.; Sazonov, E. Implementing real-time food intake detection in a wearable system using accelerometer. In Proceedings of the 2020 IEEE-EMBS Conference on Biomedical Engineering and Sciences (IECBES), 1-3 March 2021; pp. 439–443. [Google Scholar] [CrossRef]
- Kalantarian, H.; Alshurafa, N.; Le, T.; Sarrafzadeh, M. Monitoring eating habits using a piezoelectric sensor-based necklace. Computers in Biology and Medicine 2015, 58, 46–55. [Google Scholar] [CrossRef] [PubMed]
- Lotfi, R.; Tzanetakis, G.; Eskicioglu, R.; Irani, P. A comparison between audio and IMU data to detect chewing events based on an earable device. In Proceedings of the 11th Augmented Human International Conference, Winnipeg, Manitoba, Canada, 2020; p. Article 11. [Google Scholar] [CrossRef]
- Bin Morshed, M.; Haresamudram, H.K.; Bandaru, D.; Abowd, G.D.; Ploetz, T. A personalized approach for developing a snacking detection system using earbuds in a semi-naturalistic setting. In Proceedings of the 2022 ACM International Symposium on Wearable Computers, Cambridge, United Kingdom; 2022; pp. 11–16. [Google Scholar] [CrossRef]
- Sazonov, E.; Schuckers, S.; Lopez-Meyer, P.; Makeyev, O.; Sazonova, N.; Melanson, E.L.; Neuman, M. Non-invasive monitoring of chewing and swallowing for objective quantification of ingestive behavior. Physiological Measurement 2008, 29, 525–541. [Google Scholar] [CrossRef]
- Sazonov, E.S.; Fontana, J.M. A sensor system for automatic detection of food intake through non-invasive monitoring of chewing. IEEE Sensors Journal 2012, 12, 1340–1348. [Google Scholar] [CrossRef]
- Shin, J.; Lee, S.; Gong, T.; Yoon, H.; Roh, H.; Bianchi, A.; Lee, S.-J. MyDJ: Sensing food intakes with an attachable on your eyeglass frame. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 2022; p. Article 341. [Google Scholar] [CrossRef]
- Wang, S.; Zhou, G.; Ma, Y.; Hu, L.; Chen, Z.; Chen, Y.; Zhao, H.; Jung, W. Eating detection and chews counting through sensing mastication muscle contraction. Smart Health 2018, 9-10, 179–191. [Google Scholar] [CrossRef]
- Kamachi, H.; Kondo, T.; Hossain, T.; Yokokubo, A.; Lopez, G. Automatic segmentation method of bone conduction sound for eating activity detailed detection. In Proceedings of the Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers, Virtual, USA, 2021; pp. 310–315. [Google Scholar] [CrossRef]
- Bi, S.; Wang, T.; Tobias, N.; Nordrum, J.; Wang, S.; Halvorsen, G.; Sen, S.; Peterson, R.; Odame, K.; Caine, K.; et al. Auracle: Detecting eating episodes with an ear-mounted sensor. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2018, 2, Article 92. [Google Scholar] [CrossRef]
- Bi, Y.; Lv, M.; Song, C.; Xu, W.; Guan, N.; Yi, W. AutoDietary: A wearable acoustic sensor system for food intake recognition in daily life. IEEE Sensors Journal 2016, 16, 806–816. [Google Scholar] [CrossRef]
- Liu, J.; Johns, E.; Atallah, L.; Pettitt, C.; Lo, B.; Frost, G.; Yang, G.Z. An intelligent food-intake monitoring system using wearable sensors. In Proceedings of the 2012 Ninth International Conference on Wearable and Implantable Body Sensor Networks, London, UK, 9-12 May 2012; pp. 154–160. [Google Scholar] [CrossRef]
- Papapanagiotou, V.; Liapi, A.; Delopoulos, A. Chewing Detection from Commercial Smart-glasses. In Proceedings of the 7th International Workshop on Multimedia Assisted Dietary Management, Lisboa, Portugal; 2022; pp. 11–16. [Google Scholar] [CrossRef]
- Kalantarian, H.; Sarrafzadeh, M. Audio-based detection and evaluation of eating behavior using the smartwatch platform. Computers in Biology and Medicine 2015, 65, 1–9. [Google Scholar] [CrossRef]
- Niewiadomski, R.; De Lucia, G.; Grazzi, G.; Mancini, M. Towards Commensal Activities Recognition. In Proceedings of the The 2022 International Conference on Multimodal Interaction, Bengaluru, India; 2022; pp. 549–557. [Google Scholar] [CrossRef]
- Farooq, M.; Fontana, J.M.; Sazonov, E. A novel approach for food intake detection using electroglottography. Physiological Measurement 2014, 35, 739. [Google Scholar] [CrossRef]
- Zhang, R.; Amft, O. Monitoring chewing and eating in free-living using smart eyeglasses. IEEE Journal of Biomedical and Health Informatics 2018, 22, 23–32. [Google Scholar] [CrossRef] [PubMed]
- Huang, Q.; Wang, W.; Zhang, Q. Your glasses know your diet: dietary monitoring using electromyography sensors. IEEE Internet of Things Journal 2017, 4, 705–712. [Google Scholar] [CrossRef]
- Mirtchouk, M.; Merck, C.; Kleinberg, S. Automated estimation of food type and amount consumed from body-worn audio and motion sensors. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16), Heidelberg, Germany; 2016; pp. 451–462. [Google Scholar] [CrossRef]
- Chang, K.-h.; Liu, S.-y.; Chu, H.-h.; Hsu, J.Y.-j.; Chen, C.; Lin, T.-y.; Chen, C.-y.; Huang, P. The diet-aware dining table: observing dietary behaviors over a tabletop surface. In Proceedings of the PERVASIVE 2006: Pervasive Computing, Dublin, Ireland, 7-10 May 2006; pp. 366–382. [Google Scholar] [CrossRef]
- Fujiwara, M.; Moriya, K.; Sasaki, W.; Fujimoto, M.; Arakawa, Y.; Yasumoto, K. A Smart Fridge for efficient foodstuff management with weight sensor and voice interface. In Proceedings of the Workshop Proceedings of the 47th International Conference on Parallel Processing, Eugene, OR, USA, 2018; p. Article 2. [Google Scholar] [CrossRef]
- Guss, J.L.; Kissileff, H.R. Microstructural analyses of human ingestive patterns: from description to mechanistic hypotheses. Neuroscience and Biobehavioral Reviews 2000, 24, 261–268. [Google Scholar] [CrossRef]
- Kim, J.; Park, J.; Lee, U. EcoMeal: a smart tray for promoting healthy dietary habits. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, California, USA; 2016; pp. 2165–2170. [Google Scholar] [CrossRef]
- Guru, H.S.; Weng, A.D.; Pitla, S.; Dev, D.A. SensiTray: An integrated measuring device for monitoring children's mealtime dietary intake. In Proceedings of the 2021 IEEE International Instrumentation and Measurement Technology Conference (I2MTC); pp. 1–6.
- Mattfeld, R.S.; Muth, E.R.; Hoover, A. Measuring the consumption of individual solid and liquid bites using a table-embedded scale during unrestricted eating. IEEE Journal of Biomedical and Health Informatics 2017, 21, 1711–1718. [Google Scholar] [CrossRef]
- Papapanagiotou, V.; Diou, C.; Langlet, B.; Ioakimidis, I.; Delopoulos, A. A parametric probabilistic context-free grammar for food intake analysis based on continuous meal weight measurements. In Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25-29 Aug. 2015; pp. 7853–7856. [Google Scholar] [CrossRef]
- Ueda, M.; Funatomi, T.; Hashimoto, A.; Watanabe, T.; Minoh, M. Developing a real-time system for measuring the consumption of seasoning. In Proceedings of the 2011 IEEE International Symposium on Multimedia, 5-7 Dec. 2011, 2011; pp. 393–398. [CrossRef]
- Vries, R.A.J.d.; Keizers, G.H.J.; Arum, S.R.v.; Haarman, J.A.M.; Klaassen, R.; Delden, R.W.v.; Beijnum, B.-J.F.v.; Boer, J.H.W.v.d. Multimodal interactive dining with the sensory interactive table: two use cases. In Proceedings of the Companion Publication of the 2020 International Conference on Multimodal Interaction, Virtual Event, Netherlands; 2021; pp. 332–340. [Google Scholar] [CrossRef]
- Van Wymelbeke-Delannoy, V.; Juhel, C.; Bole, H.; Sow, A.K.; Guyot, C.; Belbaghdadi, F.; Brousse, O.; Paindavoine, M. A cross-sectional reproducibility study of a standard camera sensor using artificial intelligence to assess food items: The FoodIntech project. Nutrients 2022, 14, 221. [Google Scholar] [CrossRef]
- Fang, S.; Zhu, F.; Boushey, C.J.; Delp, E.J. The use of co-occurrence patterns in single image based food portion estimation. In Proceedings of the 2017 IEEE Global Conference on Signal and Information Processing (GlobalSIP), 14-16 Nov. 2017, 2017; pp. 462–466. [CrossRef]
- Fang, S.; Zhu, F.; Jiang, C.; Zhang, S.; Boushey, C.J.; Delp, E.J. A comparison of food portion size estimation using geometric models and depth images. In Proceedings of the 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 25-28 Sept. 2016; pp. 26–30. [Google Scholar] [CrossRef]
- Hippocrate, E.A.A.; Suwa, H.; Arakawa, Y.; Yasumoto, K. Food weight estimation using smartphone and cutlery. In Proceedings of the First Workshop on IoT-enabled Healthcare and Wellness Technologies and Systems, Singapore, Singapore; 2016; pp. 9–14. [Google Scholar] [CrossRef]
- Jia, W.; Chen, H.C.; Yue, Y.; Li, Z.; Fernstrom, J.; Bai, Y.; Li, C.; Sun, M. Accuracy of food portion size estimation from digital pictures acquired by a chest-worn camera. Public Health Nutrition 2014, 17, 1671–1681. [Google Scholar] [CrossRef] [PubMed]
- Jia, W.; Yue, Y.; Fernstrom, J.D.; Zhang, Z.; Yang, Y.; Sun, M. 3D localization of circular feature in 2D image and application to food volume estimation. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA; 2012; pp. 4545–4548. [Google Scholar] [CrossRef]
- Puri, M.; Zhu, Z.; Yu, Q.; Divakaran, A.; Sawhney, H. Recognition and volume estimation of food intake using a mobile device. In Proceedings of the 2009 Workshop on Applications of Computer Vision (WACV), Snowbird, UT, USA, 7-8 December 2009; pp. 1–8. [Google Scholar] [CrossRef]
- Sitparoopan, T.; Chellapillai, V.; Arulmoli, J.; Chandrasiri, S.; Kugathasan, A. Home Bridge - Smart elderly care system. In Proceedings of the 2nd International Informatics and Software Engineering Conference (IISEC); pp. 1–5. [CrossRef]
- Smith, S.P.; Adam, M.T.P.; Manning, G.; Burrows, T.; Collins, C.; Rollo, M.E. Food volume estimation by integrating 3d image projection and manual wire mesh transformations. IEEE Access 2022, 10, 48367–48378. [Google Scholar] [CrossRef]
- Sundaravadivel, P.K.K.; Kesavan, L.; Mohanty, S.P.; Kougianos, E.; Ganapathiraju, M. Smart-log: An automated, predictive nutrition monitoring system for infants through the IoT. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics (ICCE), 12-14 Jan. 2018, 2018; pp. 1-4; pp. 1–4. [CrossRef]
- Woo, I.; Otsmo, K.; Kim, S.; Ebert, D.S.; Delp, E.J.; Boushey, C.J. Automatic portion estimation and visual refinement in mobile dietary assessment. In Proceedings of the Computational Imaging VIII, San Jose, California, United States, Jan 1, 2010; p. 75330. [Google Scholar] [CrossRef]
- Xu, C.; He, Y.; Khanna, N.; Boushey, C.J.; Delp, E.J. Model-based food volume estimation using 3D pose. In Proceedings of the 2013 IEEE International Conference on Image Processing, Melbourne, VIC, Australia, 15-18 Sept. 2013; pp. 2534–2538. [Google Scholar] [CrossRef]
- Yang, Z.; Yu, H.; Cao, S.; Xu, Q.; Yuan, D.; Zhang, H.; Jia, W.; Mao, Z.H.; Sun, M. Human-mimetic estimation of food volume from a single-view RGB image using an AI system. Electronics 2021, 10, 1556. [Google Scholar] [CrossRef]
- Zhang, Q.; He, C.; Qin, W.; Liu, D.; Yin, J.; Long, Z.; He, H.; Sun, H.C.; Xu, H. Eliminate the hardware: Mobile terminals-oriented food recognition and weight estimation system. Frontiers in Nutrition 2022, 9, 965801. [Google Scholar] [CrossRef] [PubMed]
- Zhang, W.; Yu, Q.; Siddiquie, B.; Divakaran, A.; Sawhney, H. "Snap-n-Eat": Food recognition and nutrition estimation on a smartphone. Journal of Diabetes Science and Technology 2015, 9, 525–533. [Google Scholar] [CrossRef]
- Zhu, F.; Bosch, M.; Woo, I.; Kim, S.; Boushey, C.J.; Ebert, D.S.; Delp, E.J. The use of mobile devices in aiding dietary assessment and evaluation. IEEE Journal of Selected Topics in Signal Processing 2010, 4, 756–766. [Google Scholar] [CrossRef] [PubMed]
- Raju, V.B.; Hossain, D.; Sazonov, E. Estimation of plate and bowl dimensions for food portion size assessment in a wearable sensor system. IEEE Sensors Journal 2023, 23, 5391–5400. [Google Scholar] [CrossRef]
- Papathanail, I.; Brühlmann, J.; Vasiloglou, M.F.; Stathopoulou, T.; Exadaktylos, A.K.; Stanga, Z.; Münzer, T.; Mougiakakou, S. Evaluation of a novel artificial intelligence system to monitor and assess energy and macronutrient intake in hospitalised older patients. Nutrients 2021, 13, 4539. [Google Scholar] [CrossRef]
- Iizuka, K.; Morimoto, M. A nutrient content estimation system of buffet menu using RGB-D sensor. In Proceedings of the 2018 Joint 7th International Conference on Informatics, Electronics & Vision (ICIEV) and 2018 2nd International Conference on Imaging, Vision & Pattern Recognition (icIVPR); pp. 165–168. [CrossRef]
- Liao, H.C.; Lim, Z.Y.; Lin, H.W. Food intake estimation method using short-range depth camera. In Proceedings of the 2016 IEEE International Conference on Signal and Image Processing (ICSIP), Beijing, China, 13-15 Aug. 2016; pp. 198–204. [Google Scholar] [CrossRef]
- Lu, Y.; Stathopoulou, T.; Vasiloglou, M.F.; Christodoulidis, S.; Stanga, Z.; Mougiakakou, S. An artificial intelligence-based system to assess nutrient intake for hospitalised patients. IEEE Transactions on Multimedia 2021, 23, 1136–1147. [Google Scholar] [CrossRef]
- Pfisterer, K.J.; Amelard, R.; Chung, A.G.; Syrnyk, B.; MacLean, A.; Keller, H.H.; Wong, A. Automated food intake tracking requires depth-refined semantic segmentation to rectify visual-volume discordance in long-term care homes. Scientific Reports 2022, 12, 83. [Google Scholar] [CrossRef]
- Suzuki, T.; Futatsuishi, K.; Yokoyama, K.; Amaki, N. Point cloud processing method for food volume estimation based on dish space. In Proceedings of the 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20-24 July 2020; pp. 5665–5668. [Google Scholar] [CrossRef]
- Weesepoel, Y.J.A.; Alewijn, M.; Daniels, F.M.J.; Baart, A.M.; Müller-Maatsch, J.T.L.; Simsek-Senel, G.; Rijgersberg, H.; Top, J.L.; Feskens, E.J.M. Towards the universal assessment of dietary intake using spectral imaging solutions. In Proceedings of the OCM 2021-Optical Characterization of Materials: Conference Proceedings; 2021; pp. 31–43. [Google Scholar] [CrossRef]
- Yue, Y.; Jia, W.; Sun, M. Measurement of food volume based on single 2-D image without conventional camera calibration. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 Aug.-1 Sept. 2012; pp. 2166–2169. [Google Scholar] [CrossRef]
- Shang, J.; Duong, M.; Pepin, E.; Zhang, X.; Sandara-Rajan, K.; Mamishev, A.; Kristal, A. A mobile structured light system for food volume estimation. In Proceedings of the 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), 6-13 Nov. 2011, 2011; pp. 100–101. [CrossRef]
- Cobo, M.; Heredia, I.; Aguilar, F.; Lloret Iglesias, L.; García, D.; Bartolomé, B.; Moreno-Arribas, M.V.; Yuste, S.; Pérez-Matute, P.; Motilva, M.J. Artificial intelligence to estimate wine volume from single-view images. Heliyon 2022, 8, e10557. [Google Scholar] [CrossRef]
- Tufano, M.; Lasschuijt, M.; Chauhan, A.; Feskens, E.J.M.; Camps, G. Capturing eating behavior from video analysis: a systematic review. Nutrients 2022, 14, 4847. [Google Scholar] [CrossRef] [PubMed]
- Kraaij, A.v.; Fabius, J.; Hermsen, S.; Feskens, E.; Camps, G. Assessing snacking and drinking behavior in real-life settings: Validation of the SnackBox technology. Food Quality and Preference 2023, In review.
- Shiffman, S.; Stone, A.A.; Hufford, M.R. Ecological Momentary Assessment. Annual Review of Clinical Psychology 2008, 4, 1–32. [Google Scholar] [CrossRef]
- Allegra, D.; Battiato, S.; Ortis, A.; Urso, S.; Polosa, R. A review on food recognition technology for health applications. Health Psychology Research 2020, 8, 9297. [Google Scholar] [CrossRef] [PubMed]
- Bell, B.M.; Alam, R.; Alshurafa, N.; Thomaz, E.; Mondol, A.S.; de la Haye, K.; Stankovic, J.A.; Lach, J.; Spruijt-Metz, D. Automatic, wearable-based, in-field eating detection approaches for public health research: a scoping review. npj Digital Medicine 2020, 3, 38. [Google Scholar] [CrossRef] [PubMed]
- Eldridge, A.L.; Piernas, C.; Illner, A.-K.; Gibney, M.J.; Gurinović, M.A.; De Vries, J.H.M.; Cade, J.E. Evaluation of new technology-based tools for dietary intake assessment—an ilsi europe dietary intake and exposure task force evaluation. Nutrients 2019, 11, 55. [Google Scholar] [CrossRef]
- Hassannejad, H.; Matrella, G.; Ciampolini, P.; De Munari, I.; Mordonini, M.; Cagnoni, S. Automatic diet monitoring: a review of computer vision and wearable sensor-based methods. International Journal of Food Sciences and Nutrition 2017, 68, 656–670. [Google Scholar] [CrossRef]
- Heydarian, H.; Adam, M.; Burrows, T.; Collins, C.; Rollo, M.E. Assessing eating behaviour using upper limb mounted motion sensors: A systematic review. Nutrients 2019, 11, 1168. [Google Scholar] [CrossRef]
- Neves, P.A.; Simões, J.; Costa, R.; Pimenta, L.; Gonçalves, N.J.; Albuquerque, C.; Cunha, C.; Zdravevski, E.; Lameski, P.; Garcia, N.M.; et al. Thought on food: A systematic review of current approaches and challenges for food intake detection. Sensors 2022, 22, 6443. [Google Scholar] [CrossRef]
- Raju, V.B.; Sazonov, E. A systematic review of sensor-based methodologies for food portion size estimation. IEEE Sensors Journal 2021, 21, 12882–12899. [Google Scholar] [CrossRef]
- Wang, L.; Allman-Farinelli, M.; Yang, J.-A.; Taylor, J.C.; Gemming, L.; Hekler, E.; Rangan, A. Enhancing nutrition care through real-time, sensor-based capture of eating occasions: a scoping review. Frontiers in Nutrition 2022, 9, 852984. [Google Scholar] [CrossRef]
- Cohen, R.; Fernie, G.; Roshan Fekr, A. Fluid intake monitoring systems for the elderly: A review of the literature. Nutrients 2021, 13, 2092. [Google Scholar] [CrossRef] [PubMed]
- Alshurafa, N.; Lin, A.W.; Zhu, F.; Ghaffari, R.; Hester, J.; Delp, E.; Rogers, J.; Spring, B. Counting bites with bits: Expert workshop addressing calorie and macronutrient intake monitoring. Journal of Medical Internet Research 2019, 21, e14904. [Google Scholar] [CrossRef] [PubMed]
- Dijksterhuis, G.; de Wijk, R.; Onwezen, M. New consumer research technology for food behaviour: overview and validity. Foods 2022, 11, 767. [Google Scholar] [CrossRef]
- Diou, C.; Kyritsis, K.; Papapanagiotou, V.; Sarafis, I. Intake monitoring in free-living conditions: Overview and lessons we have learned. Appetite 2022, 176, 106096. [Google Scholar] [CrossRef]
- Fontana, J.M.; Farooq, M.; Sazonov, E. Detection and characterization of food intake by wearable sensors. In Wearable Sensors (Second Edition), Sazonov, E., Ed.; Academic Press: Oxford, 2021; pp. 541–574. [Google Scholar]
- Kalantarian, H.; Alshurafa, N.; Sarrafzadeh, M. A survey of diet monitoring technology. IEEE Pervasive Computing 2017, 16, 57–65. [Google Scholar] [CrossRef]
- Kissileff, H.R. The Universal Eating Monitor (UEM): objective assessment of food intake behavior in the laboratory setting. International Journal of Obesity 2022, 46, 1114–1121. [Google Scholar] [CrossRef]
- Krishna, N.; Iliger, B.B.; Jyothi, G.; Hemamai, M.; Priya, K.R. A review on sensors based quantifying device to oversee the mealtime dietary intake. In Proceedings of the 2022 6th International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 28-30 April 2022; pp. 1606–1609. [Google Scholar] [CrossRef]
- McClung, H.L.; Ptomey, L.T.; Shook, R.P.; Aggarwal, A.; Gorczyca, A.M.; Sazonov, E.S.; Becofsky, K.; Weiss, R.; Das, S.K. Dietary intake and physical activity assessment: current tools, techniques, and technologies for use in adult populations. American Journal of Preventive Medicine 2018, 55, e93–e104. [Google Scholar] [CrossRef]
- Rantala, E.; Balatsas-Lekkas, A.; Sozer, N.; Pennanen, K. Overview of objective measurement technologies for nutrition research, food-related consumer and marketing research. Trends in Food Science & Technology 2022, 125, 100–113. [Google Scholar] [CrossRef]
- Skinner, A.; Toumpakari, Z.; Stone, C.; Johnson, L. Future directions for integrative objective assessment of eating using wearable sensing technology. Frontiers in Nutrition 2020, 80. [Google Scholar] [CrossRef]
- Tahir, G.A.; Loo, C.K. A comprehensive survey of image-based food recognition and volume estimation methods for dietary assessment. Healthcare 2021, 9, 1676. [Google Scholar] [CrossRef] [PubMed]
- Tay, W.; Kaur, B.; Quek, R.; Lim, J.; Henry, C.J. Current developments in digital quantitative volume estimation for the optimisation of dietary assessment. Nutrients 2020, 12, 1167. [Google Scholar] [CrossRef] [PubMed]
- Vu, T.; Lin, F.; Alshurafa, N.; Xu, W. Wearable food intake monitoring technologies: a comprehensive review. Computers 2017, 6, 4. [Google Scholar] [CrossRef]
- Selamat, N.A.; Ali, S.H.M. Automatic food intake monitoring based on chewing activity: A survey. IEEE Access 2020, 8, 48846–48869. [Google Scholar] [CrossRef]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).