1. Introduction
Gait evaluation is a well-established clinical tool used to assess general health, detect and differentiate diseases, monitor disease progression, and predict adverse events such as recurrent falls [
1,
2,
3]. Traditionally, gait evaluation is performed in supervised, standardized clinical settings using qualitative or semi-structured assessments (clinical scores) or, in the best cases, apparatus-based measurements [
4,
5,
6]. However, these assessments can only provide a snapshot of a patient’s mobility, potentially missing episodic or rare events like freezing of gait or falls [
7]. Additionally, the ecological validity of these assessments is often questioned, as walking in a clinical setting can differ significantly from walking in daily life [
8,
9]. Consequently, there is growing interest in mobile health technologies that enable continuous, quantitative assessment of mobility and gait in real-world environments [
10]. Previous research indicates that these continuous, unsupervised assessments may offer complementary, or even more accurate, insights for predicting disease progression or the risk of future falls [
11,
12].
Health technologies for continuous long-term monitoring of gait and mobility primarily rely on mobile sensors, which typically integrate accelerometers, often augmented with gyroscopes and magnetometers. These sensors’ small size and low energy consumption allow them to be attached to various body locations or to be integrated into wearable devices, such as smartwatches. Selecting the number and placement of these sensors is crucial to balancing the accuracy of gait-related motion detection with the need for unobtrusiveness to ensure long-term user engagement and adherence. Previous research suggests that attaching two or three motion sensors to the legs and lower trunk provides the most accurate estimation of spatiotemporal stepping events [
13,
14]. In contrast, motion sensors on the wrist, where wearables are commonly worn, can only provide imprecise information about gait-related data, often limited to basic step counting [
15].
Recently, the head, particularly the ear, has emerged as a promising alternative site for mounting a single motion sensor to monitor mobility and gait [
16,
17]. Unlike the lower extremities or trunk, the ear offers several unique advantages for implementing an accurate and unobtrusive mobility monitor. First, the head, which houses crucial sensory systems for vision, hearing, and balance, remains exceptionally stable during various movements [
18,
19]. This stability provides a reliable location for low-noise identification and differentiation of different bodily activities. Research has demonstrated that a single ear-worn motion sensor can reliably distinguish between a wide range of daily activities and provide detailed insights into the number and temporal sequence of stepping events [
16]. Additionally, the ear is a location where users, especially the elderly, often already use assistive devices like hearing aids or eyeglass frames, which can be easily combined with a miniature motion sensor. This integration could significantly reduce barriers to user-friendly long-term monitoring. Finally, the ear is an promising site for comprehensive vital status monitoring using optical sensors that can measure pulse rate, blood pressure, oxygen saturation, and body temperature [
20]. Combined with gait monitoring, often referred to as the sixth vital sign, a single ear sensor would offer a comprehensive overview of a patient’s physical health and motor condition.
In this study, we explore the potential of a mobile ear-based gait identification algorithm (mEar) to comprehensively identify spatiotemporal gait characteristics. For this purpose, we collected 3D acceleration profiles from the ear-worn sensor in a large cohort of healthy individuals walking at various paces, from slow to fast. These motion profiles were synchronized with a pressure-sensitive gait mat, which served as the ground truth for the temporal stepping sequence and the spatial characteristics of the gait pattern. We employed state-of-the-art deep learning architectures to identify the spatiotemporal step sequence from the sensor data and evaluated the accuracy of these identifications at different levels of granularity, from walking bouts to individual ground contacts. Finally, we highlight the potential of the motion sensor integrated into a commercial, wearable in-ear vital sign monitor for comprehensive and continuous monitoring of mobility, gait, and vital signs in patients’ daily lives.
2. Materials and Methods
2.1. Participants
Fifty-three healthy individuals, aged between 20 and 47 years (mean age: 29.9 ± 8.4 years; height: 1.73 ± 0.10 m; weight: 70.1 ± 15.9 kg; 28 females), participated in the study. All participants provided written informed consent prior to inclusion and were screened for any neurological or orthopedic conditions that could affect balance or locomotion.
2.2. Ear-Worn Motion Sensor
The motion sensor (
Figure 1A) consisted of a triaxial accelerometer (range: ±16 g; accuracy: 0.0002 g; sampling rate: 100 Hz), integrated into a commercial, wearable in-ear vital sign monitor (c-med° alpha, dimensions: 55.2 mm × 58.6 mm × 10.0 mm; weight: 7 g, Cosinuss GmbH, Munich, Germany). The vital sign monitor includes a silicone earplug that contacts the outer ear canal skin and contains an infrared thermometer for recording body temperature and an optical sensor for measuring pulse rate and blood oxygen saturation. The earplug is connected to an earpiece hooked around the ear conch, where the motion sensor is located (
Figure 1A). The wearable device transmits acquired motion and vital signals in real time via Bluetooth Low Energy to a gateway, which then streams this information to the cosinuss° Health.
2.3. Experimental Procedures
Participants walked across a 6.7-meter pressure-sensitive gait mat (GAITRite
®, CIR System, Sparta, NJ, USA), synchronized with the ear-worn motion sensor, which detected the spatiotemporal stepping sequence at 120 Hz (
Figure 1B). They were instructed to walk at three different paces: preferred speed (“please walk at your normal walking pace”), slow speed (“please walk as slowly as possible while maintaining a fluid pace”), and fast walking speed (“please walk as quickly as you can without transitioning into jogging or running”). Each speed condition was repeated multiple times (slow: 6 times; preferred: 8 times; fast: 10 times) to gather sufficient gait cycles for estimating variability and asymmetry characteristics of the gait pattern [
21].
The gait mat provided spatiotemporal gait characteristics used as ground truth for sensor training. Temporal stepping sequence was quantified by the timing of initial contacts (IC) and final contacts (FC) of the left and right foot, while spatial stepping sequence was defined by the x- and y-coordinates of the left and right foot at successive ICs. From these metrics, various gait cycle measures were obtained, including stride time (s), swing time (s), double support time (s), stride length (cm), and stride width (cm) (
Figure 1C). These measures were evaluated at the walking bout level in terms of quantifying the mean, variability, and asymmetry across all strides collected at a particular speed. Variability was assessed using the coefficient of variation (CV; std/mean, %), and asymmetry was measured as 100*(1-mean(smaller foot value)/mean(larger foot value)) (%).
2.4. Gait Identification Model
2.4.1. Model Architecture
A modified temporal convolutional network (TCN) architecture, previously demonstrated to be effective for sensor-based gait segmentation, was selected as the generic model scheme (
Figure 2) [
13,
14,
22].
A TCN is composed of a sequence of residual blocks, with dilation factors that can increase exponentially. Each residual block typically includes layers such as dilated convolutional, batch normalization, ReLU activation, and dropout layers. The implemented TCN block for temporal gait event detection (tTCN) contains a single 1D convolutional layer, followed by batch normalization, a ReLU activation function, and a dropout layer. This convolutional layer uses a fixed kernel size, a stride of 1, and an adjustable dilation factor. Padding is applied to ensure the output sequence length matches the input, thereby preserving the temporal dimensions throughout the time series. In contrast, the implemented TCN block for determination of spatial gait characteristics (sTCN) consists of two consecutive 1D convolutional layers, each followed by ReLU activation and dropout. This block also includes padding removal (chomping) to maintain temporal causality and features a residual connection that allows the input to bypass the convolutional layers. This residual connection aids gradient flow, improving the training of deeper networks. Additionally, the sTCN block supports downsampling when the number of input and output channels differs, which is necessary, for instance, when reducing a series of acceleration data points to a single stride length value.
2.4.2. Model Training
We trained a total of three neural networks: one for the detection of temporal gait events (temporal mEar) and two for the determination of spatial gait characteristics, namely step length and width (spatial mEar). The temporal mEar received raw triaxial accelerometer data (input size 3x200 samples) to identify the temporal gait events IC and FC. From the identified IC events, accelerometer values spanning a complete gait cycle were extracted and used as input for the regression of spatial stride characteristics with the two spatial mEar networks. All inputs were standardized using a robust scaler to remove the median and scale the data according to the interquartile range. For training and validation of the models, the datasets were initially split in to training (80%) and test set (20%) using a group stratified split to prevent data from the same individual from appearing in multiple sets, thus avoiding potential bias or information leakage. The training set was further split into training and validation set using a group stratified k-fold cross-validation (with k=5). Hypertuning of the model parameters (batch size, learning rate, and number of epochs) was performed using a common grid search strategy. Model training was performed with binary cross entropy (BCE) with logits or mean squared error (MSE) as a loss function, for the temporal or spatial mEar models, respectively. Adam optimizer was used to iteratively learn the model weights. All models were built using Python 3.9 and PyTorch 2.3.1.
2.5. Performance Evaluation
The model performance was evaluated with respect to (1) detection performance and time agreement of predicted temporal gait events (i.e., IC and FC) with the ground truth (gait mat) and (2) the agreement of derived temporal and spatial stride parameters with the ground truth.
The overall detection performance measured the number of annotated events detected by the model (true positives, TP), the number of annotated events missed by the model (false negatives, FN), and the number of detected events that were not annotated (false positives, FP). Using these metrics, detection performance war primarily evaluated by the weighted
F1-Score, which accounts for both precision and recall while adjusting for class distribution imbalances. It calculates the harmonic mean of precision and recall and ranges between 1 and 0 reflecting the best and worst performance, respectively:
A detected event (either IC or FC) was considered a TP if the absolute time difference from the corresponding annotated event was < 250 ms [
13]. For all TP, the time agreement with the ground truth was quantified by:
.
We employed multiple statistical techniques to assess the agreement of derived temporal and spatial stride parameters with the ground truth, including the absolute and relative root mean square error (RMSE), Pearson’s correlation coefficient, and the intraclass correlation coefficient for absolute agreement (ICC(3,1); two-way mixed model). ICC outcomes were interpreted according to established categories [
23]: poor agreement (< 0.5), moderate agreement (0.5–0.75), good agreement (0.75–0.9), and excellent agreement (> 0.9). All the above-mentioned metrics were calculated for the various gait parameters across all subjects and gait speeds. To examine potential differences in agreement at different speeds, the relative RMSE for each subject and speed was also calculated for all mean spatiotemporal gait parameters. Differences in RMSE results between speeds were tested using a repeated measures analysis of variance (ANOVA). All statistical analyses were conducted using Python 3.9.
3. Results
3.1. Dataset Characteristics
A total of 2.59 h (left-worn sensors: 1.26 h; right-worn sensors: 1.33 h) of walking activity was recorded from the 53 participants. The collected dataset included a total of 2434 walks on the gait mat (563 at a slow walking speed; 864 at a preferred walking speed; 1007 at a fast walking speed), with a total of 11895 recorded gait cycles (3851 at a slow walking speed; 4193 at a preferred walking speed; 3851 at a fast walking speed).
3.2. Step Detection Performance
Table 1 displays the overall performance of the trained temporal mEar network to detect IC and FC events. Both types of events were identified with high accuracy, with the detection rate for IC being nearly perfect and slightly less so for FC (F1-score 99% vs. 91%). The time agreement between predicted and ground truth events was close zero for IC (3 ms) and for FC events (2 ms).
3.3. Accuracy of Temporal Gait Cycle Parameters
Based on the detected temporal gait events, various temporal gait cycle parameters (i.e., stride time, swing time, double support time) were calculated. For each temporal gait cycle aspect, the mean, variability (i.e., CV), and side asymmetry were computed.
Table 2 provides an overview of the agreement of these mEar-derived gait metrics with the gold standard. All mean temporal gait cycle parameters exhibited excellent agreement with the gold standard. Agreement for variability parameters was excellent in the case of stride time and dropped to good or moderate agreement for swing and double support time, respectively. Agreement for asymmetry parameters was only moderate in the case of stride time but poor the remaining parameters.
3.4. Accuracy of Spatial Gait Cycle Parameters
Based on the temporally segmented gait cycles, two spatial mEar networks were trained to additionally estimate the spatial characteristics of walking (i.e., stride length and stride width). For each spatial gait cycle aspect, the mean, variability (i.e., CV), and side asymmetry were computed.
Table 3 provides an overview of the agreement of these mEar-derived gait metrics with the gold standard. Mean and variability of stride length yielded good to moderate agreement, while all other spatial gait cycle characteristics, particularly all parameters related to stride width, didn’t yield sufficient agreement with the gold standard.
3.4. Speed Dependence of Gait Segmentation
The relative errors (RSME
REL) of the mEar-derived mean spatiotemporal gait parameters were further separately analyzed for each subject and the different walking speeds (slow, preferred, fast) to investigate whether the performance of the gait identification algorithm depends on walking speed (
Figure 3). Agreement of the mEar-derived gait measures was almost uniformly comparable across the three gait speeds, except for swing time, which showed a slightly larger deviation from the ground truth at slow walking speeds (F(2, 20) = 3.533, p = 0.042).
4. Discussion
This study aimed to explore mobile spatiotemporal gait characterization using a single ear-worn motion sensor. We developed and trained a deep-learning-based algorithm (mEar) based on gait measurements across a wide range of speeds, employing a large healthy cohort. mEar demonstrates high accuracy and good to excellent validity in characterizing a broad range of not only temporal but also spatial aspects of walking. This characterization remains largely consistent across various slow to fast walking speeds. mEar’s performance is comparable to leading algorithms that utilize multiple motion sensors on the lower extremities, showing only marginal differences in accuracy [
13,
14,
24,
25]. Compared to other body locations, the ear offers practical advantages for mobile health assessment due to its natural suitability for integrating wearable sensors with existing assistive devices such as glasses frames or hearing aids. Moreover, the ear’s anatomical location facilitates simultaneous monitoring of vital signs like pulse rate, blood pressure, oxygen saturation, and body temperature, complementing gait analysis [
16,
20]. This integration facilitates comprehensive health monitoring in daily life and has the potential to enhance telemedicine applications, highlighting the importance of gait as a general indicator of overall health.
Various methods have been proposed for segmenting gait patterns and characterizing spatiotemporal gait features using one or multiple body-fixed motion sensors in previous studies. These methods vary in accuracy and practicality, particularly in their ability to generalize across diverse measurement conditions. Signal processing techniques such as peak detection, template matching, and feature identification are commonly employed in these approaches [
17,
24,
26]. However, they often rely on precise sensor positioning and orientation knowledge, necessitating careful calibration that can impede usability. Recently, deep-learning approaches have emerged as a promising alternative for mobile gait detection, offering enhanced robustness in segmenting gait patterns across different (noisy) measurement conditions compared to traditional methods [
13,
14,
22,
27]. In line with this, our deep-learning-based algorithm mEar demonstrates versatility by being indifferent to the specific ear side of sensor placement and operates effectively without initial calibration, accommodating variations in sensor orientation due to different ear anatomies. This renders our algorithm especially user-friendly and lowers the barriers for future clinical applications.
In terms of accuracy, methodologies employing foot-mounted motion sensors and deep-learning-based detection algorithms currently offer the most precise spatiotemporal gait characterization in everyday scenarios [
13,
22]. mEar achieves comparable accuracy in temporal characterization (step detection, stride time, gait phases), demonstrating nearly flawless and highly precise temporal step detection across a broad spectrum of walking speeds. Despite the anatomical distance between the head and feet, our algorithm also provides accurate spatial characterization of stride length, albeit less precise than results obtained with foot-mounted multi-sensor systems [
28]. Notably, mEar encounters challenges in accurately characterizing lateral gait, specifically stride width, similar to limitations observed in prior studies using multiple foot-mounted motion sensors [
25]. Therefore, accurately characterizing spatial gait characteristics in the frontal plane remains challenging when relying solely on body-worn motion sensors.
It has been established to evaluate the quality and impairments of walking across five gait domains [
29]: pace (e.g., walking speed and stride length), rhythm (e.g., swing and double support phases), variability (e.g., spatiotemporal stride variability), asymmetry (e.g., spatiotemporal stride asymmetry), and postural control (e.g., stride width). We have shown that mEar allows for the characterization of a range of spatiotemporal gait parameters with moderate to excellent concurrent validity (e.g., stride time, swing time, stride length, stride time variability, stride time asymmetry), reliably covering all these essential dimensions of gait assessment except for the domain of postural control. The demonstrated accuracy of the spatiotemporal gait readouts from mEar may also allow to monitor gait changes in patients with clinically meaningful precision. For instance, the minimal clinically important difference (MCID) for gait speed across various clinical populations has been estimated to fall between 10 and 20 cm/s [
30]. Assuming an average stride cycle lasts about 1 s, this would translate in a stride length MCID of approximately 10 cm that is slightly above the observed RMSE
ABS for mEar-derived stride length of 9.7 cm. Beyond gait speed, changes in gait variability have been shown to provide important insights into fall risk and disease progression in conditions like cerebellar gait ataxia or Parkinson’s disease [
31,
32,
33]. A MCID of 0.7 % for temporal gait variability has been recently estimated for patients with Parkinson’s disease [
34], which lies considerably above the precision of mEar-derived stride time variability with a RMSE
ABS of 0.3 %. Finally, the MCID for gait asymmetry—a crucial metric for evaluating rehabilitation outcomes in stroke patients—has been recently estimated between 10 to 20 %, which is well met by the precision of mEar-derived stride time asymmetry (RMSE
ABS of 0.3 %) or stride length asymmetry (RMSE
ABS of 1.4 %).
Mobile ear-based mobility and gait analysis can be seamlessly integrated into existing ear-mounted wearable technology for monitoring vital functions (
Figure 4). The motion sensor used in this study is part of a wearable in-ear vital sign monitor that allows continuous measurement of pulse rate, body temperature, and oxygen saturation. The integrated monitoring of activity (including gait) and vital signs holds significant potential for telemedicine applications in healthcare, as the data from both modalities can complement and enhance each other. Activity-aware vital sign monitoring, on the one hand, enables the contextualization of patients’ vital functions based on their current physical activity (e.g., resting heart rate during inactivity, increased heart rate during physical exertion) [
16,
35,
36,
37]. This approach may help establish individual baselines for vital functions and improve the sensitivity of detecting anomalies that could indicate health issues. Conversely, vital-status-aware monitoring of gait allows the assessment of walking in the context of associated vital functions, providing crucial insights into the energy efficiency and economy of walking [
38]. Continuous assessment of gait efficiency is particularly valuable in rehabilitation settings, as it can be used to adjust personalized recovery plans, thereby optimizing rehabilitation outcomes [
39,
40].
The here proposed gait identification algorithm mEar possesses inherent limitations that need to be addressed in follow-up studies. Unlike stationary optical gait analysis methods, gait segmentation algorithms based on body-fixed motion sensors are not agnostic regarding the underlying motion sequence, and their detection accuracy can critically depend on the quality of the measured gait sequence or fail in cases of significantly altered gait patterns as observed in certain clinical populations. Therefore, the accuracies reported in this study, based on a healthy cohort and laboratory assessment settings with straight walking paths, may not directly translate to clinical populations. Moreover, the algorithm’s performance in unrestricted real-world environments remains uncertain due to its training conditions in a controlled laboratory environment. However, prior research from similar studies indicates that deep learning algorithms for motion-sensor-based gait identification can be successfully adapted to clinical settings and real-world walking scenarios.
5. Conclusions
In this study, we introduced mEar, an algorithm for mobile gait analysis based on an ear-worn motion sensor. mEar allows for the precise determination of essential gait characteristics, including not only the average spatiotemporal gait pattern but also stride-to-stride variability and gait asymmetries. Thanks to its deep-learning-based architecture, mEar functions independently of which ear the sensor is worn on and requires no initial calibration regarding natural variations in sensor orientation. Combined with an in-ear vital sign monitor, mEar enables parallel and complementary monitoring of activity and vital functions, presenting promising applications in telemedicine and rehabilitation. Further studies are, however, necessary to validate the algorithm’s effectiveness in clinical cohorts and real-life conditions.
Author Contributions
Conceptualization, R.S. and M.W.; methodology, J.D. and M.W.; software, J.D and M.W.; validation, J.D., L.B., K.J., and M.W.; formal analysis, J.D., L.B. and M.W.; investigation, L.B., J.D., K.J., and M.W.; resources, J.D., R.S., K.J., and M.W.; data curation, L.B. and M.W.; writing—original draft preparation, J.D. and M.W.; writing—review and editing, L.B, R.S. and K.J.; visualization, M.W.; supervision, M.W.; project administration, M.W.; funding acquisition, K.J. and M.W. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by German Federal Ministry for Education and Science, grant number 01EO1401 and 13GW0490B.
Institutional Review Board Statement
The study protocol was approved by the ethics committee of the medical faculty of the University of Munich (LMU, 34-16) and the study was conducted in conformity with the Declaration of Helsinki.
Informed Consent Statement
Informed consent was obtained from all subjects involved in the study.
Data Availability Statement
The datasets used and/or analysed during the current study will be available from the corresponding author upon reasonable request.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Wren, T.A.; Gorton, G.E., 3rd; Ounpuu, S.; Tucker, C.A. , Efficacy of clinical gait analysis: A systematic review. Gait Posture 2011, 34, 149–53. [Google Scholar] [CrossRef] [PubMed]
- Snijders, A.H.; van de Warrenburg, B.P.; Giladi, N.; Bloem, B.R. , Neurological gait disorders in elderly people: clinical approach and classification. Lancet Neurol. 2007, 6, 63–74. [Google Scholar] [CrossRef]
- Jahn, K.; Zwergal, A.; Schniepp, R. , Gait disturbances in old age: classification, diagnosis, and treatment from a neurological perspective. Deutsches Arzteblatt international 2010, 107, 306–316. [Google Scholar]
- Goetz, C.G.; Tilley, B.C.; Shaftman, S.R.; Stebbins, G.T.; Fahn, S.; Martinez-Martin, P.; Poewe, W.; Sampaio, C.; Stern, M.B.; Dodel, R.; Dubois, B.; Holloway, R.; Jankovic, J.; Kulisevsky, J.; Lang, A.E.; Lees, A.; Leurgans, S.; LeWitt, P.A.; Nyenhuis, D.; Olanow, C.W.; Rascol, O.; Schrag, A.; Teresi, J.A.; van Hilten, J.J.; LaPelle, N. , Movement Disorder Society-sponsored revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS): Scale presentation and clinimetric testing results. Mov. Disord. 2008, 23, 2129–2170. [Google Scholar] [CrossRef]
- Schmitz-Hübsch, T.; du Montcel, S.T.; Baliko, L.; Berciano, J.; Boesch, S.; Depondt, C.; Giunti, P.; Globas, C.; Infante, J.; Kang, J.S.; Kremer, B.; Mariotti, C.; Melegh, B.; Pandolfo, M.; Rakowicz, M.; Ribai, P.; Rola, R.; Schols, L.; Szymanski, S.; van de Warrenburg, B.P.; Durr, A.; Klockgether, T.; Fancellu, R. , Scale for the assessment and rating of ataxia: development of a new clinical scale. Neurology 2006, 66, 1717–20. [Google Scholar] [CrossRef] [PubMed]
- Kurtzke, J.F. , Rating neurologic impairment in multiple sclerosis. Neurology 1983, 33, 1444–1444. [Google Scholar] [CrossRef] [PubMed]
- Warmerdam, E.; Hausdorff, J.M.; Atrsaei, A.; Zhou, Y.; Mirelman, A.; Aminian, K.; Espay, A.J.; Hansen, C.; Evers, L.J.W.; Keller, A.; Lamoth, C.; Pilotto, A.; Rochester, L.; Schmidt, G.; Bloem, B.R.; Maetzler, W. , Long-term unsupervised mobility assessment in movement disorders. Lancet Neurol. 2020, 19, 462–470. [Google Scholar] [CrossRef]
- Hillel, I.; Gazit, E.; Nieuwboer, A.; Avanzino, L.; Rochester, L.; Cereatti, A.; Croce, U.D.; Rikkert, M.O.; Bloem, B.R.; Pelosin, E.; Del Din, S.; Ginis, P.; Giladi, N.; Mirelman, A.; Hausdorff, J.M. , Is every-day walking in older adults more analogous to dual-task walking or to usual walking? Elucidating the gaps between gait performance in the lab and during 24/7 monitoring. Eur. Rev. Aging Phys. Act. 2019, 16, 6. [Google Scholar] [CrossRef]
- Hausdorff, J.M.; Hillel, I.; Shustak, S.; Del Din, S.; Bekkers, E.M.J.; Pelosin, E.; Nieuwhof, F.; Rochester, L.; Mirelman, A. , Everyday Stepping Quantity and Quality Among Older Adult Fallers With and Without Mild Cognitive Impairment: Initial Evidence for New Motor Markers of Cognitive Deficits? J. Gerontol. A Biol. Sci. Med. Sci. 2018, 73, 1078–1082. [Google Scholar] [CrossRef]
- Mico-Amigo, M.E.; Bonci, T.; Paraschiv-Ionescu, A.; Ullrich, M.; Kirk, C.; Soltani, A.; Kuderle, A.; Gazit, E.; Salis, F.; Alcock, L.; Aminian, K.; Becker, C.; Bertuletti, S.; Brown, P.; Buckley, E.; Cantu, A.; Carsin, A.E.; Caruso, M.; Caulfield, B.; Cereatti, A.; Chiari, L.; D’Ascanio, I.; Eskofier, B.; Fernstad, S.; Froehlich, M.; Garcia-Aymerich, J.; Hansen, C.; Hausdorff, J.M.; Hiden, H.; Hume, E.; Keogh, A.; Kluge, F.; Koch, S.; Maetzler, W.; Megaritis, D.; Mueller, A.; Niessen, M.; Palmerini, L.; Schwickert, L.; Scott, K.; Sharrack, B.; Sillen, H.; Singleton, D.; Vereijken, B.; Vogiatzis, I.; Yarnall, A.J.; Rochester, L.; Mazza, C.; Del Din, S.; Mobilise, D. c. , Assessing real-world gait with digital technology? Validation, insights and recommendations from the Mobilise-D consortium. J. Neuroeng. Rehabil. 2023, 20, 78. [Google Scholar] [CrossRef]
- Schniepp, R.; Huppert, A.; Decker, J.; Schenkel, F.; Schlick, C.; Rasoul, A.; Dieterich, M.; Brandt, T.; Jahn, K.; Wuehr, M. , Fall prediction in neurological gait disorders: differential contributions from clinical assessment, gait analysis, and daily-life mobility monitoring. J. Neurol. 2021, 268, 3421–3434. [Google Scholar] [CrossRef] [PubMed]
- Ilg, W.; Müller, B.; Faber, J.; van Gaalen, J.; Hengel, H.; Vogt, I.R.; Hennes, G.; van de Warrenburg, B.; Klockgether, T.; Schöls, L.; Synofzik, M. , Digital Gait Biomarkers Allow to Capture 1-Year Longitudinal Change in Spinocerebellar Ataxia Type 3. Mov. Disord. 2022. [Google Scholar]
- Romijnders, R.; Warmerdam, E.; Hansen, C.; Schmidt, G.; Maetzler, W. , A Deep Learning Approach for Gait Event Detection from a Single Shank-Worn IMU: Validation in Healthy and Neurological Cohorts. Sensors (Basel) 2022, 22. [Google Scholar] [CrossRef] [PubMed]
- Romijnders, R.; Salis, F.; Hansen, C.; Kuderle, A.; Paraschiv-Ionescu, A.; Cereatti, A.; Alcock, L.; Aminian, K.; Becker, C.; Bertuletti, S.; Bonci, T.; Brown, P.; Buckley, E.; Cantu, A.; Carsin, A.E.; Caruso, M.; Caulfield, B.; Chiari, L.; D’Ascanio, I.; Del Din, S.; Eskofier, B.; Fernstad, S.J.; Frohlich, M.S.; Garcia Aymerich, J.; Gazit, E.; Hausdorff, J.M.; Hiden, H.; Hume, E.; Keogh, A.; Kirk, C.; Kluge, F.; Koch, S.; Mazza, C.; Megaritis, D.; Mico-Amigo, E.; Muller, A.; Palmerini, L.; Rochester, L.; Schwickert, L.; Scott, K.; Sharrack, B.; Singleton, D.; Soltani, A.; Ullrich, M.; Vereijken, B.; Vogiatzis, I.; Yarnall, A.; Schmidt, G.; Maetzler, W. , Ecological validity of a deep learning algorithm to detect gait events from real-life walking bouts in mobility-limiting diseases. Front. Neurol. 2023, 14, 1247532. [Google Scholar] [CrossRef] [PubMed]
- Kluge, F.; Brand, Y.E.; Micó-Amigo, M.E.; Bertuletti, S.; D’Ascanio, I.; Gazit, E.; Bonci, T.; Kirk, C.; Küderle, A.; Palmerini, L.; Paraschiv-Ionescu, A.; Salis, F.; Soltani, A.; Ullrich, M.; Alcock, L.; Aminian, K.; Becker, C.; Brown, P.; Buekers, J.; Carsin, A.-E.; Caruso, M.; Caulfield, B.; Cereatti, A.; Chiari, L.; Echevarria, C.; Eskofier, B.; Evers, J.; Garcia-Aymerich, J.; Hache, T.; Hansen, C.; Hausdorff, J.M.; Hiden, H.; Hume, E.; Keogh, A.; Koch, S.; Maetzler, W.; Megaritis, D.; Niessen, M.; Perlman, O.; Schwickert, L.; Scott, K.; Sharrack, B.; Singleton, D.; Vereijken, B.; Vogiatzis, I.; Yarnall, A.; Rochester, L.; Mazzà, C.; Del Din, S.; Mueller, A. , Real-World Gait Detection Using a Wrist-Worn Inertial Sensor: Validation Study. JMIR Form Res 2024, 8, e50035. [Google Scholar] [CrossRef]
- Boborzi, L.; Decker, J.; Rezaei, R.; Schniepp, R.; Wuehr, M. , Human Activity Recognition in a Free-Living Environment Using an Ear-Worn Motion Sensor. Sensors 2024, 24, 2665. [Google Scholar] [CrossRef]
- Seifer, A.-K.; Dorschky, E.; Küderle, A.; Moradi, H.; Hannemann, R.; Eskofier, B.M. , EarGait: Estimation of Temporal Gait Parameters from Hearing Aid Integrated Inertial Sensors. Sensors 2023, 23. [Google Scholar] [CrossRef]
- Kavanagh, J.J.; Morrison, S.; Barrett, R.S. , Coordination of head and trunk accelerations during walking. Eur. J. Appl. Physiol. 2005, 94, 468–75. [Google Scholar] [CrossRef]
- Winter, D.A.; Ruder, G.K.; MacKinnon, C.D. Control of Balance of Upper Body During Gait. In Multiple Muscle Systems: Biomechanics and Movement Organization; Winters, J.M., Woo, S.L.Y., Eds.; Springer New York: New York, NY, 1990; pp. 534–541. [Google Scholar]
- Röddiger, T.; Clarke, C.; Breitling, P.; Schneegans, T.; Zhao, H.; Gellersen, H.; Beigl, M. , Sensing with Earables. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2022, 6, 1–57. [Google Scholar]
- Kroneberg, D.; Elshehabi, M.; Meyer, A.C.; Otte, K.; Doss, S.; Paul, F.; Nussbaum, S.; Berg, D.; Kuhn, A.A.; Maetzler, W.; Schmitz-Hubsch, T. , Less Is More—Estimation of the Number of Strides Required to Assess Gait Variability in Spatially Confined Settings. Front. Aging Neurosci. 2018, 10, 435. [Google Scholar]
- Gadaleta, M.; Cisotto, G.; Rossi, M.; Rehman, R.Z.U.; Rochester, L.; Del Din, S. In Deep learning techniques for improving digital gait segmentation, 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2019; IEEE: 2019; pp 1834-1837.
- Koo, T.K.; Li, M.Y. , A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. J. Chiropr. Med. 2016, 15, 155–63. [Google Scholar] [CrossRef]
- Kluge, F.; Gassner, H.; Hannink, J.; Pasluosta, C.; Klucken, J.; Eskofier, B.M. , Towards Mobile Gait Analysis: Concurrent Validity and Test-Retest Reliability of an Inertial Measurement System for the Assessment of Spatio-Temporal Gait Parameters. Sensors (Basel) 2017, 17. [Google Scholar] [CrossRef]
- Teufl, W.; Lorenz, M.; Miezal, M.; Taetz, B.; Fröhlich, M.; Bleser, G. , Towards Inertial Sensor Based Mobile Gait Analysis: Event-Detection and Spatio-Temporal Parameters. Sensors 2018, 19. [Google Scholar] [CrossRef]
- Godfrey, A.; Del Din, S.; Barry, G.; Mathers, J.C.; Rochester, L. , Instrumenting gait with an accelerometer: a system and algorithm examination. Med Eng Phys 2015, 37, 400–7. [Google Scholar] [CrossRef] [PubMed]
- Zadka, A.; Rabin, N.; Gazit, E.; Mirelman, A.; Nieuwboer, A.; Rochester, L.; Del Din, S.; Pelosin, E.; Avanzino, L.; Bloem, B.R.; Della Croce, U.; Cereatti, A.; Hausdorff, J.M. , A wearable sensor and machine learning estimate step length in older adults and patients with neurological disorders. NPJ Digit Med 2024, 7, 142. [Google Scholar] [CrossRef] [PubMed]
- Hannink, J.; Kautz, T.; Pasluosta, C.F.; Barth, J.; Schulein, S.; GaBmann, K.G.; Klucken, J.; Eskofier, B.M. , Mobile Stride Length Estimation With Deep Convolutional Neural Networks. IEEE J Biomed Health Inform 2018, 22, 354–362. [Google Scholar] [CrossRef] [PubMed]
- Lord, S.; Galna, B.; Rochester, L. , Moving forward on gait measurement: toward a more refined approach. Mov. Disord. 2013, 28, 1534–43. [Google Scholar] [CrossRef] [PubMed]
- Bohannon, R.W.; Glenney, S.S. , Minimal clinically important difference for change in comfortable gait speed of adults with pathology: a systematic review. J. Eval. Clin. Pract. 2014, 20, 295–300. [Google Scholar] [CrossRef]
- Schaafsma, J.D.; Giladi, N.; Balash, Y.; Bartels, A.L.; Gurevich, T.; Hausdorff, J.M. , Gait dynamics in Parkinson’s disease: relationship to Parkinsonian features, falls and response to levodopa. J. Neurol. Sci. 2003, 212, 47–53. [Google Scholar]
- Schniepp, R.; Mohwald, K.; Wuehr, M. , Gait ataxia in humans: vestibular and cerebellar control of dynamic stability. J. Neurol. 2017, 264 (Suppl 1), 87–92. [Google Scholar] [CrossRef]
- Ilg, W.; Seemann, J.; Giese, M.; Traschutz, A.; Schols, L.; Timmann, D.; Synofzik, M. , Real-life gait assessment in degenerative cerebellar ataxia: Toward ecologically valid biomarkers. Neurology 2020, 95, e1199–e1210. [Google Scholar] [CrossRef] [PubMed]
- Baudendistel, S.T.; Haussler, A.M.; Rawson, K.S.; Earhart, G.M. , Minimal clinically important differences of spatiotemporal gait variables in Parkinson disease. Gait Posture 2024, 108, 257–263. [Google Scholar] [CrossRef] [PubMed]
- Lokare, N.; Zhong, B.; Lobaton, E. , Activity-Aware Physiological Response Prediction Using Wearable Sensors. Inventions 2017, 2. [Google Scholar] [CrossRef]
- Wu, K.; Chen, E.H.; Hao, X.; Wirth, F.; Vitanova, K.; Lange, R.; Burschka, D. In Adaptable Action-Aware Vital Models for Personalized Intelligent Patient Monitoring, 2022 International Conference on Robotics and Automation (ICRA), 23-27 May 2022, 2022; 2022; pp 826-832.
- Sun, F.-T.; Kuo, C.; Cheng, H.-T.; Buthpitiya, S.; Collins, P.; Griss, M. Activity-Aware Mental Stress Detection Using Physiological Sensors, Berlin, Heidelberg, 2012; Springer Berlin Heidelberg: Berlin, Heidelberg, 2012. [Google Scholar]
- Waters, R.L.; Mulroy, S. , The energy expenditure of normal and pathologic gait. Gait Posture 1999, 9, 207–31. [Google Scholar] [CrossRef]
- Slater, L.; Gilbertson, N.M.; Hyngstrom, A.S. , Improving gait efficiency to increase movement and physical activity—The impact of abnormal gait patterns and strategies to correct. Prog. Cardiovasc. Dis. 2021, 64, 83–87. [Google Scholar] [CrossRef] [PubMed]
- Moore, J.L.; Nordvik, J.E.; Erichsen, A.; Rosseland, I.; Bø, E.; Hornby, T.G. , Implementation of High-Intensity Stepping Training During Inpatient Stroke Rehabilitation Improves Functional Outcomes. Stroke 2020, 51, 563–570. [Google Scholar]
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).