Preprint
Review

Evaluating Mobility in Parkinson’s Disease through Wearable Sensors: A Systematic Review of Digital Biomarkers

Altmetrics

Downloads

107

Views

84

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

03 September 2024

Posted:

04 September 2024

You are already at the latest version

Alerts
Abstract
Parkinson's disease (PD) is the second most common neurodegenerative disorder, entailing several motor-related symptoms that contribute to a reduced quality of life in affected subjects. Recent advances in wearable technologies and computing resources have shown great potential for the assessment of PD-related symptoms. However, the potential applications (e.g., early diagnosis, prognosis and monitoring) and key features of digital biomarkers for motor symptoms of PD (DB-MS-PD) have not been comprehensively studied. This study aims to provide a state-of-the-art review of current digital biomarker definitions for PD, focusing on the use of wearable devices. This review systematically examines research articles from 2012 to 2024, focusing on key features and recent technologies in PD research. A total of 22 studies were included and thoroughly analyzed. Results indicate that DB-MS-PD can accurately distinguish patients with PD (PwPD) from healthy controls (HC), assess disease severity or treatment response, and detect motor symptoms. Large sample sizes, proper validation, non-invasive devices, and ecological monitoring make DB-MS-PD promising for improving PD management. Challenges include sample and method heterogeneity and lack of public datasets. Future studies can leverage evidence of the current literature to provide more effective and ready-to-use digital tools for monitoring PD.
Keywords: 
Subject: Engineering  -   Bioengineering

1. Introduction

Parkinson’s disease (PD) is a chronic neurological disorder caused by the progressive loss of dopamine-producing cells. Dopamine is a neurotransmitter involved, among other things, in the organisation of movement and whose effect is to strenghten muscular activation [1,2]. PD is the second most common neurodegenerative disease after Alzheimer’s dementia and mainly affects elderly people. The increasingly elderly population has led to an increase in the number of subjects affected by PD, reaching 8.5 million patients in 2019 [3]. Moreover, projections indicate a substantial increase, with the global patient population anticipated to 12 million in 2040 [4].
PD include motor symptoms such as tremor, bradykinesia, rigidity, or postural instability; and non-motor symptoms such as as loss of smell, constipation, fatigue, anxiety, depression, and REM sleep behavior disorder, which can occur years before the diagnosis [5,6,7]. These symptoms vary from patient to patient, as well as over the course of the disease [8], gradually reducing patients’ quality of life [9].
The diagnosis of PD is complex, as there is no pathognomonic sign or biomarker to confirm the disease [2]. Instead, the diagnosis is based on a combination of medical history, clinical examination and the presence of specific motor and non-motor symptoms [10]. The diagnosis process involves an evaluation by neurologists, and may be difficult especially in the early stages of the disease, when symptoms may be mild or non-specific. Therefore, it is important to monitor symptoms regularly to appropriately adjust the treatment plan [11].
Currently, there is no cure that can stop or reverse the disease progression. Instead, several symptomatic therapies are available, which can guarantee a good control of the disease for several years. The most common options are drug treatment or, in severe cases, surgical interventions (HIFU, intracranial implants or duodenal pumps, among others) [12]. The most widely used drug is Levodopa, a metabolic precursor of dopamine [13], which is effective in controlling symptoms during the first years of treatment. However, after several years of Levodopa treatment, motor complications such as dyskinesias (i.e., involuntary movements) and motor fluctuations may occur, and the patient may switch between periods of adequate and poor control of the disease [14].
At present, the most commonly used scale for evaluating PD progression is the Movement Disorder Society-Sponsored Revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS) [15]. This scale is designed to evaluate both motor and non-motor symptoms during a clinical examination, using a series of questions and guided exercises performed by the patient.
The fluctuating nature of PD symptoms during the day [16] and the infrequent clinical consultations (6-9 months apart [17]) make it difficult to accurately assess the disease. Furthermore, the correct application and interpretation of this scale depend on the doctor’s experience. The combination of these variables complicates the accuracy of therapeutic and pharmacological adjustments, often leading to complications derived from over- or under-medication [18].
In recent years, the emergence of new technologies, such as wearable sensors, has gained significant attention in the management of PD. These devices offer the potential to monitor various aspects of PD symptoms and motor function [19]. Wearable sensors enable continuous, objective, and long-term data collection and monitoring in a variety of contexts, including clinical and free-living environments [20].
Wearable devices can collect a large amount of data, which could give rise to digital biomarkers (i.e., quantitative and sensitive measurements of disease progression) for PD assessment, providing a wealth of data that traditional clinical assessments cannot capture. This extensive data collection can enable better understanding and management of the disease, facilitate personalized treatment, and support large-scale studies that contribute to the development of new therapies and interventions [21].

1.1. Review Objectives

This article aims to provide the reader with a summary of the trends and techniques used by the scientific community in the use of wearable technologies to acquire data and define digital biomarkers for motor symptoms, thus providing a review of the current status of the defined digital biomarkers.
Recent reviews have investigated the potential of digital biomarkers in PD, providing useful insights from different perspectives. A summary of biomarkers (speech, gait, handwriting) with potential clinical applications in PD was provided in [22], with a particular focus on validation, regulatory approval, ethical, legal, and social aspects. In [23], the potential of biomarkers extracted from facial expression analysis and eye tracking were investigated for early diagnosis and assessment of cognitive decline. Nevertheless, to the best of our knowledge, no review studies have focused on digital biomarkers for motor symptoms of PD (DB-MS-PD) obtained through wearable devices, as well as their key features, encompassing sensor settings, experimental procedures, and data analysis methods.
This study reviews relevant research articles published between 2012 and 2024, focusing on digital biomarkers defined for PD through motion analysis using data collected through wearable devices. In specific, the objective of this study is to provide a comprehensive overview of sensors’ settings, experimental procedures and data analysis methods that allow the extraction of relevant DB-MS-PD. Ultimately, comparative analysis of various studies can facilitate the identification of promising and minimally invasive systems (both hardware and software) for the supervised and unsupervised assessment of PD.
The remainder of the document is organized as follows. Section 2 presents an overview of the wearable devices and digital biomarkers. Section 3 describes the methodology used for the systematic review, including the search strategy used to find relevant articles. Section 4 presents the results of this literature review. Section 5 provides a discussion of the main findings in the analyzed articles. Finally, the conclusions of this work are given in Section 6.

2. Background

2.1. Wearable Devices

The term wearable devices (wearables) refers to compact electronic devices as well as wireless-enabled computers that are seamlessly integrated into gadgets, accessories, or clothing designed to be worn on the human body. It also includes more invasive devices such as microchips or smart tattoos, or commercial and widespread devices including smart glasses, smartphones, smartwatches, smart clothing, and smart shoes, among others [24,25].
In recent years, wearable devices have shown the potential to overcome certain limitations of the traditional healthcare and medical assessments by harnessing digital and mobile health (m-health) technologies to progress towards efficient and personalized healthcare [26]. Moreover, these devices can facilitate long-term monitoring outside clinical settings, offering a discreet and comfortable solution [27].
Wearable devices have the ability to collect comprehensive health information while functioning as they were originally designed, like fashion or productivity devices. The collected data can be analysed using standard protocols by artificial intelligence, with the aim of identifying possible predictions of health problems [28], as well as recognising activities and detecting the context in which these activities are performed [29,30].
In the realm of monitoring and management of neurological disorders, wearable devices have emerged as a highly effective tool [31]. These devices allow for continuous and non-invasive monitoring of a range of physiological and behavioral parameters, both in medical consultation and in free-living conditions [32,33,34]. Wearables have the possibility to provide an abundance of data that can support tasks such as early detection, optimization of treatment, and management of neurological disorders [35]. However, it is important to acknowledge that despite the potential advantages of this technology, their application should be complemented by professional medical supervision and should complement traditional clinical evaluations [36].
In PD, wearable devices have been used to assess various symptoms. In [37], commercial devices used for PD management were evaluated, considering their validation process and clinical applications along with the strengths and weaknesses of each. In [38] the authors explored the wearable devices used for PD in hospitals, concluding that the most common types of wearable devices include inertial measurement units and smartwatches. In [39] a comprehensive review was performed on how wearable sensors can support tasks such as early diagnosis, human motion analysis, motor fluctuations, and home and long-term monitoring for PwPD. However, although wearable devices hold promise for improving PD management, their adoption in the clinical setting is limited by problems such as inadequate technical and clinical validation [40].

2.2. Digital Biomarkers

The term biomarker refers to objective medical signs that can be accurately and reproducibly measured outside the patient. In contrast, medical symptoms are health or disease indications perceived by patients [41]. Biomarkers can be defined as characteristics that are objectively measured and evaluated as indicators of normal biological processes, pathogenic processes, or pharmacological responses to therapeutic interventions [42]. Biomarkers can include any substance, structure, or process that can be measured from the body and can influence or predict the incidence of the outcome or disease. They can be diverse, from gait cadence and pulse measurements to complex laboratory tests of blood and other tissues [43]. However, the main objective of a biomarker remains to establish the connection between measurable parameters and relevant clinical endpoints [41].
The type of biomarkers studied in this work are known as digital biomarkers. According to the U.S. Food and Drug Administration (FDA), a digital biomarker is a feature or group of features obtained from digital health technologies that are measured to indicate normal biological processes, pathogenic processes, or reactions to exposure or intervention, such as therapeutic interventions [44,45]. Digital biomarkers are rapidly developing frontier enabled by the availability of sensors and personal devices that can assimilate information about an individual’s psychological state, exercise level, cognitive abilities, eating patterns, movement, and tremor [46]. These data are largely derived from sources such as smartphones and portable electronic devices [47].
Although many additional studies are needed to link digital phenotypes and endpoints with traditional measurements, digital biomarkers have the potential to introduce novel measurements for phenomena that are already in use [48]. Depending on the applications of biomarkers, several types can be identified, such as diagnostic biomarkers (to confirm the presence of a certain disease), monitoring biomarkers (to determine the progression of a disease), pharmacodynamic/response biomarkers (to check the response to certain therapies), predictive biomarkers (to know the response to certain medical products) and prognostic biomarkers (to identify the possibility of suffering a certain disease). [49].
Extracting digital biomarkers involves collecting and analyzing data from digital devices or platforms that can provide insights into physiological, behavioral and cognitive states. These biomarkers can be derived from various sources such as smartphones, wearables, social media, and other digital interactions. The information defined in the biomarker must be meticulously processed and extracted, as raw data is influenced by numerous processes that obscure the underlying signals. To achieve this, various techniques in digital signal processing, statistical analysis, and artificial intelligence are employed to extract and refine the data into a coherent biomarker. Ultimately, these biomarkers undergo validation through clinical studies to ensure their accuracy and reliability in reflecting predicted physiological, behavioral, or cognitive states.

3. Methods

3.1. Research Questions

The aim of this review is to collect, analyze, and evaluate studies concerning the identification of DB-MS-PD. To achieve this objective, the following research questions were formulated:
  • RQ-1 What type of wearables is commonly used to capture DB-MS-PD?
  • RQ-2 Are there specific digital biomarkers that are commonly measured or tracked using wearables in PD?
  • RQ-3 How reliable and accurate are the digital biomarkers captured by these wearables?
  • RQ-4 What are the main challenges or limitations associated with using wearables for capturing DB-MS-PD?

3.2. Search Strategy

On 14 February 2024, a literature search was conducted on the PubMed, Scopus, IEEE and Web of Science databases for all the returned results. The search string included keywords related to the disease under investigation, the type of biomarker searched and the devices used to collect the data. In more detail, the following Boolean search string was used:
((Parkinson) OR (Motor symptoms)) AND (Biomarker) AND (Digital) AND (Wearable).
No additional filters were applied in the literature search. All retrieved studies were systematically identified and screened, and the data were extracted for relevant information following the PRISMA guidelines [50].

3.3. Inclusion and Exclusion Criteria

The topic of this review concerns the definition of digital biomarkers of PD obtained from wearable devices. Journal articles published between January 2012 and February 2024 and written in English were included. Furthermore, the exclusion criteria were as follows:
  • Papers without peer review, books, book chapters, or published as “letter”, “comments”, "perspective" “case reports”, "surveys" or "reviews".
  • Literature not written in English.
  • Studies related to diseases other than PD.
  • Studies that did not use any wearable devices or portable sensors for data acquisition.
  • Studies showing the results of a challenge, competition or programme.
  • Studies primarily focused on activities not related to motor symptoms in PD.
  • Studies that do not include humans.

3.4. Data Extraction

Four authors (C.PF, L.S., L.B., and I.P.) independently selected candidate studies by reviewing the title and abstract and repeated the process until they reached a consensus. The same procedure was performed for the selection based on the full-text evaluation. Finally, candidate studies that met the eligibility criteria were selected for inclusion in the review. The following information was included in the data extraction procedure:
  • Identification of study data, including authors, title and citation.
  • Type of test performed.
  • Characteristics of the participants in the study.
  • Type, number, and location of the wearable sensors and devices used for data acquisition.
  • Objective of the study.
  • End points.

4. Results

4.1. Systematic Review

Based on the search criteria, 59 articles were retrieved from PubMed, 172 from Scopus, 16 from Web of Science and 8 from IEEE, for a total of 255 publications. After removing duplicates (57), 198 publications were examined for titles and abstracts. Then 143 articles were excluded according to the exclusion criteria. Of the 55 remaining records, 9 studies were excluded due to the unavailability of the full text. Consequently, a total of 46 full texts were screened for eligibility, and 24 records were excluded according to the exclusion criteria. Finally, 22 research studies were included and reviewed. The PRISMA flow chart used for the literature search and selection is shown in Figure 1.

4.2. Study Characteristics

A summary of the findings of the 22 reviewed articles is given in Table 1. The information in each article was harmonized to facilitate comparability and analysis among studies.

4.3. Study Design

The studies under investigation analyzed different sets of activities and tasks. These can be classified into 11 different categories: activities of daily living, balance task, finger tapping, MDS-UPDRS task, postural tremor, pronation-supination movements of the hands, reaction time, rest tremor, timed up and go (TUG) test, gait task, and others.
Due to the diversity of exercises performed within a category, similar activities were grouped in a more general category. Specifically, the finger tapping activity includes four types of sub-activities: index and middle finger tapping, alternate index finger tapping, thumb-index finger tapping, and two-target finger tapping test. Similarly, the gait includes: 2-min walk at convenient speed, 20-m straight walk at convenient speed, 20-m straight walk at fast speed, 10-m at convenient speed, 20-m circular walk at fast speed, 20-steps straight walk, normal walking, rhythmic auditory cued walking, treadmill walking, and 2-min circular walk at convenient speed.
Figure 2 shows the distribution of the different activities evaluated in the selected studies.
A total of 52 different activities were investigated in the 22 articles analyzed. The most frequent activity is gait (N=16, 31%), followed by finger tapping (N=8, 15%), balance task (N=5, 10%), and TUG test (N=4, 8%). Other activities that appear less frequently include postural tremor, pronation-supination, rest tremor, MDS-UPDRS tasks and reaction time.
Less than one third of the studies (N=6, 27%) used public datasets. In more detail, 3 studies [57,60,72] used the mPower dataset [73] and 3 studies [51,58,59] used data from the Physionet vertical ground reaction force (VGRF) dataset [74]. The Physionet database contains measures of gait from 93 PwPD and 73 HC. The database includes the VGRF records of subjects as they walked at their usual, self-selected pace for approximately 2 minutes on level ground. The mPower database includes data from more than 1000 PwPD and 4000 HC. Different activities (i.e., memory, tapping, voice and walking) were recorded with the built-in smartphone sensors (i.e., triaxial accelerometer, microphone and touchscreen) in unsupervised environments.

4.4. Participant Characteristics

Of the 22 studies reviewed, 5 of them evaluated only PwPD [53,55,62,66,68], while the other 17 papers included PwPD and HC in their study. A single study [63] included people with idiopathic REM sleep behavior disorder, while [66] also evaluated subjects with progressive supranuclear palsy. Furthermore, in most studies (53%) the number of PwPD was higher than the number of HC.
The distribution of the number of articles according to the number of participants is shown in Figure 3. In terms of the number of PwPD, the mean and median values are 44 and 33 subjects, respectively. However, it should be noted that there is a high variability, with a minimum of 5 and a maximum of more than 1000 PwPD. More than half of the articles (n=14, 63%) presented a sample size of 1 to 50 patients, while 14% (n=3) enrolled more than 1000 patients.
This variability can also be observed with the distribution of HC, which presents mean and median values of 57 and 50 participants, respectively. In this case, 59% (n=13) of the studies enrolled less than 50 controls while 18% (n=4) use a sample size of more than 1000 controls.
Regarding the gender distribution (Figure 4), 65% PwPD were male while 35% were female, reflecting the actual gender distribution in the PD population. This difference increases in HC, where 71% of the participants were male and 29% were female.

4.5. Device, Sensor and Body Location

Regarding the type of devices used, 20 studies (91%) used a single type of device for data recording, while [53] employed two different devices and [69] used three types of devices. In total, 25 devices were used in the 22 studies. As shown in Figure 5a, the most commonly used device is the inertial measurement unit (IMU) (n=11 of the 25 devices, 44%) [52,54,55,56,62,64,65,66,67,68,69], followed by the smartphone (n=7, 28%). [57,60,61,63,69,71,72]; pressure insoles were used in 3 cases (12%) [51,58,59], while other devices such as tablets were used in 4 studies (16%) [53,69,70].
In terms of sensor types (Figure 5b), of the 55 total sensors used, the most used sensor is the triaxial accelerometer (n=17, 31%), followed by the triaxial gyroscope (n=11, 20%), the touch screen (n=7, 13%), triaxial magnetometer (n=6, 11%), VGRF sensors (n=3, 5%), and barometer (n=2, 4%). Other sensors (n=9, 16%) include the goniometer, GPS or galvanic skin response.
Figure 6a presents the distribution of the number of devices used. The most common option involves a single device (n=14, 64%). The remaining studies used multiple sensors. Specifically, 6 papers (27%) [52,53,54,55,64,66] used between 2 and 3 devices, while the 2 remaining papers (10%) [62,69] used more than 8 devices. Figure 6b presents the number of sensors used. The most popular choice is to use 2 sensors (n=8, 36%). A single sensor was used in 3 articles [56,65,70] (14%), 3 sensors in 6 articles [52,54,56,66,67,71] (27%), while 5 articles (23%) use more than 10 sensors.
Figure 7 shows common positions of the sensor on the human body. The device was placed in front of the person (e.g., smartphone, tablet) or on the foot or in the lower back in 8 articles each (18%). In 5 cases (11%), the wearable device was placed in the pocket, in the hand in 4 papers (9%) and the wrist in 2 cases (5%). The rest of the studies positioned the sensors in other locations like the ankle or the waist. Half of the included articles used a single location, while the other half used more than one body position.

4.6. Aim

In terms of the study aims, the reviewed articles were categorized into the following primary areas: diagnosis (i.e., classification between PwPD and HC), monitoring, and prognosis, as outlined in [20].
Figure 8 presents a summary of the principal aims of the selected studies. Most of these studies focused on the classification between PwPD and HC (n=12, 55%), followed by monitoring (n=6, 27%), prognosis (n=2, 9%) and other type of classification (n=2, 9%). The six studies on PD monitoring included the assessment of gait impairment (n=5, 38%), therapy response (n=3, 23%), ON-OFF state (n=3, 23%) estimation, and automatic detection of motor symptoms (bradykinesia, tremor, and postural instability) (n=2, 15%). Regarding prognosis, the studies aimed to predict the risk of falls (number of falls) and the evolution of motor symptoms such as abnormalities in movement of the lower-limbs.
Most of the studies addressed multiple aims, as shown in Figure 9. Specifically, 59% (n = 13) of the studies addressed more than one aim, while the remaining 41% (n = 9) addressed a single primary area (i.e., diagnosis, monitoring, and prognosis). Three studies (15%) [61,69,70] combined ON-OFF states monitoring with PD and HC classification, while three studies (15%) [62,67,72] combined therapy response evaluation with PwPD/HC classification and prognosis. Two studies (10%) [64,71] combined motor symptoms monitoring with PwPD and HC and prognosis, respectively. Two studies (10%) [58,59] combined classification PwPD/HC with gait monitoring. Two studies (10%) [53,63] combined other aims, such as classification of therapy/placebo subjects and therapy response monitoring, and the classification of PD/REM sleep behaviour disorder with the classification of PwPD/HC. In [68], the classification of motor condition was addressed, together with the assessment of the quality of life of PwPD.

4.7. Endpoints

DB-MS-PD were extracted by different methods (Figure 10). 7 studies (32%) assessed the potential of single features as biomarkers. Most studies (n=14, 64%) combined multiple features and used machine learning (ML) models to provide a robust result. Finally, only one study used raw data as input for a deep learning (DL) model.
In 11 studies (50%), different motor tasks were evaluated. The distribution of the most relevant tasks in studies addressing multiple tasks is shown in Figure 11. In 7 studies (32%), finger tapping was associated to different tasks, including balance, gait, rest tremor, postural tremor, pronation-supination, leg agility and reaction time. In 6 of these (27%) studies, finger tapping performed better than the other tasks for PD diagnosis, ON/OFF detection and PD progression. In a single study [63], resting and postural tremor provided better results than the other tasks (including finger tapping) for the diagnosis of PD. Gait gave better results than TUG test in 2 cases (9%) for PD detection and progression. A single study [62] (which included neither finger tapping nor gait) found that toe tapping and TUG tasks give better results than leg agility and hand movements for PD progression estimation. Finally, a single study [63] concluded that postural and rest tremor get better results than balance, gait, finger tapping, and reaction time tasks for the detection of sleep behavior disorders.
In half of the studies, DB-MS-PD were extracted in the laboratory/clinic, whereas in the other half, remote home monitoring was used. In the laboratory-based studies, mostly supervised active tasks (participants were asked to perform different tasks) were used, whereas in one study both active and passive monitoring were used. When data were recorded at home (11 studies), active tasks were used in 6 and passive monitoring (extraction of DB-MS-PD during daily life) in the other 5.
Focusing on the tapping task (Figure 12), of the 7 studies in which tapping was the most relevant task, most found that, in 5 cases each (71% of the cases), statistical metrics calculated on the intervals between taps (i.e. mean value, variability, percentiles) or on the spatial accuracy (i.e., two-dimensional distance between tap position and target point) were the most significant DB-MS-PD. This is followed by total taps (n=4, 57%) and the number of on-target taps (n=2, 29%). Other characteristics include tap duration and fatigue. The latter was calculated as the difference in tap speed between the first and last tap. These features were highly relevant for the diagnosis of PD, the estimation of motor fluctuations, the assessment of disease progression and the prediction of clinical motor scores.
Focusing on the gait task (Figure 13), of the 11 studies where gait was the primary task, most (n=7, 63%) found gait speed and its variability to be the one of most significant DB-MS-PD in both supervised and unsupervised contexts. This is followed by average step length and their variability (n=4, 36%), stride length (n=3, 27%), step time (n=3, 27%), swing time (n=2, 18%), stance time (n=2, 18%), and cadence (n=2, 18%). Other characteristics include total power, center of pressure, force variations, power spectrals and entropy. These features were found to be highly relevant for the diagnosis of PD, the estimation of motor fluctuations and the assessment of disease progression.
Overall, the supervised and unsupervised finger tapping task (assessed through position and acceleration analysis on a smartphone or tablet) and the supervised and unsupervised gait task (assessed through inertial or force sensors) represented the most explored tasks. When comparing the results obtained from different studies, it is evident that these two tasks provide similar performance for PD diagnosis and treatment response assessment, as shown in Table 2. On the other hand, gait was also explored for severity estimation, while the finger tapping task for predicting the clinical motor score. The classification and regression results are reported in the Table 2, where the results of the different studies are summarized using the performance range.
It is worth noting that all finger tapping tasks represent active tests, in which subjects have to actively participate in the activity. On the other hand, tapping on touch screens represents an easy and effective activity that can be performed at home. Very good performance was obtained both in supervised laboratory settings and in unsupervised remote settings (i.e., at home), demonstrating that finger tapping represents an easy and effective task for the extraction of DB-MS-PD. With regard to gait, data recorded in supervised laboratory environments performed better for PD diagnosis (accuracy 0.94-0.98) and severity estimation (accuracy 0.85-0.98), compared to unsupervised data recorded continuously in the home environment, with an area under the curve (AUC) between 0.76-0.95.
Analyzing tremor, a sensitivity of 0.85 and a specificity of 0.88 were obtained in distinguishing PwPD from HC; a sensitivity of 0.88 and specificity of 0.90 were obtained for the classification of PD versus idiopatic sleep behavior disorder (iRBD) [63]. The performance of tremor detection achieved an accuracy of 0.83 and a correlation of r = 0.97 (p<0.001) with the UPDRS tremor score (item 3.18). A correlation of r = 0.67 (p<0.001, RMSE = 2) was obtained between the sensor measurements (amplitude of hand movement) and the clinical bradykinesia score (sum of items 3.4-3.6) [67]. For toe tapping, a correlation of r = 0.74 (p<0.001) was found in predicting motor progression at 30 months [62]. Finally, the TUG test analysis provided an r = 0.55 and an RMSE of 0.33 in predicting the number of falls in PD [64].

5. Discussion

This article provides an updated review of the existing literature on digital biomarkers for motor symptoms assessment in PD. This disorder represents the second most common neurodegenerative disease in the world [3], and new digital technologies promise to significantly support the approach to the diagnosis, prognosis and monitoring of PD.
In this review, a total of 22 articles were selected and thoroughly analyzed to provide an summary of current DB-MS-PD in PD. As the term biomarker is very broad and the use of wearable devices can be extended to different locations, a large heterogeneity was found among the studies examined. Nevertheless, it was possible to identify trends and patterns in the definition of the experimental protocols, the number of participants, the number and type of devices and sensors used, the location of the devices on the human body, the objectives pursued and the types of DB-MS-PD proposed.
Overall, the results indicate that wearable devices have the potential to be used to define DB-MS-PD. These can provide measurable and objective evaluations in clinical or hospital settings. Furthermore, the indicators collected through wearable devices could contribute to the development of remote and continuous patient monitoring systems to follow the evolution of different symptoms, especially in unsupervised settings such as patient’s home.
The vast majority of studies included in this review employed commercial IMUs, mostly embedding at least accelerometer and gyroscope, or the touchscreen of smartphones and tablets. These sensors allowed assessment and monitoring of a wide spectrum of motor symptoms in PD for diagnosis, prognosis, and monitoring purposes. Importantly, more than half studies used a single device embedding multiple sensors. Among these, IMUs seem to represent the most promising solution (11 of 22 studies). IMUs are widespread and used on various applications due to their versatility, accuracy, and relatively compact form factor, thus can be used for data collection and analysis without requiring additional dedicated hardware. They feature built-in sensors like an accelerometer, gyroscope, and magnetometer for comprehensive motor symptom evaluation. The results are promising, with high diagnostic accuracy in discriminating PwPD from HC and high correlation with clinical scores. However, smartphones (7 of 22 studies) are used on a daily basis, and are equipped with a large number of built-in sensors such as accelerometer, gyroscope, magnetometer, touchscreen, camera, and microphone. Smartphones were mostly used in the execution of scripted active tasks (e.g., finger tapping, memory, walking). None of the studies evaluated the potential of smartphones for long term, continuous monitoring in unsupervised environments. While representing technology which subjects are familiar with, the dimension and weight of smartphones are not comparable to the small and tiny inertial modules that can be even embedded in smart-clothes or devices. Passive continuous monitoring through IMUs or smartphones (i.e., in the pocket) was not addressed and needs further evaluation.
Gait and finger tapping represent the most frequent activities in the studies analyzed. Other tasks and activities such as TUG test, balance task, and reaction times were poorly addressed and thus do not allow to obtain robust conclusions. As previously discussed, gait and finger tapping provided similar results in terms of discrimination capability of PwPD from HC and for therapy condition assessment. These tasks were analysed both in supervised clinical settings and in unsupervised home environments. Despite similar results, gait has the potential to provide a continuous passive evaluation of disease severity, the presence of motor symptoms, and the effect of therapy. On the other hand, finger tapping represents an active task and should be performed several times a day to accurately estimate motor fluctuations.
In general, this review emphasizes the potential of DB-MS-PD for PD diagnosis, prognosis, and monitoring. Data can be collected in unsupervised environments for long periods of time using widespread commercial devices such as smartphones. Individual features or a combination of multiple features and ML models can be used to detect symptoms, predict severity, and evaluate therapy response. The evidences are convincing, as large sample sizes, correct validation procedures, and robust methods were used. Ultimately, this review article is intended to provide the reader with a comprehensive set of information that demonstrates the potential of DB-MS-PD in PD, critically discussing current limitations and providing recommendations for future work.

5.1. Challenges

The careful analysis of the articles included in the review highlights several challenges, which are reported below. The identification of these challenges can help design and conduct future studies related to digital biomarkers for motor symptoms assessment in PwPD.
Most studies focused on the diagnosis of PD, alone or in combination with other objectives (e.g. estimation of disease severity, treatment conditions). The proper evaluation of digital tools for computerised early diagnosis should be conducted on newly diagnosed PwPD. However, some studies have not reported any measurement of duration and/or severity of disease, which makes it complicated to assess the potential of the proposed solution. When reported, the average duration of the disease was in the range of 3.5-13 years and the H&Y stage was mostly equal to or greater than 2. This means that the recruited PwPD were mostly at an advanced stage of the disease, when motor symptoms were fully visible. This raises doubts about the usefulness of the solutions developed and their real potential for early diagnosis. It is essential to define appropriate standardised criteria that provide guidelines for the recruitment process and the reporting of participant characteristics. For example, in order to enable a proper evaluation of the PwPD, at least disease duration, H&Y and MDS-UPDRS scores must be reported. Strict inclusion criteria with regard to disease duration (e.g., less than 2) and H&Y stage (e.g., less than or equal to 1) must be adopted when evaluating digital systems for early diagnosis.
Related to the reporting of the activities and tasks performed, in some cases, the activity carried out is very detailed, while in others it is described in a general way. For example, distance travelled, gait trajectories to be followed, and speeds are indicated in a limited number of studies. Similarly, the addressed tasks are sometimes referred as MDS-UPDRS tasks, without precise indication of the specific UPDRS item. A similar phenomenon is observed in the description of the positioning of wearable devices on the body. Some studies explicitly describe the position of the device through photographs or diagrams, while others merely refer to general anatomical regions, such as the upper limb or lower limb.
Regarding the evaluation of therapy condition (ON/OFF), half of the studies were conducted in clinical, supervised settings. This raises questions about the applicability of the developed digital tools to unsupervised remote environments. When performing at-home monitoring, passive monitoring during activities of daily living lasted 1 hour to 1 day [55,69]. This represents rather a short period of time for fully evaluating the performance of automatic motor condition assessment. On the other hand, long-term monitoring of 2 weeks to 6 months [71,72] provide robust performance estimate. However, this was achieved using active tests, involving patients to complete a scripted set of tasks. Ideally, passive monitoring over a long period of time would be desirable, so the evolution of biomarkers over time can be monitored without forcing patients to engage in sustained activities.
A comprehensive reporting of performance metrics is essential to fully evaluate the potential of the prediction system and to fairly compare similar studies. Focusing on diagnosis, almost half of the studies reported classification performance in terms of area under the curve (AUC). This allows the diagnostic ability of DB-MS-PD to be assessed regardless of the selected classification threshold and serves as a summary of the overall model performance. However, AUC was not reported in the other half of the studies, where accuracy or the combination of sensitivity and specificity were preferred. With regard to regression metrics, the performance evaluation is rather heterogeneous across studies. A substantial number of studies reported the correlation coefficient alone or a single error measure (i.e., MAE, RMSE). Furthermore, test-retest reliability was assessed in a single study using the intra-class correlation coefficient. Finally, inter-rater variability was not studied.
Overall, the heterogeneity of the performance metrics hinders a fair comparison with similar work and does not allow a comprehensive evaluation of performance. Again, some guidelines are needed to suggest the minimum set of performance metrics to be reported. These may include at least the receiver operating characteristic-ROC curve and AUC value in classification problems and the correlation coefficient, the associated p-value and a measure of error (i.e., MAE, RMSE or MSE) for regression tasks. In the latter case, test-retest reliability and inter-rater variability are essential to assess the consistency of multiple measurements and multiple clinical raters, respectively [75,76].
Above all, the definition of biomarkers is heterogeneous in the evaluated studies, with no common report structure. Some studies used single features as DB-MS-PD and assessed their classification or regression performance. Other studies extracted multiple features from the data and combined them using an ML model. In this case, if adequately described, the aggregation of features and the ML model can represent a DB-MS-PD. In most of these studies, the contribution of each feature to final performance was assessed using regression metrics or statistical tests between groups of subjects, conditions, or treatments. This is useful as it is possible to identify the most important feature for the specific purpose.
Reproducibility is a key aspect to allow for repeatable experiments ready-to-use solutions. However, methods and implementation details were not always exhaustively reported. Relevant information such as sensor characteristics and specific positioning on the body, signal pre-processing and conditioning steps, and ML models’ parameters should be carefully described. In addition, most of the evaluated studies used proprietary datasets that were not made publicly available. Although the diversity between the different datasets provides new perspectives and enrich the current body of knowledge, proprietary datasets hinder reproducibility of the experiments and results, and limits the advance in scientific research.
The recent significant technological advances in wearable technology allows for a wide range of data measurement modality. This is obviously a great advantage, as a large spectrum of physical and physiological parameters can be easily extracted from on-body sensors. However, it is worth considering that the number of devices, their wearability and comfort are of utmost importance when designing systems for long-term unsupervised monitoring. PwPD often suffer from non-motor symptoms such as sleep disorders, anxiety, and depression. Thus, cognitive load and patient compliance are essential aspects to consider when developing digital solutions.
Additionally, regarding the number of participants, the sample size significantly varies. Two clear trends are observed, 14 studies with a number of PwPD between 5 and 42, and 8 works with more than 80 participants (3 studies with more than 1000). Furthermore, studies tend to have a higher number of HC than PwPD (of the articles that are clearly specified, 77% of the total participants are HC). It would be interesting for a correct definition of DB-MS-PD that these are defined from a significant number of PwPD registries, close to 50% of the total. In addition, another factor that should be taken into account is the gender balance of the participants, as, for HC and PwPD, the percentage of male participants is high compared to female participants (71% HC, 65% PD).

5.2. Limitations of this Study

This review has some limitations. First, despite extensive search on several digital journals, only 22 studies were finally included and fully evaluated. Considering the heterogeneity in the sample size, objective, methods and results, a direct and comprehensive comparison of similar studies is not always possible. This is particularly evident for studies focusing on activities different from gait and finger tapping, for which it is not possible to provide robust conclusions. In addition, the distribution of study objectives is very unbalanced, with most studies focusing of diagnosis and therapy condition estimation, while the severity of motor symptoms and overall motor condition were poorly addressed. Finally, this review focused specifically on DB-MS-PD. This led to consider motor symptoms such as gait impairment, postural instability, tremor, bradykinesia, and dyskinesia. Other motor symptoms such as rigidity, fatigue, and hypomimia were not addressed in the investigated studies.
Another limitation that may be encountered is the replicability, as each study uses a different wearable device with its own sampling frequency and filtering techniques (not all of which are indicated), using a wide variety of devices.
A high percentage of the assessed studies (84.8%, n=168 of the 198 articles screened) are from 2020 or later. Furthermore, the observable trend in recent years has been that more articles are being produced each year, so it can be deduced that the definition of biomarkers using wearable technology is a subject on which numerous research projects are being carried out and that new results and DB-MS-PD will be published in the coming years. Therefore, this work can be a reference to know the current trends and future literature reviews on this topic can expand the information shown and know in further detail the way to the definition of DB-MS-PD.
In addition, 6 of the 22 articles use public databases, therefore, the variability in the results is conditioned to a certain extent, since the results of these databases are similar. It would be interesting to extend the results obtained by using other different databases or by proposing new activities or measures.

6. Conclusions

This article present the proposed biomarkers for the assessment of motor symptoms related to PD, extracted through wearable sensors. As shown in the review, wearable devices have the potential to assess motor symptoms of PD. A wide variety of devices and models have been shown to be able to extract useful information about a patient’s condition.
The use of these types of devices can generate many advantages for traditional clinical monitoring and assessment, as they can generate a high volume of data in a relatively short time, provide objective opinions on the patient’s actual condition, or be able to assess or account for aspects that traditionally have not been possible, such as falls or freezing. In addition, they introduce an unexploited aspect, such as the possibility of monitoring a patient in a home environment without the pressure of feeling observed by the neurologist.
However, in order to develop remote patient monitoring systems, several issues identified in the review need to be addressed. One of the most interesting challenges may be the standardization of data collection, analysis, processing, evaluation, and reporting of DB-MS-PD in order to facilitate comparability and replicability of results. In the absence of general guidelines for experimental development, each trial may be conducted in a different way and managed differently. This idea of standardization has been pursued in other fields such as neuroimaging in Alzeihemer [77] or in the use of public tools for database publication [78].
The review highlighted multiple issues and constraints within the included studies, offering suggestions for future research to overcome these limitations and improve PD assessment. Rapid progress in sensing and data analysis technologies is expected to significantly expedite the integration of wearable devices in this field. These DB-MS-PD can be used to obtain objective and measurable information on the status of certain disease symptoms and could be used in continuous patient monitoring systems.

Author Contributions

Conceptualization: C.PF, L.S., I.P.; Data curation: C.PF; Formal analysis: C.PF, L.S., L.B., I.P.; Funding acquisition: C.A., JM.L., G.A., I.P.; Investigation: C.PF, L.S., L.B., I.P.; Methodology: C.PF; Project administration: I.P.; Resources: C.A., JM.L., G.A., I.P.; Supervision: G.O., I.P.; Validation: C.PF; Visualization: C.PF, L.S., L.B., I.P.;Writing – original draft: C.PF, L.S., L.B., I.P.;Writing – review & editing: C.PF, L.S., L.B., G.O., C.A., JM.L., G.A., I.P.

Funding

This research has been possible thanks to the financing of the project BIOCLITE PID2021-123708OB-I00, funded by MCIN/AEI/10.13039/501100011033/FEDER, EU.

Acknowledgments

The authors acknowledge to the Instrumentation and Applied Acoustics Research Group (I2A2) at Universidad Politécnica de Madrid and to the Physical Education and Sports Science (PESS) department, the Health Research Institute (HRI), and the Data-Driven Computer Engineering (D2iCE) Group at University of Limerick.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Abbreviations

The following abbreviations are used in this manuscript:
DB-MS-PD Digital Biomarkers for Motor Symptoms of Parkinson’s Disease
PD Parkinson’s Disease
PwPD Patients with Parkinson’s Disease
HC Healthy Control
ML Machine Learning
DL Deep Learning
AUC Area Under the Curve
MAE Mean Absolute Value
IMU Inertial Measurement Unit
VGRF Vertical Ground Reaction Force
TUG Timed Up and Go
MDS-UPDRS Movement Disorder Society-Sponsored Revision of the Unified Parkinson’s Disease Rating Scale

References

  1. Goetz, C.G. The history of Parkinson’s disease: early clinical descriptions and neurological therapies. Cold Spring Harb Perspect Med 2011, 1, a008862. [Google Scholar] [CrossRef] [PubMed]
  2. Armstrong, M.J.; Okun, M.S. Diagnosis and Treatment of Parkinson Disease: A Review. JAMA 2020, 323, 548–560. [Google Scholar] [CrossRef] [PubMed]
  3. Organization, W.H. Parkinson disease. https://www.who.int/news-room/fact-sheets/detail/parkinson-disease, 2022.
  4. Dorsey, E.R.; Sherer, T.; Okun, M.S.; Bloem, B.R. The Emerging Evidence of the Parkinson Pandemic. J Parkinsons Dis 2018, 8, S3–S8. [Google Scholar] [CrossRef]
  5. Sveinbjornsdottir, S. The clinical symptoms of Parkinson’s disease. Journal of Neurochemistry 2016, 139, 318–324. [Google Scholar] [CrossRef]
  6. Chaudhuri, K.R.; Healy, D.G.; Schapira, A.H. Non-motor symptoms of Parkinson’s disease: diagnosis and management. The Lancet Neurology 2006, 5, 235–245. [Google Scholar] [CrossRef]
  7. Tolosa, E.; Wenning, G.; Poewe, W. The diagnosis of Parkinson’s disease. The Lancet Neurology 2006, 5, 75–86. [Google Scholar] [CrossRef]
  8. Xia, R.; Mao, Z.H. Progression of motor symptoms in Parkinson’s disease. Neuroscience Bulletin 2012, 28, 39–48. [Google Scholar] [CrossRef]
  9. Zhao, N.; Yang, Y.; Zhang, L.; Zhang, Q.; Balbuena, L.; Ungvari, G.S.; Zang, Y.F.; Xiang, Y.T. Quality of life in Parkinson’s disease: A systematic review and meta-analysis of comparative studies. CNS Neurosci Ther 2021, 27, 270–279. [Google Scholar] [CrossRef] [PubMed]
  10. Jankovic, J. Parkinson’s disease: clinical features and diagnosis. J Neurol Neurosurg Psychiatry 2008, 79, 368–376. [Google Scholar] [CrossRef]
  11. Jankovic, J. The Evolution of Diagnosis in Early Parkinson Disease. Archives of Neurology 2000, 57, 369. [Google Scholar] [CrossRef]
  12. Singh, N.; Pillay, V.; Choonara, Y.E. Advances in the treatment of Parkinson’s disease. Progress in Neurobiology 2007, 81, 29–44. [Google Scholar] [CrossRef] [PubMed]
  13. Levodopa and the Progression of Parkinson’s Disease. New England Journal of Medicine 2004, 351, 2498–2508. [CrossRef] [PubMed]
  14. Jankovic, J. Motor fluctuations and dyskinesias in Parkinson’s disease: clinical manifestations. Movement disorders: official journal of the Movement Disorder Society 2005, 20, S11–S16. [Google Scholar] [CrossRef] [PubMed]
  15. Goetz, C.G.; Tilley, B.C.; Shaftman, S.R.; Stebbins, G.T.; Fahn, S.; et al. Movement Disorder Society-sponsored revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS): scale presentation and clinimetric testing results. Mov Disord 2008, 23, 2129–2170. [Google Scholar] [CrossRef]
  16. Schrag, A. How valid is the clinical diagnosis of Parkinson’s disease in the community? Journal of Neurology, Neurosurgery & Psychiatry 2002, 73, 529–534. [Google Scholar] [CrossRef]
  17. Albanese, A. Standard strategies for diagnosis and treatment of patients with newly diagnosed Parkinson disease: ITALY. Neurol Clin Pract 2013, 3, 476–477. [Google Scholar] [CrossRef]
  18. Davidson, M.B.; McGhee, D.J.; Counsell, C.E. Comparison of patient rated treatment response with measured improvement in Parkinson’s disease. J Neurol Neurosurg Psychiatry 2012, 83, 1001–1005. [Google Scholar] [CrossRef]
  19. Luis-Martínez, R.; Monje, M.H.G.; Antonini, A.; Sánchez-Ferro, Á.; Mestre, T.A. Technology-Enabled Care: Integrating Multidisciplinary Care in Parkinson’s Disease Through Digital Technology. Front Neurol 2020, 11, 575975. [Google Scholar] [CrossRef] [PubMed]
  20. Sigcha, L.; Borzì, L.; Amato, F.; Rechichi, I.; Ramos-Romero, C.; Cárdenas, A.; Gascó, L.; Olmo, G. Deep learning and wearable sensors for the diagnosis and monitoring of Parkinson’s disease: a systematic review. Expert Systems with Applications 2023, 120541. [Google Scholar] [CrossRef]
  21. Shah, V.V.; McNames, J.; Mancini, M.; Carlson-Kuhta, P.; Nutt, J.G.; El-Gohary, M.; Lapidus, J.A.; Horak, F.B.; Curtze, C. Digital biomarkers of mobility in Parkinson’s disease during daily living. Journal of Parkinson’s disease 2020, 10, 1099–1111. [Google Scholar] [CrossRef]
  22. Fröhlich, H.; Bontridder, N.; Petrovska-Delacréta, D.; Glaab, E.; et al. Leveraging the Potential of Digital Technology for Better Individualized Treatment of Parkinson’s Disease. Frontiers in Neurology 2022, 13. [Google Scholar] [CrossRef] [PubMed]
  23. Chudzik, A.; Śledzianowski, A.; Przybyszewski, A.W. Machine Learning and Digital Biomarkers Can Detect Early Stages of Neurodegenerative Diseases. Sensors 2024, 24. [Google Scholar] [CrossRef] [PubMed]
  24. Bonato, P. Wearable sensors and systems. From enabling technology to clinical applications. IEEE Eng Med Biol Mag 2010, 29, 25–36. [Google Scholar] [CrossRef]
  25. Ometov, A.; Shubina, V.; Klus, L.; Skibińska, J.; Saafi, S.; Pascacio, P.; Flueratoru, L.; Gaibor, D.Q.; Chukhno, N.; Chukhno, O.; et al. A Survey on Wearable Technology: History, State-of-the-Art and Current Challenges. Computer Networks 2021, 193, 108074. [Google Scholar] [CrossRef]
  26. Iqbal, S.M.A.; Mahgoub, I.; Du, E.; Leavitt, M.A.; Asghar, W. Advances in healthcare wearable devices. npj Flexible Electronics 2021, 5. [Google Scholar] [CrossRef]
  27. Dunn, J.; Runge, R.; Snyder, M. Wearables and the medical revolution. Per Med 2018, 15, 429–448. [Google Scholar] [CrossRef]
  28. Martin, T.; Healey, J. 2006’s Wearable Computing Advances and Fashions. IEEE Pervasive Computing 2007, 6, 14–16. [Google Scholar] [CrossRef]
  29. Jiang, W.; Yin, Z. Human Activity Recognition Using Wearable Sensors by Deep Convolutional Neural Networks 2015. p. 1307–1310.
  30. Borzì, L.; Sigcha, L.; Olmo, G. Context Recognition Algorithms for Energy-Efficient Freezing-of-Gait Detection in Parkinsons Disease. Sensors 2023, 23. [Google Scholar] [CrossRef] [PubMed]
  31. Son, D.; Lee, J.; Qiao, S.; Ghaffari, R.; Kim, J.; Lee, J.E.; Song, C.; Kim, S.J.; Lee, D.J.; Jun, S.W.; et al. Multifunctional wearable devices for diagnosis and therapy of movement disorders. Nature Nanotechnology 2014, 9, 397–404. [Google Scholar] [CrossRef]
  32. Din, S.D.; Godfrey, A.; Mazzà, C.; Lord, S.; Rochester, L. Free-living monitoring of Parkinson’s disease: Lessons from the field. Movement Disorders 2016, 31, 1293–1313. [Google Scholar] [CrossRef]
  33. Wen, D.; Zhang, X.; Liu, X.; Lei, J. Evaluating the Consistency of Current Mainstream Wearable Devices in Health Monitoring: A Comparison Under Free-Living Conditions. Journal of Medical Internet Research 2017, 19, e68. [Google Scholar] [CrossRef] [PubMed]
  34. Guk, K.; Han, G.; Lim, J.; Jeong, K.; Kang, T.; Lim, E.K.; Jung, J. Evolution of Wearable Devices with Real-Time Disease Monitoring for Personalized Healthcare. Nanomaterials 2019, 9, 813. [Google Scholar] [CrossRef] [PubMed]
  35. Erdmier, C.; Hatcher, J.; Lee, M. Wearable device implications in the healthcare industry. Journal of Medical Engineering & Technology 2016, 40, 141–148. [Google Scholar] [CrossRef]
  36. Patel, M.S.; Asch, D.A.; Volpp, K.G. Wearable Devices as Facilitators, Not Drivers, of Health Behavior Change. JAMA 2015, 313, 459. [Google Scholar] [CrossRef]
  37. Moreau, C.; Rouaud, T.; Grabli, D.; Benatru, I.; Remy, P.; Marques, A.R.; Drapier, S.; Mariani, L.L.; Roze, E.; Devos, D.; et al. Overview on wearable sensors for the management of Parkinson’s disease. npj Parkinson’s Disease 2023, 9. [Google Scholar] [CrossRef] [PubMed]
  38. Tam, W.; Alajlani, M.; Abd-alrazaq, A. An Exploration of Wearable Device Features Used in UK Hospital Parkinson Disease Care: Scoping Review. Journal of Medical Internet Research 2023, 25, e42950. [Google Scholar] [CrossRef]
  39. Rovini, E.; Maremmani, C.; Cavallo, F. How Wearable Sensors Can Support Parkinson’s Disease Diagnosis and Treatment: A Systematic Review. Front Neurosci 2017, 11, 555. [Google Scholar] [CrossRef]
  40. Del Din, S.; Kirk, C.; Yarnall, A.J.; Rochester, L.; Hausdorff, J.M. Body-Worn Sensors for Remote Monitoring of Parkinson’s Disease Motor Symptoms: Vision, State of the Art, and Challenges Ahead. J Parkinsons Dis 2021, 11, S35–S47. [Google Scholar] [CrossRef]
  41. Strimbu, K.; Tavel, J.A. What are biomarkers? Curr Opin HIV AIDS 2010, 5, 463–466. [Google Scholar] [CrossRef]
  42. Biomarkers and surrogate endpoints: preferred definitions and conceptual framework. Clin Pharmacol Ther 2001, 69, 89–95. [CrossRef]
  43. Park, J.E.; Gunasekaran, T.I.; Cho, Y.H.; Choi, S.M.; Song, M.K.; Cho, S.H.; Kim, J.; Song, H.C.; Choi, K.Y.; Lee, J.J.; et al. Diagnostic Blood Biomarkers in Alzheimer’s Disease. Biomedicines 2022, 10. [Google Scholar] [CrossRef] [PubMed]
  44. Patient-Focused Drug Development: Collecting Comprehensive and Representative Input. NPJ Digit Med 2020, 46.
  45. Vasudevan, S.; Saha, A.; Tarver, M.E.; Patel, B. Digital biomarkers: Convergence of digital health technologies and biomarkers. NPJ Digit Med 2022, 5, 36. [Google Scholar] [CrossRef] [PubMed]
  46. Manta, C.; Patrick-Lake, B.; Goldsack, J.C. Digital Measures That Matter to Patients: A Framework to Guide the Selection and Development of Digital Measures of Health. Digit Biomark 2020, 4, 69–77. [Google Scholar] [CrossRef] [PubMed]
  47. Insel, T.R. Digital Phenotyping: Technology for a New Science of Behavior. JAMA 2017, 318, 1215–1216. [Google Scholar] [CrossRef]
  48. Babrak, L.M.; Menetski, J.; Rebhan, M.; Nisato, G.; Zinggeler, M.; Brasier, N.; Baerenfaller, K.; Brenzikofer, T.; Baltzer, L.; Vogler, C.; et al. Traditional and Digital Biomarkers: Two Worlds Apart? Digit Biomark 2019, 3, 92–102. [Google Scholar] [CrossRef]
  49. Califf, R.M. Biomarker definitions and their applications. Exp Biol Med (Maywood) 2018, 243, 213–221. [Google Scholar] [CrossRef]
  50. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; Group, P.; et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. International journal of surgery (London, England) 2010, 8, 336–341. [Google Scholar] [CrossRef]
  51. Hao, T.; Yamada, Y.; Rogers, J.L.; Shinakwa, K.; Nemoto, M.; Nemoto, K.; Arai, T. An Automated Digital Biomarker of Mobility. In Proceedings of the 2023 IEEE International Conference on Digital Health (ICDH); IEEE, 2023. [Google Scholar] [CrossRef]
  52. Shah, V.V.; McNames, J.; Mancini, M.; Carlson-Kuhta, P.; Nutt, J.G.; El-Gohary, M.; Lapidus, J.A.; Horak, F.B.; Curtze, C. Digital Biomarkers of Mobility in Parkinson’s Disease During Daily Living. Journal of Parkinson’s Disease 2020, 10, 1099–1111. [Google Scholar] [CrossRef]
  53. ZhuParris, A.; Thijssen, E.; Elzinga, W.O.; Makai-Bölöni, S.; Kraaij, W.; Groeneveld, G.J.; Doll, R.J. Treatment Detection and Movement Disorder Society-Unified Parkinson’s Disease Rating Scale, Part III Estimation Using Finger Tapping Tasks. Movement Disorders 2023, 38, 1795–1805. [Google Scholar] [CrossRef]
  54. Shah, V.V.; McNames, J.; Harker, G.; Mancini, M.; Carlson-Kuhta, P.; Nutt, J.G.; El-Gohary, M.; Curtze, C.; Horak, F.B. Effect of Bout Length on Gait Measures in People with and without Parkinson’s Disease during Daily Life. Sensors 2020, 20, 5769. [Google Scholar] [CrossRef] [PubMed]
  55. Atrsaei, A.; Corrà, M.F.; Dadashi, F.; Vila-Chã, N.; Maia, L.; Mariani, B.; Maetzler, W.; Aminian, K. Gait speed in clinical and daily living assessments in Parkinson’s disease patients: performance versus capacity. npj Parkinson’s Disease 2021, 7. [Google Scholar] [CrossRef] [PubMed]
  56. Coates, L.; Shi, J.; Rochester, L.; Del Din, S.; Pantall, A. Entropy of Real-World Gait in Parkinson’s Disease Determined from Wearable Sensors as a Digital Marker of Altered Ambulatory Behavior. Sensors 2020, 20, 2631. [Google Scholar] [CrossRef]
  57. Deng, K.; Li, Y.; Zhang, H.; Wang, J.; Albin, R.L.; Guan, Y. Heterogeneous digital biomarker integration out-performs patient self-reports in predicting Parkinson’s disease. Communications Biology 2022, 5. [Google Scholar] [CrossRef] [PubMed]
  58. Vidya, B.; P, S. Gait based Parkinson’s disease diagnosis and severity rating using multi-class support vector machine. Applied Soft Computing 2021, 113, 107939. [Google Scholar] [CrossRef]
  59. Khera, P.; Kumar, N. Novel machine learning-based hybrid strategy for severity assessment of Parkinson’s disorders. Medical & Biological Engineering & Computing 2022, 60, 811–828. [Google Scholar] [CrossRef]
  60. Goni, M.; Eickhoff, S.B.; Far, M.S.; Patil, K.R.; Dukart, J. Smartphone-Based Digital Biomarkers for Parkinson’s Disease in a Remotely-Administered Setting. IEEE Access 2022, 10, 28361–28384. [Google Scholar] [CrossRef]
  61. Wissel, B.D.; Mitsi, G.; Dwivedi, A.K.; Papapetropoulos, S.; Larkin, S.; López Castellanos, J.R.; Shanks, E.; Duker, A.P.; Rodriguez-Porcel, F.; Vaughan, J.E.; et al. Tablet-Based Application for Objective Measurement of Motor Fluctuations in Parkinson Disease. Digital Biomarkers 2018, 1, 126–135. [Google Scholar] [CrossRef]
  62. Di Lazzaro, G.; Ricci, M.; Saggio, G.; Costantini, G.; Schirinzi, T.; Alwardat, M.; Pietrosanti, L.; Patera, M.; Scalise, S.; Giannini, F.; et al. Technology-based therapy-response and prognostic biomarkers in a prospective study of a de novo Parkinson’s disease cohort. npj Parkinson’s Disease 2021, 7. [Google Scholar] [CrossRef]
  63. Arora, S.; Baig, F.; Lo, C.; Barber, T.R.; Lawton, M.A.; Zhan, A.; Rolinski, M.; Ruffmann, C.; Klein, J.C.; Rumbold, J.; et al. Smartphone motor testing to distinguish idiopathic REM sleep behavior disorder, controls, and PD. Neurology 2018, 91. [Google Scholar] [CrossRef]
  64. Greene, B.R.; Premoli, I.; McManus, K.; McGrath, D.; Caulfield, B. Predicting Fall Counts Using Wearable Sensors: A Novel Digital Biomarker for Parkinson’s Disease. Sensors 2021, 22, 54. [Google Scholar] [CrossRef] [PubMed]
  65. Rehman, R.Z.U.; Buckley, C.; Mico-Amigo, M.E.; Kirk, C.; Dunne-Willows, M.; Mazza, C.; Shi, J.Q.; Alcock, L.; Rochester, L.; Del Din, S. Accelerometry-Based Digital Gait Characteristics for Classification of Parkinson’s Disease: What Counts? IEEE Open Journal of Engineering in Medicine and Biology 2020, 1, 65–73. [Google Scholar] [CrossRef] [PubMed]
  66. Sharma, M.; Mishra, R.k.; Hall, A.J.; Casado, J.; Cole, R.; Nunes, A.S.; Barchard, G.; Vaziri, A.; Pantelyat, A.; Wills, A.M. Remote at-home wearable-based gait assessments in Progressive Supranuclear Palsy compared to Parkinson’s Disease. BMC Neurology 2023, 23. [Google Scholar] [CrossRef] [PubMed]
  67. Mahadevan, N.; Demanuele, C.; Zhang, H.; Volfson, D.; Ho, B.; Erb, M.K.; Patel, S. Development of digital biomarkers for resting tremor and bradykinesia using a wrist-worn wearable device. NPJ digital medicine 2020, 3, 1–12. [Google Scholar] [CrossRef]
  68. Gonçalves, H.R.; Branquinho, A.; Pinto, J.; Rodrigues, A.M.; Santos, C.P. Digital biomarkers of mobility and quality of life in Parkinson’s disease based on a wearable motion analysis LAB. Computer Methods and Programs in Biomedicine 2024, 244, 107967. [Google Scholar] [CrossRef]
  69. Evers, L.J.; Raykov, Y.P.; Krijthe, J.H.; Silva de Lima, A.L.; Badawy, R.; Claes, K.; Heskes, T.M.; Little, M.A.; Meinders, M.J.; Bloem, B.R. Real-Life Gait Performance as a Digital Biomarker for Motor Fluctuations: The Parkinson@Home Validation Study. Journal of Medical Internet Research 2020, 22, e19068. [Google Scholar] [CrossRef]
  70. Tsoulos, I.G.; Mitsi, G.; Stavrakoudis, A.; Papapetropoulos, S. Application of Machine Learning in a Parkinson’s Disease Digital Biomarker Dataset Using Neural Network Construction (NNC) Methodology Discriminates Patient Motor Status. Frontiers in ICT 2019, 6. [Google Scholar] [CrossRef]
  71. Lipsmeier, F.; Taylor, K.I.; Kilchenmann, T.; Wolf, D.; Scotland, A.; Schjodt-Eriksen, J.; Cheng, W.; Fernandez-Garcia, I.; Siebourg-Polster, J.; Jin, L.; et al. Evaluation of smartphone-based testing to generate exploratory outcome measures in a phase 1 Parkinson’s disease clinical trial. Movement Disorders 2018, 33, 1287–1297. [Google Scholar] [CrossRef]
  72. Sahandi Far, M.; Eickhoff, S.B.; Goni, M.; Dukart, J. Exploring Test-Retest Reliability and Longitudinal Stability of Digital Biomarkers for Parkinson Disease in the m-Power Data Set: Cohort Study. Journal of Medical Internet Research 2021, 23, e26608. [Google Scholar] [CrossRef]
  73. mPower Public Researcher Portal. Mobile Parkinson Disease Study, 2015. [CrossRef]
  74. Hausdorff, J.M. Gait in Parkinson’s Disease, 2008.
  75. Adams, J.; Kangarloo, T.; Gong, Y.; et al. Using a smartwatch and smartphone to assess early Parkinson’s disease in the WATCH-PD study over 12 months. npj Parkinson’s Disease 2024, 10, 112. [Google Scholar] [CrossRef]
  76. Borzì, L.; Varrecchia, M.; Sibille, S.; Olmo, G.; Artusi, C.A.; Fabbri, M.; Rizzone, M.G.; Romagnolo, A.; Zibetti, M.; Lopiano, L. Smartphone-Based Estimation of Item 3.8 of the MDS-UPDRS-III for Assessing Leg Agility in People With Parkinson’s Disease. IEEE Open Journal of Engineering in Medicine and Biology 2020, 1, 140–147. [Google Scholar] [CrossRef] [PubMed]
  77. Jack, C.R.; Bernstein, M.A.; Fox, N.C.; Thompson, P.; Alexander, G.; Harvey, D.; Borowski, B.; Britson, P.J.; L. Whitwell, J.; Ward, C.; et al. The Alzheimer’s disease neuroimaging initiative (ADNI): MRI methods. Journal of Magnetic Resonance Imaging 2008, 27, 685–691. [Google Scholar] [CrossRef] [PubMed]
  78. Siirtola, P.; Koskimäki, H.; Röning, J. OpenHAR: A Matlab Toolbox for Easy Access to Publicly Open Human Activity Data Sets. In Proceedings of the Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers. ACM, 2018, UbiComp ’18. [CrossRef]
Figure 1. PRISMA flow diagram of literature search and selection process showing the number of studies identified, screened, and included in the review.
Figure 1. PRISMA flow diagram of literature search and selection process showing the number of studies identified, screened, and included in the review.
Preprints 117188 g001
Figure 2. Distribution of the activities performed in the different studies.
Figure 2. Distribution of the activities performed in the different studies.
Preprints 117188 g002
Figure 3. Distribution of the number of articles according to the number of participants.
Figure 3. Distribution of the number of articles according to the number of participants.
Preprints 117188 g003
Figure 4. Gender distribution in the selected papers.
Figure 4. Gender distribution in the selected papers.
Preprints 117188 g004
Figure 5. Distribution of types of devices and sensors used.
Figure 5. Distribution of types of devices and sensors used.
Preprints 117188 g005
Figure 6. Number of devices and sensors used.
Figure 6. Number of devices and sensors used.
Preprints 117188 g006
Figure 7. Distribution of the location of the sensor on the human body.
Figure 7. Distribution of the location of the sensor on the human body.
Preprints 117188 g007
Figure 8. Principal aims of the selected studies.
Figure 8. Principal aims of the selected studies.
Preprints 117188 g008
Figure 9. Distribution of the aims addressed in the selected studies.
Figure 9. Distribution of the aims addressed in the selected studies.
Preprints 117188 g009
Figure 10. Biomarker extraction methods. ML: Machine learning; DL: Deep learning.
Figure 10. Biomarker extraction methods. ML: Machine learning; DL: Deep learning.
Preprints 117188 g010
Figure 11. Most relevant tasks in studies addressing multiple tasks. TUG: timed up and go.
Figure 11. Most relevant tasks in studies addressing multiple tasks. TUG: timed up and go.
Preprints 117188 g011
Figure 12. Most relevant features in the finger tapping task.
Figure 12. Most relevant features in the finger tapping task.
Preprints 117188 g012
Figure 13. Most relevant features in the gait task.
Figure 13. Most relevant features in the gait task.
Preprints 117188 g013
Table 1. Summary of included papers.
Table 1. Summary of included papers.
Reference Study Design Participants Device, sensors, Number (device, sensor), Body location Aim End point
[51] Gait (Physionet database) PwPD: N = 90 (34F; 56M) HC: N = 62 (34F; 28M) Pressure insoles VGRF sensors N = (1,16) Foot (8 each) Gait monitoring Gait features that impact the predicted TUG scores are gait speed-based features (percentiles, mean, and kurtosis), with 84,8% accuracy.
[52] Gait PwPD: N = 29 (12F; 17M) HC: N = 27 (14F; 13M) IMU (Opals by APDM) 3-axial accelerometer, 3-axial gyroscope and 3-axial magnetometer N=(3,3) Foot (1 each) and lower back Classification PwPD-HC Turning and gait indicators discriminate PwPD from HC (Turn angle, swing time variability adn stride length with AUC = 0,87 - 0,89).
[53] Finger Tapping - Index and middle finger tapping (IMFT) - Alternate index finger tapping (IFT) - Thumb index finger tapping (TIFT) PwPD: N = 20 (6F; 14M) Tablet (IMFT and IFT) and Biometrics (TIFT) Pixel Coordinates (IMFT and IFT) and Goniometer (TIFT) N = (2,2) Front of participant (IMFT and IFT) and hand (TIFT) Therapy response monitoring and Classification of subjects with therapy and placebo The IFT features (total taps, bivariate contour ellipse area, spatial error, velocity changes, intertap intervals) provides the best performance in estimating MDS-UPDRS III, with p <0,001.
[54] Gait PwPD: N = 29 (12F; 17M) HC: N = 20 (8F; 2M) IMU (Opals by APDM) 3-axial accelerometer, 3-axial gyroscope and 3-axial magnetometer N=(3,3) Foot (1 each) and lower back. Classification PwPD-HC Gait measures (gait speed, stride length) could be used to classify PwPD from HC, with AUC > 0,8.
[55] Gait Activities of daily living PwPD: N = 27 (11F; 16M) IMU (RehaGait) (clinical assessment) and IMU (Physilog® 5) (home assessment) 3-axial accelerometer and 3-axial gyroscope (clinical assessment), and 3-axial accelerometer, 3-axial gyroscope, and barometrer (home assessment) N=(3,3) Foot (1 each in clinical assessment) (only 1 in home assessment) Gait monitoring and treatment detection Gait speed could be used to control of medication intake in PD.
[56] Gait PwPD: N = 5 HC: N = 5 IMU (Axivity AX3) 3-axial accelerometer N=(1,1) Lower back Classification PwPD-HC The sample entropy of the gait signal of PwPD are higher than HC participants.
[57] Gait Balance Task Finger Tapping (Mpower database) PwPD: N = 1057 (359F; 698M)HC:N = 5343 (1014F; 4329M) Smartphone 3-axial accelerometer (gait and balance) and pixel coordinates (tapping)N=(1,2) Pocket (gait and balance) and front of participant (tapping) Classification PwPD-HC Tapping positions (Centered tapping coordinates) are the most relevant data (AUC = 0,935) for PD detection.
[58] Gait (Physionet database) PwPD: N = 93 (35F; 58M) HC: N = 73 (33F; 40M) Pressure insoles VGRF sensors N=(1,16) Foot (8 each) Gait monitoring and classification PwPD-HC Gait parameters (stride time, step time, stance time, swing time, cadence, step length, stride length, gait speed) differentiate PD severity and HC with 98,65% accuracy.
[59] Gait (Physionet database) PwPD: N = 93 (35F; 58M) HC: N = 72 (32F; 40M) Pressure insoles VGRF sensors N=(1,16) Foot (8 each) Gait monitoring and classification PwPD-HC Gait parameters (step length, force variations at heel strike, centre of pressure variability, swing stance ratio, and double support phase) are able to detect PwPD with 99,9% accuracy and its severity shows R 2 = 98,7%.
[60] Gait Balance Task Finger Tapping (Mpower database) PwPD: N = 610 (211F; 399M) (gait), 612 (211F; 401M) (balance), 970 (340F; 630M) (tapping) HC: N = 787 (147F; 640M) (gait), 803 (150F; 653M) (balance), 1257 (239F; 1018M) (tapping) Smartphone 3-axial accelerometer (gait and balance) and pixel coordinates (tapping)N=(1,2) Pocket (gait and balance) and front of participant (tapping) Classification PwPD-HC Tapping features (inter-tap interval (range, maximum value and Teager-Kaiser energy operator) detect PwPD with AUC = 0,74.
[61] Finger Tapping Pronation-supination PwPD: N = 11 (3F; 8M) HC: N = 11 (6F; 5M) Smartphone Pixel coordinates N=(1,1) Front of participant Classification PwPD-HC and ON-OFF states monitoring Tapping features (total taps, tap interval, and tap accuracy) can detect PwPD with p <0,0005 and detect ON/OFF state with AUC 0,82
[62] Pronation-supination Leg Agility Toe Tapping TUG test Postural stability Postural Tremor Rest Tremor PwPD: N = 36 (9F; 27M) IMU (Movit G1) 3-axial accelerometer and 3-axial gyroscope N=(14,2) Lower back, upper back, forearm (1 each), arm (1 each), upper leg (1 each), lower leg (1 each), hand (1 each), foot (1 each) Prognosis (motor symptoms) and therapy response monitoring A correlation was found between motor symptoms progression and some features (toe tapping amplitude decrement, velocity of arms and legs, sit-to-stand time, p <0,01).
[63] Balance Task Gait Finger tapping Reaction time Rest tremor Postural tremor PwPD: N = 334 (125F; 209M) HC: N = 84 (17F; 67M) iRBD (idiopathic REM sleep behavior disorder): N = 104 (88F; 16M) Smartphone 3-axial accelerometer (Balance, gait, rest tremor and postural tremor) and pixel coordinates (Tapping and reaction time) N=(1,2) Pocket (balance and gait), front of participant (tapping and reaction time) and hand (postural and rest tremor) Clasification PwPD-HC and clasification PwPD- iRBD Postural tremor (mean squared energy, azimuth, 25th quartile, mode, radius) and rest tremor (entropy, root mean square) were the most discriminatory task between PD-HC-iRBD, with 85-88% of sensitivity.
[64] TUG test PwPD dataset 1: N = 15 (5F; 10M) PwPD dataset 2: N = 27 (9F; 17M) HC: N = 1015 (671F; 344M) IMU (Kinesis QTUG) 3-axial accelerometer and 3-axial gyroscope N=(1,2) Shin Fall risk prognosis and gait monitoring The mobility parameters (speed, turn, transfers, symmetry, variability) could be used to predict number of fall counts of PwPD ( R 2 = 43%)
[65] Gait PwPD: N = 81 (28F; 53M) HC: N = 61 (27F; 34M) IMU (Axivity AX3)3-axial accelerometer N=(1,1) Lower Back Clasification PwPD-HC Gait features (root mean square values, power spectral density, gait speed velocity, step length, step time and age) classify PwPD with AUC = 0,94.
[66] Gait TUG test Sit-to-tand test PwPD: N = 10 (4F; 6M) PSP (Progressive Supranuclear Palsy): N = 10 (4F; 6M) IMU (LEGSys)3-axial accelerometer, 3-axial gyroscope, 3-axial magnetometer N=(3,3) Shin (1 each) and lower Back Classification PwPD-PSP Gait speed was significantly slower in PSP (p <0,001).
[67] Activities of daily living MDS-UPDRS task PwPD: N = 31 (11F; 20M) HC: N = 50 (27F; 23M) IMU (Opals by APDM) 3-axial accelerometer, 3-axial gyroscope and 3-axial magnetometer N=(1,3) Wrist Motor symptoms monitoring; Therapy-response monitoring RMS (amplitude) of the magnitude vector for resting tremor (p <0,0004) and RMS (amplitude) and jerk (smoothness) of the magnitude vector forbradykinesia (p <0,0001) achieve agreement with clinical assessment of symptom severity and treatment-related changes in motor states.
[68] Gait PwPD: N=40 (19F; 21M) IMU (+sMotion ) 3-axial accelerometer and 3-axial gyroscope N=(1,2) Lower back Classification motor condition and Quality of Life. Gait Features (velocity pace, SD swing time variability, Antero-posterior center of mass angle of postural control) classify UPDRS-III severity with p <0,001. Gait Features (gait speed, step time rhythm, stance time, step length) correlated with PDQ39 with p <0,001
[69] Activities of daily living TUG test Abnormal Involuntary Movement Scale MDS-UPDRS task Gait PwPD: N = 18 (7F; 11M) HC: N = 24 (11F; 13M) IMU (Physilog® 4), Android smartwatch, Android smartphone, Empatica E4 smartwatch 3-axial accelerometer, 3-axial gyroscope, 3-axial magnetometer, and barometer (IMU), 3-axial accelerometer, 3-axial gyroscope, barometer, and light (Android smartwatch), 3-axial accelerometer, 3-axial magnetometer, light, proximity, GPS, WiFi, and cellular networks (Android smartphone), and Galvanic skin response, photoplethysmogram, skin temperature, 3-axial accelerometer (Empatica) N = (8,12) Ankles (1 each), wrist (1 each), lower back (IMU), wrist (Android smartwatch), pocket (Android smartphone), and wrist (Empatica) Classification of PwPD-HC; ON-OFF states monitoring The total power in the 0.5- to 10-Hz band was most discriminate feature to classify PwPD-HC (AUC = 0,76) and ON-OFF detection (AUC = 0,84).
[70] Finger Tapping - Two-target finger tapping test - Reaction time - Pronation- supination PwPD: N = 19 HC: N = 17 Tablet Pixel coordinates N=(1,1) Front of participant Classification of PwPD-HC and ON-OFF states monitoring All test combined classify PwPD-HC with 93.11% accuracy. Most differentiating test is reaction time (inter-tap interval, tap accuracy) with 83.90% accuracy. ON-OFF state classifies with 76,50% accuracy.
[71] Activities of Daily Living Rest tremor Postural tremor Finger tapping Balance task Gait PwPD: N = 43 (8F; 35M) HC: N = 35 (8F; 27M) Smartphone 3-axial accelerometer, 3-axial gyroscope and 3-axial magnetometer N=(1,3) Waist (balance and gait), hand (tremor) and front of participant (tapping) Classification PwPD-HC and Motor symptoms monitoring Tapping (inter-tap variability), rest tremor (acceleration skewness), postural tremor (total power of accelerometer), balance (mean velocity), gait (turn speed) differentiated HC from PwPD and PD abnormalities (p<0.005).
[72] Gait Balance Task Finger Tapping (Mpower database) PwPD: N = 610 (211F; 399M) (gait), 612 (211F; 401M) (balance), 970 (340F; 630M) (tapping) HC: N = 807 (152F; 655M) (gait), 823 (155F; 668M) (balance), 1674 (304F; 1370M) (tapping) Smartphone 3-axial accelerometer (gait and balance) and pixel coordinates (tapping) N=(1,2) Pocket (gait and balance) and front of participant (tapping) Classification PwPD-HC and therapy response monitoring Tapping features (total taps, inter-tap intervals, median/standard deviation absolute deviations, correlation X-Y tap) displayed the best performance in classify PwPD-HC (p<0,05).
Table 2. Performance comparison of tasks for disease diagnosis, prognosis, and monitoring. Acc: accuracy; r: Pearson’s correlation coefficient; MAE: mean absolute value
Table 2. Performance comparison of tasks for disease diagnosis, prognosis, and monitoring. Acc: accuracy; r: Pearson’s correlation coefficient; MAE: mean absolute value
Task Diagnosis Treatment Severity UPDRS–III
Finger tapping AUC 0.74–0.95 Acc 0.75–0.84 - r = 0.51–0.69, MAE=8
Gait AUC 0.76–0.98 AUC 0.82 AUC 0.85–0.98 -
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated