Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
J Biomech ; 166: 112049, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38493576

ABSTRACT

Markerless motion capture has recently attracted significant interest in clinical gait analysis and human movement science. Its ease of use and potential to streamline motion capture recordings bear great potential for out-of-the-laboratory measurements in large cohorts. While previous studies have shown that markerless systems can achieve acceptable accuracy and reliability for kinematic parameters of gait, they also noted higher inter-trial variability of markerless data. Since increased inter-trial variability can have important implications for data post-processing and analysis, this study compared the inter-trial variability of simultaneously recorded markerless and marker-based data. For this purpose, the data of 18 healthy volunteers were used who were instructed to simulate four different gait patterns: physiological, crouch, circumduction, and equinus gait. Gait analysis was performed using the smartphone-based markerless system OpenCap and a marker-based motion capture system. We compared the inter-trial variability of both systems and also evaluated if changes in inter-trial variability may depend on the analyzed gait pattern. Compared to the marker-based data, we observed an increase of inter-trial variability for the markerless system ranging from 6.6% to 22.0% for the different gait patterns. Our findings demonstrate that the markerless pose estimation pipelines can introduce additionally variability in the kinematic data across different gait patterns and levels of natural variability. We recommend using averaged waveforms rather than single ones to mitigate this problem. Further, caution is advised when using variability-based metrics in gait and human movement analysis based on markerless data as increased inter-trial variability can lead to misleading results.


Subject(s)
Motion Capture , Movement , Humans , Reproducibility of Results , Movement/physiology , Gait/physiology , Gait Analysis , Biomechanical Phenomena , Motion
2.
J Biomech ; 159: 111801, 2023 Oct.
Article in English | MEDLINE | ID: mdl-37738945

ABSTRACT

Markerless motion capturing has the potential to provide a low-cost and accessible alternative to traditional marker-based systems for real-world biomechanical assessment. However, before these systems can be put into practice, we need to rigorously evaluate their accuracy in estimating joint kinematics for various gait patterns. This study evaluated the accuracy of a low-cost, open-source, and smartphone-based markerless motion capture system, namely OpenCap, for measuring 3D joint kinematics in healthy and pathological gait compared to a marker-based system. 21 healthy volunteers were instructed to walk with four different gait patterns: physiological, crouch, circumduction, and equinus gait. Three-dimensional kinematic data were simultaneously recorded using the markerless and a marker-based motion capture system. The root mean square error (RMSE) and the peak error were calculated between every joint kinematic variable obtained by both systems. We found an overall RMSE of 5.8 (SD: 1.8 degrees) and a peak error of 11.3 degrees (SD: 3.9). A repeated measures ANOVA with post hoc tests indicated significant differences in RMSE and peak errors between the four gait patterns (p ¡ 0.05). Physiological gait presented the lowest, crouch and circumduction gait the highest errors. Our findings indicate a roughly comparable accuracy to IMU-based approaches and commercial markerless multi-camera solutions. However, errors are still above clinically desirable thresholds of two to five degrees. While our findings highlight the potential of markerless systems for assessing gait kinematics, they also underpin the need to further improve the underlying deep learning algorithms to make markerless pose estimation a valuable tool in clinical settings.

3.
PLoS One ; 18(8): e0288555, 2023.
Article in English | MEDLINE | ID: mdl-37566568

ABSTRACT

The correct estimation of gait events is essential for the interpretation and calculation of 3D gait analysis (3DGA) data. Depending on the severity of the underlying pathology and the availability of force plates, gait events can be set either manually by trained clinicians or detected by automated event detection algorithms. The downside of manually estimated events is the tedious and time-intensive work which leads to subjective assessments. For automated event detection algorithms, the drawback is, that there is no standardized method available. Algorithms show varying robustness and accuracy on different pathologies and are often dependent on setup or pathology-specific thresholds. In this paper, we aim at closing this gap by introducing a novel deep learning-based gait event detection algorithm called IntellEvent, which shows to be accurate and robust across multiple pathologies. For this study, we utilized a retrospective clinical 3DGA dataset of 1211 patients with four different pathologies (malrotation deformities of the lower limbs, club foot, infantile cerebral palsy (ICP), and ICP with only drop foot characteristics) and 61 healthy controls. We propose a recurrent neural network architecture based on long-short term memory (LSTM) and trained it with 3D position and velocity information to predict initial contact (IC) and foot off (FO) events. We compared IntellEvent to a state-of-the-art heuristic approach and a machine learning method called DeepEvent. IntellEvent outperforms both methods and detects IC events on average within 5.4 ms and FO events within 11.3 ms with a detection rate of ≥ 99% and ≥ 95%, respectively. Our investigation on generalizability across laboratories suggests that models trained on data from a different laboratory need to be applied with care due to setup variations or differences in capturing frequencies.


Subject(s)
Cerebral Palsy , Deep Learning , Humans , Retrospective Studies , Biomechanical Phenomena , Gait , Algorithms
4.
Front Bioeng Biotechnol ; 9: 780314, 2021.
Article in English | MEDLINE | ID: mdl-34957075

ABSTRACT

Virtual reality (VR) is an emerging technology offering tremendous opportunities to aid gait rehabilitation. To this date, real walking with users immersed in virtual environments with head-mounted displays (HMDs) is either possible with treadmills or room-scale (overground) VR setups. Especially for the latter, there is a growing interest in applications for interactive gait training as they could allow for more self-paced and natural walking. This study investigated if walking in an overground VR environment has relevant effects on 3D gait biomechanics. A convenience sample of 21 healthy individuals underwent standard 3D gait analysis during four randomly assigned walking conditions: the real laboratory (RLab), a virtual laboratory resembling the real world (VRLab), a small version of the VRlab (VRLab-), and a version which is twice as long as the VRlab (VRLab+). To immerse the participants in the virtual environment we used a VR-HMD, which was operated wireless and calibrated in a way that the virtual labs would match the real-world. Walking speed and a single measure of gait kinematic variability (GaitSD) served as primary outcomes next to standard spatio-temporal parameters, their coefficients of variant (CV%), kinematics, and kinetics. Briefly described, participants demonstrated a slower walking pattern (-0.09 ± 0.06 m/s) and small accompanying kinematic and kinetic changes. Participants also showed a markedly increased gait variability in lower extremity gait kinematics and spatio-temporal parameters. No differences were found between walking in VRLab+ vs. VRLab-. Most of the kinematic and kinetic differences were too small to be regarded as relevant, but increased kinematic variability (+57%) along with increased percent double support time (+4%), and increased step width variability (+38%) indicate gait adaptions toward a more conservative or cautious gait due to instability induced by the VR environment. We suggest considering these effects in the design of VR-based overground training devices. Our study lays the foundation for upcoming developments in the field of VR-assisted gait rehabilitation as it describes how VR in overground walking scenarios impacts our gait pattern. This information is of high relevance when one wants to develop purposeful rehabilitation tools.

SELECTION OF CITATIONS
SEARCH DETAIL
...