Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
1.
PLoS One ; 18(8): e0283895, 2023.
Article in English | MEDLINE | ID: mdl-37561695

ABSTRACT

When judging the quality of a computational system for a pathological screening task, several factors seem to be important, like sensitivity, specificity, accuracy, etc. With machine learning based approaches showing promise in the multi-label paradigm, they are being widely adopted to diagnostics and digital therapeutics. Metrics are usually borrowed from machine learning literature, and the current consensus is to report results on a diverse set of metrics. It is infeasible to compare efficacy of computational systems which have been evaluated on different sets of metrics. From a diagnostic utility standpoint, the current metrics themselves are far from perfect, often biased by prevalence of negative samples or other statistical factors and importantly, they are designed to evaluate general purpose machine learning tasks. In this paper we outline the various parameters that are important in constructing a clinical metric aligned with diagnostic practice, and demonstrate their incompatibility with existing metrics. We propose a new metric, MedTric that takes into account several factors that are of clinical importance. MedTric is built from the ground up keeping in mind the unique context of computational diagnostics and the principle of risk minimization, penalizing missed diagnosis more harshly than over-diagnosis. MedTric is a unified metric for medical or pathological screening system evaluation. We compare this metric against other widely used metrics and demonstrate how our system outperforms them in key areas of medical relevance.


Subject(s)
Algorithms , Machine Learning , Benchmarking
2.
PLoS One ; 17(11): e0277975, 2022.
Article in English | MEDLINE | ID: mdl-36417477

ABSTRACT

Time series sensor data classification tasks often suffer from training data scarcity issue due to the expenses associated with the expert-intervened annotation efforts. For example, Electrocardiogram (ECG) data classification for cardio-vascular disease (CVD) detection requires expensive labeling procedures with the help of cardiologists. Current state-of-the-art algorithms like deep learning models have shown outstanding performance under the general requirement of availability of large set of training examples. In this paper, we propose Shapley Attributed Ablation with Augmented Learning: ShapAAL, which demonstrates that deep learning algorithm with suitably selected subset of the seen examples or ablating the unimportant ones from the given limited training dataset can ensure consistently better classification performance under augmented training. In ShapAAL, additive perturbed training augments the input space to compensate the scarcity in training examples using Residual Network (ResNet) architecture through perturbation-induced inputs, while Shapley attribution seeks the subset from the augmented training space for better learnability with the goal of better general predictive performance, thanks to the "efficiency" and "null player" axioms of transferable utility games upon which Shapley value game is formulated. In ShapAAL, the subset of training examples that contribute positively to a supervised learning setup is derived from the notion of coalition games using Shapley values associated with each of the given inputs' contribution into the model prediction. ShapAAL is a novel push-pull deep architecture where the subset selection through Shapley value attribution pushes the model to lower dimension while augmented training augments the learning capability of the model over unseen data. We perform ablation study to provide the empirical evidence of our claim and we show that proposed ShapAAL method consistently outperforms the current baselines and state-of-the-art algorithms for time series sensor data classification tasks from publicly available UCR time series archive that includes different practical important problems like detection of CVDs from ECG data.


Subject(s)
Algorithms , Electrocardiography , Time Factors , Electrocardiography/methods
3.
Annu Int Conf IEEE Eng Med Biol Soc ; 2022: 1655-1658, 2022 07.
Article in English | MEDLINE | ID: mdl-36085683

ABSTRACT

Atrial Fibrillation (AF) is a kind of arrhythmia, which is a major morbidity factor, and AF can lead to stroke, heart failure and other cardiovascular complications. Electrocardiogram (ECG) is the basic marker to test the condition of heart and it can effectively detect AF condition. Single lead ECG has the practical advantage for being small form factor and it is easy to deploy. With the sophistication of the current deep learning (DL) models, researchers have been able to construct cardiologist-level models to detect different arrhythmias including AF condition detection from single lead short-time ECG signals. However, such models are computationally expensive and require huge memory size for deployment (more than 100 MB to deploy state-of-the-art 34-layer convolutional neural network-based ECG classification model). Such models need to be significantly trimmed with insignificantly loss of its classification performance for deployment in practical applications like single lead ECG classification in wearable and implantable devices. We have found that classical deep learning model compression techniques like pruning, quantization are not capable of substantial model size reduction without compromising on the model performance. In this paper, we propose LTH-ECG, which is our novel goal-driven winning lottery ticket discovery method, where lottery ticket hypothesis (LTH)-based iterative model pruning is used with the aim of over-pruning avoidance. LTH-ECG reduces the model size by 142x times with insignificant loss of classification performance (less than 1 % test F1-score penalty). Clinical Relevance- LTH-ECG will enable practical deployment for remote screening of AF condition using single lead short-time ECG recordings such that patients can on-demand monitor AF condition remotely through wearable ECG sensing devices and report cardiological abnormality to the concerned physician. LTH-ECG acts as an early warning system for effective AF condition screening.


Subject(s)
Atrial Fibrillation , Data Compression , Deep Learning , Wearable Electronic Devices , Atrial Fibrillation/diagnosis , Electrocardiography , Humans
4.
Annu Int Conf IEEE Eng Med Biol Soc ; 2021: 886-889, 2021 11.
Article in English | MEDLINE | ID: mdl-34891432

ABSTRACT

Electrocardiogram (ECG) is one of the fundamental markers to detect different cardiovascular diseases (CVDs). Owing to the widespread availability of ECG sensors (single lead) as well as smartwatches with ECG recording capability, ECG classification using wearable devices to detect different CVDs has become a basic requirement for a smart healthcare ecosystem. In this paper, we propose a novel method of model compression with robust detection capability for CVDs from ECG signals such that the sophisticated and effective baseline deep neural network model can be optimized for the resource constrained micro-controller platform suitable for wearable devices while minimizing the performance loss. We employ knowledge distillation-based model compression approach where the baseline (teacher) deep neural network model is compressed to a TinyML (student) model using piecewise linear approximation. Our proposed ECG TinyML has achieved ~156x compression factor to suit to the requirement of 100KB memory availability for model deployment on wearable devices. The proposed model requires ~5782 times (estimated) less computational load than state-of-the-art residual neural network (ResNet) model with negligible performance loss (less than 1% loss in test accuracy, test sensitivity, test precision and test F1-score). We further feel that the small footprint model size of ECG TinyML (62.3 KB) can be suitably deployed in implantable devices including implantable loop recorder (ILR).


Subject(s)
Cardiovascular Diseases , Data Compression , Wearable Electronic Devices , Ecosystem , Electrocardiography , Humans
5.
Sensors (Basel) ; 19(12)2019 Jun 18.
Article in English | MEDLINE | ID: mdl-31216659

ABSTRACT

Remote and automated healthcare management has shown the prospective to significantly impact the future of human prognosis rate. Internet of Things (IoT) enables the development and implementation ecosystem to cater the need of large number of relevant stakeholders. In this paper, we consider the cardiac health management system to demonstrate that data-driven techniques produce substantial performance merits in terms of clinical efficacy by employing robust machine learning methods with relevant and selected signal processing features. We consider phonocardiogram (PCG) or heart sound as the exemplary physiological signal. PCG carries substantial cardiac health signature to establish our claim of data-centric superior clinical utility. Our method demonstrates close to 85% accuracy on publicly available MIT-Physionet PCG datasets and outperform relevant state-of-the-art algorithm. Due to its simpler computational architecture of shallow classifier with just three features, the proposed analytics method is performed at edge gateway. However, it is to be noted that healthcare analytics deal with number of sensitive data and subsequent inferences, which need privacy protection. Additionally, the problem of healthcare data privacy prevention is addressed by de-risking of sensitive data management using differential privacy, such that controlled privacy protection on sensitive healthcare data can be enabled. When a user sets for privacy protection, appropriate privacy preservation is guaranteed for defense against privacy-breaching knowledge mining attacks. In this era of IoT and machine intelligence, this work is of practical importance, which enables on-demand automated screening of cardiac health under minimizing the privacy breaching risk.


Subject(s)
Biosensing Techniques , Heart/physiology , Machine Learning , Monitoring, Physiologic/methods , Algorithms , Computer Security , Humans , Prospective Studies , Signal Processing, Computer-Assisted
6.
Physiol Meas ; 40(5): 054006, 2019 06 04.
Article in English | MEDLINE | ID: mdl-30650387

ABSTRACT

OBJECTIVE: Atrial fibrillation (AF) and other types of abnormal heart rhythm are related to multiple fatal cardiovascular diseases that affect the quality of human life. Hence the development of an automated robust method that can reliably detect AF, in addition to other non-sinus and sinus rhythms, would be a valuable addition to medicine. The present study focuses on developing an algorithm for the classification of short, single-lead electrocardiogram (ECG) recordings into normal, AF, other abnormal rhythms and noisy classes. APPROACH: The proposed classification framework presents a two-layer, three-node architecture comprising binary classifiers. PQRST markers are detected on each ECG recording, followed by noise removal using a spectrogram power based novel adaptive thresholding scheme. Next, a feature pool comprising time, frequency, morphological and statistical domain ECG features is extracted for the classification task. At each node of the classification framework, suitable feature subsets, identified through feature ranking and dimension reduction, are selected for use. Adaptive boosting is selected as the classifier for the present case. The training data comprises 8528 ECG recordings provided under the PhysioNet 2017 Challenge. F1 scores averaged across the three non-noisy classes are taken as the performance metric. MAIN RESULT: The final five-fold cross-validation score achieved by the proposed framework on the training data has high accuracy with low variance (0.8254 [Formula: see text] 0.0043). SIGNIFICANCE: Further, the proposed algorithm has achieved joint first place in the PhysioNet/Computing in Cardiology Challenge 2017 with a score of 0.83 computed on a hidden test dataset.


Subject(s)
Algorithms , Atrial Fibrillation/diagnostic imaging , Atrial Fibrillation/diagnosis , Electrocardiography , Humans , Probability , Reproducibility of Results , Signal Processing, Computer-Assisted , Time Factors
7.
Annu Int Conf IEEE Eng Med Biol Soc ; 2018: 482-485, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30440439

ABSTRACT

We present a system to analyze patterns inside pulsatile signals and discover repetitions inside signals. We measure dominance of the repetitions using morphology and discrete nature of the signals by exploiting machine learning and information theoretic concepts. Patterns are represented as combinations of the basic features and derived features. Consistency of discovered patterns identifies state of physiological stability which varies from one individual to another. Hence it has immense impact on deriving the accurate physiological parameters for personalized health analytics. Proposed mechanism discovers the regular and irregular patterns by performing extensive analysis on several real life cardiac data sets. We have achieved more than 90% accuracy in identifying irregular patterns using our proposed method.


Subject(s)
Machine Learning , Monitoring, Physiologic/methods , Pattern Recognition, Automated , Algorithms , Humans , Photoplethysmography
8.
Annu Int Conf IEEE Eng Med Biol Soc ; 2017: 2753-2756, 2017 Jul.
Article in English | MEDLINE | ID: mdl-29060468

ABSTRACT

Phonocardiogram (PCG) records heart sound and murmurs, which contains significant information of cardiac health. Analysis of PCG signal has the potential to detect abnormal cardiac condition. However, the presence of noise and motion artifacts in PCG hinders the accuracy of clinical event detection. Thus, noise detection and elimination are crucial to ensure accurate clinical analysis. In this paper, we present a robust denoising technique, Proclean that precisely detects the noisy PCG signal through pattern recognition, and statistical learning. We propose a novel self-discriminant learner that ensures to obtain distinct feature set to distinguish clean and noisy PCG signals without human-in-loop. We demonstrate that our proposed denoising leads to higher accuracy in subsequent clinical analytics for medical investigation. Our extensive experimentations with publicly available MIT-Physionet datasets show that we achieve more than 85% accuracy for noisy PCG signal detection. Further, we establish that physiological abnormality detection improves by more than 20%, when our proposed denoising mechanism is applied.


Subject(s)
Phonocardiography , Algorithms , Heart , Heart Murmurs , Heart Sounds , Humans , Signal Processing, Computer-Assisted
9.
Annu Int Conf IEEE Eng Med Biol Soc ; 2016: 740-743, 2016 Aug.
Article in English | MEDLINE | ID: mdl-28268434

ABSTRACT

We propose here derivation algorithms for physiological parameters like beat start point, systolic peak, pulse duration, peak-to-peak distance related to heart rate, dicrotic minima, diastolic peak from Photoplethysmogram (PPG) signals robustly. Our methods are based on unsupervised learning mainly following morphology as well as discrete nature of the signal. Statistical learning has been used as a special aid to infer most probable feature values mainly to cope up with presence of noise, which is assumed to be insignificant compared to signal values at each investigation window. Performance of the proposed method is found to be better than other standard methods, yielding precision and sensitivity more than 97% obtained from three real life data sets.


Subject(s)
Photoplethysmography , Unsupervised Machine Learning , Algorithms , Diastole , Heart Rate , Humans , Signal Processing, Computer-Assisted , Systole
SELECTION OF CITATIONS
SEARCH DETAIL
...