Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 82
Filter
1.
Comput Methods Programs Biomed ; 229: 107308, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36535127

ABSTRACT

BACKGROUND AND OBJECTIVE: Myocardial infarction (MI) is a life-threatening condition diagnosed acutely on the electrocardiogram (ECG). Several errors, such as noise, can impair the prediction of automated ECG diagnosis. Therefore, quantification and communication of model uncertainty are essential for reliable MI diagnosis. METHODS: A Dirichlet DenseNet model that could analyze out-of-distribution data and detect misclassification of MI and normal ECG signals was developed. The DenseNet model was first trained with the pre-processed MI ECG signals (from the best lead V6) acquired from the Physikalisch-Technische Bundesanstalt (PTB) database, using the reverse Kullback-Leibler (KL) divergence loss. The model was then tested with newly synthesized ECG signals with added em and ma noise samples. Predictive entropy was used as an uncertainty measure to determine the misclassification of normal and MI signals. Model performance was evaluated using four uncertainty metrics: uncertainty sensitivity (UNSE), uncertainty specificity (UNSP), uncertainty accuracy (UNAC), and uncertainty precision (UNPR); the classification threshold was set at 0.3. RESULTS: The UNSE of the DenseNet model was low but increased over the studied decremental noise range (-6 to 24 dB), indicating that the model grew more confident in classifying the signals as they got less noisy. The model became more certain in its predictions from SNR values of 12 dB and 18 dB onwards, yielding UNAC values of 80% and 82.4% for em and ma noise signals, respectively. UNSP and UNPR values were close to 100% for em and ma noise signals, indicating that the model was self-aware of what it knew and didn't. CONCLUSION: Through this work, it has been established that the model is reliable as it was able to convey when it was not confident in the diagnostic information it was presenting. Thus, the model is trustworthy and can be used in healthcare applications, such as the emergency diagnosis of MI on ECGs.


Subject(s)
Electrocardiography , Myocardial Infarction , Humans , Uncertainty , Myocardial Infarction/diagnosis , Databases, Factual , Entropy
2.
Comput Biol Med ; 146: 105550, 2022 07.
Article in English | MEDLINE | ID: mdl-35533457

ABSTRACT

Myocardial infarction (MI) accounts for a high number of deaths globally. In acute MI, accurate electrocardiography (ECG) is important for timely diagnosis and intervention in the emergency setting. Machine learning is increasingly being explored for automated computer-aided ECG diagnosis of cardiovascular diseases. In this study, we have developed DenseNet and CNN models for the classification of healthy subjects and patients with ten classes of MI based on the location of myocardial involvement. ECG signals from the Physikalisch-Technische Bundesanstalt database were pre-processed, and the ECG beats were extracted using an R peak detection algorithm. The beats were then fed to the two models separately. While both models attained high classification accuracies (more than 95%), DenseNet is the preferred model for the classification task due to its low computational complexity and higher classification accuracy than the CNN model due to feature reusability. An enhanced class activation mapping (CAM) technique called Grad-CAM was subsequently applied to the outputs of both models to enable visualization of the specific ECG leads and portions of ECG waves that were most influential for the predictive decisions made by the models for the 11 classes. It was observed that Lead V4 was the most activated lead in both the DenseNet and CNN models. Furthermore, this study has also established the different leads and parts of the signal that get activated for each class. This is the first study to report features that influenced the classification decisions of deep models for multiclass classification of MI and healthy ECGs. Hence this study is crucial and contributes significantly to the medical field as with some level of visible explainability of the inner workings of the models, the developed DenseNet and CNN models may garner needed clinical acceptance and have the potential to be implemented for ECG triage of MI diagnosis in hospitals and remote out-of-hospital settings.


Subject(s)
Deep Learning , Myocardial Infarction , Algorithms , Diagnosis, Computer-Assisted , Electrocardiography/methods , Humans , Myocardial Infarction/diagnosis
3.
Comput Biol Med ; 134: 104457, 2021 07.
Article in English | MEDLINE | ID: mdl-33991857

ABSTRACT

Cardiovascular diseases (CVDs) are main causes of death globally with coronary artery disease (CAD) being the most important. Timely diagnosis and treatment of CAD is crucial to reduce the incidence of CAD complications like myocardial infarction (MI) and ischemia-induced congestive heart failure (CHF). Electrocardiogram (ECG) signals are most commonly employed as the diagnostic screening tool to detect CAD. In this study, an automated system (AS) was developed for the automated categorization of electrocardiogram signals into normal, CAD, myocardial infarction (MI) and congestive heart failure (CHF) classes using convolutional neural network (CNN) and unique GaborCNN models. Weight balancing was used to balance the imbalanced dataset. High classification accuracies of more than 98.5% were obtained by the CNN and GaborCNN models respectively, for the 4-class classification of normal, coronary artery disease, myocardial infarction and congestive heart failure classes. GaborCNN is a more preferred model due to its good performance and reduced computational complexity as compared to the CNN model. To the best of our knowledge, this is the first study to propose GaborCNN model for automated categorizing of normal, coronary artery disease, myocardial infarction and congestive heart failure classes using ECG signals. Our proposed system is equipped to be validated with bigger database and has the potential to aid the clinicians to screen for CVDs using ECG signals.


Subject(s)
Coronary Artery Disease , Heart Failure , Myocardial Infarction , Coronary Artery Disease/diagnosis , Electrocardiography , Heart Failure/diagnosis , Humans , Myocardial Infarction/diagnosis , Signal Processing, Computer-Assisted
4.
Article in English | MEDLINE | ID: mdl-32078556

ABSTRACT

Recently, coronary heart disease has attracted more and more attention, where segmentation and analysis for vascular lumen contour are helpful for treatment. And intravascular optical coherence tomography (IVOCT) images are used to display lumen shapes in clinic. Thus, an automatic segmentation method for IVOCT lumen contour is necessary to reduce the doctors' workload while ensuring diagnostic accuracy. In this paper, we proposed a deep residual segmentation network of multi-scale feature fusion based on attention mechanism (RSM-Network, Residual Squeezed Multi-Scale Network) to segment the lumen contour in IVOCT images. Firstly, three different data augmentation methods including mirror level turnover, rotation and vertical flip are considered to expand the training set. Then in the proposed RSM-Network, U-Net is contained as the main body, considering its characteristic of accepting input images with any sizes. Meanwhile, the combination of residual network and attention mechanism is applied to improve the ability of global feature extraction and solve the vanishing gradient problem. Moreover, the pyramid feature extraction structure is introduced to enhance the learning ability for multi-scale features. Finally, in order to increase the matching degree between the actual output and expected output, the cross entropy loss function is also used. A series of metrics are presented to evaluate the performance of our proposed network and the experimental results demonstrate that the proposed RSM-Network can learn the contour details better, contributing to strong robustness and accuracy for IVOCT lumen contour segmentation.


Subject(s)
Deep Learning , Endovascular Procedures/methods , Image Processing, Computer-Assisted/methods , Tomography, Optical Coherence/methods , Blood Vessels/diagnostic imaging , Databases, Factual , Humans , Neural Networks, Computer
5.
Comput Biol Med ; 126: 103999, 2020 11.
Article in English | MEDLINE | ID: mdl-32992139

ABSTRACT

BACKGROUND: Hypertension (HPT) occurs when there is increase in blood pressure (BP) within the arteries, causing the heart to pump harder against a higher afterload to deliver oxygenated blood to other parts of the body. PURPOSE: Due to fluctuation in BP, 24-h ambulatory blood pressure monitoring has emerged as a useful tool for diagnosing HPT but is limited by its inconvenience. So, an automatic diagnostic tool using electrocardiogram (ECG) signals is used in this study to detect HPT automatically. METHOD: The pre-processed signals are fed to a convolutional neural network model. The model learns and identifies unique ECG signatures for classification of normal and hypertension ECG signals. The proposed model is evaluated by the 10-fold and leave one out patient based validation techniques. RESULTS: A high classification accuracy of 99.99% is achieved for both validation techniques. This is one of the first few studies to have employed deep learning algorithm coupled with ECG signals for the detection of HPT. Our results imply that the developed tool is useful in a hospital setting as an automated diagnostic tool, enabling the effortless detection of HPT using ECG signals.


Subject(s)
Blood Pressure Monitoring, Ambulatory , Hypertension , Algorithms , Electrocardiography , Humans , Hypertension/diagnosis , Neural Networks, Computer
6.
Comput Biol Med ; 120: 103718, 2020 05.
Article in English | MEDLINE | ID: mdl-32250851

ABSTRACT

Unlike passive infrared (IR) thermal imaging/thermography, where no external stimulation is applied, active dynamic thermography (ADT) results in a high contrast thermal image. In ADT, transient thermal images of the skin surface are captured using an IR thermal camera while the skin surface is stimulated externally, followed by a recovery phase. Upon the application of external stimulation, the presence of stenosis in the carotid artery is expected to differ the recovery rate of the external neck skin surface from the case with no stenosis. In this prospective study, using an external cooling stimulation, the ADT procedure was performed on a total of 54 (N) samples (C: N = 19, 0% stenosis; D1: N = 17, 10%-29% stenosis; D2: N = 18, ≥30% stenosis using Duplex Ultrasound). Analyzing the ADT sequence with a parameter called tissue activity ratio (TAR), the samples were classified using a cut-off value: C versus (D1+D2) and (C + D1) versus D2. As the degree of stenosis increases, the value of the TAR parameter depreciates with a significant difference among the sample groups (C:0.97 ± 0.05, D1:0.80 ± 0.04, D2:0.75 ± 0.02; p < 0.05). Under the two classification scenarios, classification accuracies of 90% and 85%, respectively, were achieved. This study suggests the potential of screening CAS with the proposed ADT procedure.


Subject(s)
Carotid Stenosis , Thermography , Carotid Artery, Common , Carotid Stenosis/diagnostic imaging , Constriction, Pathologic , Humans , Mass Screening , Prospective Studies
7.
Comput Biol Med ; 118: 103630, 2020 03.
Article in English | MEDLINE | ID: mdl-32174317

ABSTRACT

Hypertension (HPT), also known as high blood pressure, is a precursor to heart, brain or kidney diseases. Some symptoms of HPT include headaches, dizziness and fainting. The potential diagnosis of masked hypertension is of specific interest in this study. In masked hypertension (MHPT), the instantaneous blood pressure appears normal, but the 24-h ambulatory blood pressure is abnormal. Hence patients with MHPT are difficult to identify and thus remain untreated or are treated insufficiently. Hence, a computational intelligence tool (CIT) using electrocardiograms (ECG) signals for HPT and possible MHPT detection is proposed in this work. Empirical mode decomposition (EMD) is employed to decompose the pre-processed signals up to five levels. Nonlinear features are extracted from the five intrinsic mode functions (IMFs) thereafter. Student's t-test is subsequently applied to select a set of highly discriminatory features. This feature set is then input to various classifiers, in which, the best accuracy of 97.70% is yielded by the k-nearest neighbor (k-NN) classifier. The developed tool is evaluated by the 10-fold cross validation technique. Our findings suggest that the developed system is useful for diagnostic computational intelligence tool in hospital settings, and that it enables the automatic classification of HPT versus normal ECG signals.


Subject(s)
Blood Pressure Monitoring, Ambulatory , Hypertension , Algorithms , Artificial Intelligence , Electrocardiography , Humans , Hypertension/diagnosis , Signal Processing, Computer-Assisted
8.
J Biomech Eng ; 142(6)2020 06 01.
Article in English | MEDLINE | ID: mdl-31633169

ABSTRACT

In this work, we provide a quantitative assessment of the biomechanical and geometric features that characterize abdominal aortic aneurysm (AAA) models generated from 19 Asian and 19 Caucasian diameter-matched AAA patients. 3D patient-specific finite element models were generated and used to compute peak wall stress (PWS), 99th percentile wall stress (99th WS), and spatially averaged wall stress (AWS) for each AAA. In addition, 51 global geometric indices were calculated, which quantify the wall thickness, shape, and curvature of each AAA. The indices were correlated with 99th WS (the only biomechanical metric that exhibited significant association with geometric indices) using Spearman's correlation and subsequently with multivariate linear regression using backward elimination. For the Asian AAA group, 99th WS was highly correlated (R2 = 0.77) with three geometric indices, namely tortuosity, intraluminal thrombus volume, and area-averaged Gaussian curvature. Similarly, 99th WS in the Caucasian AAA group was highly correlated (R2 = 0.87) with six geometric indices, namely maximum AAA diameter, distal neck diameter, diameter-height ratio, minimum wall thickness variance, mode of the wall thickness variance, and area-averaged Gaussian curvature. Significant differences were found between the two groups for ten geometric indices; however, no differences were found for any of their respective biomechanical attributes. Assuming maximum AAA diameter as the most predictive metric for wall stress was found to be imprecise: 24% and 28% accuracy for the Asian and Caucasian groups, respectively. This investigation reveals that geometric indices other than maximum AAA diameter can serve as predictors of wall stress, and potentially for assessment of aneurysm rupture risk, in the Asian and Caucasian AAA populations.


Subject(s)
Aortic Aneurysm, Abdominal , Finite Element Analysis , Biomechanical Phenomena , Humans , Male , Middle Aged , Models, Cardiovascular
9.
Comput Biol Med ; 115: 103483, 2019 12.
Article in English | MEDLINE | ID: mdl-31698235

ABSTRACT

Glaucoma is a malady that occurs due to the buildup of fluid pressure in the inner eye. Detection of glaucoma at an early stage is crucial as by 2040, 111.8 million people are expected to be afflicted with glaucoma globally. Feature extraction methods prove to be promising in the diagnosis of glaucoma. In this study, we have used optical coherence tomography angiogram (OCTA) images for automated glaucoma detection. Ocular sinister (OS) from the left eye while ocular dexter (OD) were obtained from right eye of subjects. We have used OS macular, OS disc, OD macular and OD disc images. In this work, local phase quantization (LPQ) technique was applied to extract the features. Information fusion and principal component analysis (PCA) are used to combine and reduce the features. Our method achieved the highest accuracy of 94.3% using LPQ coupled with PCA for right eye optic disc images with AdaBoost classifier. The proposed technique can aid clinicians in glaucoma detection at an early stage. The developed model is ready to be tested with more images before deploying for clinical application.


Subject(s)
Angiography , Databases, Factual , Glaucoma/diagnostic imaging , Image Processing, Computer-Assisted , Optic Disk/diagnostic imaging , Tomography, Optical Coherence , Female , Humans , Male
10.
Comput Biol Med ; 113: 103419, 2019 10.
Article in English | MEDLINE | ID: mdl-31493579

ABSTRACT

In the present study, an infrared (IR) thermal camera was used to map the temperature of the target skin surface, and the resulting thermal image was evaluated for the presence of carotid artery stenosis (CAS). In the presence of stenosis in the carotid artery, abnormal temperature maps are expected to occur on the external skin surface, which could be captured and quantified using IR thermography. A Duplex Ultrasound (DUS) examination was used to establish the ground truth. In each patient, the background-subtracted thermal image, referred to as full thermal image, was used to extract novel parametric cold thermal feature images. From these images, statistical features, viz., correlation, energy, homogeneity, contrast, entropy, mean, standard deviation (SD), skewness, and kurtosis, were calculated and the two groups of patients (control and diseased: a total of 80 carotid artery samples) were classified. Both cut-off value- and support vector machine (SVM)-based binary classification models were tested. While the cut-off value classification model resulted in a moderate performance (70% accurate), SVM was found to have classified the patients with high accuracy (92% or higher). This preliminary study suggests the potential of IR thermography as a possible screening tool for CAS patients.


Subject(s)
Carotid Stenosis , Image Processing, Computer-Assisted , Infrared Rays , Support Vector Machine , Thermography , Aged , Carotid Stenosis/diagnosis , Carotid Stenosis/diagnostic imaging , Female , Humans , Male , Mass Screening , Middle Aged , Prospective Studies
11.
Comput Biol Med ; 113: 103392, 2019 10.
Article in English | MEDLINE | ID: mdl-31446317

ABSTRACT

In this paper, a continuous non-occluding blood pressure (BP) prediction method is proposed using multiple photoplethysmogram (PPG) signals. In the new method, BP is predicted by a committee machine or ensemble learning framework comprising multiple support vector regression (SVR) machines. The existing methods for continuous BP prediction rely on a single calibration model obtained from a single arterial segment. Our ensemble framework is the first BP estimation method which uses multiple SVR models for calibration from multiple arterial segments. This permits reducing of the mean prediction error and the risk of overfitting associated with a single model. Each SVR in the ensemble is trained on a comprehensive feature set that is constructed from a distinct PPG segment. The feature set includes pulse morphological parameters such as systolic pulse amplitude and area under the curve, heart rate variability (HRV) frequency, time domain parameters and the pulse wave velocity (PWV). Empirical evaluation using 40 volunteers with no serious health conditions shows that the proposed method is more reliable for estimating both the systolic and diastolic BP than similar methods employing a single calibration model under identical settings. Moreover, the combined output is found to be more stable than the output of any of the constituent models in the ensemble for both the systolic and diastolic cases.


Subject(s)
Blood Pressure Determination , Blood Pressure , Photoplethysmography , Pulse Wave Analysis , Signal Processing, Computer-Assisted , Support Vector Machine , Humans , Predictive Value of Tests
12.
Comput Biol Med ; 112: 103371, 2019 09.
Article in English | MEDLINE | ID: mdl-31404720

ABSTRACT

OBJECTIVE: The aim of this study was to research, develop and assess the feasibility of using basic statistical parameters derived from renogram, "mean count value (MeanCV) and "median count value (MedianCV)", as novel indices in the diagnosis of renal obstruction through diuresis renography. SUBJECTS AND METHODS: First, we re-digitalized and normalized 132 renograms from 74 patients in order to derive the MeanCV and MedianCV. To improve the performance of the parameters, we extrapolated renograms by a two-compartmental modeling. After that, the cutoff points for diagnosis using each modified parameter were set and the sensitivity and specificity were calculated in order to determine the best variants of MeanCV and MedianCV that could differentiate renal obstruction status into 3 distinct classes - i) unobstructed, ii) slightly obstructed, and iii) heavily obstructed. RESULTS: The modified MeanCV and MedianCV derived from extended renograms predicted the severity of the renal obstruction. The most appropriate variants of MeanCV and MedianCV were found to be the MeanCV50 and the MedianCV60. The cutoff points of MeanCV50 in separating unobstructed and obstructed classes as well as slightly and heavily obstructed classes were 0.50 and 0.72, respectively. The cutoff points of MedianCV60 in separating unobstructed and obstructed classes as well as slightly and heavily obstructed classes were 0.35 and 0.69, respectively. Notably, MeanCV50 and MedianCV60 were not significantly influenced by either age or gender. CONCLUSIONS: The MeanCV50 and the MedianCV60 derived from a renogram could be incorporated with other quantifiable parameters to form a system that could provide a highly accurate diagnosis of renal obstructions.


Subject(s)
Algorithms , Image Interpretation, Computer-Assisted , Kidney Diseases/diagnostic imaging , Radioisotope Renography , Radiopharmaceuticals/administration & dosage , Technetium Tc 99m Mertiatide/administration & dosage , Adult , Female , Humans , Male , Middle Aged , Sensitivity and Specificity
13.
Front Neurosci ; 13: 210, 2019.
Article in English | MEDLINE | ID: mdl-30949018

ABSTRACT

Recent research has reported the application of image fusion technologies in medical images in a wide range of aspects, such as in the diagnosis of brain diseases, the detection of glioma and the diagnosis of Alzheimer's disease. In our study, a new fusion method based on the combination of the shuffled frog leaping algorithm (SFLA) and the pulse coupled neural network (PCNN) is proposed for the fusion of SPECT and CT images to improve the quality of fused brain images. First, the intensity-hue-saturation (IHS) of a SPECT and CT image are decomposed using a non-subsampled contourlet transform (NSCT) independently, where both low-frequency and high-frequency images, using NSCT, are obtained. We then used the combined SFLA and PCNN to fuse the high-frequency sub-band images and low-frequency images. The SFLA is considered to optimize the PCNN network parameters. Finally, the fused image was produced from the reversed NSCT and reversed IHS transforms. We evaluated our algorithms against standard deviation (SD), mean gradient (G), spatial frequency (SF) and information entropy (E) using three different sets of brain images. The experimental results demonstrated the superior performance of the proposed fusion method to enhance both precision and spatial resolution significantly.

14.
Med Biol Eng Comput ; 57(2): 379-388, 2019 Feb.
Article in English | MEDLINE | ID: mdl-30123948

ABSTRACT

Early detection of breast tumors, feet pre-ulcers diagnosing in diabetic patients, and identifying the location of pain in patients are essential to physicians. Hot or cold regions in medical thermographic images have potential to be suspicious. Hence extracting the hottest or coldest regions in the body thermographic images is an important task. Lazy snapping is an interactive image cutout algorithm that can be applied to extract the hottest or coldest regions in the body thermographic images quickly with easy detailed adjustment. The most important advantage of this technique is that it can provide the results for physicians in real time readily. In other words, it is a good interactive image segmentation algorithm since it has two basic characteristics: (1) the algorithm produces intuitive segmentation that reflects the user intent with given a certain user input and (2) the algorithm is efficient enough to provide instant visual feedback. Comparing to other methods used by the authors for segmentation of breast thermograms such as K-means, fuzzy c-means, level set, and mean shift algorithms, lazy snapping was more user-friendly and could provide instant visual feedback. In this study, twelve test cases were presented and by applying lazy snapping algorithm, the hottest or coldest regions were extracted from the corresponding body thermographic images. The time taken to see the results varied from 7 to 30 s for these twelve cases. It was concluded that lazy snapping was much faster than other methods applied by the authors such as K-means, fuzzy c-means, level set, and mean shift algorithms for segmentation. Graphical abstract Time taken to implement lazy snapping algorithm to extract suspicious regions in different presented thermograms (in seconds). In this study, ten test cases are presented that by applying lazy snapping algorithm, the hottest or coldest regions were extracted from the corresponding body thermographic images. The time taken to see the results varied from 7 to 30 s for the ten cases. It concludes lazy snapping is much faster than other methods applied by the authors.


Subject(s)
Breast Neoplasms/diagnosis , Algorithms , Breast/pathology , Female , Fuzzy Logic , Hot Temperature , Humans , Image Interpretation, Computer-Assisted/methods , Middle Aged , Pattern Recognition, Automated/methods , Thermography/methods
15.
Biomed Eng Online ; 16(1): 36, 2017 Mar 23.
Article in English | MEDLINE | ID: mdl-28335790

ABSTRACT

Current clinically accepted technologies for cancer treatment still have limitations which lead to the exploration of new therapeutic methods. Since the past few decades, the hyperthermia treatment has attracted the attention of investigators owing to its strong biological rationales in applying hyperthermia as a cancer treatment modality. Advancement of nanotechnology offers a potential new heating method for hyperthermia by using nanoparticles which is termed as magnetic fluid hyperthermia (MFH). In MFH, superparamagnetic nanoparticles dissipate heat through Néelian and Brownian relaxation in the presence of an alternating magnetic field. The heating power of these particles is dependent on particle properties and treatment settings. A number of pre-clinical and clinical trials were performed to test the feasibility of this novel treatment modality. There are still issues yet to be solved for the successful transition of this technology from bench to bedside. These issues include the planning, execution, monitoring and optimization of treatment. The modeling and simulation play crucial roles in solving some of these issues. Thus, this review paper provides a basic understanding of the fundamental and rationales of hyperthermia and recent development in the modeling and simulation applied to depict the heat generation and transfer phenomena in the MFH.


Subject(s)
Hyperthermia, Induced/methods , Magnetic Fields , Models, Biological , Physical Phenomena , Animals , Hot Temperature , Humans , Nanoparticles/chemistry
16.
Comput Biol Med ; 71: 241-51, 2016 Apr 01.
Article in English | MEDLINE | ID: mdl-26897481

ABSTRACT

Early expansion of infarcted zone after Acute Myocardial Infarction (AMI) has serious short and long-term consequences and contributes to increased mortality. Thus, identification of moderate and severe phases of AMI before leading to other catastrophic post-MI medical condition is most important for aggressive treatment and management. Advanced image processing techniques together with robust classifier using two-dimensional (2D) echocardiograms may aid for automated classification of the extent of infarcted myocardium. Therefore, this paper proposes novel algorithms namely Curvelet Transform (CT) and Local Configuration Pattern (LCP) for an automated detection of normal, moderately infarcted and severely infarcted myocardium using 2D echocardiograms. The methodology extracts the LCP features from CT coefficients of echocardiograms. The obtained features are subjected to Marginal Fisher Analysis (MFA) dimensionality reduction technique followed by fuzzy entropy based ranking method. Different classifiers are used to differentiate ranked features into three classes normal, moderate and severely infarcted based on the extent of damage to myocardium. The developed algorithm has achieved an accuracy of 98.99%, sensitivity of 98.48% and specificity of 100% for Support Vector Machine (SVM) classifier using only six features. Furthermore, we have developed an integrated index called Myocardial Infarction Risk Index (MIRI) to detect the normal, moderately and severely infarcted myocardium using a single number. The proposed system may aid the clinicians in faster identification and quantification of the extent of infarcted myocardium using 2D echocardiogram. This system may also aid in identifying the person at risk of developing heart failure based on the extent of infarcted myocardium.


Subject(s)
Algorithms , Data Mining/methods , Echocardiography/methods , Image Processing, Computer-Assisted/methods , Myocardial Infarction/diagnostic imaging , Support Vector Machine , Female , Heart Failure/diagnostic imaging , Heart Failure/etiology , Humans , Male , Myocardial Infarction/complications
17.
Comput Biol Med ; 71: 231-40, 2016 Apr 01.
Article in English | MEDLINE | ID: mdl-26898671

ABSTRACT

Cross-sectional view echocardiography is an efficient non-invasive diagnostic tool for characterizing Myocardial Infarction (MI) and stages of expansion leading to heart failure. An automated computer-aided technique of cross-sectional echocardiography feature assessment can aid clinicians in early and more reliable detection of MI patients before subsequent catastrophic post-MI medical conditions. Therefore, this paper proposes a novel Myocardial Infarction Index (MII) to discriminate infarcted and normal myocardium using features extracted from apical cross-sectional views of echocardiograms. The cross-sectional view of normal and MI echocardiography images are represented as textons using Maximum Responses (MR8) filter banks. Fractal Dimension (FD), Higher-Order Statistics (HOS), Hu's moments, Gabor Transform features, Fuzzy Entropy (FEnt), Energy, Local binary Pattern (LBP), Renyi's Entropy (REnt), Shannon's Entropy (ShEnt), and Kapur's Entropy (KEnt) features are extracted from textons. These features are ranked using t-test and fuzzy Max-Relevancy and Min-Redundancy (mRMR) ranking methods. Then, combinations of highly ranked features are used in the formulation and development of an integrated MII. This calculated novel MII is used to accurately and quickly detect infarcted myocardium by using one numerical value. Also, the highly ranked features are subjected to classification using different classifiers for the characterization of normal and MI LV ultrasound images using a minimum number of features. Our current technique is able to characterize MI with an average accuracy of 94.37%, sensitivity of 91.25% and specificity of 97.50% with 8 apical four chambers view features extracted from only single frame per patient making this a more reliable and accurate classification.


Subject(s)
Echocardiography/methods , Image Processing, Computer-Assisted/methods , Myocardial Infarction/diagnostic imaging , Myocardium , Cross-Sectional Studies , Female , Humans , Male
18.
Biomed Res Int ; 2015: 861627, 2015.
Article in English | MEDLINE | ID: mdl-26509168

ABSTRACT

Computational methods have played an important role in health care in recent years, as determining parameters that affect a certain medical condition is not possible in experimental conditions in many cases. Computational fluid dynamics (CFD) methods have been used to accurately determine the nature of blood flow in the cardiovascular and nervous systems and air flow in the respiratory system, thereby giving the surgeon a diagnostic tool to plan treatment accordingly. Machine learning or data mining (MLD) methods are currently used to develop models that learn from retrospective data to make a prediction regarding factors affecting the progression of a disease. These models have also been successful in incorporating factors such as patient history and occupation. MLD models can be used as a predictive tool to determine rupture potential in patients with abdominal aortic aneurysms (AAA) along with CFD-based prediction of parameters like wall shear stress and pressure distributions. A combination of these computer methods can be pivotal in bridging the gap between translational and outcomes research in medicine. This paper reviews the use of computational methods in the diagnosis and treatment of AAA.


Subject(s)
Aortic Aneurysm, Abdominal/physiopathology , Cardiovascular System/physiopathology , Models, Cardiovascular , Nervous System/physiopathology , Biomechanical Phenomena , Computer Simulation , Disease Progression , Humans , Hydrodynamics , Image Processing, Computer-Assisted , Machine Learning , Risk Factors
19.
J Therm Biol ; 51: 23-32, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25965014

ABSTRACT

Radiofrequency ablation (RFA) has been increasingly used in treating cancer for multitude of situations in various tissue types. To perform the therapy safely and reliably, the effect of critical parameters needs to be known beforehand. Temperature plays an important role in the outcome of the therapy and any uncertainties in temperature assessment can be lethal. This study presents the RFA case of fixed tip temperature where we've analysed the effect of electrical conductivity, thermal conductivity and blood perfusion rate of the tumour and surrounding normal tissue on the radiofrequency ablation. Ablation volume was chosen as the characteristic to be optimised and temperature control was achieved via PID controller. The effect of all 6 parameters each having 3 levels was quantified with minimum number of experiments harnessing the fractional factorial characteristic of Taguchi's orthogonal arrays. It was observed that as the blood perfusion increases the ablation volume decreases. Increasing electrical conductivity of the tumour results in increase of ablation volume whereas increase in normal tissue conductivity tends to decrease the ablation volume and vice versa. Likewise, increasing thermal conductivity of the tumour results in enhanced ablation volume whereas an increase in thermal conductivity of the surrounding normal tissue has a debilitating effect on the ablation volume and vice versa. With increase in the size of the tumour (i.e., 2-3cm) the effect of each parameter is not linear. The parameter effect varies with change in size of the tumour that is manifested by the different gradient observed in ablation volume. Most important is the relative insensitivity of ablation volume to blood perfusion rate for smaller tumour size (2cm) that is also in accordance with the previous results presented in literature. These findings will provide initial insight for safe, reliable and improved treatment planning perceptively.


Subject(s)
Catheter Ablation/methods , Models, Biological , Neoplasms/surgery , Catheter Ablation/adverse effects , Computer Simulation , Electric Conductivity , Electricity , Humans , Temperature , Thermal Conductivity
20.
Comput Biol Med ; 62: 86-93, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25912990

ABSTRACT

Myocardial Infarction (MI) or acute MI (AMI) is one of the leading causes of death worldwide. Precise and timely identification of MI and extent of muscle damage helps in early treatment and reduction in the time taken for further tests. MI diagnosis using 2D echocardiography is prone to inter-/intra-observer variability in the assessment. Therefore, a computerised scheme based on image processing and artificial intelligent techniques can reduce the workload of clinicians and improve the diagnosis accuracy. A Computer-Aided Diagnosis (CAD) of infarcted and normal ultrasound images will be useful for clinicians. In this study, the performance of CAD approach using Discrete Wavelet Transform (DWT), second order statistics calculated from Gray-Level Co-Occurrence Matrix (GLCM) and Higher-Order Spectra (HOS) texture descriptors are compared. The proposed system is validated using 400 MI and 400 normal ultrasound images, obtained from 80 patients with MI and 80 normal subjects. The extracted features are ranked based on t-value and fed to the Support Vector Machine (SVM) classifier to obtain the best performance using minimum number of features. The features extracted from DWT coefficients obtained an accuracy of 99.5%, sensitivity of 99.75% and specificity of 99.25%; GLCM have achieved an accuracy of 85.75%, sensitivity of 90.25% and specificity of 81.25%; and HOS obtained an accuracy of 93.0%, sensitivity of 94.75% and specificity of 91.25%. Among the three techniques presented DWT yielded the highest classification accuracy. Thus, the proposed CAD approach may be used as a complementary tool to assist cardiologists in making a more accurate diagnosis for the presence of MI.


Subject(s)
Diagnosis, Computer-Assisted/methods , Echocardiography, Doppler/methods , Myocardial Infarction/diagnostic imaging , Signal Processing, Computer-Assisted , Support Vector Machine , Female , Humans , Male
SELECTION OF CITATIONS
SEARCH DETAIL
...