Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 24
Filter
1.
Comput Biol Med ; 168: 107755, 2024 01.
Article in English | MEDLINE | ID: mdl-38039895

ABSTRACT

The visualization and comparison of electrophysiological information in the atrium among different patients could be facilitated by a standardized 2D atrial mapping. However, due to the complexity of the atrial anatomy, unfolding the 3D geometry into a 2D atrial mapping is challenging. In this study, we aim to develop a standardized approach to achieve a 2D atrial mapping that connects the left and right atria, while maintaining fixed positions and sizes of atrial segments across individuals. Atrial segmentation is a prerequisite for the process. Segmentation includes 19 different segments with 12 segments from the left atrium, 5 segments from the right atrium, and two segments for the atrial septum. To ensure consistent and physiologically meaningful segment connections, an automated procedure is applied to open up the atrial surfaces and project the 3D information into 2D. The corresponding 2D atrial mapping can then be utilized to visualize different electrophysiological information of a patient, such as activation time patterns or phase maps. This can in turn provide useful information for guiding catheter ablation. The proposed standardized 2D maps can also be used to compare more easily structural information like fibrosis distribution with rotor presence and location. We show several examples of visualization of different electrophysiological properties for both healthy subjects and patients affected by atrial fibrillation. These examples show that the proposed maps provide an easy way to visualize and interpret intra-subject information and perform inter-subject comparison, which may provide a reference framework for the analysis of the atrial fibrillation substrate before treatment, and during a catheter ablation procedure.


Subject(s)
Atrial Appendage , Atrial Fibrillation , Catheter Ablation , Humans , Atrial Fibrillation/diagnostic imaging , Atrial Fibrillation/surgery , Heart Atria/diagnostic imaging , Catheter Ablation/methods
2.
PLoS One ; 18(11): e0292030, 2023.
Article in English | MEDLINE | ID: mdl-38032940

ABSTRACT

The liver is the primary site for the metabolism and detoxification of many compounds, including pharmaceuticals. Consequently, it is also the primary location for many adverse reactions. As the liver is not readily accessible for sampling in humans; rodent or cell line models are often used to evaluate potential toxic effects of a novel compound or candidate drug. However, relating the results of animal and in vitro studies to relevant clinical outcomes for the human in vivo situation still proves challenging. In this study, we incorporate principles of transfer learning within a deep artificial neural network allowing us to leverage the relative abundance of rat in vitro and in vivo exposure data from the Open TG-GATEs data set to train a model to predict the expected pattern of human in vivo gene expression following an exposure given measured human in vitro gene expression. We show that domain adaptation has been successfully achieved, with the rat and human in vitro data no longer being separable in the common latent space generated by the network. The network produces physiologically plausible predictions of human in vivo gene expression pattern following an exposure to a previously unseen compound. Moreover, we show the integration of the human in vitro data in the training of the domain adaptation network significantly improves the temporal accuracy of the predicted rat in vivo gene expression pattern following an exposure to a previously unseen compound. In this way, we demonstrate the improvements in prediction accuracy that can be achieved by combining data from distinct domains.


Subject(s)
Liver , Neural Networks, Computer , Humans , Rats , Animals , Learning , Machine Learning , Gene Expression
3.
Arch Toxicol ; 97(11): 2969-2981, 2023 11.
Article in English | MEDLINE | ID: mdl-37603094

ABSTRACT

Drug-induced intrahepatic cholestasis (DIC) is a main type of hepatic toxicity that is challenging to predict in early drug development stages. Preclinical animal studies often fail to detect DIC in humans. In vitro toxicogenomics assays using human liver cells have become a practical approach to predict human-relevant DIC. The present study was set up to identify transcriptomic signatures of DIC by applying machine learning algorithms to the Open TG-GATEs database. A total of nine DIC compounds and nine non-DIC compounds were selected, and supervised classification algorithms were applied to develop prediction models using differentially expressed features. Feature selection techniques identified 13 genes that achieved optimal prediction performance using logistic regression combined with a sequential backward selection method. The internal validation of the best-performing model showed accuracy of 0.958, sensitivity of 0.941, specificity of 0.978, and F1-score of 0.956. Applying the model to an external validation set resulted in an average prediction accuracy of 0.71. The identified genes were mechanistically linked to the adverse outcome pathway network of DIC, providing insights into cellular and molecular processes during response to chemical toxicity. Our findings provide valuable insights into toxicological responses and enhance the predictive accuracy of DIC prediction, thereby advancing the application of transcriptome profiling in designing new approach methodologies for hazard identification.


Subject(s)
Adverse Outcome Pathways , Chemical and Drug Induced Liver Injury , Cholestasis , Animals , Humans , Cholestasis/chemically induced , Cholestasis/genetics , Chemical and Drug Induced Liver Injury/genetics , Machine Learning
4.
Front Physiol ; 14: 1158003, 2023.
Article in English | MEDLINE | ID: mdl-37089414

ABSTRACT

Background: The optimal way to determine repolarization time (RT) from the intracardiac unipolar electrogram (UEG) has been a topic of debate for decades. RT is typically determined by either the Wyatt method or the "alternative method," which both consider UEG T-wave slope, but differently. Objective: To determine the optimal method to measure RT on the UEG. Methods: Seven pig hearts surrounded by an epicardial sock with 100 electrodes were Langendorff-perfused with selective cannulation of the left anterior descending (LAD) coronary artery and submersed in a torso-shaped tank containing 256 electrodes on the torso surface. Repolarization was prolonged in the non-LAD-regions by infusing dofetilide and shortened in the LAD-region using pinacidil. RT was determined by the Wyatt (tWyatt) and alternative (tAlt) methods, in both invasive (recorded with epicardial electrodes) and in non-invasive UEGs (reconstructed with electrocardiographic imaging). tWyatt and tAlt were compared to local effective refractory period (ERP). Results: With contact mapping, mean absolute error (MAE) of tWyatt and tAlt vs. ERP were 21 ms and 71 ms, respectively. Positive T-waves typically had an earlier ERP than negative T-waves, in line with theory. tWyatt -but not tAlt-shortened by local infusion of pinacidil. Similar results were found for the non-invasive UEGs (MAE of tWyatt and tAlt vs. ERP were 30 ms and 92 ms, respectively). Conclusion: The Wyatt method is the most accurate to determine RT from (non) invasive UEGs, based on novel and historical analyses. Using it to determine RT could unify and facilitate repolarization assessment and amplify its role in cardiac electrophysiology.

5.
Sensors (Basel) ; 23(4)2023 Feb 07.
Article in English | MEDLINE | ID: mdl-36850438

ABSTRACT

The electrocardiogram (ECG) is the standard method in clinical practice to non-invasively analyze the electrical activity of the heart, from electrodes placed on the body's surface. The ECG can provide a cardiologist with relevant information to assess the condition of the heart and the possible presence of cardiac pathology. Nonetheless, the global view of the heart's electrical activity given by the ECG cannot provide fully detailed and localized information about abnormal electrical propagation patterns and corresponding substrates on the surface of the heart. Electrocardiographic imaging, also known as the inverse problem in electrocardiography, tries to overcome these limitations by non-invasively reconstructing the heart surface potentials, starting from the corresponding body surface potentials, and the geometry of the torso and the heart. This problem is ill-posed, and regularization techniques are needed to achieve a stable and accurate solution. The standard approach is to use zero-order Tikhonov regularization and the L-curve approach to choose the optimal value for the regularization parameter. However, different methods have been proposed for computing the optimal value of the regularization parameter. Moreover, regardless of the estimation method used, this may still lead to over-regularization or under-regularization. In order to gain a better understanding of the effects of the choice of regularization parameter value, in this study, we first focused on the regularization parameter itself, and investigated its influence on the accuracy of the reconstruction of heart surface potentials, by assessing the reconstruction accuracy with high-precision simultaneous heart and torso recordings from four dogs. For this, we analyzed a sufficiently large range of parameter values. Secondly, we evaluated the performance of five different methods for the estimation of the regularization parameter, also in view of the results of the first analysis. Thirdly, we investigated the effect of using a fixed value of the regularization parameter across all reconstructed beats. Accuracy was measured in terms of the quality of reconstruction of the heart surface potentials and estimation of the activation and recovery times, when compared with ground truth recordings from the experimental dog data. Results show that values of the regularization parameter in the range (0.01-0.03) provide the best accuracy, and that the three best-performing estimation methods (L-Curve, Zero-Crossing, and CRESO) give values in this range. Moreover, a fixed value of the regularization parameter could achieve very similar performance to the beat-specific parameter values calculated by the different estimation methods. These findings are relevant as they suggest that regularization parameter estimation methods may provide the accurate reconstruction of heart surface potentials only for specific ranges of regularization parameter values, and that using a fixed value of the regularization parameter may represent a valid alternative, especially when computational efficiency or consistency across time is required.


Subject(s)
Electrocardiography , Heart , Animals , Dogs , Heart/diagnostic imaging , Torso , Electricity , Electrodes
6.
PLoS One ; 15(12): e0243386, 2020.
Article in English | MEDLINE | ID: mdl-33290430

ABSTRACT

In the absence of curative therapies, treatment of metastatic castrate-resistant prostate cancer (mCRPC) using currently available drugs can be improved by integrating evolutionary principles that govern proliferation of resistant subpopulations into current treatment protocols. Here we develop what is coined as an 'evolutionary stable therapy', within the context of the mathematical model that has been used to inform the first adaptive therapy clinical trial of mCRPC. The objective of this therapy is to maintain a stable polymorphic tumor heterogeneity of sensitive and resistant cells to therapy in order to prolong treatment efficacy and progression free survival. Optimal control analysis shows that an increasing dose titration protocol, a very common clinical dosing process, can achieve tumor stabilization for a wide range of potential initial tumor compositions and volumes. Furthermore, larger tumor volumes may counter intuitively be more likely to be stabilized if sensitive cells dominate the tumor composition at time of initial treatment, suggesting a delay of initial treatment could prove beneficial. While it remains uncertain if metastatic disease in humans has the properties that allow it to be truly stabilized, the benefits of a dose titration protocol warrant additional pre-clinical and clinical investigations.


Subject(s)
Androstenes/therapeutic use , Cell Proliferation/drug effects , Docetaxel/therapeutic use , Prostatic Neoplasms, Castration-Resistant/drug therapy , Disease Progression , Docetaxel/adverse effects , Drug Resistance, Neoplasm , Humans , Kaplan-Meier Estimate , Male , Models, Theoretical , Neoplasm Metastasis , Progression-Free Survival , Prostate-Specific Antigen/blood , Prostatic Neoplasms, Castration-Resistant/economics , Prostatic Neoplasms, Castration-Resistant/epidemiology , Prostatic Neoplasms, Castration-Resistant/pathology , Treatment Outcome , Tumor Burden/drug effects
7.
PLoS One ; 15(8): e0236392, 2020.
Article in English | MEDLINE | ID: mdl-32780735

ABSTRACT

In clinical trials, animal and cell line models are often used to evaluate the potential toxic effects of a novel compound or candidate drug before progressing to human trials. However, relating the results of animal and in vitro model exposures to relevant clinical outcomes in the human in vivo system still proves challenging, relying on often putative orthologs. In recent years, multiple studies have demonstrated that the repeated dose rodent bioassay, the current gold standard in the field, lacks sufficient sensitivity and specificity in predicting toxic effects of pharmaceuticals in humans. In this study, we evaluate the potential of deep learning techniques to translate the pattern of gene expression measured following an exposure in rodents to humans, circumventing the current reliance on orthologs, and also from in vitro to in vivo experimental designs. Of the applied deep learning architectures applied in this study the convolutional neural network (CNN) and a deep artificial neural network with bottleneck architecture significantly outperform classical machine learning techniques in predicting the time series of gene expression in primary human hepatocytes given a measured time series of gene expression from primary rat hepatocytes following exposure in vitro to a previously unseen compound across multiple toxicologically relevant gene sets. With a reduction in average mean absolute error across 76 genes that have been shown to be predictive for identifying carcinogenicity from 0.0172 for a random regression forest to 0.0166 for the CNN model (p < 0.05). These deep learning architecture also perform well when applied to predict time series of in vivo gene expression given measured time series of in vitro gene expression for rats.


Subject(s)
Deep Learning , Gene Expression Regulation/drug effects , Machine Learning , Algorithms , Animals , Clinical Trials as Topic/statistics & numerical data , Gene Expression Regulation/genetics , Hepatocytes/drug effects , Humans , Neural Networks, Computer , Rats
8.
Sci Rep ; 10(1): 10433, 2020 06 26.
Article in English | MEDLINE | ID: mdl-32591560

ABSTRACT

Understanding adipose tissue cellular heterogeneity and homeostasis is essential to comprehend the cell type dynamics in metabolic diseases. Cellular subpopulations in the adipose tissue have been related to disease development, but efforts towards characterizing the adipose tissue cell type composition are limited. Here, we identify the cell type composition of the adipose tissue by using gene expression deconvolution of large amounts of publicly available transcriptomics level data. The proposed approach allows to present a comprehensive study of adipose tissue cell type composition, determining the relative amounts of 21 different cell types in 1282 adipose tissue samples detailing differences across four adipose tissue depots, between genders, across ranges of BMI and in different stages of type-2 diabetes. We compare our results to previous marker-based studies by conducting a literature review of adipose tissue cell type composition and propose candidate cellular markers to distinguish different cell types within the adipose tissue. This analysis reveals gender-specific differences in CD4+ and CD8+ T cell subsets; identifies adipose tissue as rich source of multipotent stem/stromal cells; and highlights a strongly increased immune cell content in epicardial and pericardial adipose tissue compared to subcutaneous and omental depots. Overall, this systematic analysis provides comprehensive insights into adipose tissue cell-type heterogeneity in health and disease.


Subject(s)
Adipocytes/metabolism , Adipose Tissue/metabolism , Obesity/metabolism , Databases, Genetic , Humans , Obesity/genetics , Transcriptome
9.
Med Biol Eng Comput ; 58(9): 1933-1945, 2020 Sep.
Article in English | MEDLINE | ID: mdl-32535735

ABSTRACT

ECG-based representation of atrial fibrillation (AF) progression is currently limited. We propose a novel framework for a more sensitive noninvasive characterization of the AF substrate during persistent AF. An atrial activity (AA) recurrence signal is computed from body surface potential map (BSPM) recordings, and a set of characteristic indices is derived from it which captures the short- and long-term recurrent behaviour in the AA patterns. A novel measure of short- and long-term spatial variability of AA propagation is introduced, to provide an interpretation of the above indices, and to test the hypothesis that the variability in the oscillatory content of AA is due mainly to a spatially uncoordinated propagation of the AF waveforms. A simple model of atrial signal dynamics is proposed to confirm this hypothesis, and to investigate a possible influence of the AF substrate on the short-term recurrent behaviour of AA propagation. Results confirm the hypothesis, with the model also revealing the above influence. Once the characteristic indices are normalized to remove this influence, they show to be significantly associated with AF recurrence 4 to 6 weeks after electrical cardioversion. Therefore, the proposed framework improves noninvasive AF substrate characterization in patients with a very similar substrate. Graphical Abstract Schematic representation of the proposed framework for the noninvasive characterization of short-term atrial signal dynamics during persistent AF. The proposed framework shows that the faster the AA is propagating, the more stable its propagation paths are in the short-term (larger values of Speed in the bottom right plot should be interpreted as lower speed of propagation of the corresponding AA propagation patters).


Subject(s)
Atrial Fibrillation/physiopathology , Body Surface Potential Mapping/statistics & numerical data , Heart Atria/physiopathology , Models, Cardiovascular , Atrial Fibrillation/classification , Atrial Fibrillation/therapy , Biomedical Engineering , Databases, Factual , Electric Countershock , Electrocardiography/statistics & numerical data , Humans , Recurrence , Signal Processing, Computer-Assisted , Spatio-Temporal Analysis
10.
Article in English | MEDLINE | ID: mdl-30369448

ABSTRACT

Increasingly, multiple parallel omics datasets are collected from biological samples. Integrating these datasets for classification is an open area of research. Additionally, whilst multiple datasets may be available for the training samples, future samples may only be measured by a single technology requiring methods which do not rely on the presence of all datasets for sample prediction. This enables us to directly compare the protein and the gene profiles. New samples with just one set of measurements (e.g., just protein) can then be mapped to this latent common space where classification is performed. Using this approach, we achieved an improvement of up to 12 percent in accuracy when classifying samples based on their protein measurements compared with baseline methods which were trained on the protein data alone. We illustrate that the additional inclusion of the gene expression or protein expression in the training process enabled the separation between the classes to become clearer.


Subject(s)
Breast Neoplasms/classification , Computational Biology/methods , Immunohistochemistry/methods , Machine Learning , Algorithms , Breast Neoplasms/genetics , Breast Neoplasms/metabolism , Female , Humans , Transcriptome/genetics
11.
PLoS Comput Biol ; 15(10): e1007400, 2019 10.
Article in English | MEDLINE | ID: mdl-31581241

ABSTRACT

Given the association of disturbances in non-esterified fatty acid (NEFA) metabolism with the development of Type 2 Diabetes and Non-Alcoholic Fatty Liver Disease, computational models of glucose-insulin dynamics have been extended to account for the interplay with NEFA. In this study, we use arteriovenous measurement across the subcutaneous adipose tissue during a mixed meal challenge test to evaluate the performance and underlying assumptions of three existing models of adipose tissue metabolism and construct a new, refined model of adipose tissue metabolism. Our model introduces new terms, explicitly accounting for the conversion of glucose to glyceraldehye-3-phosphate, the postprandial influx of glycerol into the adipose tissue, and several physiologically relevant delays in insulin signalling in order to better describe the measured adipose tissues fluxes. We then applied our refined model to human adipose tissue flux data collected before and after a diet intervention as part of the Yoyo study, to quantify the effects of caloric restriction on postprandial adipose tissue metabolism. Significant increases were observed in the model parameters describing the rate of uptake and release of both glycerol and NEFA. Additionally, decreases in the model's delay in insulin signalling parameters indicates there is an improvement in adipose tissue insulin sensitivity following caloric restriction.


Subject(s)
Adipose Tissue/metabolism , Computational Biology/methods , Lipid Metabolism/physiology , Arteriovenous Anastomosis/metabolism , Blood Glucose/metabolism , Computer Simulation , Fatty Acids/metabolism , Fatty Acids, Nonesterified/metabolism , Glucose/metabolism , Humans , Insulin/metabolism , Isotopes , Lipids/physiology , Models, Biological , Postprandial Period/physiology
12.
Arch Toxicol ; 93(11): 3067-3098, 2019 11.
Article in English | MEDLINE | ID: mdl-31586243

ABSTRACT

Drug-induced liver injury (DILI) complicates safety assessment for new drugs and poses major threats to both patient health and drug development in the pharmaceutical industry. A number of human liver cell-based in vitro models combined with toxicogenomics methods have been developed as an alternative to animal testing for studying human DILI mechanisms. In this review, we discuss the in vitro human liver systems and their applications in omics-based drug-induced hepatotoxicity studies. We furthermore present bioinformatic approaches that are useful for analyzing toxicogenomic data generated from these models and discuss their current and potential contributions to the understanding of mechanisms of DILI. Human pluripotent stem cells, carrying donor-specific genetic information, hold great potential for advancing the study of individual-specific toxicological responses. When co-cultured with other liver-derived non-parenchymal cells in a microfluidic device, the resulting dynamic platform enables us to study immune-mediated drug hypersensitivity and accelerates personalized drug toxicology studies. A flexible microfluidic platform would also support the assembly of a more advanced organs-on-a-chip device, further bridging gap between in vitro and in vivo conditions. The standard transcriptomic analysis of these cell systems can be complemented with causality-inferring approaches to improve the understanding of DILI mechanisms. These approaches involve statistical techniques capable of elucidating regulatory interactions in parts of these mechanisms. The use of more elaborated human liver models, in harmony with causality-inferring bioinformatic approaches will pave the way for establishing a powerful methodology to systematically assess DILI mechanisms across a wide range of conditions.


Subject(s)
Animal Testing Alternatives/methods , Chemical and Drug Induced Liver Injury , Liver/drug effects , Transcriptome/drug effects , Animals , Cell Line, Tumor , Chemical and Drug Induced Liver Injury/etiology , Chemical and Drug Induced Liver Injury/metabolism , Chemical and Drug Induced Liver Injury/pathology , Computational Biology , Gene Expression Profiling , Hepatocytes/drug effects , Hepatocytes/metabolism , Hepatocytes/pathology , Humans , In Vitro Techniques , Lab-On-A-Chip Devices , Liver/metabolism , Liver/pathology , Spheroids, Cellular/drug effects , Spheroids, Cellular/metabolism , Spheroids, Cellular/pathology , Stem Cells/drug effects , Stem Cells/metabolism , Stem Cells/pathology
13.
Sci Rep ; 9(1): 9388, 2019 06 28.
Article in English | MEDLINE | ID: mdl-31253846

ABSTRACT

The Muscle Insulin Sensitivity Index (MISI) has been developed to estimate muscle-specific insulin sensitivity based on oral glucose tolerance test (OGTT) data. To date, the score has been implemented with considerable variation in literature and initial positive evaluations were not reproduced in subsequent studies. In this study, we investigate the computation of MISI on oral OGTT data with differing sampling schedules and aim to standardise and improve its calculation. Seven time point OGTT data for 2631 individuals from the Maastricht Study and seven time point OGTT data combined with a hyperinsulinemic-euglycaemic clamp for 71 individuals from the PRESERVE Study were used to evaluate the performance of MISI. MISI was computed on subsets of OGTT data representing four and five time point sampling schedules to determine minimal requirements for accurate computation of the score. A modified MISI computed on cubic splines of the measured data, resulting in improved identification of glucose peak and nadir, was compared with the original method yielding an increased correlation (ρ = 0.576) with the clamp measurement of peripheral insulin sensitivity as compared to the original method (ρ = 0.513). Finally, a standalone MISI calculator was developed allowing for a standardised method of calculation using both the original and improved methods.


Subject(s)
Glucose Intolerance , Glucose/metabolism , Insulin Resistance , Insulin/metabolism , Muscle, Skeletal/metabolism , Adult , Aged , Blood Glucose , Female , Glucose/administration & dosage , Glucose Tolerance Test/methods , Glucose Tolerance Test/standards , Humans , Male , Metabolic Syndrome/diagnosis , Metabolic Syndrome/etiology , Metabolic Syndrome/metabolism , Middle Aged , Reproducibility of Results
14.
EPMA J ; 9(2): 161-173, 2018 Jun.
Article in English | MEDLINE | ID: mdl-29896315

ABSTRACT

BACKGROUND: It is uncertain whether repeated measurements of a multi-target biomarker panel may help to personalize medical heart failure (HF) therapy to improve outcome in chronic HF. METHODS: This analysis included 499 patients from the Trial of Intensified versus standard Medical therapy in Elderly patients with Congestive Heart Failure (TIME-CHF), aged ≥ 60 years, LVEF ≤ 45%, and NYHA ≥ II, who had repeated clinical visits within 19 months follow-up. The interaction between repeated measurements of biomarkers and treatment effects of loop diuretics, spironolactone, ß-blockers, and renin-angiotensin system (RAS) inhibitors on risk of HF hospitalization or death was investigated in a hypothesis-generating analysis. Generalized estimating equation (GEE) models were used to account for the correlation between recurrences of events in a patient. RESULTS: One hundred patients (20%) had just one event (HF hospitalization or death) and 87 (17.4%) had at least two events. Loop diuretic up-titration had a beneficial effect for patients with high interleukin-6 (IL6) or high high-sensitivity C-reactive protein (hsCRP) (interaction, P = 0.013 and P = 0.001), whereas the opposite was the case with low hsCRP (interaction, P = 0.013). Higher dosage of loop diuretics was associated with poor outcome in patients with high blood urea nitrogen (BUN) or prealbumin (interaction, P = 0.006 and P = 0.001), but not in those with low levels of these biomarkers. Spironolactone up-titration was associated with lower risk of HF hospitalization or death in patients with high cystatin C (CysC) (interaction, P = 0.021). ß-Blockers up-titration might have a beneficial effect in patients with low soluble fms-like tyrosine kinase-1 (sFlt) (interaction, P = 0.021). No treatment biomarker interactions were found for RAS inhibition. CONCLUSION: The data of this post hoc analysis suggest that decision-making using repeated biomarker measurements may be very promising in bringing treatment of heart failure to a new level in the context of predictive, preventive, and personalized medicine. Clearly, prospective testing is needed before this novel concept can be adopted. CLINICAL TRIAL REGISTRATION: isrctn.org, identifier: ISRCTN43596477.

15.
Med Biol Eng Comput ; 56(11): 2039-2050, 2018 Nov.
Article in English | MEDLINE | ID: mdl-29752679

ABSTRACT

We investigated a novel sparsity-based regularization method in the wavelet domain of the inverse problem of electrocardiography that aims at preserving the spatiotemporal characteristics of heart-surface potentials. In three normal, anesthetized dogs, electrodes were implanted around the epicardium and body-surface electrodes were attached to the torso. Potential recordings were obtained simultaneously on the body surface and on the epicardium. A CT scan was used to digitize a homogeneous geometry which consisted of the body-surface electrodes and the epicardial surface. A novel multitask elastic-net-based method was introduced to regularize the ill-posed inverse problem. The method simultaneously pursues a sparse wavelet representation in time-frequency and exploits correlations in space. Performance was assessed in terms of quality of reconstructed epicardial potentials, estimated activation and recovery time, and estimated locations of pacing, and compared with performance of Tikhonov zeroth-order regularization. Results in the wavelet domain obtained higher sparsity than those in the time domain. Epicardial potentials were non-invasively reconstructed with higher accuracy than with Tikhonov zeroth-order regularization (p < 0.05), and recovery times were improved (p < 0.05). No significant improvement was found in terms of activation times and localization of origin of pacing. Next to improved estimation of recovery isochrones, which is important when assessing substrate for cardiac arrhythmias, this novel technique opens potentially powerful opportunities for clinical application, by allowing to choose wavelet bases that are optimized for specific clinical questions. Graphical Abstract The inverse problem of electrocardiography is to reconstruct heart-surface potentials from recorded bodysurface electrocardiograms (ECGs) and a torso-heart geometry. However, it is ill-posed and solving it requires additional constraints for regularization. We introduce a regularization method that simultaneously pursues a sparse wavelet representation in time-frequency and exploits correlations in space. Our approach reconstructs epicardial (heart-surface) potentials with higher accuracy than common methods. It also improves the reconstruction of recovery isochrones, which is important when assessing substrate for cardiac arrhythmias. This novel technique opens potentially powerful opportunities for clinical application, by allowing to choose wavelet bases that are optimized for specific clinical questions.


Subject(s)
Electrocardiography/methods , Signal Processing, Computer-Assisted , Algorithms , Animals , Body Surface Potential Mapping/methods , Dogs , Electrodes, Implanted , Pericardium/diagnostic imaging , Pericardium/physiology , Tomography, X-Ray Computed
16.
J Card Fail ; 23(5): 382-389, 2017 May.
Article in English | MEDLINE | ID: mdl-28232046

ABSTRACT

BACKGROUND: Although heart failure (HF) patients are known to experience repeated hospitalizations, most studies evaluated only time to first event. N-Terminal B-type natriuretic peptide (NT-proBNP)-guided therapy has not convincingly been shown to improve HF-specific outcomes, and effects on recurrent all-cause hospitalization are uncertain. Therefore, we investigated the effect of NT-proBNP-guided therapy on recurrent events in HF with the use of a time-between-events approach in a hypothesis-generating analysis. METHODS AND RESULTS: The Trial of Intensified Versus Standard Medical Therapy in Elderly Patients With Congestive Heart Failure (TIME-CHF) randomized 499 HF patients, aged ≥60 years, left ventricular ejection fraction ≤45%, New York Heart Association functional class ≥I,I to NT-proBNP-guided versus symptom-guided therapy for 18 months, with further follow-up for 5.5 years. The effect of NT-proBNP-guided therapy on recurrent HF-related and all-cause hospitalizations and/or all-cause death was explored. One hundred four patients (49 NT-proBNP-guided, 55 symptom-guided) experienced 1 and 275 patients (133 NT-proBNP-guided, 142 symptom-guided) experienced ≥2 all-cause hospitalization events. Regarding HF hospitalization, 132 patients (57 NT-proBNP-guided, 75 symptom-guided) experienced 1 and 122 patients (57 NT-proBNP-guided, 65 symptom-guided) experienced ≥2 events. NT-proBNP-guided therapy was significant in preventing 2nd all-cause hospitalizations (hazard ratio [HR] 0.83; P = .01), in contrast to nonsignificant results in preventing 1st all-cause hospitalization events (HR 0.91; P = .35). This was not the case regarding HF hospitalization events (HR 0.85 [P = .14] vs HR 0.73 [P = .01]) The beneficial effect of NT-proBNP-guided therapy was seen only in patients aged <75 years, and not in those aged ≥75 years (interaction terms with P = .01 and P = .03 for all-cause hospitalization and HF hospitalization events, respectively). CONCLUSION: NT-proBNP-guided therapy reduces the risk of recurrent events in patients <75 years of age. This included all-cause hospitalization by mainly reducing later events, adding knowledge to the neutral effect on this end point when shown using time-to-first-event analysis only. CLINICAL TRIAL REGISTRATION: isrctn.org, identifier: ISRCTN43596477.


Subject(s)
Heart Failure/blood , Heart Failure/diagnosis , Natriuretic Peptide, Brain/blood , Patient Readmission/trends , Peptide Fragments/blood , Aged , Aged, 80 and over , Biomarkers/blood , Chronic Disease , Female , Follow-Up Studies , Heart Failure/therapy , Hospitalization/trends , Humans , Male , Treatment Outcome
17.
JACC Clin Electrophysiol ; 3(3): 232-242, 2017 03.
Article in English | MEDLINE | ID: mdl-29759517

ABSTRACT

OBJECTIVES: The purpose of this study was to evaluate the accuracy of noninvasive reconstructions of epicardial potentials, electrograms, activation and recovery isochrones, and beat origins by simultaneously performing electrocardiographic imaging (ECGI) and invasive epicardial electrography in intact animals. BACKGROUND: Noninvasive imaging of electrical potentials at the epicardium, known as ECGI, is increasingly applied in patients to assess normal and abnormal cardiac electrical activity. METHODS: Body-surface potentials and epicardial potentials were recorded in normal anesthetized dogs. Computed tomography scanning provided a torso-heart geometry that was used to reconstruct epicardial potentials from body-surface potentials. RESULTS: Electrogram reconstructions attained a moderate accuracy compared with epicardial recordings (median correlation coefficient: 0.71), but with considerable variation (interquartile range: 0.36 to 0.86). This variation could be explained by a spatial mismatch (overall resolution was <20 mm) that was most apparent in regions with electrographic transition. More accurate derivation of activation times (Pearson R: 0.82), recovery times (R: 0.73), and the origin of paced beats (median error: 10 mm; interquartile range: 7 to 17 mm) was achieved by a spatiotemporal approach that incorporates the characteristics of the respective electrogram and neighboring electrograms. Reconstruction of beats from repeated single-site pacing showed a stable localization of origin. Cardiac motion, currently ignored in ECGI, correlates negatively with reconstruction accuracy. CONCLUSIONS: ECGI shows a decent median accuracy, but variability in electrogram reconstruction can be sizable. At present, therefore, clinical interpretations of ECGI should not be made on the basis of single electrograms only. Incorporating local spatiotemporal characteristics allows for accurate reconstruction of epicardial activation and recovery patterns, and beat origin localization to a 10-mm precision. Even more reliable interpretations are expected when the influences of cardiac motion are accounted for in ECGI.


Subject(s)
Cardiac Pacing, Artificial/methods , Electrocardiography/instrumentation , Pericardium/physiopathology , Animals , Body Surface Potential Mapping/methods , Computer Simulation , Data Accuracy , Dogs , Electrodes, Implanted/adverse effects , Electrodes, Implanted/standards , Humans , Spatio-Temporal Analysis
18.
Med Biol Eng Comput ; 55(8): 1353-1365, 2017 Aug.
Article in English | MEDLINE | ID: mdl-27873155

ABSTRACT

The inverse problem of electrocardiography aims at noninvasively reconstructing electrical activity of the heart from recorded body-surface electrocardiograms. A crucial step is regularization, which deals with ill-posedness of the problem by imposing constraints on the possible solutions. We developed a regularization method that includes electrophysiological input. Body-surface potentials are recorded and a computed tomography scan is performed to obtain the torso-heart geometry. Propagating waveforms originating from several positions at the heart are simulated and used to generate a set of basis vectors representing spatial distributions of potentials on the heart surface. The real heart-surface potentials are then reconstructed from the recorded body-surface potentials by finding a sparse representation in terms of this basis. This method, which we named 'physiology-based regularization' (PBR), was compared to traditional Tikhonov regularization and validated using in vivo recordings in dogs. PBR recovered details of heart-surface electrograms that were lost with traditional regularization, attained higher correlation coefficients and led to improved estimation of recovery times. The best results were obtained by including approximate knowledge about the beat origin in the PBR basis.


Subject(s)
Action Potentials/physiology , Body Surface Potential Mapping/methods , Diagnosis, Computer-Assisted/methods , Electrocardiography/methods , Heart Conduction System/physiology , Models, Cardiovascular , Algorithms , Animals , Computer Simulation , Dogs , Humans , Reproducibility of Results , Sensitivity and Specificity
19.
Article in English | MEDLINE | ID: mdl-26738077

ABSTRACT

Spatiotemporal complexity of atrial fibrillation (AF) patterns is often quantified by annotated intracardiac contact mapping. We introduce a new approach that applies recurrence plot (RP) construction followed by recurrence quantification analysis (RQA) to epicardial atrial electrograms, recorded with a high-density grid of electrodes. In 32 patients with no history of AF (aAF, n=11), paroxysmal AF (PAF, n=12) and persistent AF (persAF, n=9), RPs were constructed using a phase space electrogram embedding dimension equal to the estimated AF cycle length. Spatial information was incorporated by 1) averaging the recurrence over all electrodes, and 2) by applying principal component analysis (PCA) to the matrix of embedded electrograms and selecting the first principal component as a representation of spatial diversity. Standard RQA parameters were computed on the constructed RPs and correlated to the number of fibrillation waves per AF cycle (NW). Averaged RP RQA parameters showed no correlation with NW. Correlations improved when applying PCA, with maximum correlation achieved between RP threshold and NW (RR1%, r=0.68, p <; 0.001) and RP determinism (DET, r=-0.64, p <; 0.001). All studied RQA parameters based on the PCA RP were able to discriminate between persAF and aAF/PAF (DET persAF 0.40 ± 0.11 vs. 0.59 ± 0.14/0.62 ± 0.16, p <; 0.01). RP construction and RQA combined with PCA provide a quick and reliable tool to visualize dynamical behaviour and to assess the complexity of contact mapping patterns in AF.


Subject(s)
Atrial Fibrillation/physiopathology , Electrophysiologic Techniques, Cardiac/methods , Signal Processing, Computer-Assisted , Humans , Principal Component Analysis , Spatio-Temporal Analysis
20.
Article in English | MEDLINE | ID: mdl-24110554

ABSTRACT

Noninvasive, detailed assessment of electrical cardiac activity at the level of the heart surface has the potential to revolutionize diagnostics and therapy of cardiac pathologies. Due to the requirement of noninvasiveness, body-surface potentials are measured and have to be projected back to the heart surface, yielding an ill-posed inverse problem. Ill-posedness ensures that there are non-unique solutions to this problem, resulting in a problem of choice. In the current paper, it is proposed to restrict this choice by requiring that the time series of reconstructed heart-surface potentials is sparse in the wavelet domain. A local search technique is introduced that pursues a sparse solution, using an orthogonal wavelet transform. Epicardial potentials reconstructed from this method are compared to those from existing methods, and validated with actual intracardiac recordings. The new technique improves the reconstructions in terms of smoothness and recovers physiologically meaningful details. Additionally, reconstruction of activation timing seems to be improved when pursuing sparsity of the reconstructed signals in the wavelet domain.


Subject(s)
Electrocardiography/instrumentation , Heart Conduction System/physiology , Action Potentials , Body Surface Potential Mapping/instrumentation , Body Surface Potential Mapping/methods , Electrocardiography/methods , Humans , Imaging, Three-Dimensional/instrumentation , Imaging, Three-Dimensional/methods , Wavelet Analysis
SELECTION OF CITATIONS
SEARCH DETAIL
...