Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
1.
Rev. esp. cardiol. (Ed. impr.) ; 75(7): 559-567, jul. 2022. tab, graf
Article in Spanish | IBECS | ID: ibc-205125

ABSTRACT

Introducción y objetivos: Se han desarrollado puntuaciones multiparamétricas para una mejor estratificación del riesgo en el síndrome de Brugada (SBr). Nuestro objetivo es validar 3 abordajes multiparamétricos (las escalas Delise, Sieira y Shanghai BrS) en una cohorte de pacientes con síndrome de Brugada y estudio electrofisiológico (EEF). Métodos: Pacientes diagnosticados de SBr y con un EEF previo entre 1998-2019 en 23 hospitales. Se utilizaron análisis mediante estadístico C y modelos de regresión de riesgos proporcionales de Cox. Resultados: Se incluyó en total a 831 pacientes con una media de edad de 42,8±13,1 años; 623 (75%) eran varones; 386 (46,5%) tenían patrón electrocardiográfico (ECG) tipo 1; 677 (81,5%) estaban asintomáticos y 319 (38,4%) tenían un desfibrilador automático implantable. Durante un seguimiento de 10,2±4,7 años, 47 (5,7%) sufrieron un evento cardiovascular. En la cohorte total, un ECG tipo 1 y síncope fueron predictivos de eventos arrítmicos. Todas las puntuaciones de riesgo se asociaron significativamente con los eventos. Las capacidades discriminatorias de las 3 escalas fueron discretas (particularmente al aplicarlas a pacientes asintomáticos). La evaluación de las puntuaciones de Delise y Sieira con diferente número de extraestímulos (1 o 2 frente a 3) no mejoró sustancialmente el índice c de predicción de eventos. Conclusiones: En el SBr, los factores de riesgo clásicos como el ECG y el síncope previo predicen eventos arrítmicos. El número de extraestímulos necesarios para inducir arritmias ventriculares influye en las capacidades predictivas del EEF. Las escalas que combinan factores de riesgo clínico con EEF ayudan a identificar las poblaciones con más riesgo, aunque sus capacidades predictivas siguen siendo discretas tanto en población general con SBr como en pacientes asintomáticos (AU)


Introduction and objectives: Multiparametric scores have been designed for better risk stratification in Brugada syndrome (BrS). We aimed to validate 3 multiparametric approaches (the Delise score, Sieira score and the Shanghai BrS Score) in a cohort with Brugada syndrome and electrophysiological study (EPS). Methods: We included patients diagnosed with BrS and previous EPS between 1998 and 2019 in 23 hospitals. C-statistic analysis and Cox proportional hazard regression models were used. Results: A total of 831 patients were included (mean age, 42.8±13.1; 623 [75%] men; 386 [46.5%] had a type 1 electrocardiogram (ECG) pattern, 677 [81.5%] were asymptomatic, and 319 [38.4%] had an implantable cardioverter-defibrillator). During a follow-up of 10.2±4.7 years, 47 (5.7%) experienced a cardiovascular event. In the global cohort, a type 1 ECG and syncope were predictive of arrhythmic events. All risk scores were significantly associated with events. The discriminatory abilities of the 3 scores were modest (particularly when these scores were evaluated in asymptomatic patients). Evaluation of the Delise and Sieira scores with different numbers of extra stimuli (1 or 2 vs 3) did not substantially improve the event prediction c-index. Conclusions: In BrS, classic risk factors such as ECG pattern and previous syncope predict arrhythmic events. The predictive capabilities of the EPS are affected by the number of extra stimuli required to induce ventricular arrhythmias. Scores combining clinical risk factors with EPS help to identify the populations at highest risk, although their predictive abilities remain modest in the general BrS population and in asymptomatic patients (AU)


Subject(s)
Humans , Male , Female , Adult , Middle Aged , Brugada Syndrome/complications , Death, Sudden/prevention & control , Predictive Value of Tests , Retrospective Studies , Risk Assessment , Cohort Studies , Electrocardiography , Risk Factors
2.
Transplant Proc ; 43(6): 2251-2, 2011.
Article in English | MEDLINE | ID: mdl-21839248

ABSTRACT

The goal of heart transplantation (HT) is not only to prolong the life of patients with end-stage heart failure, but also to offer them the sort of health they enjoyed before the disease. It is widely known that patients' functional capacity improves after HT but what about their quality of life (QoL)? Do functional capacity and QoL improve simultaneously? In the present study, we compared the progression of effort capacity and QoL in the first 2 years after HT. A prospective longitudinal study was performed in 58 heart transplant recipients (43 males, 15 females, age 51.6 ± 10 years) able to complete an effort test 2, 6, 12, and 24 months after transplantation. The studied variables included the five dimensions of the Euroqol-5D questionnaire (EQ-5D) test: mobility, self-care, daily activities, pain/discomfort, anxiety, and depression; a visual analog scale from 0 to 100; and the results (metabolic equivalent units [METs] and time of exercise) of the effort test at 2, 6, 12, and 24 months after transplantation. Analysis of variance was used to compare these variables at each point. Significance was set at P < .05. Functional capacity, measured by both METs and time of exercise, improved progressively (METs: 2 months: 5.2 ± 1.8, 6 months: 6.6 ± 2.1, 12 months: 7.5 ± 2.2, and 24 months: 8.5 ± 2.3, P < .001). As well, the result of EQ-5D questionare improved in parallel to exercise capacity. However, visual analog scale score did not change significatively during the follow-up (2 months: 78.9.3 ± 16.1, 6 months: 83.8 ± 11.3, 12 months: 83.3 ± 11.1, 24 months: 85.2 ± 14.9; P = .192), reaching a plateau at 6 to 24 months. In conclusion, the improvement in functional capacity shown by heart transplant recipients in the first 2 years after transplantation was not parallel to the feelings of well-being measured by the analog scale of the EQ-5D. Possibly long after transplantation patients will compare themselves to healthy people rather than to their state before HT, resulting in improvements the visual analog scale.


Subject(s)
Activities of Daily Living , Exercise Tolerance , Heart Failure/surgery , Heart Transplantation , Quality of Life , Adult , Exercise Test , Female , Heart Failure/psychology , Humans , Longitudinal Studies , Male , Middle Aged , Prospective Studies , Recovery of Function , Spain , Surveys and Questionnaires , Time Factors , Treatment Outcome
3.
Transplant Proc ; 43(6): 2257-9, 2011.
Article in English | MEDLINE | ID: mdl-21839250

ABSTRACT

BACKGROUND: Metabolic syndrome (MS) increases the risk of cardiovascular events due to endothelial dysfunction. There are few studies evaluating the impact of MS on the survival of heart transplantation (HTx) patients. AIM: The aim of this study was to study the impact of MS in the early period and on the long-term survival after HTx. MATERIALS AND METHODS: We studied 196 HTx patients with a minimum survival of 1 year post-HTx. A diagnosis of MS was made at 3 months after HTx, if at least 3 of the following criteria were met: triglyceride levels ≥150 mg/dL (or drug treatment for hypertriglyceridemia); high-density lipoprotein cholesterol (HDL-C) <40 mg/dL in men and <50 mg/dL in women (or drug treatment to raise HDL-C levels); diabetes mellitus on drug treatment or fasting glucose levels ≥100 mg/dL; blood pressure ≥130/85 mm Hg (or on antihypertensive drug treatment); and body mass index (BMI) ≥30. We used the Kaplan-Meier method (log-rank test) to calculate long-term survival and Student t and chi-square tests for comparisons. RESULTS: Among 196 patients, 96 developed MS. There were no differences between the groups with versus without MS in recipient gender, underlying etiology, smoking, pre-HTx diabetes, or immunosuppressive regimen. However, differences were observed between groups in age (MS: 53 ± 9 vs non-MS: 50 ± 12 years; P = .001); pre-HTx creatinine (MS: 1.2 ± 0.3 vs non-MS: 1.0 ± 0.4 mg/dL; P = .001); BMI (MS: 27.3 ± 4 vs non-MS: 24.6 ± 4; P = .001); pre-HTx hypertension (MS: 48% vs non-MS: 17%; P < .001); and dyslipidemia (MS: 53% vs non-MS: 37%; P = .023). Long-term survival was better among the non-MS group, but the difference did not reach significance (MS: 2381 ± 110 vs non-MS: 2900 ± 110 days; P = .34). CONCLUSIONS: The development of MS early after HTx is a common complication that affects nearly 50% of HTx patients. The prognostic implication of this syndrome on overall survival might occur in the long term.


Subject(s)
Heart Transplantation/adverse effects , Metabolic Syndrome/etiology , Biomarkers/blood , Blood Glucose/analysis , Blood Pressure , Body Mass Index , Chi-Square Distribution , Female , Heart Transplantation/mortality , Humans , Kaplan-Meier Estimate , Lipids/blood , Male , Metabolic Syndrome/blood , Metabolic Syndrome/diagnosis , Metabolic Syndrome/mortality , Metabolic Syndrome/physiopathology , Risk Assessment , Risk Factors , Spain , Time Factors , Treatment Outcome
4.
Transplant Proc ; 42(8): 3186-8, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20970645

ABSTRACT

INTRODUCTION: Heart transplant recipients show an abnormal heart rate (HR) response to exercise due to complete cardiac denervation after surgery. They present elevated resting HR, minimal increase in HR during exercise, with maximal HR reached during the recovery period. The objective of this study was to study the frequency of normalization of the abnormal HR in the first 6 months after transplantation. MATERIALS AND METHODS: We prospectively studied 27 heart transplant recipients who underwent treadmill exercise tests at 2 and 6 months after heart transplantation (HT). HR responses to exercise were classified as normal or abnormal, depending on achieving all of the following criteria: (1) increased HR for each minute of exercise, (2) highest HR at the peak exercise intensity, and (3) decreased HR for each minute of the recovery period. The HR response at 2 months was compared with the results at 6 months post-HT. RESULTS: At 2 months post-HT, 96.3% of the patients showed abnormal HR responses to exercise. Four months later, 11 patients (40.7%) had normalized HR responses (P<.001), which also involved a significant decrease in the time to achieve the highest HR after exercise (124.4±63.8 seconds in the first test and 55.6±44.6 seconds in the second). A significant improvement in exercise capacity and chronotropic competence was also shown in tests performed at 6 months after surgery. CONCLUSIONS: We observed important improvements in HR responses to exercise at 6 months after HT, which may represent early functional cardiac reinnervation.


Subject(s)
Exercise , Heart Rate , Heart Transplantation , Adult , Female , Humans , Male , Middle Aged , Prospective Studies
5.
Transplant Proc ; 41(6): 2250-2, 2009.
Article in English | MEDLINE | ID: mdl-19715889

ABSTRACT

OBJECTIVE: Exercise capacity has been shown to be reduced among cardiac transplant recipients. This observation is directly connected to both the transplanted heart's dependence on circulating catecholamines and the abnormal sympathoadrenal response to exercise in these patients. Taking into account this background, there is reluctance to use beta-blockers after heart transplantation. Nevertheless, this point remains controversial. Our aim was to examine exercise tolerance after an oral dose of atenolol early after cardiac transplantation. MATERIALS AND METHODS: Eighteen nonrejecting, otherwise health, cardiac transplant recipients were included in this study at a mean of 61.9 +/- 25.6 days after surgery; 13 were men. Patients performed controlled exercise to a symptom-limited maximum before and 2 hours after taking an oral dose of atenolol. Heart rate, blood pressure, exercise time, and metabolic equivalent units (METS) were recorded at rest as well as during and after exercise. We compared results depending on taking atenolol. RESULTS: Resting (101.7 +/- 14.5 vs 84 +/- 12.4 bpm; P = .001) and peak heart rates (128.5 +/- 12.9 vs 100.7 +/- 16 bpm; P = .001) were significantly higher before than after beta blockade. Resting systolic blood pressure was slightly higher before compared with after beta blockade (129.3 +/- 23.6 vs 122.2 +/- 20.3 mm Hg; P = .103). However, there was neither a significant difference in the length of exercise (3.17 +/- 1.96 vs 3.40 +/- 2.48 minutes; P = .918) nor in the estimated oxygen consumption (METS; 5.07 +/- 1.8 vs 5.31 +/- 2.2; P = .229). Furthermore, no patient reported a greater degree of tiredness after beta blockade. CONCLUSIONS: This study showed little adverse effect on exercise tolerance by beta blockade in recently transplanted patients. Atenolol seemed to be safe in this context.


Subject(s)
Adrenergic beta-Antagonists/pharmacology , Exercise Tolerance/drug effects , Heart Transplantation/statistics & numerical data , Heart/drug effects , Adult , Blood Pressure/drug effects , Female , Heart/physiopathology , Heart Rate/drug effects , Humans , Male , Middle Aged , Rest/physiology , Sympathetic Nervous System/drug effects , Sympathetic Nervous System/physiology
6.
Transplant Proc ; 40(9): 3044-5, 2008 Nov.
Article in English | MEDLINE | ID: mdl-19010187

ABSTRACT

UNLABELLED: The application of clinical trials (CTs) to daily practice is based on the assumption that the patients included in these trials are similar to those seen on a daily basis. We performed a retrospective study to evaluate patient survival depending on whether they were included in a CT. We studied 217 patients who underwent heart transplantation (HT) between January 2000 and September 2006. We excluded patients who received combination transplants, those who underwent repeat HT, and pediatric patients who underwent HT. In total, 54 patients were included in a CT and 163 were not (NCT). The statistical tests included the t test, the chi(2) test and the Kaplan-Meier method. RESULTS: Patients in the NCT group were in worse condition at HT, with a greater percentage of inotropic treatments pre-HT (36% vs 17%; P = .005), emergency transplants procedures (30% vs 13%; P = .01), and worse functional status pre-HT (P = .03). The NCT group exhibited lower survival (80.37% vs 87.04%; P = 0.13, log-rank test). There were no significant differences in the other analyzed variables. CONCLUSIONS: Patients included in CTs tend to have better long-term survival rates, for several reasons: patients in the CT group were more stable at HT (selection bias), and the close follow-up of patients in CTs makes it more likely that any complication will be detected and treated early (follow-up bias).


Subject(s)
Clinical Trials as Topic/statistics & numerical data , Heart Transplantation/mortality , Heart Transplantation/physiology , Emergencies/epidemiology , Humans , Kaplan-Meier Estimate , Patient Selection , Retrospective Studies , Survival Rate , Survivors/statistics & numerical data
7.
Transplant Proc ; 40(9): 3051-2, 2008 Nov.
Article in English | MEDLINE | ID: mdl-19010190

ABSTRACT

BACKGROUND: This study was performed to determine the factors that cause arterial hypertension after heart transplantation (HT) and the drugs used in its management. MATERIALS AND METHODS: We studied 247 consecutive patients who had undergone HT between 2000 and 2006 and who survived for at least 6 months. We excluded patients who received combination transplants, those who underwent repeat transplantation, and pediatric patients who had received transplants. Hypertension was defined as the need to use drugs for its control. Renal dysfunction was defined as serum creatinine concentration greater than 1.4 mg/dL, and diabetes as the need for an antidiabetes drug for its control. Statistical analyses were performed using the t test, the chi(2) test, and Cox regression. RESULTS: Mean (SD) patient age was 52 (10) years, and 87.4% of the patients were men. Follow-up was 72 (42) months. Hypertension was present in 33.3% of patients before HT and in 71.1% at some time after HT. The number of drugs used to control hypertension was 1.3 (0.5); one drug was used in 72.9% of patients. The most often used single class of drugs were calcium channel blockers (63.2%), followed by angiotensin-converting enzyme inhibitors (20%), and angiotensin receptor blockers (15.8%). Only pre-HT hypertension was significantly associated with greater use of antihypertensive drugs post-HT (mean [SD], 1.48 [0.65] vs 1.22 [0.41]; P = .005). At univariate analysis, only pre-HT hypertension was associated with the presence of post-HT hypertension (80.5% vs 65.5%; P = .02). At Cox regression analysis, recipient age (P = .02) and pre-HT hypertension (P = .004) were associated with post-HT hypertension. CONCLUSIONS: Hypertension is common after HT; however, in most patients, it can be controlled with a single antihypertensive agent. The most important factors in the development of hypertension are the presence of pre-HT hypertension and advanced age.


Subject(s)
Antihypertensive Agents/therapeutic use , Heart Transplantation/adverse effects , Hypertension/epidemiology , Adult , Antihypertensive Agents/classification , Creatinine/blood , Diabetes Mellitus/drug therapy , Diabetes Mellitus/epidemiology , Female , Follow-Up Studies , Humans , Hypertension/drug therapy , Hypoglycemic Agents/therapeutic use , Male , Middle Aged , Patient Selection , Postoperative Complications/drug therapy , Postoperative Complications/epidemiology , Postoperative Complications/physiopathology , Predictive Value of Tests , Regression Analysis , Retrospective Studies , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...