Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 29
Filter
1.
Int J Cardiol ; 383: 8-14, 2023 07 15.
Article in English | MEDLINE | ID: mdl-37085119

ABSTRACT

BACKGROUND: Revascularization of left main coronary artery (LMCA) stenosis is mostly based on angiography. Indices based on angiography might increase accuracy of the decision, although they have been scarcely used in LMCA. The objective of this study is to study the diagnostic agreement of QFR (quantitative flow ratio) with wire-based fractional flow reserve (FFR) in LMCA lesions and to compare with visual severity assessment. METHODS: In a series of patients with invasive FFR assessment of intermediate LMCA stenoses we retrospectively compared the measured value of QFR with that of FFR and the estimate of significance from angiography. RESULTS: 107 QFR studies were included. The QFR intra-observer and inter-observer agreement was 87% and 82% respectively. The mean QFR-FFR difference was 0.047 ± 0.05 with a concordance of 90.7%, sensitivity 88.1%, specificity 92.3%, positive predictive value 88.1% and negative predictive value 92.3%. All these values were superior to those observed with the visual estimation which showed an intra- and inter-observer agreement of 73% and 72% respectively, besides 78% with the FFR value. The low diagnostic performance of the visual estimation and the acceptable performance of the QFR index measurement were observed in all subgroups analysed. CONCLUSIONS: QFR allows an acceptable estimate of the FFR obtained with intracoronary pressure guidewire in intermediate LMCA lesions, and clearly superior to the assessment based on angiography alone. The decision to revascularize patients with moderate LMCA lesions should not be based solely on the degree of angiographic stenosis.


Subject(s)
Coronary Artery Disease , Coronary Stenosis , Fractional Flow Reserve, Myocardial , Humans , Coronary Vessels/diagnostic imaging , Constriction, Pathologic , Retrospective Studies , Coronary Angiography , Severity of Illness Index , Coronary Stenosis/diagnostic imaging , Coronary Stenosis/surgery , Predictive Value of Tests , Reproducibility of Results , Coronary Artery Disease/diagnostic imaging , Coronary Artery Disease/surgery
2.
Cytokine ; 155: 155895, 2022 07.
Article in English | MEDLINE | ID: mdl-35569383

ABSTRACT

Natural Killer (NK) cells belong to the innate lymphoid lineage and are highly present in the human skin. NK cells can produce a range of pro-inflammatory mediators, including cytokines and chemokines. The role of NK(-T) cells in the immune response towards Borrelia burgdorferi infection was studied. The production of interleukin (IL)-6, IL-1ß and interferon-gamma (IFN-γ) by human primary peripheral blood mononuclear cells (PBMCs) exposed to B. burgdorferi was assessed. Interestingly, CD56+ (NK + NK-T) cells were the only cells within the PBMC-fraction that produced IFN-γ during the first 24 h of stimulation. Within the NK(-T) cell fraction, NK cells seemed to be responsible for the IFN-γ production. Since it was previously demonstrated that both TLR2 and NOD2 receptors are involved in the recognition of B. burgdorferi, the expression of both TLR2 and NOD2 mRNA on NK cells was determined. In contrast to TLR2, NOD2 mRNA was upregulated on CD56+ (NK + NK-T) cells after Borrelia exposure. Finally, to unravel the mechanisms underlying erythema migrans (EM) development, crosstalk between CD56+ (NK + NK-T) cells and keratinocytes was investigated. CD56+ (NK + NK-T) cells activated by B. burgdorferi produced soluble mediators strongly inducing the expression of antimicrobial peptides, such as ß-defensin-2 and psoriasin in human keratinocytes. In conclusion, CD56+ (NK + NK-T) cells produced IFN-γ shortly after exposure to B. burgdorferi and released soluble mediators that were able to activate keratinocytes. These observations underscore the important role of CD56+ (NK + NK-T) cells during early host defence when Borrelia burgdorferi enters the human skin during a tick bite.


Subject(s)
Borrelia burgdorferi , Borrelia burgdorferi/genetics , CD56 Antigen/metabolism , Humans , Immunity, Innate , Interferon-gamma/metabolism , Killer Cells, Natural , Leukocytes, Mononuclear/metabolism , RNA, Messenger/metabolism , Toll-Like Receptor 2/metabolism
3.
Eur J Appl Physiol ; 122(4): 945-954, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35059801

ABSTRACT

PURPOSE: The aim of the present investigation is to study the relationship of ventricular global longitudinal strain (GLS) and ultrasound lung comets (ULC) formation to establish a link between extravascular pulmonary water formation and cardiac contractile dysfunction. METHODS: This is a prospective observational study including 14 active military divers. The subjects performed two sea dives of 120 min each with a semi-closed SCUBA circuit at 10 m depth. Divers were examined at baseline, 15 min (D1) and 60 min (D2) after diving. The evaluation included pulmonary and cardiac echography (including speckle tracking techniques). Blood samples were drawn at baseline and after diving, assessing hs-TnT and Endothelin-1. RESULTS: ULC were detected in 9 (64.2%) and 8 (57.1%) of the subjects after D1 and D2 respectively. No differences were found in right and left ventricular GLS after both immersions (RV: Baseline: - 17.9 4.9 vs. D1: - 17.2 6.5 and D2: - 16.7 5.8 s-1; p = 0.757 and p = 0.529; LV: Baseline: - 17.0 2.3 vs. D1: - 17.4 2.1 and D2: - 16.9 2.2 s-1; p = 0.546 and p = 0.783). However, a decrease in atrial longitudinal strain parameters have been detected after diving (RA: Baseline: 35.5 9.2 vs. D1: 30.3 12.8 and D2: 30.7 13.0 s-1; p = 0.088 and p = 0.063; LA: Baseline: 39.0 10.0 vs. D1: 31.6 6.1 and D2: 32.4 10.6 s-1; p = 0.019 and p = 0.054). CONCLUSION: In the present study, no ventricular contractile dysfunction was observed. However, increase pulmonary vasoconstriction markers were present after diving.


Subject(s)
Diving , Extravascular Lung Water , Echocardiography , Extravascular Lung Water/diagnostic imaging , Humans , Myocardial Contraction , Ultrasonography
4.
Rev Esp Cardiol (Engl Ed) ; 75(4): 325-333, 2022 Apr.
Article in English, Spanish | MEDLINE | ID: mdl-34016548

ABSTRACT

INTRODUCTION AND OBJECTIVES: Transcatheter aortic valve implant has become a widely accepted treatment for inoperable patients with aortic stenosis and patients at high surgical risk. Its indications have recently been expanded to include patients at intermediate and low surgical risk. Our aim was to evaluate the efficiency of SAPIEN 3 vs conservative medical treatment (CMT) or surgical aortic valve replacement (SAVR) in symptomatic inoperable patients at high or intermediate risk. METHODS: We conducted a cost-effectiveness analysis of SAPIEN 3 vs SAVR/CMT, using a Markov model (monthly cycles) with 8 states defined by the New York Heart Association and a time horizon of 15 years, including major complications and management after hospital discharge, from the perspective of the National Health System. Effectiveness parameters were based on the PARTNER trials. Costs related to the procedure, hospitalization, complications, and follow-up were included (euros in 2019). An annual discount rate of 3% was applied to both costs and benefits. Deterministic and probabilistic sensitivity analyses (Monte Carlo) were performed. RESULTS: Compared with SAVR (high and intermediate risk) and CMT (inoperable), SAPIEN 3 showed better clinical results in the 3 populations and lower hospital stay. Incremental cost-utility ratios (€/quality-adjusted life years gained) were 5471 (high risk), 8119 (intermediate risk) and 9948 (inoperable), respectively. In the probabilistic analysis, SAPIEN 3 was cost-effective in more than 75% of the simulations in the 3 profiles. CONCLUSIONS: In our health system, SAPIEN 3 facilitates efficient management of severe aortic stenosis in inoperable and high- and intermediate-risk patients.


Subject(s)
Aortic Valve Stenosis , Heart Valve Prosthesis Implantation , Heart Valve Prosthesis , Transcatheter Aortic Valve Replacement , Aortic Valve/surgery , Aortic Valve Stenosis/diagnosis , Aortic Valve Stenosis/surgery , Cost-Benefit Analysis , Heart Valve Prosthesis Implantation/methods , Humans , Severity of Illness Index , Transcatheter Aortic Valve Replacement/adverse effects , Treatment Outcome
5.
Rev. esp. cardiol. (Ed. impr.) ; 70(11): 907-914, nov. 2017. ilus, tab, graf
Article in Spanish | IBECS | ID: ibc-168316

ABSTRACT

Introducción y objetivos: Los pacientes con síndrome de QT largo (SQTL) tienen una adaptación anormal del QT a los cambios bruscos de la frecuencia cardiaca producidos con la bipedestación. Este trabajo estudia la utilidad del test de bipedestación en una cohorte de pacientes con SQTL y evalúa si el fenómeno de «mala adaptación» del QT se normaliza con el tratamiento con bloqueadores beta. Métodos: Se realizó un electrocardiograma basal y otro inmediatamente tras la bipedestación a 36 pacientes con SQTL (6 [17%] con QTL1, 20 [56%] con QTL2, 3 [8%] con QTL7 y 7 [19%] con genotipo no identificado) y 41 controles. Se midió el intervalo QT corregido (QTc) basal (QTcdecúbito) y tras la bipedestación (QTcbipedestación) y el incremento del QTc (ΔQTc = QTcbipedestación - QTcdecúbito). Se repitió el test en 26 de los pacientes bajo tratamiento con bloqueadores beta. Resultados: El QTcbipedestación y el ΔQTc fueron mayores en el grupo de SQTL que en el grupo control (QTcbipedestación, 528 ± 46 frente a 420 ± 15 ms; p < 0,0001; ΔQTc, 78 ± 40 frente a 8 ± 13 ms; p < 0,0001). No hubo diferencias significativas entre los pacientes con QTL1 y QTL2. Los pacientes con SQTL presentaron patrones típicos del segmento ST-onda T tras la bipedestación. Las curvas receiver operating characteristic del QTcbipedestación y ΔQTc mostraron un incremento significativo del valor diagnóstico comparadas con la del QTcdecúbito(área bajo la curva de ambos, 0,99 frente a 0,85; p < 0,001). El tratamiento con bloqueadores beta atenuó la respuesta a la bipedestación de los pacientes con SQTL (en tratamiento, QTcbipedestación, 440 ± 32 ms [p < 0,0001] y ΔQTc, 14 ± 16 ms [p < 0,0001]). Conclusiones: La evaluación del intervalo QTc tras la bipedestación proporciona un alto rendimiento diagnóstico y podría ser de gran utilidad en la monitorización del tratamiento con bloqueadores beta en los pacientes con SQTL (AU)


Introduction and objectives: Patients with congenital long QT syndrome (LQTS) have an abnormal QT adaptation to sudden changes in heart rate provoked by standing. The present study sought to evaluate the standing test in a cohort of LQTS patients and to assess if this QT maladaptation phenomenon is ameliorated by beta-blocker therapy. Methods: Electrographic assessments were performed at baseline and immediately after standing in 36 LQTS patients (6 LQT1 [17%], 20 LQT2 [56%], 3 LQT7 [8%], 7 unidentified-genotype patients [19%]) and 41 controls. The corrected QT interval (QTc) was measured at baseline (QTcsupine) and immediately after standing (QTcstanding); the QTc change from baseline (ΔQTc) was calculated as QTcstanding - QTcsupine. The test was repeated in 26 patients receiving beta-blocker therapy. Results: Both QTcstanding and ΔQTc were significantly higher in the LQTS group than in controls (QTcstanding, 528 ± 46 ms vs 420 ± 15 ms, P < .0001; ΔQTc, 78 ± 40 ms vs 8 ± 13 ms, P < .0001). No significant differences were noted between LQT1 and LQT2 patients. Typical ST-T wave patterns appeared after standing in LQTS patients. Receiver operating characteristic curves of QTcstanding and ΔQTc showed a significant increase in diagnostic value compared with the QTcsupine (area under the curve for both, 0.99 vs 0.85; P < .001). Beta-blockers attenuated the response to standing in LQTS patients (QTcstanding, 440 ± 32 ms, P < .0001; ΔQTc, 14 ± 16 ms, P < .0001). Conclusions: Evaluation of the QTc after the simple maneuver of standing shows a high diagnostic performance and could be important for monitoring the effects of beta-blocker therapy in LQTS patients (AU)


Subject(s)
Humans , Long QT Syndrome/diagnosis , Long QT Syndrome/drug therapy , Adrenergic beta-Antagonists/therapeutic use , Heart Rate , Clinical Protocols , Cohort Studies , Electrocardiography , Posture , 28599
6.
Rev Esp Cardiol (Engl Ed) ; 70(11): 907-914, 2017 Nov.
Article in English, Spanish | MEDLINE | ID: mdl-28233664

ABSTRACT

INTRODUCTION AND OBJECTIVES: Patients with congenital long QT syndrome (LQTS) have an abnormal QT adaptation to sudden changes in heart rate provoked by standing. The present study sought to evaluate the standing test in a cohort of LQTS patients and to assess if this QT maladaptation phenomenon is ameliorated by beta-blocker therapy. METHODS: Electrographic assessments were performed at baseline and immediately after standing in 36 LQTS patients (6 LQT1 [17%], 20 LQT2 [56%], 3 LQT7 [8%], 7 unidentified-genotype patients [19%]) and 41 controls. The corrected QT interval (QTc) was measured at baseline (QTcsupine) and immediately after standing (QTcstanding); the QTc change from baseline (ΔQTc) was calculated as QTcstanding - QTcsupine. The test was repeated in 26 patients receiving beta-blocker therapy. RESULTS: Both QTcstanding and ΔQTc were significantly higher in the LQTS group than in controls (QTcstanding, 528 ± 46ms vs 420 ± 15ms, P < .0001; ΔQTc, 78 ± 40ms vs 8 ± 13ms, P < .0001). No significant differences were noted between LQT1 and LQT2 patients. Typical ST-T wave patterns appeared after standing in LQTS patients. Receiver operating characteristic curves of QTcstanding and ΔQTc showed a significant increase in diagnostic value compared with the QTcsupine (area under the curve for both, 0.99 vs 0.85; P < .001). Beta-blockers attenuated the response to standing in LQTS patients (QTcstanding, 440 ± 32ms, P < .0001; ΔQTc, 14 ± 16ms, P < .0001). CONCLUSIONS: Evaluation of the QTc after the simple maneuver of standing shows a high diagnostic performance and could be important for monitoring the effects of beta-blocker therapy in LQTS patients.


Subject(s)
Exercise Test/methods , Long QT Syndrome/diagnosis , Adrenergic beta-Antagonists , Adult , Case-Control Studies , Female , Heart Rate , Humans , Long QT Syndrome/drug therapy , Long QT Syndrome/physiopathology , Male , Point-of-Care Testing , Posture , ROC Curve
7.
JACC Heart Fail ; 3(8): 641-4, 2015 Aug.
Article in English | MEDLINE | ID: mdl-26251092

ABSTRACT

OBJECTIVES: This study sought to examine the prognostic value of the soluble form of neprilysin (sNEP) in acute heart failure (AHF) and sNEP kinetics during hospital admission. BACKGROUND: sNEP was recently identified in chronic heart failure (HF) and was associated with cardiovascular outcomes. METHODS: A total of 350 patients (53% women, mean 72.6 ± 10.7 years of age) were included in the study. Primary endpoints were composites of cardiovascular death or HF hospitalizations at short-term (2 months) and long-term (mean: 1.8 ± 1.2 years) follow-up. sNEP was measured using an ad hoc-modified enzyme-linked immunosorbent assay, and its prognostic value was assessed using Cox regression analyses. In a subgroup of patients, sNEP was measured both at admission and at discharge (n = 92). RESULTS: Median admission sNEP concentrations were 0.67 ng/ml (Q1 to Q3: 0.37 to 1.29), and sNEP was significantly associated, in age-adjusted Cox regression analyses, with the composite endpoint at short-term (hazard ratio [HR]: 1.29; 95% confidence interval [CI]: 1.04 to 1.61; p = 0.02) and long-term (HR: 1.23; 95% CI: 1.01 to 1.05; p = 0.003) follow-up. In multivariate Cox analyses that included clinical variables and N-terminal prohormone of brain natriuretic peptide (NT-proBNP) concentration, sNEP concentration at admission showed a clear trend toward significance for the composite endpoint at 2 months (HR: 1.22; 95% CI: 0.97 to 1.53; p = 0.09) and remained significant at the end of follow-up (HR: 1.21; 95% CI: 1.04 to 1.40; p = 0.01). At discharge, sNEP levels decreased from 0.70 to 0.52 ng/ml (p = 0.06). CONCLUSIONS: Admission sNEP concentration was associated with short- and long-term outcomes in AHF, and dynamic sNEP concentrations were observed during hospital admission. These preliminary data may be hypothesis-generating for the use of NEP inhibitors in AHF.


Subject(s)
Biomarkers/blood , Heart Failure/blood , Heart Failure/diagnosis , Neprilysin/blood , Female , Heart Failure/drug therapy , Humans , Male , Pilot Projects , Prognosis , Reference Values
8.
Gastroenterol. hepatol. (Ed. impr.) ; 37(4): 240-245, abr. 2014. ilus
Article in Spanish | IBECS | ID: ibc-124575

ABSTRACT

Pacientes con enfermedad celiaca del adulto de reciente diagnóstico fueron evaluados con los test GSRS y PGWBI con el objetivo de valorar las alteraciones psicológicas que presentan, su relación con la sintomatología gastrointestinal y su evolución después de la instauración de dieta sin gluten. Previo asesoramiento nutricional los pacientes iniciaron dieta sin gluten y 6 meses después fueron reevaluados. Las variables cuantitativas se expresan como medianas y percentil Resultados Se incluyeron 21 pacientes, 17 mujeres y 4 hombres, edad 43 años (31-47). La histología fue compatible con lesión Marsh I en 6 casos, Marsh IIIa en 6 y Marsh IIIb en 9.Basalmente 8 pacientes presentaban distrés psicológico severo, 4 distrés moderado y 9 no presentaban distrés. La puntuación GSRS fue 34 (17-43) y el PGWBI 64 (48-87), objetivando la correlación significativa entre los 2 índices (rho = -0,58, p = 0,006).A los 6 meses 3 pacientes tenían distrés psicológico severo, 5 distrés moderado, 9 no presentaban distrés y 4 presentaban bienestar psicológico, la puntuación GSRS del 6.° mes fue 13 (8-17) y el PGWBI 83 (68-95) (p < 0,05 respecto de los datos basales para los 3 indicadores), constatándose mejoría significativa de los 6 ejes del PGWBI y sin que entonces se objetive correlación entre el GSRS y PGWBI. Conclusiones Los pacientes con enfermedad celiaca presentan alteraciones psicológicas cuya intensidad está relacionada con la sintomatología gastrointestinal, que mejoran después de la instauración de DSG


Patients with recently-diagnosed adult celiac disease were evaluated with the Gastrointestinal Symptom rating Scale (GSRS) and Psychological General Well-Being Index (PGWBI) to evaluate their psychological alterations, the association between any alterations and gastrointestinal symptoms, and their outcome after starting a gluten-free diet. The patients underwent nutritional assessment and then started a gluten-free diet; they were reassessed 6 months later. Quantitative variables are expressed as the median and 25th-75th percentiles. Results: We included 21 patients, 17 women and 4 mena, with a mean age of 43 years (31-47). The results of histological analysis were compatible with Marsh I lesions in 6 patients, Marsh IIIa in 6 and Marsh IIIb in 9. At baseline, 8 patients showed severe psychological distress, 4 showed moderate distress and 9 showed no distress. The GSRS score was 34 (17-43) and the PGWBI was 64 (48-87), with a significant correlation between the 2 indexes (rho = -.58, P = .006). At 6 months, 3 patients had severe psychological distress, 5 had moderate distress, 9 showed no distress and 4 showed psychological well-being. The GSRS score at 6 months was 13 (8-17) and the PGWBI was 83 (68-95) (P < .05 compared with baseline data for the 3 indicators). The 6 axes of the PGWBI showed significant improvement. At 6 months, no correlation was found between the GSRS and PGWBI. Conclusions: Patients with celiac disease have psychological alterations whose intensity is relted to gastrointestinal symptoms. These symptoms improve after the start of a gluten-free diet


Subject(s)
Humans , Male , Female , Adult , Middle Aged , Celiac Disease/complications , Anxiety Disorders/epidemiology , Depressive Disorder/epidemiology , Brief Psychiatric Rating Scale/statistics & numerical data , Gastrointestinal Diseases/complications , Risk Factors
9.
Gastroenterol Hepatol ; 37(4): 240-5, 2014 Apr.
Article in Spanish | MEDLINE | ID: mdl-24576676

ABSTRACT

UNLABELLED: Patients with recently-diagnosed adult celiac disease were evaluated with the Gastrointestinal Symptom rating Scale (GSRS) and Psychological General Well-Being Index (PGWBI) to evaluate their psychological alterations, the association between any alterations and gastrointestinal symptoms, and their outcome after starting a gluten-free diet. The patients underwent nutritional assessment and then started a gluten-free diet; they were reassessed 6 months later. Quantitative variables are expressed as the median and 25th-75th percentiles. RESULTS: We included 21 patients, 17 women and 4 mena, with a mean age of 43 years (31-47). The results of histological analysis were compatible with Marsh I lesions in 6 patients, Marsh IIIa in 6 and Marsh IIIb in 9. At baseline, 8 patients showed severe psychological distress, 4 showed moderate distress and 9 showed no distress. The GSRS score was 34 (17-43) and the PGWBI was 64 (48-87), with a significant correlation between the 2 indexes (rho=-.58, P=.006). At 6 months, 3 patients had severe psychological distress, 5 had moderate distress, 9 showed no distress and 4 showed psychological well-being. The GSRS score at 6 months was 13 (8-17) and the PGWBI was 83 (68-95) (P<.05 compared with baseline data for the 3 indicators). The 6 axes of the PGWBI showed significant improvement. At 6 months, no correlation was found between the GSRS and PGWBI. CONCLUSIONS: Patients with celiac disease have psychological alterations whose intensity is related to gastrointestinal symptoms. These symptoms improve after the start of a gluten-free diet.


Subject(s)
Celiac Disease/psychology , Adult , Celiac Disease/diet therapy , Celiac Disease/pathology , Depression/etiology , Diet, Gluten-Free/psychology , Female , Humans , Male , Middle Aged , Psychological Tests , Quality of Life , Self Concept , Self Efficacy , Severity of Illness Index , Stress, Psychological/etiology , Symptom Assessment , Treatment Outcome
12.
J Heart Lung Transplant ; 32(12): 1187-95, 2013 Dec.
Article in English | MEDLINE | ID: mdl-24263021

ABSTRACT

BACKGROUND: Primary graft failure (PGF) is the leading cause of early heart transplantation (HT) mortality. Our aim was to analyze PGF currently and explore the ability of a dedicated score for PGF risk stratification. METHODS: After applying a dedicated PGF definition, we analyzed its incidence, mortality, and associated factors in a multicenter cohort of 857 HTs performed in 2006 to 2009. We used the following criteria: recipient right (R) atrial pressure ≥ 10 mm Hg; age (A) ≥ 60 years; diabetes (D) mellitus, and inotrope (I) dependence; donor age (A) ≥ 30 years, and length (L) of ischemia ≥ 240 minutes to calculate the RADIAL score for PGF risk prediction. RESULTS: PGF incidence was 22%. The right ventricle was almost always affected, alone (45%) or as part of biventricular failure (47%). Mechanical circulatory support was used in 55%. Mortality attributable to PGF was 53% and extended through the third month after HT, but thereafter, PGF had little influence in long-term outcome. The RADIAL score was higher in PGF patients (2.78 ± 1.1 vs. 2.42 ± 1.1, p = 0.001) and stratified 3 groups with incremental PGF incidence: low risk (12.1%), intermediate risk (19.4%), and high risk (27.5%, p = 0.001). CONCLUSIONS: PGF had a strong impact, with an incidence of 22% and a mortality exceeding 50% that extends through the third post-HT month. The RADIAL score classified patients into 3 groups with incremental risk for PGF and may be useful for its prevention and early therapy.


Subject(s)
Graft Rejection/epidemiology , Graft Rejection/physiopathology , Heart Transplantation , Risk Assessment/methods , Adult , Age Factors , Cohort Studies , Diabetes Complications/complications , Female , Heart Atria/physiopathology , Humans , Incidence , Male , Middle Aged , Retrospective Studies , Risk Factors , Tissue Donors
14.
Gastroenterol. hepatol. (Ed. impr.) ; 34(9): 605-610, Nov. 2011.
Article in Spanish | IBECS | ID: ibc-98650

ABSTRACT

Objetivo Valorar los recursos disponibles en los hospitales comarcales catalanes para la asistencia urgente de la hemorragia digestiva alta. Método Se analiza una encuesta enviada a 32 hospitales, sobre la existencia, composición y recursos del turno de guardia (TDG) de endoscopia, referida al año 2009.ResultadosRespondieron 24 centros, que cubrían la asistencia de 3.954.000 habitantes. Tenían TDG 12 hospitales. No disponían de TDG en su centro de referencia 1.483.000 habitantes. Los centros con TDG tenían más camas y cubrían más población. Los TDG estaban formados por 4,5 endoscopistas (rango 2-11), que cubrían 82,1 (33,2-182,5) guardias/año. Diecisiete centros reportaban 1.571 episodios (51 por centro, rango 3-280, 39,68/100.000 hab.). Los centros con TDG reportaban más casos (76 vs. 43, p=0,047). Los que no disponen de TDG derivaron más pacientes (147 vs. 17, p= 0,001). Los pacientes en urgencias estaban a cargo de medicina interna en 4 centros, de cirugía en 14 y repartidos entre ambos servicios en 6. Si ingresaban, quedaron a cargo de Digestivo solo en 6 hospitales. Los recursos más utilizados eran la ligadura en la hemorragia varicosa y las terapias de inyección en la no varicosa. Un 21% de centros no realizaban tratamiento combinado. Conclusiones Una proporción significativa de la población no dispone de endoscopista de guardia en su centro de referencia. La constitución de TDG en hospitales comarcales supone importantes cargas asistenciales. La coordinación entre profesionales y centros permitiría la aplicación eficiente de los recursos terapéuticos y el establecimiento de TDG en centros que no tienen (AU)


Objective To evaluate the resources available in Catalan regional hospitals for the emergency care of upper gastrointestinal hemorrhage. Methods We analyzed a survey sent to 32 hospitals on the availability, composition and resources of a duty endoscopy service for the year 2009.ResultsResponses were obtained from 24 centers, covering 3,954,000 inhabitants. Duty endoscopists were available in 12 hospitals. A total of 1,483,000 inhabitants were unable to access a duty endoscopist in the referral center. Centers with duty endoscopists had more beds and had a larger catchment area. Duty services were composed of 4.5 endoscopists (range 2-11), covering 82.1 (33.2-182.5) duty shifts/year. Seventeen centers reported 1,571 episodes (51%, range: 3-280, 39.68/100,000 inhabitants). Centers with a duty service reported a greater number of cases (76 vs. 43, p=0.047). Centers without this service referred a greater number of patients (147 vs. 17, p=0.001). Patients in the emergency department were under the care of the internal medicine department in four centers, the surgery department in 14 centers and under the care of both departments in six. Admitted patients were under the care of the gastroenterology department in only six hospitals. The most widely used procedures were ligation of varicose bleeding and injection therapies in non-varicose bleeding. Twenty-one percent of centers did not perform combined treatment. Conclusions A significant proportion of the population does not have access to a duty endoscopist in referral centers. Duty shifts represent significant workload in regional hospitals. Coordination among health professionals and centers would allow the efficient application of therapeutic resources and a duty endoscopy service to be established in centers lacking this resource (AU)


Subject(s)
Humans , Emergency Medical Services/statistics & numerical data , Emergency Treatment/methods , Gastrointestinal Hemorrhage/epidemiology , Endoscopy, Gastrointestinal , Peptic Ulcer Hemorrhage/epidemiology , Hemostasis, Endoscopic , Esophageal and Gastric Varices/epidemiology , Proton Pump Inhibitors/therapeutic use , Vasoconstrictor Agents/therapeutic use
15.
Gastroenterol Hepatol ; 34(9): 605-10, 2011 Nov.
Article in Spanish | MEDLINE | ID: mdl-22000030

ABSTRACT

OBJECTIVE: To evaluate the resources available in Catalan regional hospitals for the emergency care of upper gastrointestinal hemorrhage. METHODS: We analyzed a survey sent to 32 hospitals on the availability, composition and resources of a duty endoscopy service for the year 2009. RESULTS: Responses were obtained from 24 centers, covering 3,954,000 inhabitants. Duty endoscopists were available in 12 hospitals. A total of 1,483,000 inhabitants were unable to access a duty endoscopist in the referral center. Centers with duty endoscopists had more beds and had a larger catchment area. Duty services were composed of 4.5 endoscopists (range 2-11), covering 82.1 (33.2-182.5) duty shifts/year. Seventeen centers reported 1,571 episodes (51%, range: 3-280, 39.68/100,000 inhabitants). Centers with a duty service reported a greater number of cases (76 vs. 43, p=0.047). Centers without this service referred a greater number of patients (147 vs. 17, p=0.001). Patients in the emergency department were under the care of the internal medicine department in four centers, the surgery department in 14 centers and under the care of both departments in six. Admitted patients were under the care of the gastroenterology department in only six hospitals. The most widely used procedures were ligation of varicose bleeding and injection therapies in non-varicose bleeding. Twenty-one percent of centers did not perform combined treatment. CONCLUSIONS: A significant proportion of the population does not have access to a duty endoscopist in referral centers. Duty shifts represent significant workload in regional hospitals. Coordination among health professionals and centers would allow the efficient application of therapeutic resources and a duty endoscopy service to be established in centers lacking this resource.


Subject(s)
Emergency Service, Hospital/statistics & numerical data , Gastrointestinal Hemorrhage , Gastrointestinal Hemorrhage/therapy , Hospitals/statistics & numerical data , Humans , Retrospective Studies , Spain
16.
Rev. esp. cardiol. (Ed. impr.) ; 63(11): 1317-1328, nov. 2010. tab, ilus
Article in Spanish | IBECS | ID: ibc-82361

ABSTRACT

Introducción y objetivos. El propósito de este artículo es presentar los resultados del trasplante cardiaco desde que se inició esta modalidad terapéutica en España en mayo de 1984. Métodos. Se ha realizado un análisis descriptivo de todos los trasplantes cardiacos realizados hasta el 31 de diciembre de 2009. Resultados. El número total de trasplantes fue de 6.048. El perfil clínico medio del paciente que se trasplantó en España en 2009 fue el de un varón de 53 años, diagnosticado de cardiopatía isquémica no revascularizable con depresión grave de la función ventricular y situación funcional avanzada, al que se implantó un corazón procedente de un donante fallecido por hemorragia cerebral, con una media de edad de 37 años y un tiempo en lista de espera de 106 días. El tiempo medio de supervivencia se ha incrementado con los años. Así, mientras en el total de la serie la probabilidad de supervivencia tras 1, 5, 10 y 15 años es del 78, el 67, el 53 y el 40% respectivamente, en los últimos 5 años la probabilidad de supervivencia tras 1 y 5 años es del 85 y el 73% respectivamente. La causa más frecuente de fallecimiento es el fallo agudo del injerto (17%), seguido de infección (16%), un combinado de enfermedad vascular del injerto y muerte súbita (14%), tumores (12%) y rechazo agudo (8%). Conclusiones. La supervivencia obtenida en España con el trasplante cardiaco, sobre todo en los últimos años, lo sitúa como el tratamiento de elección para cardiopatías irreversibles en situación funcional avanzada y sin otras opciones médicas o quirúrgicas establecidas (AU)


Introduction and objectives. The purpose of this report is to present the results obtained with heart transplantation in Spain from the first use of this therapeutic modality in May 1984. Methods. A descriptive analysis of all heart transplantations performed up to December 31, 2009 is presented. Results. In total, 6048 transplants were carried out. The typical clinical profile of a Spanish heart transplant patient in 2009 was that of a 53-year-old male who had been diagnosed with nonrevascularizable ischemic heart disease and who had severely impaired ventricular function and a poor functional status. The implanted heart typically came from a donor who had died from a brain hemorrhage (mean age 37 years) and the average time on the waiting list was 106 days. Mean survival time has increased progressively over the years. Whereas for the whole time series, the probability of survival at 1, 5, 10 and 15 years was 78%, 67%, 53% and 40%, respectively, for the past 5 years, the probability of survival at 1 and 5 years was 85% and 73%, respectively. The most frequent cause of death was acute graft failure (17%), followed by infection (16%), the combination of graft vascular disease and sudden death (14%), tumor (12%) and acute rejection (8%). Conclusions. The survival rates obtained in Spain with heart transplantation, especially in recent years, make the procedure the treatment of choice for patients who have irreversible heart failure and a poor functional status and for whom there are few other established medical or surgical options (AU)


Subject(s)
Humans , Male , Female , Societies, Medical/ethics , Societies, Medical/legislation & jurisprudence , Societies, Medical/standards , Heart Transplantation/education , Heart Transplantation/methods , Heart Transplantation/trends , Survival , Immunosuppression Therapy/instrumentation , Immunosuppression Therapy/methods , Vital Statistics , Cerebrovascular Disorders/complications , Cerebrovascular Disorders/epidemiology , Vascular Diseases/epidemiology , Indicators of Morbidity and Mortality
17.
Am J Cardiol ; 103(7): 1003-10, 2009 Apr 01.
Article in English | MEDLINE | ID: mdl-19327431

ABSTRACT

The population of patients with heart failure (HF) and mild to moderate left ventricular (LV) dysfunction is growing, and mortality remains high. There is a need for better risk stratification of patients who might benefit from primary prevention of mortality. This study aimed to evaluate the prognostic value of Holter-based parameters for predicting mortality in patients with HF with LV ejection fraction (EF) >35%. The study involved 294 patients (199 men, mean age 66 years) with HF of ischemic and nonischemic causes, New York Heart Association classes II to III, and LVEF >35%. Surface electrocardiogram and 24-hour Holter monitoring were performed at enrollment to assess traditional electrocardiographic variables, as well as heart rate variability, heart rate turbulence, and repolarization dynamics (QT/RR). Total mortality and sudden death were the primary and secondary end points. During a median 44-month follow-up, there were 43 deaths (15%). None of the traditional electrocardiographic risk parameters significantly predicted mortality. A standard deviation of all normal-to-normal RR intervals < or =86 ms, turbulence slope < or =2.5 ms/RR, and QT end/RR >0.21 at daytime were found to be independent risk predictors of mortality in multivariate analyses. The predictive score based on these 3 variables showed that patients with > or =2 abnormal risk markers were at risk of death (30% 3-year mortality rate) and sudden death (12%), similar to death rates observed in patients with LVEF < or =35%. In conclusion, increased risk of mortality and sudden death could be predicted in patients with HF with LVEF >35% by evaluating the combination of standard deviation of all normal-to-normal RR intervals, turbulence slope, and QT/RR, parameters reflecting autonomic control of the heart, baroreflex sensitivity, and repolarization dynamics.


Subject(s)
Heart Failure/mortality , Risk Assessment/methods , Stroke Volume/physiology , Ventricular Function, Left/physiology , Adolescent , Adult , Aged , Aged, 80 and over , Electrocardiography, Ambulatory/methods , Female , Follow-Up Studies , Heart Failure/physiopathology , Heart Rate/physiology , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Prognosis , Proportional Hazards Models , Prospective Studies , Risk Factors , Spain/epidemiology , Survival Rate/trends , Young Adult
18.
Clin Transplant ; 23(3): 329-36, 2009.
Article in English | MEDLINE | ID: mdl-19210687

ABSTRACT

After liver transplantation, long-term immunosuppression (IS) administration is commonly complicated by renal dysfunction and cardiovascular complications. Twenty liver transplant patients on cyclosporine (CyA)-based IS were followed up prospectively after IS withdrawal. They consisted of 10 electively weaned patients and 10 either forcibly or incidentally weaned patients. Liver biochemical tests, blood pressure, serum creatinine, serum urea, serum uric acid, triglycerides, cholesterol and glucose were monitored after the start of weaning. Eight of the 20 patients (40%) were IS therapy free for a mean period of 61 +/- 39 months (range: 10-132 months). Of the remaining 12 patients, mild or moderate acute rejection occurred in six patients (30%), and mixed inflammatory portal tract infiltrate was seen in another six patients (30%). At the end of the study, mean (SD) serum creatinine had fallen by 0.28 (0.10) mg/dL (p < 0.001) in operationally tolerant (T) patients whereas the serum creatinine level increased in IS-dependent patients [+0.35 (0.35) mg/dL] (p = 0.005). In T patients, serum cholesterol, serum uric acid, fasting glucose and diastolic arterial pressure values significantly decreased. IS withdrawal can be achieved in selected liver transplant patients, and can improve not only kidney function, but also other CyA-associated side effects, such as hypercholesterolemia, hyperuricemia, hypertension and diabetes.


Subject(s)
Cyclosporine/administration & dosage , Immunosuppressive Agents/administration & dosage , Kidney Diseases/chemically induced , Liver Transplantation , Adult , Creatinine/blood , Cyclosporine/adverse effects , Drug Administration Schedule , Female , Humans , Hyperglycemia/prevention & control , Hyperlipidemias/prevention & control , Hypertension/prevention & control , Immunosuppressive Agents/adverse effects , Kidney Diseases/blood , Liver Function Tests , Male , Middle Aged , Survivors
19.
J Card Fail ; 14(7): 561-8, 2008 Sep.
Article in English | MEDLINE | ID: mdl-18722321

ABSTRACT

OBJECTIVE: The association between low blood pressure (BP) levels and increased mortality has been established in several studies of heart failure (HF). Although many drugs administered to these patients decrease BP, the relationship between changes in BP and survival has not been investigated. Nor have previous analyses distinguished among different forms of death. We investigated the influence of baseline BP and changes in BP during a 1-year period on the survival of patients with HF, distinguishing among sudden cardiac death, nonsudden cardiac death, and noncardiac death. We also identified the possible relationship with the baseline values of and changes in other clinical and treatment variables, including pharmacologic treatments. METHOD AND RESULTS: A total of 1062 patients with chronic HF included in the Spanish National Registry of Sudden Death (mean age of 64.5 +/- 11.8 years, 72% were men, and 21% were in New York Heart Association class III with a mean left ventricular ejection fraction of 36.7% +/- 14.2%) were prospectively investigated for a mean of 1.9 +/- 0.6 years. A multivariable Cox proportional hazards model adjusting for clinical and therapeutic variables showed an independent association between low baseline systolic blood pressure (SBP) and nonsudden cardiac death (hazard ratio [HR] 0.96, 95% confidence interval [CI] 0.93-0.98), but changes in SBP during the following year did not influence survival, regardless of the baseline SBP level (P = .55). Contrariwise, baseline diastolic BP was not associated with mortality, but an increase in diastolic BP during the following year showed a borderline independent significant association with lower nonsudden cardiac death (HR 0.90, 95% CI 0.82-1.00). Treatment with angiotensin-converting enzyme inhibitors or beta-blockers at baseline was also associated with lower nonsudden cardiac mortality, as was an increase in left ventricular ejection fraction during the following year (HR 0.69, 95% CI 0.51-0.93; P = .015). CONCLUSION: Among patients with stable HF, low SBP is associated with a greater risk of nonsudden cardiac death. The change in SBP during a 1-year period has no prognostic value. Because the beneficial effects of drugs associated with increased survival (in this study, angiotensin-converting enzyme inhibitors and beta-blockers) thus seem to be independent of their effects on BP, changes in BP should probably not influence the decision to use such drugs or continue their administration.


Subject(s)
Blood Pressure/physiology , Heart Failure/physiopathology , Adrenergic beta-Antagonists/therapeutic use , Aged , Angiotensin-Converting Enzyme Inhibitors/therapeutic use , Blood Pressure/drug effects , Calcium Channel Blockers/therapeutic use , Death, Sudden, Cardiac/etiology , Diuretics/therapeutic use , Female , Follow-Up Studies , Heart Arrest/etiology , Heart Failure/drug therapy , Humans , Hypertrophy, Left Ventricular/physiopathology , Hypotension/chemically induced , Hypotension/physiopathology , Male , Middle Aged , Mineralocorticoid Receptor Antagonists/therapeutic use , Prospective Studies , Spironolactone/therapeutic use , Stroke Volume/drug effects , Survival Rate , Ventricular Function, Left/drug effects
20.
Hum Immunol ; 69(3): 143-8, 2008 Mar.
Article in English | MEDLINE | ID: mdl-18396205

ABSTRACT

Using an indirect immunofluorescence method on human umbilical vein endothelial cells (HUVEC), we investigated the presence of antiendothelial cell antibodies (AECA) in 136 pre- and posttransplant serum samples sequentially collected from 31 patients during the first year after cardiac transplantation. A healthy control group was also included (n = 87). Colocalization studies demonstrated a positive staining pattern of different cytoskeletal components (cytoskeletal-antiendothelial cell antibodies, CSK-AECA) including antivimentin, antiactin, antitubulin, and anticytokeratin among heart transplanted patients. Frequency of CSK-AECA in the control group and at day 0 in the transplant group was 18.3 and 22.5%, respectively (p = NS). A progressive increase in the frequency of CSK-AECA was observed after cardiac transplantation: 13.3% at day 15; 22.2% at day 30; 53.8% at day 90, and 58% at day 360. Interestingly, rejection episodes within the first year after transplantation occurred in 83.3% of CSK-AECA-positive and in 30.7% of CSK-AECA-negative patients (p = 0.0045). The presence of antibodies was detected prior to the rejection event and was associated with a poor clinical outcome: rejection episodes occurred at a mean of 36.14 +/- 17 days after transplantation in patients with preexisting AECA and 174.25 +/- 51.9 days after de novo antibody appearance in patients with no antibodies at day 0 (p = 0.029). In conclusion, a progressive increase in the frequency of CSK-AECA was observed following cardiac transplantation; the presence of these antibodies is strongly associated and precedes the rejection episodes. Thus, CSK-AECA could be a good marker for acute graft rejection.


Subject(s)
Autoantibodies/immunology , Graft Rejection/immunology , Heart Transplantation/immunology , Actins/immunology , Adult , Animals , Antibodies, Monoclonal/immunology , Female , Fluorescent Antibody Technique, Indirect , Humans , Keratins/immunology , Male , Mice , Middle Aged , Transplantation, Homologous/immunology , Tubulin/immunology , Vimentin/immunology
SELECTION OF CITATIONS
SEARCH DETAIL
...