Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 57
Filter
1.
Am Heart J ; 275: 35-44, 2024 May 31.
Article in English | MEDLINE | ID: mdl-38825218

ABSTRACT

BACKGROUND: The Seattle Proportional Risk Model (SPRM) estimates the proportion of sudden cardiac death (SCD) in heart failure (HF) patients, identifying those most likely to benefit from implantable cardioverter-defibrillator (ICD) therapy (those with ≥50% estimated proportion of SCD). The GISSI-HF trial tested fish oil and rosuvastatin in HF patients. We used the SPRM to evaluate its accuracy in this cohort in predicting potential ICD benefit in patients with EF ≤50% and an SPRM-predicted proportion of SCD either ≥50% or <50%. METHODS: The SPRM was estimated in patients with EF ≤50% and in a logistic regression model comparing SCD with non-SCD. RESULTS: We evaluated 6,750 patients with EF ≤50%. There were 1,892 all-cause deaths, including 610 SCDs. Fifty percent of EF ≤35% patients and 43% with EF 36% to 50% had an SPRM of ≥50%. The SPRM (OR: 1.92, P < 0.0001) accurately predicted the risk of SCD vs non-SCD with an estimated proportion of SCD of 44% vs the observed proportion of 41% at 1 year. By traditional criteria for ICD implantation (EF ≤35%, NYHA class II or III), 64.5% of GISSI-HF patients would be eligible, with an estimated ICD benefit of 0.81. By SPRM >50%, 47.8% may be eligible, including 30.2% with EF >35%. GISSI-HF participants with EF ≤35% with SPRM ≥50% had an estimated ICD HR of 0.64, comparable to patients with EF 36% to 50% with SPRM ≥50% (HR: 0.65). CONCLUSIONS: The SPRM discriminated SCD vs non-SCD in GISSI-HF, both in patients with EF ≤35% and with EF 36% to 50%. The comparable estimated ICD benefit in patients with EF ≤35% and EF 36% to 50% supports the use of a proportional risk model for shared decision making with patients being considered for primary prevention ICD therapy.

2.
Comput Inform Nurs ; 41(5): 330-337, 2023 May 01.
Article in English | MEDLINE | ID: mdl-35977915

ABSTRACT

Many inpatient hospital visits result in adverse events, and a disproportionate number of adverse events are thought to occur among vulnerable populations. The personal and financial costs of these events are significant at the individual, care team, and system levels. Existing methods for identifying adverse events, such as the Institute for Healthcare Improvement Global Trigger Tool, typically involve retroactive chart review to identify risks or triggers and then detailed review to determine whether and what type of harm occurred. These methods are limited in scalability and ability to prospectively identify triggers to enable intervention before an adverse event occurs. The purpose of this study was to gather usability feedback on a prototype of an informatics intervention based on the IHI method. The prototype electronic Global Trigger Tool collects and presents risk factors for adverse events. Six health professionals identified as potential users in clinical, quality improvement, and research roles were interviewed. Interviewees universally described insufficiencies of current methods for tracking adverse events and offered important information on desired future user interface features. A key next step will be to refine and integrate an electronic Global Trigger Tool system into standards-compliant electronic health record systems as a patient safety module.


Subject(s)
User-Centered Design , User-Computer Interface , Humans , Medical Errors , Patient Safety , Risk Factors
3.
J Am Heart Assoc ; 11(17): e025607, 2022 09 06.
Article in English | MEDLINE | ID: mdl-36056726

ABSTRACT

Background It is unclear how to geographically distribute percutaneous coronary intervention (PCI) programs to optimize patient outcomes. The Washington State Certificate of Need program seeks to balance hospital volume and patient access through regulation of elective PCI. Methods and Results We performed a retrospective cohort study of all non-Veterans Affairs hospitals with PCI programs in Washington State from 2009 to 2018. Hospitals were classified as having (1) full PCI services and surgical backup (legacy hospitals, n=17); (2) full services without surgical backup (new certificate of need [CON] hospitals, n=9); or (3) only nonelective PCI without surgical backup (myocardial infarction [MI] access hospitals, n=9). Annual median hospital-level volumes were highest at legacy hospitals (605, interquartile range, 466-780), followed by new CON, (243, interquartile range, 146-287) and MI access, (61, interquartile range, 23-145). Compared with MI access hospitals, risk-adjusted mortality for nonelective patients was lower for legacy (odds ratio [OR], 0.59 [95% CI, 0.48-0.72]) and new-CON hospitals (OR, 0.55 [95% CI, 0.45-0.65]). Legacy hospitals provided access within 60 minutes for 90% of the population; addition of new CON and MI access hospitals resulted in only an additional 1.5% of the population having access within 60 minutes. Conclusions Many PCI programs in Washington State do not meet minimum volume standards despite regulation designed to consolidate elective PCI procedures. This CON strategy has resulted in a tiered system that includes low-volume centers treating high-risk patients with poor outcomes, without significant increase in geographic access. CON policies should re-evaluate the number and distribution of PCI programs.


Subject(s)
Myocardial Infarction , Percutaneous Coronary Intervention , Government Regulation , Humans , Outcome Assessment, Health Care , Percutaneous Coronary Intervention/adverse effects , Retrospective Studies , Treatment Outcome , Washington/epidemiology
4.
J Am Heart Assoc ; 11(13): e023743, 2022 07 05.
Article in English | MEDLINE | ID: mdl-35766293

ABSTRACT

Background As patients derive variable benefit from generator changes (GCs) of implantable cardioverter-defibrillators (ICDs) with an original primary prevention (PP) indication, better predictors of outcomes are needed. Methods and Results In the National Cardiovascular Data Registry ICD Registry, patients undergoing GCs of initial non-cardiac resynchronization therapy PP ICDs in 2012 to 2016, predictors of post-GC survival and survival benefit versus control heart failure patients without ICDs were assessed. These included predicted annual mortality based on the Seattle Heart Failure Model, left ventricular ejection fraction (LVEF) >35%, and the probability that a patient's death would be arrhythmic (proportional risk of arrhythmic death [PRAD]). In 40 933 patients undergoing GCs of initial noncardiac resynchronization therapy PP ICDs (age 67.7±12.0 years, 24.5% women, 34.1% with LVEF >35%), Seattle Heart Failure Model-predicted annual mortality had the greatest effect size for decreased post-GC survival (P<0.0001). Patients undergoing GCs of initial noncardiac resynchronization therapy PP ICDs with LVEF >35% had a lower Seattle Heart Failure Model-adjusted survival versus 23 472 control heart failure patients without ICDs (model interaction hazard ratio, 1.21 [95% CI, 1.11-1.31]). In patients undergoing GCs of initial noncardiac resynchonization therapy PP ICDs with LVEF ≤35%, the model indicated worse survival versus controls in the 21% of patients with a PRAD <43% and improved survival in the 10% with PRAD >65%. The association of the PRAD with survival benefit or harm was similar in patients with or without pre-GC ICD therapies. Conclusions Patients who received replacement of an ICD originally implanted for primary prevention and had at the time of GC either LVEF >35% alone or both LVEF ≤35% and PRAD <43% had worse survival versus controls without ICDs.


Subject(s)
Defibrillators, Implantable , Heart Failure , Aged , Death, Sudden, Cardiac/etiology , Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable/adverse effects , Female , Heart Failure/complications , Heart Failure/diagnosis , Heart Failure/therapy , Humans , Male , Middle Aged , Primary Prevention/methods , Proportional Hazards Models , Risk Factors , Stroke Volume , Treatment Outcome , Ventricular Function, Left
5.
J Heart Lung Transplant ; 41(2): 161-170, 2022 02.
Article in English | MEDLINE | ID: mdl-34404571

ABSTRACT

BACKGROUND: Preoperative variables can predict short term left ventricular assist device (LVAD) survival, but predictors of extended survival remain insufficiently characterized. METHOD: Patients undergoing LVAD implant (2012-2018) in the Intermacs registry were grouped according to time on support: short-term (<1 year, n = 7,483), mid-term (MT, 1-3 years, n = 5,976) and long-term (LT, ≥3 years, n = 3,015). Landmarked hazard analyses (adjusted hazard ratio, HR) were performed to identify correlates of survival after 1 and 3 years of support. RESULTS: After surviving 1 year of support, additional LVAD survival was less likely in older (HR 1.15 per decade), Caucasian (HR 1.22) and unmarried (HR 1.16) patients (p < 0.05). After 3 years of support, only 3 preoperative characteristics (age, race, and history of bypass surgery, p < 0.05) correlated with extended survival. Postoperative events most negatively influenced achieving LT survival. In those alive at 1 year or 3 years, the occurrence of postoperative renal (creatinine HR MT = 1.09; LT HR = 1.10 per mg/dl) and hepatic dysfunction (AST HR MT = 1.29; LT HR = 1.34 per 100 IU), stroke (MT HR = 1.24; LT HR = 1.42), infection (MT HR = 1.13; LT HR = 1.10), and/or device malfunction (MT HR = 1.22; LT HR = 1.46) reduced extended survival (all p ≤ 0.03). CONCLUSIONS: Success with LVAD therapy hinges on achieving long term survival in more recipients. After 1 year, extended survival is heavily constrained by the occurrence of adverse events and postoperative end-organ dysfunction. The growth of destination therapy intent mandates that future LVAD studies be designed with follow up sufficient for capturing outcomes beyond 24 months.


Subject(s)
Heart Failure/therapy , Heart-Assist Devices/adverse effects , Multiple Organ Failure/mortality , Registries , Equipment Failure , Female , Follow-Up Studies , Heart Failure/mortality , Humans , Male , Middle Aged , Multiple Organ Failure/etiology , Retrospective Studies , Survival Rate/trends , Time Factors , United States/epidemiology
6.
J Heart Lung Transplant ; 40(12): 1571-1578, 2021 12.
Article in English | MEDLINE | ID: mdl-34465530

ABSTRACT

BACKGROUND: Heart transplant programs and regulatory entities require highly accurate performance metrics to support internal quality improvement activities and national oversight of transplant programs, respectively. We assessed the accuracy of publicly reported performance measures. METHODS: We used the United Network for Organ Sharing registry to study patients who underwent heart transplantation between January 1, 2016 and June 30, 2018. We used tests of calibration to compare the observed rate of 1-year graft failure to the expected risk of 1-year graft failure, which was calculated for each recipient using the July 2019 method published by the Scientific Registry of Transplant Recipients (SRTR). The primary study outcome was the joint test of calibration, which accounts for both the total number of events predicted (calibration-in-the-large) and dispersion of risk predictions (calibration slope). RESULTS: 6,528 heart transplants were analyzed. The primary test of calibration failed (p <0.0001), indicating poor accuracy of the SRTR model. The calibration-in-the-large statistic (0.63, 95% confidence interval [CI] 0.58-0.68, p < 0.0001) demonstrated overestimation of event rates while the calibration slope statistic (0.56, 95% CI 0.49-0.62, p <0.0001) indicated over-dispersion of event rates. Pre-specified subgroup analyses demonstrated poor calibration for all subgroups (each p <0.01). After recalibration, program-level observed/expected ratios increased by a median of 0.14 (p <0.0001). CONCLUSIONS: Risk models employed for publicly-reported graft survival at U.S. heart transplant centers lack accuracy in general and in all subgroups tested. The use of disease-specific models may improve the accuracy of program performance metrics.


Subject(s)
Graft Rejection/epidemiology , Graft Survival , Heart Transplantation/statistics & numerical data , Quality Indicators, Health Care/statistics & numerical data , Registries , Tissue and Organ Procurement/statistics & numerical data , Female , Heart Transplantation/adverse effects , Humans , Male , Middle Aged , Models, Statistical , Reproducibility of Results , Risk Assessment
7.
J Heart Lung Transplant ; 40(7): 698-706, 2021 07.
Article in English | MEDLINE | ID: mdl-33965332

ABSTRACT

BACKGROUND: Adult Congenital Heart Disease (ACHD) heart transplant recipients may have lower post-transplant survival resulting from higher peri-operative mortality than non-ACHD patients. However, the late risk of mortality appears lower in ACHD recipients. This study seeks to establish whether long-term heart transplant survival is reduced among ACHD recipients relative to non-ACHD recipients. METHODS: Adult patients who received a heart transplant between January, 2000 and December, 2019 in the United Network for Organ Sharing database were stratified by the presence of ACHD. Propensity-matched cohorts (1:4) were created to adjust for differences between groups. Graft survival at time points from 1 to 18 years was compared between groups using restricted mean survival time (RMST) analysis. RESULTS: The matched cohort included 1,139 ACHD and 4,293 non-ACHD patients. Median age was 35 years and 61% were male. Average survival time at 1 year was 0.85 years for ACHD patients and 0.93 years for non-ACHD patients (average difference: -0.08 years, 95% Confidence Interval [CI] -0.10 to -0.06, p < 0.001), reflecting higher immediate post-transplant mortality. Average survival time at 18 years was not clinically or statistically different: 11.14 years for ACHD patients and 11.40 years for non-ACHD patients (average difference: -0.26 years, 95% CI: -0.85 to + 0.32 years, p = 0.38). CONCLUSIONS: Despite increased medium-term mortality among ACHD patients after heart transplant, differences in long-term survival are minimal. Allocation of hearts to ACHD patients results in acceptable utility of donor hearts.


Subject(s)
Heart Defects, Congenital/surgery , Heart Transplantation/mortality , Tissue Donors/statistics & numerical data , Adult , Databases, Factual , Female , Follow-Up Studies , Heart Defects, Congenital/mortality , Humans , Male , Middle Aged , Retrospective Studies , Survival Rate/trends , Time Factors , United States/epidemiology
8.
J Nurs Care Qual ; 36(4): 350-354, 2021.
Article in English | MEDLINE | ID: mdl-33534348

ABSTRACT

BACKGROUND: Prolonged length of stay (LOS) has undesirable consequences including increased cost, resource consumption, morbidity, and disruptions in hospital flow. LOCAL PROBLEM: A high-volume heart transplant center in the Pacific Northwest had a mean index hospital LOS of 23 days, with a goal of 10 days according to the institutional heart transplant care pathway. METHODS: A retrospective, regression analysis was used to identify the factors contributing to LOS of 41 post-heart transplant patients. INTERVENTIONS: The post-heart transplant care pathway and order set were modified accordingly and reintroduced to the health care team. RESULTS: Factors contributing to LOS included number of days (1) until the first therapeutic calcineurin inhibitor level, (2) until intravenous diuretics were no longer required, and (3) outside of a therapeutic calcineurin inhibitor range. The interventions reduced the mean LOS by 8 days. CONCLUSIONS: Increased awareness of LOS, education, and consistent use of care pathways can significantly reduce length of stay.


Subject(s)
Heart Transplantation , Hospitals , Humans , Length of Stay , Retrospective Studies
9.
ESC Heart Fail ; 7(6): 4241-4246, 2020 Dec.
Article in English | MEDLINE | ID: mdl-33001579

ABSTRACT

AIMS: Optimal blood pressure (BP) control is imperative to reduce complications, especially strokes, in continuous flow ventricular assist device (VAD) patients. Doppler BP has been shown to be an accurate and reliable non-invasive BP measurement method in HeartMate II and HVAD patients. We examined whether Doppler BP is also accurate in patients with the HeartMate 3 VAD. METHODS AND RESULTS: In a prospective, longitudinal cohort of HeartMate 3 patients, arterial line BP and simultaneously measured Doppler opening pressure were obtained. Correlation and agreement between Doppler opening pressure and arterial line mean arterial pressure (MAP) versus systolic blood pressure (SBP) were analysed, as well as the effect of pulse pressure on the accuracy of Doppler opening pressure. A total of 589 pairs of simultaneous Doppler opening pressure and arterial line pressure readings were obtained in 43 patients. Doppler opening pressure had good correlation with intra-arterial MAP (r = 0.754) and more closely approximated MAP than SBP (mean error 2.0 vs. -8.6 mmHg). Pulse pressure did not have a clinically significant impact on the accuracy of the Doppler BP method. These results in HeartMate 3 patients are very similar to previous results in HeartMate II and HVAD patients. CONCLUSIONS: Doppler BP method should be the default non-invasive BP measurement method in continuous flow VAD patients including patients implanted with the HeartMate 3.

10.
JAMA Netw Open ; 3(9): e2017595, 2020 09 01.
Article in English | MEDLINE | ID: mdl-32945871
12.
J Physiol ; 598(15): 3203-3222, 2020 08.
Article in English | MEDLINE | ID: mdl-32372434

ABSTRACT

KEY POINTS: Right heart catheterization data from clinical records of heart transplant patients are used to identify patient-specific models of the cardiovascular system. These patient-specific cardiovascular models represent a snapshot of cardiovascular function at a given post-transplant recovery time point. This approach is used to describe cardiac function in 10 heart transplant patients, five of which had multiple right heart catheterizations allowing an assessment of cardiac function over time. These patient-specific models are used to predict cardiovascular function in the form of right and left ventricular pressure-volume loops and ventricular power, an important metric in the clinical assessment of cardiac function. Outcomes for the longitudinally tracked patients show that our approach was able to identify the one patient from the group of five that exhibited post-transplant cardiovascular complications. ABSTRACT: Heart transplant patients are followed with periodic right heart catheterizations (RHCs) to identify post-transplant complications and guide treatment. Post-transplant positive outcomes are associated with a steady reduction of right ventricular and pulmonary arterial pressures, toward normal levels of right-side pressure (about 20 mmHg) measured by RHC. This study shows that more information about patient progression is obtained by combining standard RHC measures with mechanistic computational cardiovascular system models. The purpose of this study is twofold: to understand how cardiovascular system models can be used to represent a patient's cardiovascular state, and to use these models to track post-transplant recovery and outcome. To obtain reliable parameter estimates comparable within and across datasets, we use sensitivity analysis, parameter subset selection, and optimization to determine patient-specific mechanistic parameters that can be reliably extracted from the RHC data. Patient-specific models are identified for 10 patients from their first post-transplant RHC, and longitudinal analysis is carried out for five patients. Results of the sensitivity analysis and subset selection show that we can reliably estimate seven non-measurable quantities; namely, ventricular diastolic relaxation, systemic resistance, pulmonary venous elastance, pulmonary resistance, pulmonary arterial elastance, pulmonary valve resistance and systemic arterial elastance. Changes in parameters and predicted cardiovascular function post-transplant are used to evaluate the cardiovascular state during recovery of five patients. Of these five patients, only one showed inconsistent trends during recovery in ventricular pressure-volume relationships and power output. At the four-year post-transplant time point this patient exhibited biventricular failure along with graft dysfunction while the remaining four exhibited no cardiovascular complications.


Subject(s)
Heart Failure , Heart Transplantation , Heart Ventricles , Humans , Models, Cardiovascular , Pulmonary Artery , Ventricular Function, Right
13.
J Card Fail ; 26(9): 762-768, 2020 Sep.
Article in English | MEDLINE | ID: mdl-32439325

ABSTRACT

BACKGROUND: We describe how patient characteristics influence hospital bypass, interhospital transfer, and in-hospital mortality in patients with heart failure in Washington. Rural patients with heart failure may bypass their nearest hospital or be transferred for appropriate therapies. The frequency, determinants, and outcomes of these practices remain uncharacterized. METHODS AND RESULTS: Mean excess travel times based on hospital and patient residence ZIP codes were calculated using published methods. Hospitals and servicing areas were coded based on bed size and ZIP code, respectively. Transfer patterns were analyzed using bootstrap inference for clusters. Analysis of mortality and transfer-associated factors was performed using logistic regression with generalized estimating equations. There were 48,163 patients, representing 1106 instances of transfer, studied. The mean excess travel time increased 7.14 minutes per decrease in population density (metropolitan, micropolitan, small town, rural; P < .0001). The rural mean excess travel time was greatest at 28.56 minutes. Transfer likelihood increased with younger age, male gender, admitting hospital rurality, higher Charlson Comorbidity Index, and stroke. Transfer was less likely among women (odds ratio [OR], 0.82; 95% confidence interval [CI], 0.72-0.94) and patients over 70 years old (OR, 0.15-0.46; 95% CI, 0.10-0.65). Adjusting for comorbidities and transfer propensity, transfer exhibited a stronger association with mortality than any other measured patient risk factor (OR, 2.15; 95% CI, 1.69-2.73), excluding stroke (OR, 7.09; 95% CI, 4.99-10.06). CONCLUSIONS: Rural hospital bypass is prevalent among patients with heart failure, although its clinical significance is unclear. Female and older patients were found to have a lesser likelihood of transfer adjusted for other factors. Interhospital transfer is associated with increased mortality when adjusted for comorbidities.


Subject(s)
Heart Failure , Patient Transfer , Stroke , Aged , Female , Hospital Mortality , Hospitals , Humans , Male , Travel
14.
Europace ; 22(4): 588-597, 2020 04 01.
Article in English | MEDLINE | ID: mdl-32155253

ABSTRACT

AIMS: Heart failure (HF) is associated with an increased risk of sudden cardiac death (SCD). This study sought to demonstrate the incidence of SCD within a multicentre Japanese registry of HF patients hospitalized for acute decompensation, and externally validate the Seattle Proportional Risk Model (SPRM). METHODS AND RESULTS: We consecutively registered 2240 acute HF patients from academic institutions in Tokyo, Japan. The discrimination and calibration of the SPRM were assessed by the c-statistic, Hosmer-Lemeshow statistic, and visual plotting among non-survivors. Patient-level SPRM predictions and implantable cardioverter-defibrillator (ICD) benefit [ICD estimated hazard ratio (HR), derived from the Cox proportional hazards model in the Sudden Cardiac Death in Heart Failure Trial (SCD-HeFT)] was calculated. During the 2-year follow-up, 356 deaths (15.9%) occurred, which included 76 adjudicated SCDs (3.4%) and 280 non-SCDs (12.5%). The SPRM showed acceptable discrimination [c-index = 0.63; 95% confidence interval (CI) 0.56-0.70], similar to that of original SPRM-derivation cohort. The calibration plot showed reasonable conformance. Among HF patients with reduced ejection fraction (EF; < 40%), SPRM showed improved discrimination compared with the ICD eligibility criteria (e.g. New York Heart Association functional Class II-III with EF ≤ 35%): c-index = 0.53 (95% CI 0.42-0.63) vs. 0.65 (95% CI 0.55-0.75) for SPRM. Finally, in the subgroup of 246 patients with both EF ≤ 35% and SPRM-predicted risk of ≥ 42.0% (SCD-HeFT defined ICD benefit threshold), mean ICD estimated HR was 0.70 (30% reduction of all-cause mortality by ICD). CONCLUSION: The cumulative incidence of SCD was 3.4% in Japanese HF registry. The SPRM performed reasonably well in Japanese patients and may aid in improving SCD prediction.


Subject(s)
Defibrillators, Implantable , Heart Failure , Death, Sudden, Cardiac/epidemiology , Death, Sudden, Cardiac/prevention & control , Heart Failure/diagnosis , Heart Failure/therapy , Humans , Japan/epidemiology , Risk Factors , Tokyo
15.
Am Heart J ; 222: 93-104, 2020 04.
Article in English | MEDLINE | ID: mdl-32032927

ABSTRACT

BACKGROUND: Patients with heart failure having a low expected probability of arrhythmic death may not benefit from implantable cardioverter defibrillators (ICDs). OBJECTIVE: The objective was to validate models to identify cardiac resynchronization therapy (CRT) candidates who may not require CRT devices with ICD functionality. METHODS: Heart failure (HF) patients with CRT-Ds and non-CRT ICDs from the National Cardiovascular Data Registry and others with no device from 3 separate registries and 3 heart failure trials were analyzed using multivariable Cox proportional hazards regression for survival with the Seattle Heart Failure Model (SHFM; estimates overall mortality) and the Seattle Proportional Risk Model (SPRM; estimates proportional risk of arrhythmic death). RESULTS: Among 60,185 patients (age 68.6 ±â€¯11.3 years, 31.9% female) meeting CRT-D criteria, 38,348 had CRT-Ds, 11,389 had non-CRT ICDs, and 10,448 had no device. CRT-D patients had a prominent adjusted survival benefit (HR 0.52, 95% CI 0.50-0.55, P < .0001 versus no device). CRT-D patients with SHFM-predicted 4-year survival ≥81% (median) and a low SPRM-predicted probability of an arrhythmic mode of death ≤42% (median) had an absolute adjusted risk reduction attributable to ICD functionality of just 0.95%/year with the majority of survival benefit (70%) attributable to CRT pacing. In contrast, CRT-D patients with SHFM-predicted survival median had substantially more ICD-attributable benefit (absolute risk reduction of 2.6%/year combined; P < .0001). CONCLUSIONS: The SPRM and SHFM identified a quarter of real-world, primary prevention CRT-D patients with minimal benefit from ICD functionality. Further studies to evaluate CRT pacemakers in these low-risk CRT candidates are indicated.


Subject(s)
Cardiac Resynchronization Therapy/methods , Death, Sudden, Cardiac/prevention & control , Defibrillators, Implantable , Heart Failure/therapy , Primary Prevention/methods , Registries , Risk Assessment/methods , Aged , Death, Sudden, Cardiac/epidemiology , Death, Sudden, Cardiac/etiology , Female , Follow-Up Studies , Heart Failure/complications , Heart Failure/mortality , Humans , Incidence , Male , Risk Factors , Survival Rate/trends , Sweden/epidemiology , Time Factors , Treatment Outcome
16.
ASAIO J ; 66(7): 766-773, 2020 07.
Article in English | MEDLINE | ID: mdl-31453832

ABSTRACT

Left ventricular assist device (LVAD) use has continued to grow. Despite recent advances in technology, LVAD patients continue to suffer from devastating complications, including stroke and device thrombosis. Among several variables affecting thrombogenicity, we hypothesize that insertion depth of the inflow cannula into the left ventricle (LV) influences hemodynamics and thrombosis risk. Blood flow patterns were studied in a patient-derived computational model of the LV, mitral valve (MV), and LVAD inflow cannula using unsteady computational fluid dynamics (CFD). Hundreds of thousands of platelets were tracked individually, for two inflow cannula insertion depth configurations (12 mm-reduced and 27 mm-conventional) using platelet-level (Lagrangian) metrics to quantify thrombogenicity. Particularly in patients with small LV dimensions, the deeper inflow cannula insertion resulted in much higher platelet shear stress histories (SH), consistent with markedly abnormal intraventricular hemodynamics. A larger proportion of platelets in this deeper insertion configuration was found to linger in the domain for long residence times (RT) and also accumulated much higher SH. The reduced inflow depth configuration promoted LV washout and reduced platelet SH. The increase of both SH and RT in the LV demonstrates the impact of inflow cannula depth on platelet activation and increased stroke risk in these patients. Inflow cannula depth of insertion should be considered as an opportunity to optimize surgical planning of LVAD therapy.


Subject(s)
Cannula/adverse effects , Catheterization/methods , Heart-Assist Devices/adverse effects , Models, Cardiovascular , Thrombosis/etiology , Cardiovascular Surgical Procedures/adverse effects , Cardiovascular Surgical Procedures/methods , Catheterization/adverse effects , Heart Ventricles/physiopathology , Hemodynamics/physiology , Humans , Hydrodynamics , Stress, Mechanical
17.
J Am Coll Cardiol ; 74(23): 2908-2918, 2019 12 10.
Article in English | MEDLINE | ID: mdl-31806135

ABSTRACT

BACKGROUND: The number of adult congenital heart disease (CHD) patients undergoing heart transplantation is increasing rapidly. CHD patients have higher surgical risk at transplantation. High-volume adult CHD transplant centers may have better transplant outcomes. OBJECTIVES: This study aimed to evaluate the effect of center CHD transplant volume and expertise on transplant outcomes in CHD patients. METHODS: The authors studied heart transplantations in CHD patients age ≥18 years using the United Network of Organ Sharing (UNOS) database for the primary outcomes of waitlist mortality and post-transplant outcomes at 30 days and 1 year. Transplant centers were assessed by status as the highest CHD transplant volume center in a UNOS region versus all others, presence of Adult Congenital Heart Association accreditation, and adult versus pediatric hospital designation. RESULTS: Between January of 2000 and June of 2018, 1,746 adult CHD patients were listed for transplant; 1,006 (57.6%) of these underwent heart transplantation. After adjusting for age, sex, listing status, and inotrope requirement, waitlist mortality risk was lower at Adult Congenital Heart Association accredited centers (hazard ratio: 0.730; p = 0.020). Post-transplant 30-day mortality was lower at the highest volume CHD transplant center in each UNOS region (hazard ratio: 0.706; p = 0.014). CONCLUSIONS: Designated expertise in CHD care is associated with improved waitlist outcomes for CHD patients listed for transplantation. Post-transplant survival was improved at the highest volume regional center. These findings suggest a possible advantage of regionalization of CHD transplantation.


Subject(s)
Delivery of Health Care, Integrated/methods , Heart Defects, Congenital/surgery , Heart Transplantation , Registries , Tissue and Organ Procurement/methods , Waiting Lists/mortality , Adult , Female , Follow-Up Studies , Graft Survival , Heart Defects, Congenital/epidemiology , Humans , Incidence , Male , Retrospective Studies , Survival Rate/trends , Treatment Outcome , United States/epidemiology , Young Adult
19.
Inform Health Soc Care ; 44(2): 164-175, 2019.
Article in English | MEDLINE | ID: mdl-29672242

ABSTRACT

OBJECTIVE: The purpose of this project was to build and formatively evaluate a near-real time heart failure (HF) data mart. Heart Failure (HF) is a leading cause of hospital readmissions. Increased efforts to use data meaningfully may enable healthcare organizations to better evaluate effectiveness of care pathways and quality improvements, and to prospectively identify risk among HF patients. METHODS AND PROCEDURES: We followed a modified version of the Systems Development Life Cycle: 1) Conceptualization, 2) Requirements Analysis, 3) Iterative Development, and 4) Application Release. This foundational work reflects the first of a two-phase project. Phase two (in process) involves the implementation and evaluation of predictive analytics for clinical decision support. RESULTS: We engaged stakeholders to build working definitions and established automated processes for creating an HF data mart containing actionable information for diverse audiences. As of December 2017, the data mart contains information from over 175,000 distinct patients and >100 variables from each of their nearly 300,000 visits. CONCLUSION: The HF data mart will be used to enhance care, assist in clinical decision-making, and improve overall quality of care. This model holds the potential to be scaled and generalized beyond the initial focus and setting.


Subject(s)
Electronic Health Records/organization & administration , Heart Failure/epidemiology , Quality Improvement/organization & administration , Registries , Research/organization & administration , Health Information Exchange , Humans , Information Storage and Retrieval/methods , Risk Factors , Software Design , United States
20.
ASAIO J ; 65(2): 152-159, 2019 02.
Article in English | MEDLINE | ID: mdl-29677037

ABSTRACT

The prevalence of ventricular assist device (VAD) therapy has continued to increase due to a stagnant donor supply and growing advanced heart failure (HF) population. We hypothesize that left ventricular (LV) size strongly influences biocompatibility and risk of thrombosis. Unsteady computational fluid dynamics (CFD) was used in conjunction with patient-derived computational modeling and virtual surgery with a standard, apically implanted inflow cannula. A dual-focus approach of evaluating thrombogenicity was employed: platelet-based metrics to characterize the platelet environment and flow-based metrics to investigate hemodynamics. Left ventricular end-diastolic dimensions (LVEDds) ranging from 4.5 to 6.5 cm were studied and ranked according to relative thrombogenic potential. Over 150,000 platelets were individually tracked in each LV model over 15 cardiac cycles. As LV size decreased, platelets experienced markedly increased shear stress histories (SHs), whereas platelet residence time (RT) in the LV increased with size. The complex interplay between increased SH and longer RT has profound implications on thrombogenicity, with a significantly higher proportion of platelets in small LVs having long RT times and being subjected to high SH, contributing to thrombus formation. Our data suggest that small LV size, rather than decreased VAD speed, is the primary pathologic mechanism responsible for the increased incidence of thrombosis observed in VAD patients with small LVs.


Subject(s)
Heart Ventricles/pathology , Heart-Assist Devices/adverse effects , Thrombosis/etiology , Female , Heart Failure/therapy , Heart Ventricles/physiopathology , Humans , Male , Organ Size , Risk Factors , Thrombosis/physiopathology
SELECTION OF CITATIONS
SEARCH DETAIL
...