Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23
Filter
1.
Transplant Direct ; 10(2): e1572, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38264297

ABSTRACT

Background: Pulmonary embolism (PE) is a rare yet serious postoperative complication for lung transplant recipients (LTRs). The association between timing and severity of PE and the development of chronic allograft lung dysfunction (CLAD) has not been described. Methods: A single-center, retrospective cohort analysis of first LTRs included bilateral or single lung transplants and excluded multiorgan transplants and retransplants. PEs were confirmed by computed tomography angiography or ventilation/perfusion (VQ) scans. Infarctions were confirmed on computed tomography angiography by a trained physician. The PE severity was defined by the Pulmonary Embolism Severity Index (PESI) score, a 30-d post-PE mortality risk calculator, and stratified by low I and II (0-85), intermediate III and IV (85-125), and high V (>125). PE and PESI were analyzed in the outcomes of overall survival, graft failure, and chronic lung allograft dysfunction (CLAD). Results: We identified 57 of 928 patients (6.14%) who had at least 1 PE in the LTR cohort with a median follow-up of 1623 d. In the subset with PE, the median PESI score was 85 (75.8-96.5). Most of the PESI scores (32/56 available) were in the low-risk category. In the CLAD analysis, there were 49 LTRs who had a PE and 16 LTRs (33%) had infarction. When treating PE as time-dependent and adjusting for covariates, PE was significantly associated with death (hazard ratio [HR] 1.8; 95% confidence interval [CI], 1.3-2.5), as well as increased risk of graft failure, defined as retransplant, CLAD, or death (HR 1.8; 95% CI, 1.3-2.5), and CLAD (HR 1.7; 95% CI, 1.2-2.4). Infarction was not associated with CLAD or death. The PESI risk category was not a significant predictor of death or CLAD. Conclusions: PE is associated with decreased survival and increased hazard of developing CLAD. PESI score was not a reliable predictor of CLAD or death in this lung transplant cohort.

2.
BMC Pulm Med ; 23(1): 414, 2023 Oct 30.
Article in English | MEDLINE | ID: mdl-37904125

ABSTRACT

BACKGROUND: Clonal hematopoiesis of indeterminate potential (CHIP), the age-related acquisition of somatic mutations that leads to an expanded blood cell clone, has been associated with development of a pro-inflammatory state. An enhanced or dysregulated inflammatory response may contribute to rejection after lung transplantation, however the prevalence of CHIP in lung recipients and influence of CHIP on allograft outcomes is unknown. METHODS: We analyzed whole-exome sequencing data in 279 lung recipients to detect CHIP, defined by pre-specified somatic mutations in 74 genes known to promote clonal expansion of hematopoietic stem cells. We compared the burden of acute rejection (AR) over the first post-transplant year in lung recipients with vs. without CHIP using multivariable ordinal regression. Multivariate Cox proportional hazards models were used to assess the association between CHIP and CLAD-free survival. An exploratory analysis evaluated the association between the number of CHIP-associated variants and chronic lung allograft dysfunction (CLAD)-free survival. RESULTS: We detected 64 CHIP-associated mutations in 45 individuals (15.7%), most commonly in TET2 (10.8%), DNMT3A (9.2%), and U2AF1 (9.2%). Patients with CHIP tended to be older but did not significantly differ from patients without CHIP in terms of race or native lung disease. Patients with CHIP did not have a higher incidence of AR over the first post-transplant year (p = 0.45) or a significantly increased risk of death or CLAD (adjusted HR 1.25, 95% CI 0.88-1.78). We did observe a significant association between the number of CHIP variants and CLAD-free survival, specifically patients with 2 or more CHIP-associated variants had an increased risk for death or CLAD (adjusted HR 3.79, 95% CI 1.98-7.27). CONCLUSIONS: Lung recipients have a higher prevalence of CHIP and a larger variety of genes with CHIP-associated mutations compared with previous reports for the general population. CHIP did not increase the risk of AR, CLAD, or death in lung recipients.


Subject(s)
Clonal Hematopoiesis , Lung Transplantation , Humans , Transplant Recipients , Prevalence , Lung , Lung Transplantation/adverse effects
3.
Transpl Immunol ; 80: 101904, 2023 10.
Article in English | MEDLINE | ID: mdl-37499884

ABSTRACT

BACKGROUND: Sensitized lung transplant recipients are at increased risk of developing donor-specific antibodies, which have been associated with acute and chronic rejection. Perioperative intravenous immune globulin has been used in sensitized individuals to down-regulate antibody production. METHODS: We compared patients with a pre-transplant calculated panel reactive antibody ≥25% who did not receive preemptive immune globulin therapy to a historical control that received preemptive immune globulin therapy. Our cohort included 59 patients, 17 patients did not receive immune globulin therapy and 42 patients received therapy. RESULTS: Donor specific antibody development was numerically higher in the non-immune globulin group compared to the immune globulin group (58.8% vs 33.3%, respectively, odds ratio 2.80, 95% confidence interval [0.77, 10.79], p = 0.13). Median time to antibody development was 9 days (Q1, Q3: 7, 19) and 28 days (Q1, Q3: 7, 58) in the non-immune globulin and immune globulin groups, respectively. There was no significant difference between groups in the incidence of primary graft dysfunction at 72 h post-transplant or acute cellular rejection, antibody-mediated rejection, and chronic lung allograft dysfunction at 12 months. CONCLUSION: These findings are hypothesis generating and emphasize the need for larger, randomized studies to determine association of immune globulin therapy with clinical outcomes.


Subject(s)
Immunoglobulins, Intravenous , Humans , Antibodies , Graft Rejection/prevention & control , Immunoglobulins, Intravenous/therapeutic use , Lung , Transplant Recipients
4.
Chest ; 164(3): 670-681, 2023 09.
Article in English | MEDLINE | ID: mdl-37003354

ABSTRACT

BACKGROUND: Chronic lung allograft dysfunction (CLAD) is the leading cause of death among lung transplant recipients. Eosinophils, effector cells of type 2 immunity, are implicated in the pathobiology of many lung diseases, and prior studies suggest their presence associates with acute rejection or CLAD after lung transplantation. RESEARCH QUESTION: Does histologic allograft injury or respiratory microbiology correlate with the presence of eosinophils in BAL fluid (BALF)? Does early posttransplant BALF eosinophilia associate with future CLAD development, including after adjustment for other known risk factors? STUDY DESIGN AND METHODS: We analyzed BALF cell count, microbiology, and biopsy data from a multicenter cohort of 531 lung recipients with 2,592 bronchoscopies over the first posttransplant year. Generalized estimating equation models were used to examine the correlation of allograft histology or BALF microbiology with the presence of BALF eosinophils. Multivariable Cox regression was used to determine the association between ≥ 1% BALF eosinophils in the first posttransplant year and definite CLAD. Expression of eosinophil-relevant genes was quantified in CLAD and transplant control tissues. RESULTS: The odds of BALF eosinophils being present was significantly higher at the time of acute rejection and nonrejection lung injury histologies and during pulmonary fungal detection. Early posttransplant ≥ 1% BALF eosinophils significantly and independently increased the risk for definite CLAD development (adjusted hazard ratio, 2.04; P = .009). Tissue expression of eotaxins, IL-13-related genes, and the epithelial-derived cytokines IL-33 and thymic stromal lymphoprotein were significantly increased in CLAD. INTERPRETATION: BALF eosinophilia was an independent predictor of future CLAD risk across a multicenter lung recipient cohort. Additionally, type 2 inflammatory signals were induced in established CLAD. These data underscore the need for mechanistic and clinical studies to clarify the role of type 2 pathway-specific interventions in CLAD prevention or treatment.


Subject(s)
Eosinophilia , Lung Transplantation , Humans , Bronchoalveolar Lavage Fluid , Lung , Transplantation, Homologous , Lung Transplantation/adverse effects , Allografts , Eosinophilia/etiology , Retrospective Studies , Graft Rejection
5.
J Heart Lung Transplant ; 42(6): 741-749, 2023 06.
Article in English | MEDLINE | ID: mdl-36941179

ABSTRACT

BACKGROUND: Chronic lung allograft dysfunction (CLAD) increases morbidity and mortality for lung transplant recipients. Club cell secretory protein (CCSP), produced by airway club cells, is reduced in the bronchoalveolar lavage fluid (BALF) of lung recipients with CLAD. We sought to understand the relationship between BALF CCSP and early posttransplant allograft injury and determine if early posttransplant BALF CCSP reductions indicate later CLAD risk. METHODS: We quantified CCSP and total protein in 1606 BALF samples collected over the first posttransplant year from 392 adult lung recipients at 5 centers. Generalized estimating equation models were used to examine the correlation of allograft histology or infection events with protein-normalized BALF CCSP. We performed multivariable Cox regression to determine the association between a time-dependent binary indicator of normalized BALF CCSP level below the median in the first posttransplant year and development of probable CLAD. RESULTS: Normalized BALF CCSP concentrations were 19% to 48% lower among samples corresponding to histological allograft injury as compared with healthy samples. Patients who experienced any occurrence of a normalized BALF CCSP level below the median over the first posttransplant year had a significant increase in probable CLAD risk independent of other factors previously linked to CLAD (adjusted hazard ratio 1.95; p = 0.035). CONCLUSIONS: We discovered a threshold for reduced BALF CCSP to discriminate future CLAD risk; supporting the utility of BALF CCSP as a tool for early posttransplant risk stratification. Additionally, our finding that low CCSP associates with future CLAD underscores a role for club cell injury in CLAD pathobiology.


Subject(s)
Lung Transplantation , Adult , Humans , Lung Transplantation/adverse effects , Biomarkers/metabolism , Lung , Bronchoalveolar Lavage Fluid , Allografts , Retrospective Studies
6.
Chest ; 164(1): 159-168, 2023 07.
Article in English | MEDLINE | ID: mdl-36681147

ABSTRACT

BACKGROUND: Frailty, measured as a single construct, is associated variably with poor outcomes before and after lung transplantation. The usefulness of a comprehensive frailty assessment before transplantation is unknown. RESEARCH QUESTION: How are multiple frailty constructs, including phenotypic and cumulative deficit models, muscle mass, exercise tolerance, and social vulnerabilities, measured before transplantation, associated with short-term outcomes after lung transplantation? STUDY DESIGN AND METHODS: We conducted a retrospective cohort study of 515 lung recipients who underwent frailty assessments before transplantation, including the short physical performance battery (SPPB), transplant-specific frailty index (FI), 6-min walk distance (6MWD), thoracic sarcopenia, and social vulnerability indexes. We tested the association between frailty measures before transplantation and outcomes after transplantation using logistic regression to model 1-year survival and zero-inflated negative binomial regression to model hospital-free days (HFDs) in the first 90 days after transplantation. Adjustment covariates included age, sex, native lung disease, transplantation type, lung allocation score, BMI, and primary graft dysfunction. RESULTS: Before transplantation, 51.3% of patients were frail by FI (FI ≥ 0.25) and no patients were frail by SPPB. In multivariate adjusted models that also included FI, SPPB, and 6MWD, greater frailty by FI, but not SPPB, was associated with fewer HFDs (-0.006 per 0.01 unit worsening; 95% CI, -0.01 to -0.002 per 0.01 unit worsening) among discharged patients. Greater SPPB deficits were associated with decreased odds of 1-year survival (OR, 0.51 per 1 unit worsening; 95% CI, 0.28-0.93 per 1 unit worsening). Correlation among frailty measurements overall was poor. No association was found between thoracic sarcopenia, 6MWD, or social vulnerability assessments and short-term outcomes after lung transplantation. INTERPRETATION: Both phenotypic and cumulative deficit models measured before transplantation are associated with short-term outcomes after lung transplantation. Cumulative deficit measures of frailty may be more relevant in the first 90 days after transplantation, whereas phenotypic frailty may have a stronger association with 1-year survival.


Subject(s)
Frailty , Lung Transplantation , Sarcopenia , Humans , Frailty/complications , Retrospective Studies , Sarcopenia/epidemiology , Sarcopenia/complications , Lung
7.
Transplant Proc ; 54(8): 2270-2276, 2022 Oct.
Article in English | MEDLINE | ID: mdl-36123193

ABSTRACT

BACKGROUND: Acute rejection is a risk factor for the development of chronic lung allograft dysfunction, the leading cause of morbidity and mortality in lung transplant recipients. Calcineurin inhibitors are the cornerstone of immunosuppression regimens after lung transplantation. METHODS: We retrospectively evaluated the association of tacrolimus level variability with total acute rejection score at 12 months post-transplant. Secondary outcomes included the development of chronic lung allograft dysfunction and antibody-mediated rejection at 24months post-transplant. There were 229 lung transplant recipients included. RESULTS: The mean (standard deviation) total rejection score of the cohort was 1.6 (1.7). Patients with high tacrolimus variability at 0 to 3, 3 to 6, and 6 to 12 months on average scored 0.18 (mean 1.6 vs 1.5; 95% CI): -0.3 to 0.66, P =.46), 0.14 (mean 1.7 vs 1.5; 95% CI: -0.32 to 0.6, P = .55), and 0.12 (mean 1.6 vs 1.5; 95% CI: -0.34 to 0.58, P = .62) point higher in 12-month total acute rejection scores, respectively; however, these differences were not statistically significant. The incidences of chronic lung allograft dysfunction and antibody-mediated rejection were numerically greater in the high variability group throughout certain periods; however, this was not consistent throughout all study timeframes and statistical significance was not evaluated. CONCLUSIONS: High tacrolimus variability was not associated with increased 12-month total acute rejection score. Further studies are needed to assess long-term outcomes with tacrolimus level variability.


Subject(s)
Lung Transplantation , Tacrolimus , Humans , Tacrolimus/adverse effects , Graft Rejection/epidemiology , Immunosuppressive Agents/adverse effects , Retrospective Studies , Lung Transplantation/adverse effects
8.
Am J Transplant ; 22(12): 3002-3011, 2022 12.
Article in English | MEDLINE | ID: mdl-36031951

ABSTRACT

We determined prognostic implications of acute lung injury (ALI) and organizing pneumonia (OP), including timing relative to transplantation, in a multicenter lung recipient cohort. We sought to understand clinical risks that contribute to development of ALI/OP. We analyzed prospective, histologic diagnoses of ALI and OP in 4786 lung biopsies from 803 adult lung recipients. Univariable Cox regression was used to evaluate the impact of early (≤90 days) or late (>90 days) posttransplant ALI or OP on risk for chronic lung allograft dysfunction (CLAD) or death/retransplantation. These analyses demonstrated late ALI/OP conferred a two- to threefold increase in the hazards of CLAD or death/retransplantation; there was no association between early ALI/OP and these outcomes. To determine risk factors for late ALI/OP, we used univariable Cox models considering donor/recipient characteristics and posttransplant events as candidate risks. Grade 3 primary graft dysfunction, higher degree of donor/recipient human leukocyte antigen mismatch, bacterial or viral respiratory infection, and an early ALI/OP event were significantly associated with increased late ALI/OP risk. These data from a contemporary, multicenter cohort underscore the prognostic implications of ALI/OP on lung recipient outcomes, clarify the importance of the timing of these events, and identify clinical risks to target for ALI/OP prevention.


Subject(s)
Acute Lung Injury , Lung Transplantation , Pneumonia , Adult , Humans , Prospective Studies , Prognosis , Retrospective Studies , Lung Transplantation/adverse effects , Acute Lung Injury/etiology , Acute Lung Injury/pathology , Lung , Pneumonia/epidemiology , Pneumonia/etiology , Pneumonia/pathology , Risk Factors , Cohort Studies
9.
Am J Respir Crit Care Med ; 206(12): 1495-1507, 2022 12 15.
Article in English | MEDLINE | ID: mdl-35876129

ABSTRACT

Rationale: It remains unclear how gastroesophageal reflux disease (GERD) affects allograft microbial community composition in lung transplant recipients and its impact on lung allograft inflammation and function. Objectives: Our objective was to compare the allograft microbiota in lung transplant recipients with or without clinically diagnosed GERD in the first year after transplant and assess associations between GERD, allograft microbiota, inflammation, and acute and chronic lung allograft dysfunction (ALAD and CLAD). Methods: A total of 268 BAL samples were collected from 75 lung transplant recipients at a single transplant center every 3 months after transplant for 1 year. Ten transplant recipients from a separate transplant center provided samples before and after antireflux Nissen fundoplication surgery. Microbial community composition and density were measured using 16S ribosomal RNA gene sequencing and quantitative polymerase chain reaction, respectively, and inflammatory markers and bile acids were quantified. Measurements and Main Results: We observed a range of allograft community composition with three discernible types (labeled community state types [CSTs] 1-3). Transplant recipients with GERD were more likely to have CST1, characterized by high bacterial density and relative abundance of the oropharyngeal colonizing genera Prevotella and Veillonella. GERD was associated with more frequent transitions to CST1. CST1 was associated with lower inflammatory cytokine concentrations than pathogen-dominated CST3 across the range of microbial densities observed. Cox proportional hazard models revealed associations between CST3 and the development of ALAD/CLAD. Nissen fundoplication decreased bacterial load and proinflammatory cytokines. Conclusions: GERD was associated with a high bacterial density, Prevotella- and Veillonella-dominated CST1. CST3, but not CST1 or GERD, was associated with inflammation and early development of ALAD and CLAD. Nissen fundoplication was associated with a reduction in microbial density in BAL fluid samples, especially the CST1-specific genus, Prevotella.


Subject(s)
Gastroesophageal Reflux , Lung Transplantation , Microbiota , Humans , Retrospective Studies , Gastroesophageal Reflux/complications , Lung , Inflammation , Allografts
10.
Am J Transplant ; 22(9): 2169-2179, 2022 09.
Article in English | MEDLINE | ID: mdl-35634722

ABSTRACT

Histopathologic lung allograft injuries are putative harbingers for chronic lung allograft dysfunction (CLAD). However, the mechanisms responsible are not well understood. CXCL9 and CXCL10 are potent chemoattractants of mononuclear cells and potential propagators of allograft injury. We hypothesized that these chemokines would be quantifiable in plasma, and would associate with subsequent CLAD development. In this prospective multicenter study, we evaluated 721 plasma samples for CXCL9/CXCL10 levels from 184 participants at the time of transbronchial biopsies during their first-year post-transplantation. We determined the association between plasma chemokines, histopathologic injury, and CLAD risk using Cox proportional hazards models. We also evaluated CXCL9/CXCL10 levels in bronchoalveolar lavage (BAL) fluid and compared plasma to BAL with respect to CLAD risk. Plasma CXCL9/CXCL10 levels were elevated during the injury patterns associated with CLAD, acute rejection, and acute lung injury, with a dose-response relationship between chemokine levels and CLAD risk. Importantly, there were strong interactions between injury and plasma CXCL9/CXCL10, where histopathologic injury associated with CLAD only in the presence of elevated plasma chemokines. We observed similar associations and interactions with BAL CXCL9/CXCL10 levels. Elevated plasma CXCL9/CXCL10 during allograft injury may contribute to CLAD pathogenesis and has potential as a minimally invasive immune monitoring biomarker.


Subject(s)
Graft vs Host Disease , Lung Transplantation , Allografts , Biomarkers , Chemokine CXCL10 , Chemokine CXCL9 , Graft Rejection/diagnosis , Graft Rejection/etiology , Humans , Lung , Lung Transplantation/adverse effects , Prospective Studies
11.
Gen Hosp Psychiatry ; 72: 53-58, 2021.
Article in English | MEDLINE | ID: mdl-34298477

ABSTRACT

BACKGROUND: Previous studies suggested that depressive symptoms and sleep quality may be important for long-term clinical outcomes following cardiothoracic transplant. Few studies, however, have systematically examined objective markers of these behavioral factors among ambulatory transplant recipients, or their association with clinical outcomes. METHODS: We examined sleep quality and depressive symptoms with subsequent clinical outcomes (hospitalizations and death) in a sample of 66 lung or heart transplant recipients using a single-center, prospective cohort study. Recipients were assessed at approximately 6 months post-transplant and completed one week of actigraphy assessment to examine sleep quality and self-report measures of mood (Centers for Epidemiologic Studies of Depression [CESD]). Recipients were followed for clinical outcomes. RESULTS: At 6-months following transplantation, recipients spent the majority of daytime activity at a sedentary level (61% of daily activity [SD = 10]) and elevated depressive symptoms were common (subclinical = 17%, mild = 12%, or moderate = 8%). Over a median follow-up of 4.5 years (IQR = 0.9, 5.1), 51 participants (77%) had at least one unplanned hospitalization and 11 (17%) participants died. In addition, sleep efficiency measurements suggested that a subset of participants exhibited suboptimal sleep (mean efficiency = 87% [SD = 7]). Poorer sleep quality, indexed by lower sleep efficiency and greater sleep fragmentation, was associated with greater depressive symptoms (r's = 0.37-0.50, P < .01). Better sleep quality at 6-months (HR = 0.75 [0.60, 0.95], P = .015), including sleep efficiency (HR = 0.74 [0.56, 0.99], P = .041) and sleep fragmentation (HR = 0.71 [0.53, 0.95], P = .020) were associated with lower risk of hospitalization or death. Compared with individuals without elevated depressive symptoms or sleep difficulties, individuals with either factor (HR = 1.72 [1.05, 2.81], P = .031) or both factors (HR = 2.37 [1.35, 4.18], P = .003) exhibited greater risk of clinical events in adjusted analyses. CONCLUSIONS: Sleep quality is associated with depressive symptoms among cardiothoracic transplant recipients and enhances the prognostic association between biobehavioral risk factors and clinical outcomes.


Subject(s)
Depression , Sleep Quality , Depression/epidemiology , Follow-Up Studies , Humans , Pilot Projects , Prospective Studies , Sleep
12.
Am J Transplant ; 21(10): 3401-3410, 2021 10.
Article in English | MEDLINE | ID: mdl-33840162

ABSTRACT

The histopathologic diagnosis of acute allograft injury is prognostically important in lung transplantation with evidence demonstrating a strong and consistent association between acute rejection (AR), acute lung injury (ALI), and the subsequent development of chronic lung allograft dysfunction (CLAD). The pathogenesis of these allograft injuries, however, remains poorly understood. CXCL9 and CXCL10 are CXC chemokines induced by interferon-γ and act as potent chemoattractants of mononuclear cells. We hypothesized that these chemokines are involved in the mononuclear cell recruitment associated with AR and ALI. We further hypothesized that the increased activity of these chemokines could be quantified as increased levels in the bronchoalveolar lavage fluid. In this prospective multicenter study, we evaluate the incidence of histopathologic allograft injury development during the first-year post-transplant and measure bronchoalveolar CXCL9 and CXCL10 levels at the time of the biopsy. In multivariable models, CXCL9 levels were 1.7-fold and 2.1-fold higher during AR and ALI compared with "normal" biopsies without histopathology. Similarly, CXCL10 levels were 1.6-fold and 2.2-fold higher during these histopathologies, respectively. These findings support the association of CXCL9 and CXCL10 with episodes of AR and ALI and provide potential insight into the pathogenesis of these deleterious events.


Subject(s)
Chemokine CXCL10 , Graft Rejection , Allografts , Chemokine CXCL9 , Graft Rejection/etiology , Lung , Prospective Studies
13.
J Med Virol ; 93(8): 5040-5047, 2021 08.
Article in English | MEDLINE | ID: mdl-33704812

ABSTRACT

Epstein-Barr virus (EBV)-driven posttransplant lymphoproliferative disorder (PTLD) is a serious complication following lung transplant. The extent to which the presence of EBV in PTLD tissue is associated with survival is uncertain. Moreover, whether the heterogeneity in expression of EBV latency programs is related to the timing of PTLD onset remains unexplored. We retrospectively performed a comprehensive histological evaluation of EBV markers at the tissue level in 34 adult lung transplant recipients with early- and late-onset PTLD. Early-onset PTLD, occurring within the first 12 months posttransplant, had higher odds to express EBV markers. The presence of EBV in PTLD was not associated with a difference in survival relative to EBV-negative tumors. However, we found evidence of heterogeneous expression of EBV latency programs, including type III, IIb, IIa, and 0/I. Our study suggests that the heterogeneous expression of EBV latency programs may represent a mechanism for immune evasion in patients with PLTD after lung transplants. The recognition of multiple EBV latency programs can be used in personalized medicine in patients who are nonresponsive to traditional types of chemotherapy and can be potentially evaluated in other types of solid organ transplants.


Subject(s)
Epstein-Barr Virus Infections/virology , Herpesvirus 4, Human/genetics , Lung/virology , Lymphoproliferative Disorders/virology , Organ Transplantation/adverse effects , Adult , Epstein-Barr Virus Infections/etiology , Epstein-Barr Virus Infections/mortality , Female , Gene Expression , Humans , Lung/metabolism , Lung/surgery , Lymphoproliferative Disorders/etiology , Lymphoproliferative Disorders/mortality , Male , Middle Aged , Organ Transplantation/mortality , Retrospective Studies , Transplant Recipients , Viral Proteins/genetics , Viral Proteins/metabolism , Virus Latency/genetics
14.
J Heart Lung Transplant ; 40(1): 42-55, 2021 01.
Article in English | MEDLINE | ID: mdl-33208278

ABSTRACT

BACKGROUND: Lung transplantation is increasingly performed in recipients aged ≥65 years. However, the risk factors for mortality specific to this population have not been well studied. In lung transplant recipients aged ≥65 years, we sought to determine post-transplant survival and clinical factors associated with post-transplant mortality. METHODS: We investigated 5,815 adult lung transplants recipients aged ≥65 years in the Scientific Registry of Transplant Recipients. Mortality was defined as a composite of recipient death or retransplantation. The Kaplan-Meier method was used to estimate the median time to mortality. Univariable and multivariable Cox proportional hazards regression models were used to examine the association between time to mortality and 23 donor, recipient, or center characteristics. RESULTS: Median survival in lung transplant recipients aged ≥65 years was 4.41 years (95% CI: 4.21-4.60 years) and significantly worsened by increasing age strata. In the multivariable model, increasing recipient age strata, creatinine level, bilirubin level, hospitalization at the time of transplantation, single lung transplant operation, steroid use at the time of transplantation, donor diabetes, and cytomegalovirus mismatch were independently associated with increased mortality. CONCLUSIONS: Among the 8 risk factors we identified, 5 factors are readily available, which can be used to optimize post-transplant survival by informing risk during candidate selection of patients aged ≥65 years. Furthermore, bilateral lung transplantation may confer improved survival in comparison with single lung transplantation. Our results support that after careful consideration of risk factors, lung transplantation can provide life-extending benefits in individuals aged ≥65 years.


Subject(s)
Lung Transplantation/mortality , Registries , Tissue Donors , Transplant Recipients/statistics & numerical data , Age Factors , Aged , Female , Graft Survival , Humans , Male , North Carolina/epidemiology , Retrospective Studies , Risk Factors , Survival Rate/trends , Time Factors
15.
J Heart Lung Transplant ; 39(9): 934-944, 2020 09.
Article in English | MEDLINE | ID: mdl-32487471

ABSTRACT

BACKGROUND: Gastroesophageal reflux disease (GERD) is a risk factor for chronic lung allograft dysfunction. Bile acids-putative markers of gastric microaspiration-and inflammatory proteins in the bronchoalveolar lavage (BAL) have been associated with chronic lung allograft dysfunction, but their relationship with GERD remains unclear. Although GERD is thought to drive chronic microaspiration, the selection of patients for anti-reflux surgery lacks precision. This multicenter study aimed to test the association of BAL bile acids with GERD, lung inflammation, allograft function, and anti-reflux surgery. METHODS: We analyzed BAL obtained during the first post-transplant year from a retrospective cohort of patients with and without GERD, as well as BAL obtained before and after Nissen fundoplication anti-reflux surgery from a separate cohort. Levels of taurocholic acid (TCA), glycocholic acid, and cholic acid were measured using mass spectrometry. Protein markers of inflammation and injury were measured using multiplex assay and enzyme-linked immunosorbent assay. RESULTS: At 3 months after transplantation, TCA, IL-1ß, IL-12p70, and CCL5 were higher in the BAL of patients with GERD than in that of no-GERD controls. Elevated TCA and glycocholic acid were associated with concurrent acute lung allograft dysfunction and inflammatory proteins. The BAL obtained after anti-reflux surgery contained reduced TCA and inflammatory proteins compared with that obtained before anti-reflux surgery. CONCLUSIONS: Targeted monitoring of TCA and selected inflammatory proteins may be useful in lung transplant recipients with suspected reflux and microaspiration to support diagnosis and guide therapy. Patients with elevated biomarker levels may benefit most from anti-reflux surgery to reduce microaspiration and allograft inflammation.


Subject(s)
Bile Acids and Salts/metabolism , Bronchiolitis Obliterans/surgery , Bronchoalveolar Lavage Fluid/chemistry , Gastroesophageal Reflux/complications , Graft Rejection/metabolism , Lung Transplantation , Transplant Recipients , Adult , Aged , Biomarkers/metabolism , Bronchiolitis Obliterans/complications , Female , Follow-Up Studies , Gastroesophageal Reflux/metabolism , Graft Rejection/etiology , Humans , Male , Middle Aged , Retrospective Studies , Young Adult
16.
Am J Respir Crit Care Med ; 202(4): 576-585, 2020 08 15.
Article in English | MEDLINE | ID: mdl-32379979

ABSTRACT

Rationale: Acute rejection, manifesting as lymphocytic inflammation in a perivascular (acute perivascular rejection [AR]) or peribronchiolar (lymphocytic bronchiolitis [LB]) distribution, is common in lung transplant recipients and increases the risk for chronic graft dysfunction.Objectives: To evaluate clinical factors associated with biopsy-proven acute rejection during the first post-transplant year in a present-day, five-center lung transplant cohort.Methods: We analyzed prospective diagnoses of AR and LB from over 2,000 lung biopsies in 400 newly transplanted adult lung recipients. Because LB without simultaneous AR was rare, our analyses focused on risk factors for AR. Multivariable Cox proportional hazards models were used to assess donor and recipient factors associated with the time to the first AR occurrence.Measurements and Main Results: During the first post-transplant year, 53.3% of patients experienced at least one AR episode. Multivariable proportional hazards analyses accounting for enrolling center effects identified four or more HLA mismatches (hazard ratio [HR], 2.06; P ≤ 0.01) as associated with increased AR hazards, whereas bilateral transplantation (HR, 0.57; P ≤ 0.01) was associated with protection from AR. In addition, Wilcoxon rank-sum analyses demonstrated bilateral (vs. single) lung recipients, and those with fewer than four (vs. more than four) HLA mismatches demonstrated reduced AR frequency and/or severity during the first post-transplant year.Conclusions: We found a high incidence of AR in a contemporary multicenter lung transplant cohort undergoing consistent biopsy sampling. Although not previously recognized, the finding of reduced AR in bilateral lung recipients is intriguing, warranting replication and mechanistic exploration.


Subject(s)
Bronchiolitis/epidemiology , Graft Rejection/epidemiology , Lung Transplantation , Postoperative Complications/epidemiology , Acute Disease , Aged , Cohort Studies , Female , Humans , Male , Middle Aged , Risk Factors , Time Factors
17.
Am J Transplant ; 20(6): 1489-1494, 2020 06.
Article in English | MEDLINE | ID: mdl-32342596

ABSTRACT

Long-term survival after lung transplant lags behind that of other commonly transplanted organs, reflecting the current incomplete understanding of the mechanisms involved in the development of posttransplant lung injury, rejection, infection, and chronic allograft dysfunction. To address this unmet need, 2 ongoing National Institute of Allergy and Infectious Disease funded studies through the Clinical Trials in Organ Transplant Consortium (CTOT) CTOT-20 and CTOT-22 were dedicated to understanding the clinical factors and biological mechanisms that drive chronic lung allograft dysfunction and those that maintain cytomegalovirus polyfunctional protective immunity. The CTOT-20 and CTOT-22 studies enrolled 800 lung transplant recipients at 5 North American centers over 3 years. Given the number and complexity of subjects included, CTOT-20 and CTOT-22 utilized innovative data transfers and capitalized on patient-entered data collection to minimize site manual data entry. The data were coupled with an extensive biosample collection strategy that included DNA, RNA, plasma, serum, bronchoalveolar lavage fluid, and bronchoalveolar lavage cell pellet. This Special Article describes the CTOT-20 and CTOT-22 protocols, data and biosample strategy, initial results, and lessons learned through study execution.


Subject(s)
Lung Transplantation , Organ Transplantation , Bronchoalveolar Lavage Fluid , Cytomegalovirus , Graft Rejection/etiology , Humans , Lung Transplantation/adverse effects , Organ Transplantation/adverse effects , Transplant Recipients
18.
Transplant Direct ; 6(3): e535, 2020 Mar.
Article in English | MEDLINE | ID: mdl-32195326

ABSTRACT

BACKGROUND: Impaired functional capacity and emotional distress are associated with lower quality of life (QoL) and worse clinical outcomes in post lung transplant patients. Strategies to increase physical activity and reduce distress are needed. METHODS: The Investigational Study of Psychological Interventions in Recipients of Lung Transplant-III study is a single site, parallel group randomized clinical trial in which 150 lung transplant recipients will be randomly assigned to 3 months of telephone-delivered coping skills training combined with aerobic exercise (CSTEX) or to a Standard of Care plus Education control group. The primary endpoints are a global measure of distress and distance walked on the 6-Minute Walk Test. Secondary outcomes include measures of transplant-specific QoL, frailty, health behaviors, and chronic lung allograft dysfunction-free survival. RESULTS: Participants will be evaluated at baseline, at the conclusion of 3 months of weekly treatment, at 1-year follow-up, and followed annually thereafter for clinical events for up to 4 years (median = 2 y). We also will determine whether functional capacity, distress, and health behaviors (eg, physical activity, medication adherence, and volume of air forcefully exhaled in 1 second (FEV1), mediate the effects of the CSTEX intervention on clinical outcomes. CONCLUSIONS: Should the CSTEX intervention result in better outcomes compared with the standard of care plus post-transplant education, the remotely delivered CSTEX intervention can be made available to all lung transplant recipients as a way of enhancing their QoL and improving clinical outcomes.

19.
Clin Transplant ; 33(11): e13710, 2019 11.
Article in English | MEDLINE | ID: mdl-31518448

ABSTRACT

BACKGROUND: Physical inactivity and depressive symptoms following cardiothoracic transplantation are recognized as potentially modifiable psychosocial factors to improve clinical outcomes. However, few studies have prospectively assessed these in ambulatory, outpatient transplant recipients. METHODS: We conducted a prospective, single-center study examining actigraphy-assessed physical activity (PA) levels over a 1-week period in heart or lung transplant recipients recruited at 6 months (range 4-9) post-transplant. Depressive symptoms (Centers for Epidemiologic Study of Depression [CESD]), quality of life (QoL), and clinical events (transplant-related hospitalization and death) were collected. Clustered Cox proportional hazards models were used to examine the associations between PA, psychological measures, and clinical events. RESULTS: Among 105 potentially eligible participants, 66 (63%) met inclusion criteria and were enrolled between July, 2016 and May, 2017, including 42 lung and 24 heart transplant recipients. The mean age of the population was 53 years, 41% were women and 18% were black. Participants tended to be sedentary, with the majority of activity spent within the "sedentary" level (61%) and an average daily step count of 7188 (SD = 2595). In addition, participants tended to exhibit subclinical depressive symptoms, (mean CESD = 9.4 [SD = 8]) with only a subset exhibiting levels suggestive of clinical depression (22%). Over a median follow-up of 1.4 years (1.14, 1.62), 21 participants (32%) experienced at least one transplant-related hospitalization, including two deaths. In adjusted survival models, greater intensity of PA (HR = 0.45 [0.24, 0.84] per 0.2 METs, P = .012) was associated with a lower risk of clinical events, whereas greater depressive symptoms (HR = 2.11 [1.58, 2.82] per 9 CESD points, P < .001) at 6 months were associated a higher likelihood of subsequent transplant-related hospitalization and/or death. CONCLUSIONS: Physical inactivity and depressive symptoms at 6 months post-transplant were predictive of subsequent adverse clinical events among ambulatory cardiothoracic transplant recipients. Future studies should examine whether improving these potentially modifiable post-transplant risk factors improves clinical outcomes.


Subject(s)
Depressive Disorder/mortality , Exercise , Heart Transplantation/mortality , Lung Transplantation/mortality , Postoperative Complications/mortality , Quality of Life , Depressive Disorder/epidemiology , Female , Follow-Up Studies , Heart Transplantation/adverse effects , Humans , Lung Transplantation/adverse effects , Male , Middle Aged , Pilot Projects , Prognosis , Prospective Studies , Risk Factors , Surveys and Questionnaires , Survival Rate , United States/epidemiology
20.
Chest ; 156(3): 477-485, 2019 09.
Article in English | MEDLINE | ID: mdl-30978332

ABSTRACT

BACKGROUND: Pulmonary fibrosis (PF) is the most common disease indication for lung transplantation. Our recent work implicated an excess of rare genetic variants in the telomere-related genes TERT, RTEL1, and PARN in PF disease risk. The impact of such variants on posttransplant outcomes is uncertain. The objective of this study was to determine if patients with these PF-associated variants have altered rates of posttransplant acute rejection (AR), chronic lung allograft dysfunction (CLAD), and survival. METHODS: The study cohort consisted of 262 PF lung transplant recipients previously genetically characterized by whole exome sequencing. Thirty-one patients (11.8%) had variants in TERT, RTEL1, or PARN, whereas 231 (88.2%) did not. Multivariate Cox proportional hazards models adjusted for relevant clinical variables were used to assess the outcomes of death and CLAD. The AR burden was quantified and compared over the first posttransplant year. RESULTS: Patients with PF with disease-associated variants in TERT, RTEL1, or PARN had a significantly higher risk of death (adjusted hazard ratio [HR], 1.82; 95% CI, 1.07-3.08; P = .03) and CLAD (adjusted HR, 2.88; 95% CI, 1.42-5.87; P = .004) than patients without these variants. There was no difference in AR burden or rates of grade 3 primary graft dysfunction between the two groups. CONCLUSIONS: Rare variants in the telomere-related genes TERT, RTEL1, or PARN are associated with poor posttransplant outcomes among PF lung transplant recipients. Further research is needed to understand the biological mechanisms by which telomere-related variants increase the risk for death and CLAD.


Subject(s)
DNA Helicases/genetics , Exoribonucleases/genetics , Lung Transplantation , Pulmonary Fibrosis/genetics , Pulmonary Fibrosis/surgery , Telomerase/genetics , Aged , Cohort Studies , Female , Graft Rejection/epidemiology , Graft Rejection/genetics , Humans , Male , Middle Aged , Primary Graft Dysfunction/epidemiology , Primary Graft Dysfunction/genetics , Pulmonary Fibrosis/mortality , Survival Rate , Telomere/genetics , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...