Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 78
Filter
1.
Article in English | MEDLINE | ID: mdl-38913442

ABSTRACT

BACKGROUND: Community-dwelling older adults with sleep disorders are at higher risk of developing dementia. Greater than 50% of older patients with kidney failure experience sleep disorders, which may explain their high burden of dementia. METHODS: Among 216,158 patients (age ≥66 years) with kidney failure (United States Renal Data System; 2008-2019), we estimated the risk of dementia (including subtypes) associated with sleep disorders using Cox proportional-hazard models with propensity score weighting. We tested whether positive airway pressure (PAP) therapy was associated with reduced dementia risk among patients with obstructive sleep apnea (OSA). RESULTS: 26.3% of patients were diagnosed with sleep disorders; these patients had a higher five-year unadjusted cumulative incidence for any type of dementia (36.2% vs. 32.3%; P<0.001), vascular dementia (4.4% vs. 3.7%; P<0.001), and other/mixed dementia (29.3% vs. 25.8%; P<0.001). Higher risk of any type of dementia was identified in patients with insomnia (aHR=1.42; 95%CI: 1.34-1.51), sleep-related breathing disorders (SRBDs) (aHR=1.20; 95%CI: 1.17-1.23), and other sleep disorders (aHR=1.24; 95%CI: 1.11-1.39). Higher vascular dementia risk was observed in patients with insomnia (aHR=1.43; 95%CI: 1.19-1.73), SRBDs (aHR=1.15; 95%CI: 1.07-1.24). Patients with SRBDs (aHR=1.07; 95%CI: 1.00-1.15) were at higher risk of Alzheimer's disease. Among patients with OSA, PAP therapy was associated with lower risk for any type of dementia (aHR=0.82; 95%CI: 0.76-0.90), and vascular dementia (aHR=0.65; 95%CI: 0.50-0.85). CONCLUSION: Older patients with kidney failure and sleep disorders are at a higher risk of dementia. Sleep is an important modifiable factor that should be considered for targeted interventions to mitigate dementia risk in patients with kidney failure. For patients with OSA, PAP therapy is associated with lower dementia risk.

2.
Nat Commun ; 15(1): 3140, 2024 Apr 11.
Article in English | MEDLINE | ID: mdl-38605083

ABSTRACT

Pig-to-human xenotransplantation is rapidly approaching the clinical arena; however, it is unclear which immunomodulatory regimens will effectively control human immune responses to pig xenografts. Here, we transplant a gene-edited pig kidney into a brain-dead human recipient on pharmacologic immunosuppression and study the human immune response to the xenograft using spatial transcriptomics and single-cell RNA sequencing. Human immune cells are uncommon in the porcine kidney cortex early after xenotransplantation and consist of primarily myeloid cells. Both the porcine resident macrophages and human infiltrating macrophages express genes consistent with an alternatively activated, anti-inflammatory phenotype. No significant infiltration of human B or T cells into the porcine kidney xenograft is detectable. Altogether, these findings provide proof of concept that conventional pharmacologic immunosuppression may be able to restrict infiltration of human immune cells into the xenograft early after compatible pig-to-human kidney xenotransplantation.


Subject(s)
Gene Editing , Kidney , Animals , Swine , Humans , Animals, Genetically Modified , Heterografts , Transplantation, Heterologous , Graft Rejection/genetics
3.
Am J Transplant ; 24(3): 328-337, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38072121

ABSTRACT

Obesity is a chronic, relapsing disease that increases the risks of living kidney donation; at the same time, transplant centers have liberalized body mass index constraints for donors. With the increasing number of antiobesity medications available, the treatment of obesity with antiobesity medications may increase the pool of potential donors and enhance donor safety. Antiobesity medications are intended for long-term use given the chronic nature of obesity. Cessation of treatment can be expected to lead to weight regain and increase the risk of comorbidity rebound/development. In addition, antiobesity medications are meant to be used in conjunction with-rather than in replacement of-diet and physical activity optimization. Antiobesity medication management includes selecting medications that may ameliorate any coexisting medical conditions, avoiding those that are contraindicated in such conditions, and being sensitive to any out-of-pocket expenses that may be incurred by the potential donor. A number of questions remain regarding who will and should shoulder the costs of long-term obesity treatment for donors. In addition, future studies are needed to quantify the degree of weight loss and duration of weight loss maintenance needed to normalize the risk of adverse kidney outcomes relative to comparable nondonors and lower-weight donors.


Subject(s)
Tissue Donors , Tissue and Organ Harvesting , Humans , Kidney , Obesity/drug therapy , Weight Loss
4.
Am J Transplant ; 24(4): 591-605, 2024 Apr.
Article in English | MEDLINE | ID: mdl-37949413

ABSTRACT

Body mass index is often used to determine kidney transplant (KT) candidacy. However, this measure of body composition (BC) has several limitations, including the inability to accurately capture dry weight. Objective computed tomography (CT)-based measures may improve pre-KT risk stratification and capture physiological aging more accurately. We quantified the association between CT-based BC measurements and waitlist mortality in a retrospective study of 828 KT candidates (2010-2022) with clinically obtained CT scans using adjusted competing risk regression. In total, 42.5% of candidates had myopenia, 11.4% had myopenic obesity (MO), 68.8% had myosteatosis, 24.8% had sarcopenia (probable = 11.2%, confirmed = 10.5%, and severe = 3.1%), and 8.6% had sarcopenic obesity. Myopenia, MO, and sarcopenic obesity were not associated with mortality. Patients with myosteatosis (adjusted subhazard ratio [aSHR] = 1.62, 95% confidence interval [CI]: 1.07-2.45; after confounder adjustment) or sarcopenia (probable: aSHR = 1.78, 95% CI: 1.10-2.88; confirmed: aSHR = 1.68, 95% CI: 1.01-2.82; and severe: aSHR = 2.51, 95% CI: 1.12-5.66; after full adjustment) were at increased risk of mortality. When stratified by age, MO (aSHR = 2.21, 95% CI: 1.28-3.83; P interaction = .005) and myosteatosis (aSHR = 1.95, 95% CI: 1.18-3.21; P interaction = .038) were associated with elevated risk only among candidates <65 years. MO was only associated with waitlist mortality among frail candidates (adjusted hazard ratio = 2.54, 95% CI: 1.28-5.05; P interaction = .021). Transplant centers should consider using BC metrics in addition to body mass index when a CT scan is available to improve pre-KT risk stratification at KT evaluation.


Subject(s)
Kidney Transplantation , Sarcopenia , Humans , Sarcopenia/diagnostic imaging , Sarcopenia/etiology , Risk Assessment/methods , Retrospective Studies , Obesity , Muscular Atrophy , Tomography, X-Ray Computed , Body Composition
5.
Am J Transplant ; 23(12): 1980-1989, 2023 Dec.
Article in English | MEDLINE | ID: mdl-37748554

ABSTRACT

Older compatible living donor kidney transplant (CLDKT) recipients have higher mortality and death-censored graft failure (DCGF) compared to younger recipients. These risks may be amplified in older incompatible living donor kidney transplant (ILDKT) recipients who undergo desensitization and intense immunosuppression. In a 25-center cohort of ILDKT recipients transplanted between September 24, 1997, and December 15, 2016, we compared mortality, DCGF, delayed graft function (DGF), acute rejection (AR), and length of stay (LOS) between 234 older (age ≥60 years) and 1172 younger (age 18-59 years) recipients. To investigate whether the impact of age was different for ILDKT recipients compared to 17 542 CLDKT recipients, we used an interaction term to determine whether the relationship between posttransplant outcomes and transplant type (ILDKT vs CLDKT) was modified by age. Overall, older recipients had higher mortality (hazard ratio: 1.632.072.65, P < .001), lower DCGF (hazard ratio: 0.360.530.77, P = .001), and AR (odds ratio: 0.390.540.74, P < .001), and similar DGF (odds ratio: 0.461.032.33, P = .9) and LOS (incidence rate ratio: 0.880.981.10, P = 0.8) compared to younger recipients. The impact of age on mortality (interaction P = .052), DCGF (interaction P = .7), AR interaction P = .2), DGF (interaction P = .9), and LOS (interaction P = .5) were similar in ILDKT and CLDKT recipients. Age alone should not preclude eligibility for ILDKT.


Subject(s)
Kidney Transplantation , Humans , Aged , Middle Aged , Adolescent , Young Adult , Adult , Kidney Transplantation/adverse effects , Living Donors , Graft Survival , Graft Rejection/etiology , HLA Antigens , Risk Factors
6.
Am J Surg ; 226(5): 692-696, 2023 11.
Article in English | MEDLINE | ID: mdl-37558520

ABSTRACT

INTRODUCTION: Liver allocation changes have led to increased travel and expenditures, highlighting the need to efficiently identify marginal livers suitable for transplant. We evaluated the validity of existing non-invasive liver quality tests and a novel machine learning-based model at predicting deceased donor macrosteatosis >30%. METHODS: We compared previously-validated non-invasive tests and a novel machine learning-based model to biopsies in predicting macrosteatosis >30%. We also tested them in populations enriched for macrosteatosis. RESULTS: The Hepatic Steatosis Index area-under-the-curve (AUC) was 0.56. At the threshold identified by Youden's J statistic, sensitivity, specificity, positive, and negative predictive values were 49.6%, 58.9%, 14.0%, and 89.7%. Other tests demonstrated comparable results. Machine learning produced the highest AUC (0.71). Even in populations enriched for macrosteatosis, no test was sufficiently predictive. CONCLUSION: Commonly used clinical scoring systems and a novel machine learning-based model were not clinically useful, highlighting the importance of pre-procurement biopsies to facilitate allocation.


Subject(s)
Fatty Liver , Liver Transplantation , Humans , Tissue Donors , Liver Function Tests
7.
JAMA ; 329(9): 735-744, 2023 03 07.
Article in English | MEDLINE | ID: mdl-36881033

ABSTRACT

Importance: In January 2011, the US Food and Drug Administration (FDA) announced a mandate to limit acetaminophen (paracetamol) to 325 mg/tablet in combination acetaminophen and opioid medications, with manufacturer compliance required by March 2014. Objective: To assess the odds of hospitalization and the proportion of acute liver failure (ALF) cases with acetaminophen and opioid toxicity prior to and after the mandate. Design, Setting, and Participants: This interrupted time-series analysis used hospitalization data from 2007-2019 involving ICD-9/ICD-10 codes consistent with both acetaminophen and opioid toxicity from the National Inpatient Sample (NIS), a large US hospitalization database, and ALF cases from 1998-2019 involving acetaminophen and opioid products from the Acute Liver Failure Study Group (ALFSG), a cohort of 32 US medical centers. For comparison, hospitalizations and ALF cases consistent with acetaminophen toxicity alone were extracted from the NIS and ALFSG. Exposures: Time prior to and after the FDA mandate limiting acetaminophen to 325 mg in combination acetaminophen and opioid products. Main Outcomes and Measures: Odds of hospitalization involving acetaminophen and opioid toxicity and percentage of ALF cases from acetaminophen and opioid products prior to and after the mandate. Results: In the NIS, among 474 047 585 hospitalizations from Q1 2007 through Q4 2019, there were 39 606 hospitalizations involving acetaminophen and opioid toxicity; 66.8% of cases were among women; median age, 42.2 (IQR, 28.4-54.1). In the ALFSG, from Q1 1998 through Q3 2019, there were a total of 2631 ALF cases, of which 465 involved acetaminophen and opioid toxicity; 85.4% women; median age, 39.0 (IQR, 32.0-47.0). The predicted incidence of hospitalizations 1 day prior to the FDA announcement was 12.2 cases/100 000 hospitalizations (95% CI, 11.0-13.4); by Q4 2019, it was 4.4/100 000 hospitalizations (95% CI, 4.1-4.7) (absolute difference, 7.8/100 000 [95% CI, 6.6-9.0]; P < .001). The odds of hospitalizations with acetaminophen and opioid toxicity increased 11%/y prior to the announcement (odds ratio [OR], 1.11 [95% CI, 1.06-1.15]) and decreased 11%/y after the announcement (OR, 0.89 [95% CI, 0.88-0.90]). The predicted percentage of ALF cases involving acetaminophen and opioid toxicity 1 day prior to the FDA announcement was 27.4% (95% CI, 23.3%-31.9%); by Q3 2019, it was 5.3% (95% CI, 3.1%-8.8%) (absolute difference, 21.8% [95% CI, 15.5%-32.4%]; P < .001). The percentage of ALF cases involving acetaminophen and opioid toxicity increased 7% per year prior to the announcement (OR, 1.07 [95% CI, 1.03-1.1]; P < .001) and decreased 16% per year after the announcement (OR, 0.84 [95% CI, 0.77-0.92]; P < .001). Sensitivity analyses confirmed these findings. Conclusions and Relevance: The FDA mandate limiting acetaminophen dosage to 325 mg/tablet in prescription acetaminophen and opioid products was associated with a statistically significant decrease in the yearly rate of hospitalizations and proportion per year of ALF cases involving acetaminophen and opioid toxicity.


Subject(s)
Acetaminophen , Analgesics, Opioid , Analgesics , Hospitalization , Liver Failure, Acute , Adult , Female , Humans , Male , Acetaminophen/administration & dosage , Acetaminophen/adverse effects , Analgesics, Opioid/administration & dosage , Analgesics, Opioid/adverse effects , Hospitalization/statistics & numerical data , Liver Failure, Acute/chemically induced , Liver Failure, Acute/epidemiology , Liver Failure, Acute/therapy , Prescriptions/statistics & numerical data , United States/epidemiology , United States Food and Drug Administration , Drug Combinations , Analgesics/administration & dosage , Analgesics/adverse effects , Middle Aged
8.
Res Sq ; 2023 Jan 09.
Article in English | MEDLINE | ID: mdl-36711785

ABSTRACT

Pig-to-human xenotransplantation is rapidly approaching the clinical arena; however, it is unclear which immunomodulatory regimens will effectively control human immune responses to pig xenografts. We transplanted a gene-edited pig kidney into a brain-dead human recipient on pharmacologic immunosuppression and studied the human immune response to the xenograft using spatial transcriptomics and single-cell RNA sequencing. Human immune cells were uncommon in the porcine kidney cortex early after xenotransplantation and consisted of primarily myeloid cells. Both the porcine resident macrophages and human infiltrating macrophages expressed genes consistent with an alternatively activated, anti-inflammatory phenotype. No significant infiltration of human B or T cells into the porcine kidney xenograft was detected. Altogether, these findings provide proof of concept that conventional pharmacologic immunosuppression is sufficient to restrict infiltration of human immune cells into the xenograft early after compatible pig-to-human kidney xenotransplantation.

9.
Kidney Int ; 103(5): 936-948, 2023 05.
Article in English | MEDLINE | ID: mdl-36572246

ABSTRACT

Machine learning (ML) models have recently shown potential for predicting kidney allograft outcomes. However, their ability to outperform traditional approaches remains poorly investigated. Therefore, using large cohorts of kidney transplant recipients from 14 centers worldwide, we developed ML-based prediction models for kidney allograft survival and compared their prediction performances to those achieved by a validated Cox-Based Prognostication System (CBPS). In a French derivation cohort of 4000 patients, candidate determinants of allograft failure including donor, recipient and transplant-related parameters were used as predictors to develop tree-based models (RSF, RSF-ERT, CIF), Support Vector Machine models (LK-SVM, AK-SVM) and a gradient boosting model (XGBoost). Models were externally validated with cohorts of 2214 patients from Europe, 1537 from North America, and 671 from South America. Among these 8422 kidney transplant recipients, 1081 (12.84%) lost their grafts after a median post-transplant follow-up time of 6.25 years (Inter Quartile Range 4.33-8.73). At seven years post-risk evaluation, the ML models achieved a C-index of 0.788 (95% bootstrap percentile confidence interval 0.736-0.833), 0.779 (0.724-0.825), 0.786 (0.735-0.832), 0.527 (0.456-0.602), 0.704 (0.648-0.759) and 0.767 (0.711-0.815) for RSF, RSF-ERT, CIF, LK-SVM, AK-SVM and XGBoost respectively, compared with 0.808 (0.792-0.829) for the CBPS. In validation cohorts, ML models' discrimination performances were in a similar range of those of the CBPS. Calibrations of the ML models were similar or less accurate than those of the CBPS. Thus, when using a transparent methodological pipeline in validated international cohorts, ML models, despite overall good performances, do not outperform a traditional CBPS in predicting kidney allograft failure. Hence, our current study supports the continued use of traditional statistical approaches for kidney graft prognostication.


Subject(s)
Kidney Transplantation , Renal Insufficiency , Humans , Kidney Transplantation/adverse effects , Kidney , Transplantation, Homologous , Machine Learning , Allografts , Graft Survival
10.
Ann Surg ; 278(1): e115-e122, 2023 Jul 01.
Article in English | MEDLINE | ID: mdl-35946818

ABSTRACT

OBJECTIVE: To examine whether body mass index (BMI) changes modify the association between kidney donation and incident hypertension. BACKGROUND: Obesity increases hypertension risk in both general and living kidney donor (LKD) populations. Donation-attributable risk in the context of obesity, and whether weight change modifies that risk, is unknown. METHODS: Nested case-control study among 1558 adult LKDs (1976-2020) with obesity (median follow-up: 3.6 years; interquartile range: 2.0-9.4) and 3783 adults with obesity in the Coronary Artery Risk Development in Young Adults (CARDIA) and Atherosclerosis Risk in Communities (ARIC) studies (9.2 y; interquartile range: 5.3-15.8). Hypertension incidence was compared by donor status using conditional logistic regression, with BMI change investigated for effect modification. RESULTS: Overall, LKDs and nondonors had similar hypertension incidence [incidence rate ratio (IRR): 1.16, 95% confidence interval (95% CI): 0.94-1.43, P =0.16], even after adjusting for BMI change (IRR: 1.25, 95% CI: 0.99-1.58, P =0.05). Although LKDs and nondonors who lost >5% BMI had comparable hypertension incidence (IRR: 0.78, 95% CI: 0.46-1.34, P =0.36), there was a significant interaction between donor and >5% BMI gain (multiplicative interaction IRR: 1.62, 95% CI: 1.15-2.29, P =0.006; relative excess risk due to interaction: 0.90, 95% CI: 0.24-1.56, P =0.007), such that LKDs who gained weight had higher hypertension incidence than similar nondonors (IRR: 1.83, 95% CI: 1.32-2.53, P <0.001). CONCLUSIONS: Overall, LKDs and nondonors with obesity had similar hypertension incidence. Weight stability and loss were associated with similar hypertension incidence by donor status. However, LKDs who gained >5% saw increased hypertension incidence versus similar nondonors, providing support for counseling potential LKDs with obesity on weight management postdonation.


Subject(s)
Hypertension , Kidney Transplantation , Young Adult , Humans , Body Mass Index , Kidney Transplantation/adverse effects , Case-Control Studies , Nephrectomy , Risk Factors , Obesity/complications , Obesity/epidemiology , Hypertension/epidemiology , Hypertension/etiology , Living Donors
11.
Am J Surg ; 225(2): 425-428, 2023 02.
Article in English | MEDLINE | ID: mdl-36167624

ABSTRACT

BACKGROUND: Chronic Kidney Disease (CKD) Epidemiology Collaboration eGFR 2021 formula removed Black race from the 2009 equation. Unintended consequences may lead to reclassifying Black living kidney donors as having more advanced CKD, exacerbating racial disparities in living donation. METHODS: We used national data to quantify CKD stage reclassification based on eGFR for Black living donors both pre- and post-donation. RESULTS: Among 6365 Black living donors, 17.7% were reclassified as having a higher CKD stage pre-donation with the 2021 formula. Among 4149 Black living donors with at least 2 creatinine measurements post-donation, 25.5% were reclassified as having a higher CKD stage post-donation with the 2021 formula. CONCLUSION: Eliminating race in the formula may inappropriately label Black potential donors with CKD. These data highlight the need for a validated eGFR formula for donors, use of measured and not eGFR, and education of non-transplant providers regarding interpretation of CKD staging in living donation.


Subject(s)
Kidney Transplantation , Renal Insufficiency, Chronic , Humans , Living Donors , Glomerular Filtration Rate , Creatinine , Kidney
12.
PLoS One ; 17(11): e0276882, 2022.
Article in English | MEDLINE | ID: mdl-36399462

ABSTRACT

BACKGROUND: Approval of living kidney donors (LKD) with end-stage kidney disease (ESKD) risk factors, such as obesity, has increased. While lifetime ESKD development data are lacking, the study of intermediate outcomes such as diabetes is critical for LKD safety. Donation-attributable diabetes risk among persons with obesity remains unknown. The purpose of this study was to evaluate 10-year diabetes-free survival among LKDs and non-donors with obesity. METHODS: This longitudinal cohort study identified adult, LKDs (1976-2020) from 42 US transplant centers and non-donors from the Coronary Artery Risk Development in Young Adults (1985-1986) and the Atherosclerosis Risk in Communities (1987-1989) studies with body mass index ≥30 kg/m2. LKDs were matched to non-donors on baseline characteristics (age, sex, race, body mass index, systolic and diastolic blood pressure) plus diabetes-specific risk factors (family history of diabetes, impaired fasting glucose, smoking history). Accelerated failure time models were utilized to evaluate 10-year diabetes-free survival. FINDINGS: Among 3464 participants, 1119 (32%) were LKDs and 2345 (68%) were non-donors. After matching on baseline characteristics plus diabetes-specific risk factors, 4% (7/165) LKDs and 9% (15/165) non-donors developed diabetes (median follow-up time 8.5 (IQR: 5.6-10.0) and 9.1 (IQR: 5.9-10.0) years, respectively). While not significant, LKDs were estimated to live diabetes-free 2 times longer than non-donors (estimate 1.91; 95% CI: 0.79-4.64, p = 0.15). CONCLUSIONS: LKDs with obesity trended toward living longer diabetes-free than non-donors with obesity, suggesting within the decade following donation there was no increased diabetes risk among LKDs. Further work is needed to evaluate donation-attributable diabetes risk long-term.


Subject(s)
Diabetes Mellitus , Kidney Failure, Chronic , Kidney Transplantation , Humans , Young Adult , Kidney Transplantation/adverse effects , Longitudinal Studies , Living Donors , Cohort Studies , Obesity/complications , Obesity/epidemiology , Diabetes Mellitus/etiology
13.
Obesity (Silver Spring) ; 30(11): 2204-2212, 2022 11.
Article in English | MEDLINE | ID: mdl-36161516

ABSTRACT

OBJECTIVE: Recent changes to the Chronic Kidney Disease Epidemiology Collaboration estimated glomerular filtration rate (eGFR) formula (2021 CKD-EPI) removed race from the 2009 formula, increasing the number of Black people classified as having CKD, but these changes may impact eligibility and/or dosing for antiobesity medications. This study estimated the number of people with obesity nationwide who might have pharmacotherapy options impacted by the new formula. METHODS: Using National Health and Nutrition Examination Survey (NHANES) cohort study data, the number of people eligible for antiobesity medication was estimated, and the number who would require a dosage reduction or would no longer be eligible for specific medications based on the new eGFR formula was also estimated. RESULTS: Among 16,412,571 Black and 109,654,751 non-Black people eligible for antiobesity medication, 911,336 (6.1%) Black and 6,925,492 (6.6%) non-Black people had ≥CKD stage 3 by the 2009 CKD-EPI formula. Applying the 2021 CKD-EPI formula, 1,260,969 (8.5%) Black people and 4,989,919 (4.7%) non-Black people had ≥CKD stage 3. For medications requiring renal adjustment, the number of Black people who would require a lower dose or be precluded from using a medication increased by 24.7% to 50.2%. CONCLUSIONS: These findings highlight the importance of measuring-rather than estimating-GFR in Black people with CKD when considering many antiobesity pharmacotherapy options.


Subject(s)
Renal Insufficiency, Chronic , Humans , Glomerular Filtration Rate , Nutrition Surveys , Cohort Studies , Renal Insufficiency, Chronic/epidemiology , Obesity , Creatinine
14.
Transpl Int ; 35: 10626, 2022.
Article in English | MEDLINE | ID: mdl-35928347

ABSTRACT

Alloimmune responses in kidney transplant (KT) patients previously hospitalized with COVID-19 are understudied. We analyzed a cohort of 112 kidney transplant recipients who were hospitalized following a positive SARS-CoV-2 test result during the first 20 months of the COVID-19 pandemic. We found a cumulative incidence of 17% for the development of new donor-specific antibodies (DSA) or increased levels of pre-existing DSA in hospitalized SARS-CoV-2-infected KT patients. This risk extended 8 months post-infection. These changes in DSA status were associated with late allograft dysfunction. Risk factors for new or increased DSA responses in this KT patient cohort included the presence of circulating DSA pre-COVID-19 diagnosis and time post-transplantation. COVID-19 vaccination prior to infection and remdesivir administration during infection were each associated with decreased likelihood of developing a new or increased DSA response. These data show that new or enhanced DSA responses frequently occur among KT patients requiring admission with COVID-19 and suggest that surveillance, vaccination, and antiviral therapies may be important tools to prevent alloimmunity in these individuals.


Subject(s)
COVID-19 Drug Treatment , COVID-19 , Kidney Transplantation , Adenosine Monophosphate/analogs & derivatives , Alanine/analogs & derivatives , Antibodies , COVID-19/prevention & control , COVID-19 Testing , COVID-19 Vaccines/therapeutic use , Graft Rejection , HLA Antigens , Humans , Pandemics , SARS-CoV-2 , Transplant Recipients , Vaccination
15.
Ann Surg ; 276(4): 597-604, 2022 10 01.
Article in English | MEDLINE | ID: mdl-35837899

ABSTRACT

BACKGROUND: The burden of end-stage kidney disease (ESKD) and kidney transplant rates vary significantly across the United States. This study aims to examine the mismatch between ESKD burden and kidney transplant rates from a perspective of spatial epidemiology. METHODS: US Renal Data System data from 2015 to 2017 on incident ESKD and kidney transplants per 1000 incident ESKD cases was analyzed. Clustering of ESKD burden and kidney transplant rates at the county level was determined using local Moran's I and correlated to county health scores. Higher percentile county health scores indicated worse overall community health. RESULTS: Significant clusters of high-ESKD burden tended to coincide with clusters of low kidney transplant rates, and vice versa. The most common cluster type had high incident ESKD with low transplant rates (377 counties). Counties in these clusters had the lowest overall mean transplant rate (61.1), highest overall mean ESKD incidence (61.3), and highest mean county health scores percentile (80.9%, P <0.001 vs all other cluster types). By comparison, counties in clusters with low ESKD incidence and high transplant rates (n=359) had the highest mean transplant rate (110.6), the lowest mean ESKD incidence (28.9), and the lowest county health scores (20.2%). All comparisons to high-ESKD/low-transplant clusters were significant at P value <0.001. CONCLUSION: There was a significant mismatch between kidney transplant rates and ESKD burden, where areas with the greatest need had the lowest transplant rates. This pattern exacerbates pre-existing disparities, as disadvantaged high-ESKD regions already suffer from worse access to care and overall community health, as evidenced by the highest county health scores in the study.


Subject(s)
Kidney Failure, Chronic , Kidney Transplantation , Cluster Analysis , Humans , Incidence , Kidney Failure, Chronic/epidemiology , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , United States/epidemiology
16.
Am J Gastroenterol ; 117(12): 1990-1998, 2022 12 01.
Article in English | MEDLINE | ID: mdl-35853462

ABSTRACT

INTRODUCTION: In the published studies of early liver transplantation (LT) for alcohol-associated hepatitis (AH), patients with a prior liver decompensation are excluded. The appropriateness of this criteria is unknown. METHODS: Among 6 American Consortium of Early Liver Transplantation for Alcohol-Associated Hepatitis sites, we included consecutive early LT for clinically diagnosed AH between 2007 and 2020. Patients were stratified as first vs prior history of liver decompensation, with the latter defined as a diagnosis of ascites, hepatic encephalopathy, variceal bleeding, or jaundice, and evidence of alcohol use after this event. Adjusted Cox regression assessed the association of first (vs prior) decompensation with post-LT mortality and harmful (i.e., any binge and/or frequent) alcohol use. RESULTS: A total of 241 LT recipients (210 first vs 31 prior decompensation) were included: median age 43 vs 38 years ( P = 0.23), Model for End-Stage Liver Disease Sodium score of 39 vs 39 ( P = 0.98), and follow-up after LT 2.3 vs 1.7 years ( P = 0.08). Unadjusted 1- and 3-year survival among first vs prior decompensation was 93% (95% confidence interval [CI] 89%-96%) vs 86% (95% CI 66%-94%) and 85% (95% CI 79%-90%) vs 78% (95% CI 57%-89%). Prior (vs first) decompensation was associated with higher adjusted post-LT mortality (adjusted hazard ratio 2.72, 95% CI 1.61-4.59) and harmful alcohol use (adjusted hazard ratio 1.77, 95% CI 1.07-2.94). DISCUSSION: Prior liver decompensation was associated with higher risk of post-LT mortality and harmful alcohol use. These results are a preliminary safety signal and validate first decompensation as a criterion for consideration in early LT for AH patients. However, the high 3-year survival suggests a survival benefit for early LT and the need for larger studies to refine this criterion. These results suggest that prior liver decompensation is a risk factor, but not an absolute contraindication to early LT.


Subject(s)
End Stage Liver Disease , Esophageal and Gastric Varices , Hepatitis, Alcoholic , Liver Transplantation , Humans , Adult , End Stage Liver Disease/surgery , Gastrointestinal Hemorrhage , Severity of Illness Index , Hepatitis, Alcoholic/surgery , Retrospective Studies
17.
Surgery ; 172(3): 997-1004, 2022 09.
Article in English | MEDLINE | ID: mdl-35831221

ABSTRACT

BACKGROUND: Community-level factors contribute to living donor kidney transplantation disparities but may also influence the interventions aimed to mitigate these disparities. The Living Donor Navigator Program was designed to separate the advocacy role from the patient in need of transplantation-friends/family are encouraged to participate as the patients' advocates to identify living donors, though some of the patients participate alone as self-advocates. Self-advocates have a lower living donor kidney transplantation likelihood compared to the patients with an advocate. We sought to evaluate the relationship between the patients' community-level vulnerability and living donor navigator self-advocacy as a surrogate for program fidelity. METHODS: This single-center, retrospective study included 110 Living Donor Navigator participants (April 2017-June 2019). Program fidelity was assessed using the participants' advocacy status. Measures of community vulnerability were obtained from the Centers for Disease Control and Prevention Social Vulnerability Index. Modified Poisson regression was used to evaluate the association between community-level vulnerability and living donor navigator self-advocacy. RESULTS: Of the 110 participants, 19% (n = 21) were self-advocates. For every 10% increase in community-level vulnerability, patients had 17% higher risk of self-advocacy (adjusted relative risk 1.17, 95% confidence interval: 1.03-1.32, P = .01). Living in areas with greater unemployment (adjusted relative risk: 1.18, 95% confidence interval: 1.04-1.33, P = .01), single-parent households (adjusted relative risk: 1.23, 95% confidence interval: 1.06-1.42, P = .006), minority population (adjusted relative risk: 1.30, 95% confidence interval: 1.04-1.55, P = .02), or no-vehicle households (adjusted relative risk: 1.17, 95% confidence interval: 1.02-1.35, P = .02) were associated with increased risk of self-advocacy. CONCLUSION: Having a greater community-level vulnerability was associated with poor Living Donor Navigator Program fidelity. The potential barriers identified using the Social Vulnerability Index may direct resource allocation and program refinement to optimize program fidelity and efficacy for all participants.


Subject(s)
Kidney Transplantation , Living Donors , Humans , Minority Groups , Retrospective Studies , Risk
18.
Transplantation ; 106(9): 1799-1806, 2022 09 01.
Article in English | MEDLINE | ID: mdl-35609185

ABSTRACT

BACKGROUND: Much of our understanding regarding geographic issues in transplantation is based on statistical techniques that do not formally account for geography and is based on obsolete boundaries such as donation service area. METHODS: We applied spatial epidemiological techniques to analyze liver-related mortality and access to liver transplant services at the county level using data from the Centers for Disease Control and Prevention and Scientific Registry of Transplant Recipients from 2010 to 2018. RESULTS: There was a significant negative spatial correlation between transplant rates and liver-related mortality at the county level (Moran's I, -0.319; P = 0.001). Significant clusters were identified with high transplant rates and low liver-related mortality. Counties in geographic clusters with high ratios of liver transplants to liver-related deaths had more liver transplant centers within 150 nautical miles (6.7 versus 3.6 centers; P < 0.001) compared with all other counties, as did counties in geographic clusters with high ratios of waitlist additions to liver-related deaths (8.5 versus 2.5 centers; P < 0.001). The spatial correlation between waitlist mortality and overall liver-related mortality was positive (Moran's I, 0.060; P = 0.001) but weaker. Several areas with high waitlist mortality had some of the lowest overall liver-related mortality in the country. CONCLUSIONS: These data suggest that high waitlist mortality and allocation model for end-stage liver disease do not necessarily correlate with decreased access to transplant, whereas local transplant center density is associated with better access to waitlisting and transplant.


Subject(s)
End Stage Liver Disease , Liver Transplantation , End Stage Liver Disease/diagnosis , End Stage Liver Disease/surgery , Health Services Accessibility , Humans , Liver Transplantation/adverse effects , Retrospective Studies , Severity of Illness Index , United States/epidemiology , Waiting Lists
19.
Am J Surg ; 224(3): 990-998, 2022 09.
Article in English | MEDLINE | ID: mdl-35589438

ABSTRACT

BACKGROUND: Donation after cardiac death(DCD) has been proposed as an avenue to expand the liver donor pool. METHODS: We examined factors associated with nonrecovery of DCD livers using UNOS data from 2015 to 2019. RESULTS: There 265 non-recovered potential(NRP) DCD livers. Blood type AB (7.8% vs. 1.1%) and B (16.9% vs. 9.8%) were more frequent in the NRP versus actual donors (p < 0.001). The median driving time between donor hospital and transplant center was similar for NRP and actual donors (30.1 min vs. 30.0 min; p = 0.689), as was the percentage located within a transplant hospital (20.8% vs. 20.9%; p = 0.984).The donation service area(DSA) of a donor hospital explained 27.9% (p = 0.001) of the variability in whether a DCD liver was recovered. CONCLUSION: A number of potentially high quality DCD donor livers go unrecovered each year, which may be partially explained by donor blood type and variation in regional and DSA level practice patterns.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Death , Graft Survival , Humans , Liver , Retrospective Studies , Tissue Donors , United States
20.
Clin Transplant ; 36(7): e14676, 2022 07.
Article in English | MEDLINE | ID: mdl-35437836

ABSTRACT

INTRODUCTION: Time-zero biopsies can detect donor-derived lesions at the time of kidney transplantation, but their utility in predicting long-term outcomes is unclear under the updated Kidney Allocation System. METHODS: We conducted a single-center retrospective cohort study of 272 consecutive post-reperfusion time-zero biopsies. We tested the hypothesis that abnormal time-zero histology is a strong indicator of donor quality that increases the precision of the kidney donor profile index (KDPI) score to predict long-term outcomes. RESULTS: We detected abnormal biopsies in 42% of the cohort, which were independently associated with a 1.2-fold increased hazard for a composite of acute rejection, allograft failure, and death after adjusting for clinical characteristics including KDPI. By Kaplan-Meier analysis, the relationship between abnormal time-zero histology and the composite endpoint was only significant in the subgroup of deceased donor kidney transplants with KDPI scores >35. Abnormal time-zero histology, particularly vascular intimal fibrosis and arteriolar hyalinosis scores, was independently associated with lower 12-month estimated GFR. CONCLUSION: In conclusion, abnormal time-zero histology is relatively common and identifies a group of kidney recipients at increased risk for worse long-term outcomes. Further studies are needed to determine the optimal patient population in which to deploy time-zero biopsies as an additional surveillance tool.


Subject(s)
Kidney Transplantation , Transplants , Graft Survival , Humans , Kidney/pathology , Kidney Transplantation/adverse effects , Retrospective Studies , Tissue Donors
SELECTION OF CITATIONS
SEARCH DETAIL
...