Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
1.
J Clin Anesth ; 85: 111040, 2023 05.
Article in English | MEDLINE | ID: mdl-36549035

ABSTRACT

BACKGROUND: Immediate postoperative extubation (IPE) can reduce perioperative complications and length of stay (LOS), however it is performed variably after liver transplant across institutions and has historically excluded high-risk recipients from consideration. In late 2012, we planned and implemented a single academic institution structured quality improvement (QI) initiative to standardize perioperative care of liver transplant recipients without exceptions. We hypothesized that such an approach would lead to a sustained increase in IPE after primary (PAC) and delayed abdominal closure (DAC). METHODS: We retrospectively studied 591 patients from 2013 to 2018 who underwent liver transplant after initiative implementation. We evaluated trends in incidence of IPE versus delayed extubation (DE), and reintubation, LOS, and mortality. RESULTS: Overall, 476/591 (80.5%) recipients underwent PAC (278 IPE, 198 DE) and 115/591 (19.5%) experienced DAC (39 IPE, 76 DE). When comparing data from 2013 to data from 2018, the incidence of IPE increased from 9/67 (13.4%) to 78/90 (86.7%) after PAC and from 1/12 (8.3%) to 16/23 (69.6%) after DAC. For the same years, the incidence of IPE after PAC for recipients with MELD scores ≥30 increased from 0/19 (0%) to 12/17 (70.6%), for recipients who underwent simultaneous liver-kidney transplant increased from 1/8 (12.5%) to 4/5 (80.0%), and for recipients who received massive transfusion (>10 units of packed red blood cells) increased from 0/17 (0%) to 10/13 (76.9%). Reintubation for respiratory considerations <48 h after IPE occurred in 3/278 (1.1%) after PAC and 1/39 (2.6%) after DAC. IPE was associated with decreased intensive care unit (HR of discharge: 1.92; 95% CI: 1.58, 2.33; P < 0.001) and hospital LOS (HR of discharge: 1.45; 95% CI: 1.20, 1.76; P < 0.001) but demonstrated no association with mortality. CONCLUSION: A structured QI initiative led to sustained high rates of IPE and reduced LOS in all liver transplant recipients, including those classified as high risk.


Subject(s)
Liver Transplantation , Humans , Liver Transplantation/adverse effects , Retrospective Studies , Airway Extubation/adverse effects , Liver , Postoperative Period , Length of Stay
2.
JAMA Surg ; 156(11): 1026-1034, 2021 11 01.
Article in English | MEDLINE | ID: mdl-34379106

ABSTRACT

Importance: Traditionally, liver transplant (LT) for alcohol-associated liver disease (ALD) requires 6 months of abstinence. Although early LT before 6 months of abstinence has been associated with decreased mortality for decompensated ALD, this practice remains controversial and concentrated at a few centers. Objective: To define patient, allograft, and relapse-free survival in early LT for ALD, and to investigate the association between these survival outcomes and early vs standard LT. Design, Setting, and Participants: This cohort study analyzed all patients with ALD who underwent their first LT at a single academic referral center between October 1, 2012, and November 13, 2020. Patients with known pretransplant hepatocellular carcinoma, hepatitis B or C, or an alternative cause of liver failure were excluded. Follow-up period was defined as the time from LT to the most recent encounter with a transplant center or death. Exposures: The exposure of interest was early LT, which was defined as less than 180 days of pre-LT abstinence. Standard LT was defined as 180 days or more of pre-LT abstinence. Patients were separated into early LT and standard LT by time from abstinence to LT. Main Outcomes and Measures: The outcomes were patient, allograft, relapse-free, and hazardous relapse-free survival for patients who underwent early LT or standard LT. These groups were compared by log-rank testing of Kaplan-Meier estimates. Hazardous relapse was defined as binge, at-risk, or frequent drinking. Abstinence was reassessed at the most recent follow-up visit for all patients. Results: Of the 163 patients with ALD included in this study, 88 (54%) underwent early LT and 75 (46%) underwent standard LT. This cohort had a mean (SD) age at transplant of 52 (10) years and was predominantly composed of 108 male patients (66%). Recipients of early LT vs standard LT were younger (median [interquartile range (IQR)] age, 49.7 [39.0-54.2] years vs 54.6 [48.7-60.0] years; P < .001) and had a higher median (IQR) Model for End-stage Liver Disease score at listing (35.0 [29.0-39.0] vs 20.0 [13.0-26.0]; P < .001). Both recipients of early LT and standard LT had similar 1-year patient survival (94.1% [95% CI, 86.3%-97.5%] vs 95.9% [95% CI, 87.8%-98.7%]; P = .60), allograft survival (92.7% [95% CI, 84.4%-96.7%] vs 90.5% [95% CI, 81.0%-95.3%]; P = .42), relapse-free survival (80.4% [95% CI, 69.1%-88.0%] vs 83.5% [95% CI, 72.2%-90.6%]; P = .41), and hazardous relapse-free survival (85.8% [95% CI, 75.1%-92.2%] vs 89.6% [95% CI, 79.5%-94.9%]; P = .41). Conclusions and Relevance: Adherence to the 6-month rule was not associated with superior patient survival, allograft survival, or relapse-free survival among selected patients. This finding suggests that patients with ALD should not be categorically excluded from LT solely on the basis of 6 months of abstinence, but rather alternative selection criteria should be identified that are based on need and posttransplant outcomes.


Subject(s)
Liver Diseases, Alcoholic/mortality , Liver Diseases, Alcoholic/surgery , Liver Transplantation , Adult , Alcohol Abstinence , Cohort Studies , Disease-Free Survival , Female , Graft Survival , Humans , Male , Middle Aged , Patient Selection , Survival Rate , Time Factors , Treatment Outcome
3.
BMC Nephrol ; 21(1): 465, 2020 11 09.
Article in English | MEDLINE | ID: mdl-33167882

ABSTRACT

BACKGROUND: Live kidney donors (LKDs) account for nearly a third of kidney transplants in the United States. While donor nephrectomy poses minimal post-surgical risk, LKDs face an elevated adjusted risk of developing chronic diseases such as hypertension, diabetes, and end-stage renal disease. Routine screening presents an opportunity for the early detection and management of chronic conditions. Transplant hospital reporting requirements mandate the submission of laboratory and clinical data at 6-months, 1-year, and 2-years after kidney donation, but less than 50% of hospitals are able to comply. Strategies to increase patient engagement in follow-up efforts while minimizing administrative burden are needed. We seek to evaluate the effectiveness of using small financial incentives to promote patient compliance with LKD follow-up. METHODS/DESIGN: We are conducting a two-arm randomized controlled trial (RCT) of patients who undergo live donor nephrectomy at The Johns Hopkins Hospital Comprehensive Transplant Center (MDJH) and the University of Maryland Medical Center Transplant Center (MDUM). Eligible donors will be recruited in-person at their first post-surgical clinic visit or over the phone. We will use block randomization to assign LKDs to the intervention ($25 gift card at each follow-up visit) or control arm (current standard of care). Follow-up compliance will be tracked over time. The primary outcome will be complete (all components addressed) and timely (60 days before or after expected visit date), submission of LKD follow-up data at required 6-month, 1-year, and 2-year time points. The secondary outcome will be transplant hospital-level compliance with federal reporting requirements at each visit. Rates will be compared between the two arms following the intention-to-treat principle. DISCUSSION: Small financial incentivization might increase patient compliance in the context of LKD follow-up, without placing undue administrative burden on transplant providers. The findings of this RCT will inform potential center- and national-level initiatives to provide all LKDs with small financial incentives to promote engagement with post-donation monitoring efforts. TRIAL REGISTRATION: ClinicalTrials.gov number: NCT03090646 Date of registration: March 2, 2017 Sponsors: Johns Hopkins University, University of Maryland Medical Center Funding: The Living Legacy Foundation of Maryland.


Subject(s)
Aftercare , Kidney Transplantation , Living Donors , Motivation , Patient Compliance , Adult , Aftercare/economics , Baltimore , Follow-Up Studies , Humans , Postoperative Complications/diagnosis , Standard of Care
4.
Clin Transplant ; 34(9): e13905, 2020 09.
Article in English | MEDLINE | ID: mdl-32399996

ABSTRACT

Simple (Bosniak I) renal cysts are considered acceptable in living kidney donor selection in terms of cancer risk. However, they tend to increase in number and size over time and might compromise renal function in donors. To clarify their implications for long-term renal function, we characterized the prevalence of renal cysts in 454 individuals who donated at our center from 2000 to 2007. We estimated the association between the presence of cysts in the kidney remaining after nephrectomy (ie, retained cysts) and postdonation eGFR trajectory using mixed-effects linear regression. Donors with retained cysts (N = 86) were older (P < .001) and had slightly lower predonation eGFR (median 94 vs 98 mL/min/1.73 m2 , P < .01) than those without cysts. Over a median 7.8 years, donors with retained cysts had lower baseline eGFR (-8.7 -5.6 -2.3  mL/min/1.73 m2 , P < .01) but similar yearly change in eGFR (-0.4 0.02 0.4  mL/min/1.73 m2 , P = .2) compared to those without retained cysts. Adjusting for predonation characteristics, there was no difference in baseline eGFR (P = .6) or yearly change in eGFR (P > .9). There continued to be no evidence of an association when we considered retained cyst(s) ≥10 mm or multiple retained cysts (all P > .05). These findings reaffirm current practices of accepting candidates with simple renal cysts for donor nephrectomy.


Subject(s)
Cysts , Kidney Diseases, Cystic , Kidney Failure, Chronic , Kidney Transplantation , Cysts/etiology , Glomerular Filtration Rate , Humans , Kidney , Kidney Diseases, Cystic/surgery , Kidney Failure, Chronic/surgery , Living Donors , Nephrectomy , Retrospective Studies
5.
Am J Transplant ; 19(1): 269-276, 2019 01.
Article in English | MEDLINE | ID: mdl-30253051

ABSTRACT

A recent study reported that kidney transplant recipients of offspring living donors had higher graft loss and mortality. This seemed counterintuitive, given the excellent HLA matching and younger age of offspring donors; we were concerned about residual confounding and other study design issues. We used Scientific Registry of Transplant Recipients data 2001-2016 to evaluate death-censored graft failure (DCGF) and mortality for recipients of offspring versus nonoffspring living donor kidneys, using Cox regression models with interaction terms. Recipients of offspring kidneys had lower DCGF than recipients of nonoffspring kidneys (15-year cumulative incidence 21.2% vs 26.1%, P < .001). This association remained after adjustment for recipient and transplant factors (adjusted hazard ratio [aHR] = 0.73 0.770.82 , P < .001), and was attenuated among African American donors (aHR 0.77 0.850.95 ; interaction: P = .01) and female recipients (aHR 0.77 0.840.91 , P < .001). Although offspring kidney recipients had higher mortality (15-year mortality 56.4% vs 37.2%, P < .001), this largely disappeared with adjustment for recipient age alone (aHR = 1.02 1.061.10 , P = .002) and was nonsignificant after further adjustment for other recipient characteristics (aHR = 0.93 0.971.01 , P = .1). Kidneys from offspring donors provided lower graft failure and comparable mortality. An otherwise eligible donor should not be dismissed because they are the offspring of the recipient, and we encourage continued individualized counseling for potential donors.


Subject(s)
Kidney Failure, Chronic/mortality , Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Kidney/surgery , Living Donors , Transplant Recipients , Adult , Black or African American , Aged , Female , Graft Rejection/etiology , Graft Survival , HLA Antigens , Humans , Incidence , Kidney Failure, Chronic/ethnology , Male , Middle Aged , Postoperative Complications , Proportional Hazards Models , Registries , Treatment Outcome , United States
6.
Am J Transplant ; 18(11): 2804-2810, 2018 11.
Article in English | MEDLINE | ID: mdl-30086198

ABSTRACT

Development of end-stage renal disease (ESRD) in living kidney donors is associated with increased graft loss in the recipients of their kidneys. Our goal was to investigate if this relationship was reflected at an earlier stage postdonation, possibly early enough for recipient risk prediction based on donor response to nephrectomy. Using national registry data, we studied 29 464 recipients and their donors from 2008-2016 to determine the association between donor 6-month postnephrectomy estimated GFR (eGFR) and recipient death-censored graft failure (DCGF). We explored donor BMI as an effect modifier, given the association between obesity and hyperfiltration. On average, risk of DCGF increased with each 10 mL/min decrement in postdonation eGFR (adjusted hazard ratio [aHR] 1.06, 95% confidence interval [CI] 1.02-1.10, P = .007). The association was attenuated with higher donor BMI (interaction P = .049): recipients from donors with BMI = 20 (aHR 1.12, 95% CI 1.04-1.19, P = .002) and BMI = 25 (aHR 1.07, 95% CI 1.03-1.12, P = .001) had a higher risk of DCGF with each 10 mL/min decrement in postdonation eGFR, whereas recipients from donors with BMI = 30 and BMI = 35 did not have a higher risk. The relationship between postdonation eGFR, donor BMI, and recipient graft loss can inform counseling and management of living donor kidney transplant recipients.


Subject(s)
Glomerular Filtration Rate , Graft Rejection/etiology , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Living Donors/supply & distribution , Nephrectomy/adverse effects , Tissue and Organ Harvesting/adverse effects , Adult , Female , Follow-Up Studies , Graft Survival , Humans , Kidney Function Tests , Male , Middle Aged , Postoperative Complications , Prognosis , Registries , Risk Factors , Transplant Recipients/statistics & numerical data
7.
J Am Coll Surg ; 202(5): 762-72, 2006 May.
Article in English | MEDLINE | ID: mdl-16648016

ABSTRACT

BACKGROUND: Twenty-nine of 1,284 battle-injured soldiers arriving at Walter Reed Army Medical Center from Operations Enduring Freedom and Iraqi Freedom have abdominal wounds requiring delayed definitive closure with Gore-Tex (WL Gore & Assoc) mesh. METHODS: Serial abdominal closure (SAC) leading to early definitive abdominal closure (EDAC) was achieved using Gore-Tex mesh. Inpatient records of Operations Enduring Freedom and Iraqi Freedom soldiers with open or reopened abdomens were reviewed from March 2003 to August 2005. RESULTS: Twenty-nine soldiers, average age 27 years (range 20 to 42 years) injured by secondary blast effects (n = 19); penetrating (n = 8); motor vehicle crashes (n = 1); and crushing injury (n = 1) were included in the study. Patients arrived at Walter Reed Army Medical Center 8 days (range 3 to 56 days) after injury with Gore-Tex mesh placed 6 days (range 0 to 26 days) from arrival and 14 days (range 4 to 79 days) from injury. SAC was achieved with towel clamp tightening or excision of midline mesh and drawing fascia closer to the midline for an average of 46 days (range 15 to 160 days) before EDAC. One patient is undergoing SAC and another was transferred to another facility. EDAC was achieved in 24 of the remaining of 27 patients (89%). Four patients required early removal of the Gore-Tex mesh, resulting in three patients with planned ventral hernia. One patient underwent EDAC with primary closure and fascial release. EDAC was completed with polypropylene mesh in 17 patients and 6 patients had original Gore-Tex in place. Patients were discharged from the hospital an average of 18 days after closure (range 1 to 89 days) with total hospital days of 62 (range 17 to 197 days). Average followup of patients from placement of Gore-Tex mesh is 264 days (range 31 to 855 days). CONCLUSIONS: SAC with Gore-Tex mesh led to EDAC in 89% of patients and proved to be a safe and effective alternative to planned ventral hernia. SAC allowed protection of abdominal contents, effective fluid management, reclamation of abdominal domain, and early rehabilitation with minimal complications and only one hernia reoccurrence.


Subject(s)
Abdominal Injuries/surgery , Military Personnel , Surgical Mesh , Adult , Afghanistan , Female , Humans , Iraq , Male , Retrospective Studies , Suture Techniques , Time Factors , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...