Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 89
Filter
1.
Urology ; 2024 Jun 26.
Article in English | MEDLINE | ID: mdl-38942394

ABSTRACT

OBJECTIVE: To determine whether early versus delayed autotransplantation are associated with adverse outcomes in patients undergoing renal autotransplantation. METHODS: Patients who underwent renal autotransplantation from June 2012 to September 2022 were divided into 2 groups based on timing of autotransplant in relation to initial intervention or diagnosis (early cohort: ≤1-year; delayed cohort: >1-year). Primary outcomes were perioperative complications, aborted surgery, renal function (glomerular filtration rate [GFR]), and postoperative complications at most recent follow-up. RESULTS: Autotransplantation patients (N = 72) were predominantly female (68%) and White (54%), with a median age of 49 years. Ninety percentage of patients had undergone previous interventions, including stenting (40%) and nephrostomy tubes (49%), primarily for obstruction (64%). Early versus delayed cohorts had median preoperative disease durations of 143 (IQR 83-222) versus 673 days (IQR 529-1703, P <.001), with similar median follow-up times (879 vs 818 days, P = .8). Groups were similar in demographics and comorbidities. There were no significant differences in rates of aborted surgery (15% vs 4.2%, P = .3), perioperative complications (15% vs 17%, P > .9), long-term complications (49% vs 48%, P > .9), or changes in GFR (median change +3 vs +4, P = .7). Outcomes were comparable across preoperative disease durations ranging from 6 to 24 months. These findings were confirmed following adjustments for sex, body mass index, American Society of Anesthesiologists classification, race, preoperative creatinine levels, laterality, gastroesophageal reflux disease, diabetes, hypertension, nephrolithiasis, hyperlipidemia, history of colon surgery, urologic surgery, abdominal surgery, and prior interventions in separate logistic models. CONCLUSION: Disease duration before autotransplantation does not influence outcomes, offering reassurance for clinical decision-making in complex cases.

2.
Urol Case Rep ; 54: 102717, 2024 May.
Article in English | MEDLINE | ID: mdl-38617183

ABSTRACT

Nutcracker Syndrome (NCS) is characterized by entrapment of the left renal vein, leading hematuria, flank pain, and proteinuria. We evaluated the efficacy of renal autotransplantation as a curative treatment for NCS through a review and case report. 55 patients from 18 studies were analyzed, with a combined 91% success rate of symptom resolution or improvement post-autotransplantation. In our case report, a 25-year-old man with severe NCS received laparoscopic nephrectomy and autotransplant, resulting in symptom resolution at 3.1 years follow up. Further research should confirm these findings and refine patient selection criteria and surgical techniques.

3.
Urol Case Rep ; 54: 102715, 2024 May.
Article in English | MEDLINE | ID: mdl-38550655

ABSTRACT

Ureteral avulsion can be secondary to blunt or penetrating trauma, or can emerge as a surgical complication. Popularization of minimally invasive interventions has significantly decreased ureteral injuries, ranging from 0% to 28% and varying from minor mucosal injury to perforation, and most catastrophically, avulsion. We present a case of complete ureteral avulsion that was not initially appreciated after undergoing ureteroscopy for stone extraction. Eventual recognition of this injury was managed successfully with a subsequent laparoscopically nephrectomy and renal auto-transplantation preserving renal function.

4.
Transplantation ; 108(2): 483-490, 2024 Feb 01.
Article in English | MEDLINE | ID: mdl-38259180

ABSTRACT

BACKGROUND: Improper opioid prescription after surgery is a well-documented iatrogenic contributor to the current opioid epidemic in North America. In fact, opioids are known to be overprescribed to liver transplant patients, and liver transplant patients with high doses or prolonged postsurgical opioid use have higher risks of graft failure and death. METHODS: This is a retrospective cohort study of 552 opioid-naive patients undergoing liver transplant at an academic center between 2012 and 2019. The primary outcome was the discrepancy between the prescribed discharge opioid daily dose and each patient's own inpatient opioid consumption 24 h before discharge. Variables were analyzed with Wilcoxon and chi-square tests and logistic regression. RESULTS: Opioids were overprescribed in 65.9% of patients, and 54.3% of patients who required no opioids the day before discharge were discharged with opioid prescriptions. In contrast, opioids were underprescribed in 13.4% of patients, among whom 27.0% consumed inpatient opioids but received no discharge opioid prescription. The median prescribed opioid daily dose was 333.3% and 56.3% of the median inpatient opioid daily dose in opioid overprescribed and underprescribed patients, respectively. Importantly, opioid underprescribed patients had higher rates of opioid refill 1 to 30 and 31 to 90 d after discharge, and the rate of opioid underprescription more than doubled from 2016 to 2019. CONCLUSIONS: Opioids are both over- and underprescribed to liver transplant patients, and opioid underprescribed patients had higher rates of opioid refill. Therefore, we proposed to prescribe discharge opioid prescriptions based on liver transplant patients' inpatient opioid consumption to provide patient-centered opioid prescriptions.


Subject(s)
Liver Transplantation , Transplants , Humans , Liver Transplantation/adverse effects , Analgesics, Opioid/adverse effects , Retrospective Studies , Prescriptions
5.
Front Immunol ; 14: 1246867, 2023.
Article in English | MEDLINE | ID: mdl-37731493

ABSTRACT

Introduction: Donation after circulatory death (DCD) liver transplantation (LT) makes up well less than 1% of all LTs with a Model for End-Stage Liver Disease (MELD)≥35 in the United States. We hypothesized DCD-LT yields acceptable ischemia-reperfusion and reasonable outcomes for recipients with MELD≥35. Methods: We analyzed recipients with lab-MELD≥35 at transplant within the UCSF (n=41) and the UNOS (n=375) cohorts using multivariate Cox regression and propensity score matching. Results: In the UCSF cohort, five-year patient survival was 85% for DCD-LTs and 86% for matched-Donation after Brain Death donors-(DBD) LTs (p=0.843). Multivariate analyses showed that younger donor/recipient age and more recent transplants (2011-2021 versus 1999-2010) were associated with better survival. DCD vs. DBD graft use did not significantly impact survival (HR: 1.2, 95%CI 0.6-2.7). The transaminase peak was approximately doubled, indicating suggesting an increased ischemia-reperfusion hit. DCD-LTs had a median post-LT length of stay of 11 days, and 34% (14/41) were on dialysis at discharge versus 12 days and 22% (9/41) for DBD-LTs. 27% (11/41) DCD-LTs versus 12% (5/41) DBD-LTs developed a biliary complication (p=0.095). UNOS cohort analysis confirmed patient survival predictors, but DCD graft emerged as a risk factor (HR: 1.5, 95%CI 1.3-1.9) with five-year patient survival of 65% versus 75% for DBD-LTs (p=0.016). This difference became non-significant in a sub-analysis focusing on MELD 35-36 recipients. Analysis of MELD≥35 DCD recipients showed that donor age of <30yo independently reduced the risk of graft loss by 30% (HR, 95%CI: 0.7 (0.9-0.5), p=0.019). Retransplant status was associated with a doubled risk of adverse event (HR, 95%CI: 2.1 (1.4-3.3), p=0.001). The rejection rates at 1y were similar between DCD- and DBD-LTs, (9.3% (35/375) versus 1,541 (8.7% (1,541/17,677), respectively). Discussion: In highly selected recipient/donor pair, DCD transplantation is feasible and can achieve comparable survival to DBD transplantation. Biliary complications occurred at the expected rates. In the absence of selection, DCD-LTs outcomes remain worse than those of DBD-LTs.


Subject(s)
Body Fluids , End Stage Liver Disease , Liver Transplantation , Humans , Liver Transplantation/adverse effects , End Stage Liver Disease/surgery , Severity of Illness Index , Tissue Donors
6.
Transpl Int ; 35: 10855, 2022.
Article in English | MEDLINE | ID: mdl-36568142

ABSTRACT

Donation-after-circulatory-death (DCD), donation-after-brain-death (DBD), and living-donation (LD) are the three possible options for liver transplantation (LT), each with unique benefits and complication rates. We aimed to compare DCD-, DBD-, and LD-LT-specific graft survival and biliary complications (BC). We collected data on 138 DCD-, 3,027 DBD- and 318 LD-LTs adult recipients from a single center and analyzed patient/graft survival. BC (leak and anastomotic/non-anastomotic stricture (AS/NAS)) were analyzed in a subset of 414 patients. One-/five-year graft survival were 88.6%/70.0% for DCD-LT, 92.6%/79.9% for DBD-LT, and, 91.7%/82.9% for LD-LT. DCD-LTs had a 1.7-/1.3-fold adjusted risk of losing their graft compared to DBD-LT and LD-LT, respectively (p < 0.010/0.403). Bile leaks were present in 10.1% (DCD-LTs), 7.2% (DBD-LTs), and 36.2% (LD-LTs) (ORs, DBD/LD vs. DCD: 0.7/4.2, p = 0.402/<0.001). AS developed in 28.3% DCD-LTs, 18.1% DBD-LTs, and 43.5% LD-LTs (ORs, DBD/LD vs. DCD: 0.5/1.8, p = 0.018/0.006). NAS was present in 15.2% DCD-LTs, 1.4% DBDs-LT, and 4.3% LD-LTs (ORs, DBD/LD vs. DCD: 0.1/0.3, p = 0.001/0.005). LTs w/o BC had better liver graft survival compared to any other groups with BC. DCD-LT and LD-LT had excellent graft survival despite significantly higher BC rates compared to DBD-LT. DCD-LT represents a valid alternative whose importance should increase further with machine/perfusion systems.


Subject(s)
Liver Transplantation , Tissue and Organ Procurement , Adult , Humans , Liver Transplantation/adverse effects , Cohort Studies , Brain Death , Living Donors , Retrospective Studies , Graft Survival , Tissue Donors , Death
7.
Kidney360 ; 3(6): 1080-1088, 2022 06 30.
Article in English | MEDLINE | ID: mdl-35845334

ABSTRACT

Background: The optimal timing of dialysis access placement in individuals with stage 5 CKD is challenging to estimate. Preemptive living donor kidney transplant (LDKT) is the gold-standard treatment for ESKD due to superior graft survival and mortality, but dialysis initiation is often required. Among LDKT recipients, we sought to determine which clinical characteristics were associated with preemptive transplant. Among non-preemptive LDKT recipients, we sought to determine what dialysis access was used, and their duration of use before receipt of living donor transplant. Methods: We retrospectively extracted data on 569 LDKT recipients, >18 years old, who were transplanted between January 2014 and July 2019 at UCSF, including dialysis access type (arteriovenous fistula [AVF], arteriovenous graft [AVG], peritoneal dialysis catheter [PD], and venous catheter), duration of dialysis, and clinical characteristics. Results: Preemptive LDKT recipients constituted 30% of our cohort and were older, more likely to be White, more likely to have ESKD from polycystic kidney disease, and less likely to have ESKD from type 2 diabetes. Of the non-preemptive patients, 26% used AVF, 0.5% used AVG, 32% used peritoneal catheter, 11% used venous catheter, and 31% used more than one access type. Median (IQR) time on dialysis for AVF/AVG use was 1.86 (0.85-3.32) years; for PD catheters, 1.12 (0.55-1.92) years; for venous catheters, 0.66 (0.23-1.69) years; and for multimodal access, 2.15 (1.37-3.72) years. Conclusions: We characterized the dialysis access landscape in LDKT recipients. Venous catheter and PD were the most popular modality in the first quartile of dialysis, and patients using these modalities had shorter times on dialysis compared with those with an AVF. Venous catheter or PD can be considered a viable bridge therapy in patients with living donor availability given their shorter waitlist times. Earlier referral of patients with living donor prospects might further minimize dialysis need.


Subject(s)
Diabetes Mellitus, Type 2 , Renal Dialysis , Adolescent , Catheters, Indwelling , Humans , Kidney , Living Donors , Retrospective Studies
8.
Urology ; 166: 277-282, 2022 08.
Article in English | MEDLINE | ID: mdl-35550384

ABSTRACT

OBJECTIVE: To raise awareness that patients with proximal ureteral stricture who elect for nephrectomy can consider donating the kidney. We present a series of patients undergoing therapeutic living donor nephrectomy (TLDN), a scenario in which a patient undergoing nephrectomy for an underlying medical problem donates the kidney to a person with end-stage renal disease. This practice is underutilized, and only a single TLDN with proximal ureteral stricture has been previously described. We aim to help define the indications, risks, and benefits for patients. METHODS: This is a retrospective case series of seven therapeutic donors with proximal ureteral pathology and stone disease. Patient characteristics, donor work up, operative details, and donor and recipient outcome were collected. RESULTS: All seven donors had proximal ureteral pathology, and six of the seven had nephrolithiasis or ureterolithiasis. After electing for nephrectomy, the mean time to TLDN was 57.9 days. No recipients experienced delayed graft function . Mean follow up was 40.1 months (range 8-131), and the most recent follow-up mean creatinine was 1.08 (mg/dL). Graft and recipient survival is 100%. No recipients developed recurrence of ureteral stricture or stones. CONCLUSION: This is the first series demonstrating patients with proximal ureteral stricture, even with concomitant stone disease, may donate kidneys for transplantation. Recipient outcomes suggest this practice is safe, and appropriately selected patients that have already elected for nephrectomy should receive counseling about this opportunity. Importantly, patients who donate a kidney receive waiting list priority if they ever need a kidney transplant in the future.


Subject(s)
Laparoscopy , Ureteral Obstruction , Constriction, Pathologic/surgery , Humans , Kidney , Living Donors , Nephrectomy/adverse effects , Retrospective Studies , Ureteral Obstruction/surgery
9.
Am J Transplant ; 22(1): 266-273, 2022 01.
Article in English | MEDLINE | ID: mdl-34467618

ABSTRACT

Increasing numbers of compatible pairs are choosing to enter paired exchange programs, but motivations, outcomes, and system-level effects of participation are not well described. Using a linkage of the Scientific Registry of Transplant Recipients and National Kidney Registry, we compared outcomes of traditional (originally incompatible) recipients to originally compatible recipients using the Kaplan-Meier method. We identified 154 compatible pairs. Most pairs sought to improve HLA matching. Compared to the original donor, actual donors were younger (39 vs. 50 years, p < .001), less often female (52% vs. 68%, p < .01), higher BMI (27 vs. 25 kg/m², p = .03), less frequently blood type O (36% vs. 80%, p < .001), and had higher eGFR (99 vs. 94 ml/min/1.73 m², p = .02), with a better LKDPI (median 7 vs. 22, p < .001). We observed no differences in graft failure or mortality. Compatible pairs made 280 additional transplants possible, many in highly sensitized recipients with long wait times. Compatible pair recipients derived several benefits from paired exchange, including better donor quality. Living donor pairs should receive counseling regarding all options available, including kidney paired donation. As more compatible pairs choose to enter exchange programs, consideration should be given to optimizing compatible pair and hard-to-transplant recipient outcomes.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Donor Selection , Female , Humans , Living Donors , Motivation , Transplant Recipients
10.
Am J Transplant ; 21(9): 3014-3020, 2021 09.
Article in English | MEDLINE | ID: mdl-33421310

ABSTRACT

Kidney transplantation reduces mortality in patients with end stage renal disease (ESRD). Decisions about performing kidney transplantation in the setting of a prior cancer are challenging, as cancer recurrence in the setting of immunosuppression can result in poor outcomes. For cancer of the breast, rapid advances in molecular characterization have allowed improved prognostication, which is not reflected in current guidelines. We developed a 19-question survey to determine transplant surgeons' knowledge, practice, and attitudes regarding guidelines for kidney transplantation in women with breast cancer. Of the 129 respondents from 32 states and 14 countries, 74.8% felt that current guidelines are inadequate. Surgeons outside the United States (US) were more likely to consider transplantation in a breast cancer patient without a waiting period (p = .017). Within the US, 29.2% of surgeons in the Western region would consider transplantation without a waiting period, versus 3.6% of surgeons in the East (p = .004). Encouragingly, 90.4% of providers surveyed would consider eliminating wait-times for women with a low risk of cancer recurrence based on the accurate prediction of molecular assays. These findings support the need for new guidelines incorporating individualized recurrence risk to improve care of ESRD patients with breast cancer.


Subject(s)
Breast Neoplasms , Cancer Survivors , Kidney Failure, Chronic , Kidney Transplantation , Breast Neoplasms/surgery , Female , Humans , Kidney Failure, Chronic/surgery , Neoplasm Recurrence, Local , Surveys and Questionnaires , United States
11.
Am J Surg ; 222(1): 234-240, 2021 07.
Article in English | MEDLINE | ID: mdl-33384155

ABSTRACT

BACKGROUND: Opioids are generally discouraged and used sparingly in liver transplant (LT) candidates prior to LT. This study examined the relationship between opioid use at the time of LT and graft and patient survival following transplantation. METHODS: A retrospective single center cohort study of LT recipients from June 2012 to December 2019 was performed. Primary outcomes were graft and patient survival, analyzed with the Kaplan-Meier method and Cox proportional hazards models; primary predictor was active opioid prescription at LT. RESULTS: 751 LT recipients were included; 16% had an opioid prescription at LT. Post-transplant death was significantly greater in opioid users (pvalue<0.001). In a multivariable Cox model examining predictors of death, opioid use remained associated with a significant increase in the risk of death (HR 2.4 CI 1.5-4.0, p < 0.001) even after controlling for other factors. CONCLUSION: Opioid use at LT is associated with a markedly increased risk of death following transplant.


Subject(s)
Analgesics, Opioid/therapeutic use , End Stage Liver Disease/surgery , Graft Rejection/epidemiology , Liver Transplantation/adverse effects , Pain/drug therapy , Aged , Drug Prescriptions/statistics & numerical data , End Stage Liver Disease/complications , End Stage Liver Disease/diagnosis , End Stage Liver Disease/mortality , Female , Graft Rejection/etiology , Graft Survival , Humans , Kaplan-Meier Estimate , Liver Transplantation/standards , Male , Middle Aged , Pain/diagnosis , Pain/epidemiology , Pain/etiology , Retrospective Studies , Risk Assessment/statistics & numerical data , Risk Factors , Severity of Illness Index , Transplant Recipients/statistics & numerical data , United States/epidemiology
12.
Ann Surg ; 273(4): 648-655, 2021 04 01.
Article in English | MEDLINE | ID: mdl-33443907

ABSTRACT

OBJECTIVE: The aim of this study was to evaluate which mesh type yields lower recurrence and complication rates after ventral hernia repair. SUMMARY BACKGROUND DATA: More than 400,000 ventral hernia repairs are performed annually in the United States. Although the most effective method for repairing ventral hernias involves using mesh, whether to use biologic mesh versus synthetic mesh is controversial. METHODS: Single-blind, randomized, controlled, pragmatic clinical trial conducted from March 2014 through October 2018; 165 patients enrolled with an average follow up of 26 months. Patients were randomized 1:1 to have their ventral hernias repaired using either a biologic (porcine) or synthetic (polypropylene) mesh. The primary study outcome measure was hernia recurrence at 2 years. RESULTS: A total of 165 patients (68 men), mean age 55 years, were included in the study with a mean follow-up of 26 months. An intention-to-treat analysis noted that hernias recurred in 25 patients (39.7%) assigned to biologic mesh and in 14 patients (21.9%) assigned to synthetic mesh (P = 0.035) at 2 years. Subgroup analysis identified an increased rate of hernia recurrence in the biologic versus the synthetic mesh group under contaminated wound conditions (50.0% vs 5.9%; P for interaction = 0.041). Postoperative complication rates were similar for the 2 mesh types. CONCLUSIONS: The risk of hernia recurrence was significantly higher for patients undergoing ventral hernia repair with biologic mesh compared to synthetic mesh, with similar rates of postoperative complications. These data indicate that the use of synthetic mesh over biologic mesh to repair ventral hernias is effective and can be endorsed, including under contaminated wound conditions. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02041494.


Subject(s)
Hernia, Ventral/surgery , Herniorrhaphy/methods , Postoperative Complications/prevention & control , Secondary Prevention/methods , Surgical Mesh , Female , Follow-Up Studies , Humans , Male , Middle Aged , Prosthesis Design , Recurrence , Retrospective Studies , Single-Blind Method , Treatment Outcome
13.
Transplantation ; 105(6): 1297-1302, 2021 06 01.
Article in English | MEDLINE | ID: mdl-33347261

ABSTRACT

BACKGROUND: The use of living donor liver transplantation (LDLT) for primary liver transplantation (LT) may quell concerns about allocating deceased donor organs if the need for retransplantation (re-LT) arises because the primary LT did not draw from the limited organ pool. However, outcomes of re-LT after LDLT are poorly studied. The purpose of this study was to analyze the Adult to Adult Living Donor Liver Transplantation Study (A2ALL) data to report outcomes of re-LT after LDLT, with a focus on long-term survival after re-LT. METHODS: A retrospective review of A2ALL data collected between 1998 and 2014 was performed. Patients were excluded if they received a deceased donor LT. Demographic data, postoperative outcomes and complications, graft and patient survival, and predictors of re-LT and patient survival were assessed. RESULTS: Of the 1065 patients who underwent LDLT during the study time period, 110 recipients (10.3%) required re-LT. In multivariable analyses, hepatitis C virus, longer length of stay at LDLT, hepatic artery thrombosis, biliary stricture, infection, and disease recurrence were associated with an increased risk of re-LT. Patient survival among re-LT patients was significantly inferior to those who underwent primary transplant only at 1 (86% versus 92%), 5 (64% versus 82%), and 10 years (44% versus 68%). CONCLUSIONS: Approximately 10% of A2ALL patients who underwent primary LDLT required re-LT. Compared with patients who underwent primary LT, survival among re-LT recipients was worse at 1, 5, and 10 years after LT, and re-LT was associated with a significantly increased risk of death in multivariable modeling (hazard ratios, 2.29; P < 0.001).


Subject(s)
Liver Transplantation , Living Donors , Reoperation , Adult , Age Factors , Female , Humans , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Male , Middle Aged , North America , Reoperation/adverse effects , Reoperation/mortality , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome
14.
Front Surg ; 8: 808733, 2021.
Article in English | MEDLINE | ID: mdl-35071316

ABSTRACT

Background: Scoring systems have been proposed to select donation after circulatory death (DCD) donors and recipients for liver transplantation (LT). We hypothesized that complex scoring systems derived in large datasets might not predict outcomes locally. Methods: Based on 1-year DCD-LT graft survival predictors in multivariate logistic regression models, we designed, validated, and compared a simple index using the University of California, San Francisco (UCSF) cohort (n = 136) and a universal-comprehensive (UC)-DCD score using the United Network for Organ Sharing (UNOS) cohort (n = 5,792) to previously published DCD scoring systems. Results: The total warm ischemia time (WIT)-index included donor WIT (dWIT) and hepatectomy time (dHep). The UC-DCD score included dWIT, dHep, recipient on mechanical ventilation, transjugular-intrahepatic-portosystemic-shunt, cause of liver disease, model for end-stage liver disease, body mass index, donor/recipient age, and cold ischemia time. In the UNOS cohort, the UC-score outperformed all previously published scores in predicting DCD-LT graft survival (AUC: 0.635 vs. ≤0.562). In the UCSF cohort, the total WIT index successfully stratified survival and biliary complications, whereas other scores did not. Conclusion: DCD risk scores generated in large cohorts provide general guidance for safe recipient/donor selection, but they must be tailored based on non-/partially-modifiable local circumstances to expand DCD utilization.

15.
J Natl Compr Canc Netw ; 18(11): 1446-1452, 2020 11.
Article in English | MEDLINE | ID: mdl-33152701

ABSTRACT

Organ donors are systematically screened for infection, whereas screening for malignancy is less rigorous. The true incidence of donor-transmitted malignancies is unknown due to a lack of universal tumor testing in the posttransplant setting. Donor-transmitted malignancy may occur even when not suspected based on donor or recipient factors, including age and time to cancer diagnosis. We describe the detection of a gastrointestinal adenocarcinoma transmitted from a young donor to 4 transplant recipients. Multidimensional histopathologic and genomic profiling showed a CDH1 mutation and MET amplification, consistent with gastric origin. At the time of writing, one patient in this series remains alive and without evidence of cancer after prompt organ explant after cancer was reported in other recipients. Because identification of a donor-derived malignancy changes management, our recommendation is to routinely perform short tandem repeat testing (or a comparable assay) immediately upon diagnosis of cancer in any organ transplant recipient. Routine testing for a donor-origin cancer and centralized reporting of outcomes are necessary to establish a robust evidence base for the future development of clinical practice guidelines.


Subject(s)
Neoplasms , Organ Transplantation , Transplant Recipients , Humans , Incidence , Neoplasms/diagnosis , Neoplasms/genetics , Organ Transplantation/adverse effects , Tissue Donors
16.
Transplant Direct ; 6(10): e610, 2020 Oct.
Article in English | MEDLINE | ID: mdl-33062843

ABSTRACT

BACKGROUND: Sarcopenia has been identified as a predictive variable for surgical outcomes. We hypothesized that sarcopenia could be a key measure to identify frail patients and potentially predict poorer outcomes among recipients of simultaneous pancreas and kidney (SPK) transplants. METHODS: We estimated sarcopenia by measuring psoas muscle mass index (PMI). PMI was assessed on perioperative computed tomography (CT) scans of SPK recipients. RESULTS: Of the 141 patients identified between 2010 and 2018, 107 had a CT scan available and were included in the study. The median follow-up was 4 years (range, 0.5-9.1 y). Twenty-three patients had a low PMI, and 84 patients had a normal PMI. Patient characteristics were similar between the 2 groups except for body mass index, which was significantly lower in low PMI group (P < 0.001). Patient and kidney graft survival were not statistically different between groups (P = 0.851 and P = 0.357, respectively). A multivariate Cox regression analysis showed that patients with a low PMI were 6 times more likely to lose their pancreas allograft (hazard ratios, 5.4; 95% confidence intervals, 1.4-20.8; P = 0.015). Three out of 6 patients lost their pancreas graft due to rejection in the low PMI group, compared with 1 out of 9 patients in the normal PMI group. Among low PMI patients who had a follow-up CT scan, 62.5% (5/8) of those with a functional pancreas graft either improved or resolved sarcopenia, whereas 75.0% (3/4) of those who lost their pancreas graft continued to lose muscle mass. CONCLUSION: Sarcopenia could represent one of the predictors of pancreas graft failure and should be evaluated and potentially optimized in SPK recipients.

17.
Transplantation ; 104(11): 2215-2220, 2020 11.
Article in English | MEDLINE | ID: mdl-32639408

ABSTRACT

BACKGROUND: The novel severe acute respiratory syndrome coronavirus (SARS-CoV-2) disease has transformed innumerable aspects of medical practice, particularly in the field of transplantation. MAIN BODY: Here we describe a single-center approach to creating a generalizable, comprehensive, and graduated set of recommendations to respond in stepwise fashion to the challenges posed by these conditions, and the underlying principles guiding such decisions. CONCLUSIONS: Creation of a stepwise plan will allow transplant centers to respond in a dynamic fashion to the ongoing challenges posed by the COVID-19 pandemic.


Subject(s)
Coronavirus Infections/epidemiology , Organ Transplantation/standards , Pneumonia, Viral/epidemiology , Practice Guidelines as Topic , Betacoronavirus , COVID-19 , Health Resources , Humans , Immunosuppression Therapy , Pandemics , SARS-CoV-2 , Tissue Donors , Waiting Lists
18.
Transplant Proc ; 52(6): 1734-1740, 2020.
Article in English | MEDLINE | ID: mdl-32446691

ABSTRACT

BACKGROUND: In living donors, if both kidneys are considered to be of equal quality, the side with favorable anatomy for transplant is usually selected. A "suboptimal kidney" is a kidney that has a significant abnormality and is chosen to maintain the principle of leaving the better kidney with the donor. We hypothesized that the long-term outcome of suboptimal kidney is inferior to that of the normal kidney. METHODS: In a retrospective analysis of 1744 living donor kidney transplantations performed between 1999 and 2015 at our institution, 172 allografts were considered as a suboptimal kidney (9.9%). Median length of follow-up after living donor kidney transplantation was 59.5 months (interquartile range 26.3-100.8). This study strictly complied with the Helsinki Congress and the Istanbul Declaration regarding donor source. RESULTS: The reasons for suboptimal kidneys were cysts or tumors (46.5%), arterial abnormalities (22.7%), inferior size or function (19.8%), and anatomic abnormalities (11.0%). Suboptimal kidneys showed worse long-term overall graft survival regardless of the reasons (5-year: control vs suboptimal kidney; 88.9% vs 79.3%, P = .001 and 10-year: 73.6% vs 63.5%, P = .004). Suboptimal kidneys showed a 1.6-fold higher adjusted hazard ratio (aHR) of all-cause graft loss (95% confidence interval [CI]: 1.1-2.5, P = .025) and had the same impact as older donor age (≥ 54 years old, aHR: 1.6, 95% CI: 1.1-2.4, P = .008). CONCLUSIONS: The impact of suboptimal kidney should be factored into the donor selection process.


Subject(s)
Graft Survival , Kidney Failure, Chronic/mortality , Kidney Transplantation/mortality , Transplants/pathology , Adult , Donor Selection , Female , Humans , Kidney/pathology , Kidney/surgery , Kidney Failure, Chronic/surgery , Living Donors , Male , Middle Aged , Proportional Hazards Models , Retrospective Studies , Transplantation, Homologous , Transplants/surgery , Treatment Outcome
19.
Transplant Proc ; 52(9): 2596-2600, 2020 Nov.
Article in English | MEDLINE | ID: mdl-32471628

ABSTRACT

BACKGROUND: Although hospital systems have largely halted elective surgical practices in preparing their response to the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) pandemic, transplantation remains an essential and lifesaving surgical practice. To continue transplantation while protecting immunocompromised patients and health care workers, significant restructuring of normal patient care practice habits is required. METHODS: This is a nonrandomized, descriptive study of the abdominal transplant program at 1 academic center (University of California, San Francisco) and the programmatic changes undertaken to safely continue transplantations. Patient transfers, fellow use, and patient discharge education were identified as key areas requiring significant reorganization. RESULTS: The University of California, San Francisco abdominal transplant program took an early and aggressive approach to restructuring inpatient workflows and health care worker staffing. The authors formalized a coronavirus disease 2019 (COVID-19) transfer system to address patients in need of services at their institution while minimizing the risk of SARS-CoV-2 in their transplant ward and used technological approaches to provide virtual telehealth where possible. They also modified their transplant fellow staffing and responsibilities to develop an adequate backup system in case of potential exposures. CONCLUSION: Every transplant program is unique, and an individualized plan to adapt and modify standard clinical practices will be required to continue providing essential transplantation services. The authors' experience highlights areas of attention specific to transplant programs and may provide generalizable solutions to support continued transplantation in the COVID-19 era.


Subject(s)
Coronavirus Infections , Pandemics , Pneumonia, Viral , Transplantation/standards , Workflow , Betacoronavirus , COVID-19 , Humans , Patient Care/methods , Patient Care/standards , SARS-CoV-2 , San Francisco , Transplantation/methods
SELECTION OF CITATIONS
SEARCH DETAIL
...