Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 55.061
Filter
1.
Microsurgery ; 44(5): e31200, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38828556

ABSTRACT

BACKGROUND: Vascularized free tissue transfer has been established as an effective method in the reconstruction of mandibular defects. However, a limited understanding of its efficacy in pediatric patients persists due to its infrequent presentation. The aim of this study is to systematically consolidate the survival and infection rates of free flaps in pediatric mandibular reconstruction. METHODS: A systematic literature search was conducted on Ovid Medline, Embase, and Cochrane Library for studies published up to January 2024. We included peer-reviewed studies reporting on survival and infection outcomes associated with free flap mandibular reconstruction in pediatric patients (<18 years). We performed a random-effects meta-analysis with the inverse-variance weighted approach to estimate survival and infection rates. Heterogeneity was assessed by I2, and publication bias was examined using Egger's test. RESULTS: A total of 26 studies, reporting on 463 free flaps and 439 pediatric patients with a mean age of 10.7 years, were included in our study. Most free flaps originated from the fibula (n = 392/463, 84.7%) and benign tumors were the most common cause for mandibular reconstruction (n = 179/463, 38.7%). The pooled estimate for survival of flaps was 96% (95% CI: 93-97, I2 = 0%), and recipient-site infections were estimated to occur in 9% (95% CI: 6-13, I2 = 0%) of cases. The most common reported complications within the study timeframe were early malocclusion (n = 28/123, 21.4%) and bite abnormalities (18/131, 13.7%). CONCLUSION: Free tissue transfer for mandibular reconstruction in pediatric patients is effective and safe. Further research is required to explore functionality following mandibular reconstruction in diverse pediatric populations.


Subject(s)
Free Tissue Flaps , Mandibular Reconstruction , Humans , Free Tissue Flaps/transplantation , Mandibular Reconstruction/methods , Child , Graft Survival , Surgical Wound Infection/epidemiology , Surgical Wound Infection/etiology
2.
Nat Commun ; 15(1): 4309, 2024 Jun 03.
Article in English | MEDLINE | ID: mdl-38830846

ABSTRACT

The efficacy of costimulation blockade with CTLA4-Ig (belatacept) in transplantation is limited due to T cell-mediated rejection, which also persists after induction with anti-thymocyte globulin (ATG). Here, we investigate why ATG fails to prevent costimulation blockade-resistant rejection and how this barrier can be overcome. ATG did not prevent graft rejection in a murine heart transplant model of CTLA4-Ig therapy and induced a pro-inflammatory cytokine environment. While ATG improved the balance between regulatory T cells (Treg) and effector T cells in the spleen, it had no such effect within cardiac allografts. Neutralizing IL-6 alleviated graft inflammation, increased intragraft Treg frequencies, and enhanced intragraft IL-10 and Th2-cytokine expression. IL-6 blockade together with ATG allowed CTLA4-Ig therapy to achieve long-term, rejection-free heart allograft survival. This beneficial effect was abolished upon Treg depletion. Combining ATG with IL-6 blockade prevents costimulation blockade-resistant rejection, thereby eliminating a major impediment to clinical use of costimulation blockers in transplantation.


Subject(s)
Abatacept , Antilymphocyte Serum , Graft Rejection , Graft Survival , Heart Transplantation , Interleukin-6 , Mice, Inbred C57BL , T-Lymphocytes, Regulatory , Animals , Graft Rejection/immunology , Graft Rejection/prevention & control , Interleukin-6/metabolism , Heart Transplantation/adverse effects , Mice , T-Lymphocytes, Regulatory/immunology , T-Lymphocytes, Regulatory/drug effects , Abatacept/pharmacology , Abatacept/therapeutic use , Antilymphocyte Serum/pharmacology , Antilymphocyte Serum/therapeutic use , Graft Survival/drug effects , Graft Survival/immunology , Mice, Inbred BALB C , Allografts/immunology , Male , Immunosuppressive Agents/pharmacology , Lymphocyte Depletion , Interleukin-10/metabolism , Interleukin-10/immunology
3.
Transpl Int ; 37: 12864, 2024.
Article in English | MEDLINE | ID: mdl-38832357

ABSTRACT

Simultaneous pancreas-kidney (SPK) transplantation improves quality of life and limits progression of diabetic complications. There is reluctance to accept pancreata from donors with abnormal blood tests, due to concern of inferior outcomes. We investigated whether donor amylase and liver blood tests (markers of visceral ischaemic injury) predict pancreas graft outcome using the UK Transplant Registry (2016-2021). 857 SPK recipients were included (619 following brainstem death, 238 following circulatory death). Peak donor amylase ranged from 8 to 3300 U/L (median = 70), and this had no impact on pancreas graft survival when adjusting for multiple confounders (aHR = 0.944, 95% CI = 0.754-1.81). Peak alanine transaminases also did not influence pancreas graft survival in multivariable models (aHR = 0.967, 95% CI = 0.848-1.102). Restricted cubic splines were used to assess associations between donor blood tests and pancreas graft survival without assuming linear relationships; these confirmed neither amylase, nor transaminases, significantly impact pancreas transplant outcome. This is the largest, most statistically robust study evaluating donor blood tests and transplant outcome. Provided other factors are acceptable, pancreata from donors with mild or moderately raised amylase and transaminases can be accepted with confidence. The use of pancreas grafts from such donors is therefore a safe, immediate, and simple approach to expand the donor pool to reach increasing demands.


Subject(s)
Amylases , Graft Survival , Kidney Transplantation , Pancreas Transplantation , Tissue Donors , Humans , Female , Male , Middle Aged , Adult , Amylases/blood , Cohort Studies , Alanine Transaminase/blood , United Kingdom , Hematologic Tests , Registries
5.
Exp Clin Transplant ; 22(3): 189-199, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38695588

ABSTRACT

OBJECTIVES: Kidney transplant survival can be improved with better graft surveillance postoperatively. In the quest to explore new technologies, we explored the feasibility of an implantable Doppler probe as a blood flow monitoring device in kidney transplant patients. This qualitative study was embeddedin a feasibility trial and aimed to test the device's clinical acceptability and obtain suggestions for the development of the intervention. Objectives included exploring the experiences of feasibility study participants and identifying barriers to the implementation of implantable Doppler probes in clinical practice. MATERIALS AND METHODS: We conducted semi-structured interviews containing open-ended questions with 12 feasibility study participants recruited by purposive sampling. All interviews were audio-recorded with verbatim transcription. Thematic data analysis was performed at the latent level by using an inductive approach with a previously published 6-phase guide. RESULTS: Three key themes emerged: (1) perceived value of the intervention in clinical practice, (2) challenges and barriers to implementation of the intervention, and (3) suggestions forthe development of the intervention. Due to functional limitations and lack of research, medical professional participants revealed clinical equipoise regarding the utility of implantable Doppler probes. However,the device was well received by patient participants. Challenges included device training needs for medical professionals and educational sessions for patients. Innovative ideas for development included the insertion of a display screen, adopting disposable units to reduce overall cost, online access allowing remote monitoring, decreasing external monitoring unit size, and integrating a wireless connection with the probe to reduce signal errors and increase patient safety. CONCLUSIONS: The clinical need for blood flow sensing technology in kidney transplants has been widely acknowledged. Implantable Doppler probes may be a beneficial adjunct in the early postoperative surveillance of kidney transplant patients. However, the device's technical limitations are the main challenges to its acceptance in clinical practice.


Subject(s)
Feasibility Studies , Interviews as Topic , Kidney Transplantation , Predictive Value of Tests , Qualitative Research , Ultrasonography, Doppler , Humans , Kidney Transplantation/adverse effects , Female , Male , Ultrasonography, Doppler/instrumentation , Middle Aged , Adult , Treatment Outcome , Equipment Design , Renal Circulation , Aged , Health Knowledge, Attitudes, Practice , Graft Survival , Blood Flow Velocity
6.
Exp Clin Transplant ; 22(3): 185-188, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38695587

ABSTRACT

OBJECTIVES: Before the advent of direct-acting antiviral therapy for hepatitis C virus, a large proportion of kidneys from donors with hepatitis C viremia were discarded. Hepatitis C virus is now amenable to effective treatment with excellent seronegativity rates. In this study, we review the outcomes of hepatitis C viremic kidneys transplanted into hepatitis C-naive recipients. MATERIALS AND METHODS: In this retrospective observational study, we examined 6 deceased donor kidneys with hepatitis C viremia that were transplanted into hepatitis C-naive recipients between March 2020 and April 2021 at a single center. Because of health insurance constraints, patients were treated for hepatitis C virus with glecaprevir/pibrentasvir for 8 weeks following seroconversion posttransplant. Primary outcome measured was viral seroconversion; secondary outcomes included graft function, posttransplant complications, and all-cause mortality. RESULTS: On average, patients seroconverted 6 days (range, 4-10 d) after transplant and began treatment 26 days (range, 15-37 d) after seroconversion. An 8-week course of antiviral treatment was successful in preventing acute hepatitis C virus infection in all patients. Posttransplant median creatinine was 1.96 mg/dL (range, 1-4.55 mg/dL), whereas median estimated glomerular filtration rate was 41.33 mL/min/1.73 m2 (range, 17-85 mL/min/1.73 m2). Patient survival rate was 66.7%, and death-censored graft survival rate was 100%. Two patients died from unrelated reasons: 1 from acute respiratory failure secondary to SARS-CoV-2 infection and 1 from posttransplant lymphoproliferative disorder. Two patients developed allograft rejection posttransplant (1 developed antibody mediated rejection, 1 developed borderline T-cell-mediated cellular rejection). Other major complications included neutropenia, fungal rash, SARS-CoV-2 infection, cytomegalovirus, BK virus, and Epstein-Barr virus reactivation. CONCLUSIONS: Use of hepatitis C-viremic donor kidneys for transplant is a safe option and has great potential to increase the kidney donor pool, as long as high index of suspicion is maintained for allograft rejection and opportunistic infections.


Subject(s)
Antiviral Agents , Benzimidazoles , Donor Selection , Hepatitis C , Kidney Transplantation , Pyrrolidines , Quinoxalines , Viremia , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Retrospective Studies , Male , Female , Middle Aged , Antiviral Agents/therapeutic use , Hepatitis C/diagnosis , Hepatitis C/drug therapy , Treatment Outcome , Viremia/diagnosis , Viremia/virology , Adult , Time Factors , Risk Factors , Tissue Donors , Drug Combinations , Graft Survival , Aged , Rural Health Services , Seroconversion
7.
Exp Clin Transplant ; 22(3): 207-213, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38695589

ABSTRACT

OBJECTIVES: Modern immunosuppressive regimens have reduced rejection episodes in renal allograft recipients but have increased the risk of opportunistic infections. Infections are considered to be the second leading cause of death after cardiovascular complications in renal allograft recipients. Data on opportunistic infections affecting the allograft itself are scarce. The present study describes the spectrum of renal opportunistic infections and their outcomes diagnosed on renal allograft biopsies and nephrectomy specimens. MATERIALS AND METHODS: Our retrospective observational study was conducted from December 2011 to December 2021. We analyzed infectious episodes diagnosed on renal allograft biopsies or graft nephrectomy specimens. We obtained clinical, epidemiological, and laboratory details for analyses from hospital records. RESULTS: BK virus nephropathy was the most common opportunistic infection affecting the allograft, accounting for 47% of cases, followed by bacterial graft pyelonephritis (25%). Mucormycosis was the most common fungal infection. The diagnosis of infection from day of transplant ranged from 14 days to 39 months. Follow-up periods ranged from 1 to 10 years. Mortality was highest among patients with opportunistic fungal infection (62%), followed by viral infections, and graft failure rate was highest in patients with graft pyelonephritis (50%). Among patients with BK polyomavirus nephropathy, 45% had stable graft function compared with just 33% of patients with bacterial graft pyelonephritis. CONCLUSIONS: BK polyoma virus infection was the most common infection affecting the renal allograft in our study. Although fungal infections caused the highest mortality among our patients, bacterial graft pyelonephritis was responsible for maximum graft failure. Correctly identifying infections on histology is important so that graft and patient life can be prolonged.


Subject(s)
Kidney Transplantation , Nephrectomy , Opportunistic Infections , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Retrospective Studies , Male , Female , Nephrectomy/adverse effects , Middle Aged , Adult , Biopsy , Treatment Outcome , Time Factors , Risk Factors , Opportunistic Infections/immunology , Opportunistic Infections/mortality , Opportunistic Infections/diagnosis , Opportunistic Infections/microbiology , Opportunistic Infections/virology , Opportunistic Infections/epidemiology , Allografts , Living Donors , Graft Survival , Turkey/epidemiology , Aged , Pyelonephritis/microbiology , Pyelonephritis/diagnosis , Pyelonephritis/mortality , Polyomavirus Infections/diagnosis , Polyomavirus Infections/mortality , Polyomavirus Infections/virology , Polyomavirus Infections/epidemiology , Polyomavirus Infections/immunology
8.
Transpl Int ; 37: 12591, 2024.
Article in English | MEDLINE | ID: mdl-38694489

ABSTRACT

Tacrolimus is pivotal in pancreas transplants but poses challenges in maintaining optimal levels due to recipient differences. This study aimed to explore the utility of time spent below the therapeutic range and intrapatient variability in predicting rejection and de novo donor-specific antibody (dnDSA) development in pancreas graft recipients. This retrospective unicentric study included adult pancreas transplant recipients between January 2006 and July 2020. Recorded variables included demographics, immunosuppression details, HLA matching, biopsy results, dnDSA development, and clinical parameters. Statistical analysis included ROC curves, sensitivity, specificity, and predictive values. A total of 131 patients were included. Those with biopsy-proven acute rejection (BPAR, 12.2%) had more time (39.9% ± 24% vs. 25.72% ± 21.57%, p = 0.016) and tests (41.95% ± 13.57% vs. 29.96% ± 17.33%, p = 0.009) below therapeutic range. Specific cutoffs of 31.5% for time and 34% for tests below the therapeutic range showed a high negative predictive value for BPAR (93.98% and 93.1%, respectively). Similarly, patients with more than 34% of tests below the therapeutic range were associated with dnDSA appearance (38.9% vs. 9.4%, p = 0.012; OR 6.135, 1.346-27.78). In pancreas transplantation, maintaining optimal tacrolimus levels is crucial. Suboptimal test percentages below the therapeutic range prove valuable in identifying acute graft rejection risk.


Subject(s)
Graft Rejection , Immunosuppressive Agents , Pancreas Transplantation , Tacrolimus , Humans , Graft Rejection/immunology , Tacrolimus/therapeutic use , Male , Retrospective Studies , Female , Adult , Immunosuppressive Agents/therapeutic use , Middle Aged , Isoantibodies/blood , Isoantibodies/immunology , Tissue Donors , Time Factors , Biopsy , Graft Survival
9.
Pediatr Transplant ; 28(4): e14742, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38702926

ABSTRACT

BACKGROUND: As more pediatric patients become candidates for heart transplantation (HT), understanding pathological predictors of outcome and the accuracy of the pretransplantation evaluation are important to optimize utilization of scarce donor organs and improve outcomes. The authors aimed to investigate explanted heart specimens to identify pathologic predictors that may affect cardiac allograft survival after HT. METHODS: Explanted pediatric hearts obtained over an 11-year period were analyzed to understand the patient demographics, indications for transplant, and the clinical-pathological factors. RESULTS: In this study, 149 explanted hearts, 46% congenital heart defects (CHD), were studied. CHD patients were younger and mean pulmonary artery pressure and resistance were significantly lower than in cardiomyopathy patients. Twenty-one died or underwent retransplantation (14.1%). Survival was significantly higher in the cardiomyopathy group at all follow-up intervals. There were more deaths and the 1-, 5- and 7-year survival was lower in patients ≤10 years of age at HT. Early rejection was significantly higher in CHD patients exposed to homograft tissue, but not late rejection. Mortality/retransplantation rate was significantly higher and allograft survival lower in CHD hearts with excessive fibrosis of one or both ventricles. Anatomic diagnosis at pathologic examination differed from the clinical diagnosis in eight cases. CONCLUSIONS: Survival was better for the cardiomyopathy group and patients >10 years at HT. Prior homograft use was associated with a higher prevalence of early rejection. Ventricular fibrosis (of explant) was a strong predictor of outcome in the CHD group. We presented several pathologic findings in explanted pediatric hearts.


Subject(s)
Graft Rejection , Graft Survival , Heart Defects, Congenital , Heart Transplantation , Humans , Child , Male , Female , Child, Preschool , Infant , Adolescent , Heart Defects, Congenital/surgery , Heart Defects, Congenital/pathology , Graft Rejection/pathology , Graft Rejection/epidemiology , Retrospective Studies , Treatment Outcome , Follow-Up Studies , Cardiomyopathies/surgery , Cardiomyopathies/pathology , Reoperation , Infant, Newborn , Survival Analysis
10.
Pediatr Transplant ; 28(4): e14771, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38702924

ABSTRACT

BACKGROUND: We examined the combined effects of donor age and graft type on pediatric liver transplantation outcomes with an aim to offer insights into the strategic utilization of these donor and graft options. METHODS: A retrospective analysis was conducted using a national database on 0-2-year-old (N = 2714) and 3-17-year-old (N = 2263) pediatric recipients. These recipients were categorized based on donor age (≥40 vs <40 years) and graft type. Survival outcomes were analyzed using the Kaplan-Meier and Cox proportional hazards models, followed by an intention-to-treat (ITT) analysis to examine overall patient survival. RESULTS: Living and younger donors generally resulted in better outcomes compared to deceased and older donors, respectively. This difference was more significant among younger recipients (0-2 years compared to 3-17 years). Despite this finding, ITT survival analysis showed that donor age and graft type did not impact survival with the exception of 0-2-year-old recipients who had an improved survival with a younger living donor graft. CONCLUSIONS: Timely transplantation has the largest impact on survival in pediatric recipients. Improving waitlist mortality requires uniform surgical expertise at many transplant centers to provide technical variant graft (TVG) options and shed the conservative mindset of seeking only the "best" graft for pediatric recipients.


Subject(s)
Graft Survival , Kaplan-Meier Estimate , Liver Transplantation , Tissue Donors , Humans , Child, Preschool , Retrospective Studies , Child , Adolescent , Male , Female , Infant , Age Factors , Infant, Newborn , Proportional Hazards Models , Adult , Treatment Outcome , Living Donors
11.
Clin Transplant ; 38(5): e15329, 2024 May.
Article in English | MEDLINE | ID: mdl-38722085

ABSTRACT

BACKGROUND: Immunosuppression reduction for BK polyoma virus (BKV) must be balanced against risk of adverse alloimmune outcomes. We sought to characterize risk of alloimmune events after BKV within context of HLA-DR/DQ molecular mismatch (mMM) risk score. METHODS: This single-center study evaluated 460 kidney transplant patients on tacrolimus-mycophenolate-prednisone from 2010-2021. BKV status was classified at 6-months post-transplant as "BKV" or "no BKV" in landmark analysis. Primary outcome was T-cell mediated rejection (TCMR). Secondary outcomes included all-cause graft failure (ACGF), death-censored graft failure (DCGF), de novo donor specific antibody (dnDSA), and antibody-mediated rejection (ABMR). Predictors of outcomes were assessed in Cox proportional hazards models including BKV status and alloimmune risk defined by recipient age and molecular mismatch (RAMM) groups. RESULTS: At 6-months post-transplant, 72 patients had BKV and 388 had no BKV. TCMR occurred in 86 recipients, including 27.8% with BKV and 17% with no BKV (p = .05). TCMR risk was increased in recipients with BKV (HR 1.90, (95% CI 1.14, 3.17); p = .01) and high vs. low-risk RAMM group risk (HR 2.26 (95% CI 1.02, 4.98); p = .02) in multivariable analyses; but not HLA serological MM in sensitivity analysis. Recipients with BKV experienced increased dnDSA in univariable analysis, and there was no association with ABMR, DCGF, or ACGF. CONCLUSIONS: Recipients with BKV had increased risk of TCMR independent of induction immunosuppression and conventional alloimmune risk measures. Recipients with high-risk RAMM experienced increased TCMR risk. Future studies on optimizing immunosuppression for BKV should explore nuanced risk stratification and may consider novel measures of alloimmune risk.


Subject(s)
BK Virus , Graft Rejection , Graft Survival , Kidney Function Tests , Kidney Transplantation , Polyomavirus Infections , Tumor Virus Infections , Viremia , Humans , Kidney Transplantation/adverse effects , BK Virus/immunology , BK Virus/isolation & purification , Female , Male , Polyomavirus Infections/immunology , Polyomavirus Infections/virology , Polyomavirus Infections/complications , Middle Aged , Graft Rejection/etiology , Graft Rejection/immunology , Follow-Up Studies , Tumor Virus Infections/immunology , Tumor Virus Infections/virology , Viremia/immunology , Viremia/virology , Prognosis , Risk Factors , Glomerular Filtration Rate , Adult , Postoperative Complications , Immunosuppressive Agents/therapeutic use , Immunosuppressive Agents/adverse effects , Retrospective Studies , Kidney Failure, Chronic/surgery , Kidney Failure, Chronic/immunology , Kidney Diseases/virology , Kidney Diseases/immunology , Kidney Diseases/surgery , Transplant Recipients
12.
Physiol Res ; 73(2): 217-225, 2024 Apr 30.
Article in English | MEDLINE | ID: mdl-38710053

ABSTRACT

An analytical method for studying DNA degradation by electrophoresis after cell lysis and visualization of DNA fragments with fluorescent dye, comet assay, was used to evaluate the viability of the endothelial layer of human arterial grafts with the aim of identifying the procedure that will least damage the tissue before cryopreservation. Four groups of samples were studied: cryopreserved arterial grafts that were thawed in two different ways, slowly lasting 2 hours or rapidly for approx. 7 minutes. Arterial grafts that were collected as part of multiorgan procurement with minimal warm ischemia time. Cadaveric grafts were taken as part of the autopsy, so they have a more extended period of warm ischemia. The HeadDNA (%) parameter and others commonly used parameters like TailDNA (%). TailMoment, TailLength, OliveMoment, TailMoment to characterize the comet were used to assess viability in this study. The ratio of non-decayed to decayed nuclei was determined from the values found. This ratio for cadaveric grafts was 0.63, for slowly thawed cryopreserved grafts 2.9, for rapidly thawed cryopreserved grafts 1.9, and for multi-organ procurement grafts 0.68. The results of the study confirmed the assumption that the allografts obtained from cadaveric donors are the least suitable. On the other hand, grafts obtained from multiorgan donors are better in terms of viability monitored by comet assay. Keywords: Arterial grafts, Cryopreservation, Cadaveric, Multiorgan procurement, Viability, Comet assay.


Subject(s)
Comet Assay , Cryopreservation , Humans , Cadaver , Arteries/transplantation , Graft Survival/physiology
13.
Microsurgery ; 44(4): e31186, 2024 May.
Article in English | MEDLINE | ID: mdl-38716649

ABSTRACT

INTRODUCTION: Free flap transfer for head and neck defects has gained worldwide acceptance. Because flap failure is a devastating outcome, studies have attempted to identify risk factors-including renal failure. We sought to determine whether end-stage renal disease (ESRD) patients undergoing dialysis are at increased risk of flap failure following microsurgical head and neck reconstruction. PATIENTS AND METHODS: The study's participants were patients who underwent free flap reconstruction in the head and neck region at Hualien Tzu Chi Hospital between January 2010 and December 2019. We used the National Health Insurance "Specific Diagnosis and Treatment Code" to identify patients undergoing dialysis; these patients comprised the dialysis group, whose members were matched to a non-dialysis group for age and gender. The dependent variables were flap survival rate, take-back rate, and flap failure risk between the dialysis and non-dialysis groups. RESULTS: We included 154 patients in the dialysis (n = 14) and non-dialysis (n = 140) groups. The groups were similar in terms of age and most comorbidities, except diabetes mellitus, hypertension, and coronary artery disease, which were more prevalent in the dialysis group. The dialysis and non-dialysis groups had similar flap survival rates (100% vs. 92.9%; p = .600). Twenty-three patients underwent take-back surgery, most in the non-dialysis group (14.3% vs. 15.0%; p = 1.000). Patients in the dialysis group were more likely to have prolonged intensive care unit stays; however, dialysis alone did not predict flap failure (OR: 0.83; p = .864). CONCLUSION: This study found no significant differences in free flap survival and take-back rates between patients with and without dialysis. Dialysis did not increase the risk of flap failure following microsurgical head and neck reconstruction in this study; however, prospective, randomized controlled trials are needed.


Subject(s)
Free Tissue Flaps , Head and Neck Neoplasms , Kidney Failure, Chronic , Microsurgery , Plastic Surgery Procedures , Renal Dialysis , Humans , Male , Female , Kidney Failure, Chronic/therapy , Kidney Failure, Chronic/complications , Middle Aged , Free Tissue Flaps/transplantation , Plastic Surgery Procedures/methods , Microsurgery/methods , Head and Neck Neoplasms/surgery , Head and Neck Neoplasms/complications , Aged , Retrospective Studies , Graft Survival , Risk Factors , Adult
14.
Clin Transplant ; 38(5): e15325, 2024 May.
Article in English | MEDLINE | ID: mdl-38716770

ABSTRACT

BACKGROUND/AIMS: Direct-acting antiviral (DAA) therapy has revolutionized solid organ transplantation by providing an opportunity to utilize organs from HCV-viremic donors. Though transplantation of HCV-viremic donor organs into aviremic recipients is safe in the short term, midterm data on survival and post-transplant complications is lacking. We provide a midterm assessment of complications of lung transplantation (LT) up to 2 years post-transplant, including patient and graft survival between HCV-viremic transplantation (D+) and HCV-aviremic transplantation (D-). METHODS: This is a retrospective cohort study including 500 patients from 2018 to 2022 who underwent LT at our quaternary care institution. Outcomes of patients receiving D+ grafts were compared to those receiving D- grafts. Recipients of HCV antibody+ but PCR- grafts were treated as D- recipients. RESULTS: We identified 470 D- and 30 D+ patients meeting inclusion criteria. Crude mortality did not differ between groups (p = .43). Patient survival at years 1 and 2 did not differ between D+ and D- patients (p = .89, p = .87, respectively), and graft survival at years 1 and 2 did not differ between the two groups (p = .90, p = .88, respectively). No extrahepatic manifestations or fibrosing cholestatic hepatitis (FCH) occurred among D+ recipients. D+ and D- patients had similar rates of post-transplant chronic lung allograft rejection (CLAD) (p = 6.7% vs. 12.8%, p = .3), acute cellular rejection (60.0% vs. 58.0%, p = .8) and antibody-mediated rejection (16.7% vs. 14.2%, p = .7). CONCLUSION: There is no difference in midterm patient or graft survival between D+ and D-LT. No extrahepatic manifestations of HCV occurred. No differences in any type of rejection including CLAD were observed, though follow-up for CLAD was limited. These results provide additional support for the use of HCV-viremic organs in selected recipients in LT.


Subject(s)
Graft Rejection , Graft Survival , Hepacivirus , Hepatitis C , Lung Transplantation , Postoperative Complications , Viremia , Humans , Lung Transplantation/adverse effects , Female , Male , Retrospective Studies , Middle Aged , Follow-Up Studies , Prognosis , Hepatitis C/surgery , Hepatitis C/virology , Hepacivirus/isolation & purification , Viremia/virology , Viremia/etiology , Survival Rate , Graft Rejection/etiology , Risk Factors , Tissue Donors/supply & distribution , Adult , Antiviral Agents/therapeutic use , Transplant Recipients
15.
Transpl Int ; 37: 12605, 2024.
Article in English | MEDLINE | ID: mdl-38711816

ABSTRACT

Patients of Asian and black ethnicity face disadvantage on the renal transplant waiting list in the UK, because of lack of human leucocyte antigen and blood group matched donors from an overwhelmingly white deceased donor pool. This study evaluates outcomes of renal allografts from Asian and black donors. The UK Transplant Registry was analysed for adult deceased donor kidney only transplants performed between 2001 and 2015. Asian and black ethnicity patients constituted 12.4% and 6.7% of all deceased donor recipients but only 1.6% and 1.2% of all deceased donors, respectively. Unadjusted survival analysis demonstrated significantly inferior long-term allograft outcomes associated with Asian and black donors, compared to white donors. On Cox-regression analysis, Asian donor and black recipient ethnicities were associated with poorer outcomes than white counterparts, and on ethnicity matching, compared with the white donor-white recipient baseline group and adjusting for other donor and recipient factors, 5-year graft outcomes were significantly poorer for black donor-black recipient, Asian donor-white recipient, and white donor-black recipient combinations in decreasing order of worse unadjusted 5-year graft survival. Increased deceased donation among ethnic minorities could benefit the recipient pool by increasing available organs. However, it may require a refined approach to enhance outcomes.


Subject(s)
Asian People , Black People , Graft Survival , Kidney Transplantation , Tissue Donors , Humans , United Kingdom , Male , Female , Adult , Middle Aged , Tissue Donors/supply & distribution , Black People/statistics & numerical data , Registries , White People/statistics & numerical data , Treatment Outcome , Aged , Proportional Hazards Models , Waiting Lists , Transplant Recipients/statistics & numerical data
16.
Pediatr Transplant ; 28(4): e14785, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38766986

ABSTRACT

BACKGROUND: Long-term outcomes in pediatric kidney transplantation remain suboptimal, largely related to chronic rejection. Creatinine is a late marker of renal injury, and more sensitive, early markers of allograft injury are an active area of current research. METHODS: This is an educational review summarizing existing strategies for monitoring for rejection in kidney transplant recipients. RESULTS: We summarize supporting currently available clinical tests, including surveillance biopsy, donor specific antibodies, and donor-derived cell free DNA, as well as the potential limitations of these studies. In addition, we review the current avenues of active research, including transcriptomics, proteomics, metabolomics, and torque tenovirus levels. CONCLUSION: Advancing the use of noninvasive immune monitoring will depend on well-designed multicenter trials that include patients with stable graft function, include biopsy results on all patients, and can demonstrate both association with a patient-relevant clinical endpoint such as graft survival or change in glomerular filtration rate and a potential timepoint for intervention.


Subject(s)
Graft Rejection , Kidney Transplantation , Humans , Graft Rejection/immunology , Child , Monitoring, Immunologic/methods , Biomarkers/metabolism , Biopsy , Graft Survival/immunology
17.
Clin Transplant ; 38(5): e15339, 2024 May.
Article in English | MEDLINE | ID: mdl-38775413

ABSTRACT

Simultaneous pancreas-kidney transplantation (SPKT) is the best treatment for selected individuals with type 1 diabetes mellitus and end-stage renal disease. Despite advances in surgical techniques, donor and recipient selection, and immunosuppressive therapies, SPKT remains a complex procedure with associated surgical complications and adverse consequences. We conducted a retrospective study that included 263 SPKT procedures performed between May 2000, and December 2022. A total of 65 patients (25%) required at least one relaparotomy, resulting in an all-cause relaparotomy rate of 2.04 events per 100 in-hospital days. Lower donor body mass index was identified as an independent factor associated with reoperation (OR .815; 95% CI:  .725-.917, p = .001). Technical failure (TF) occurred in 9.9% of cases, primarily attributed to pancreas graft thrombosis, intra-abdominal infections, bleeding, and anastomotic leaks. Independent predictors of TF at 90 days included donor age above 36 years (HR 2.513; 95% CI 1.162-5.434), previous peritoneal dialysis (HR 2.503; 95% CI 1.149-5.451), and specific pancreas graft reinterventions. The findings highlight the importance of carefully considering donor and recipient factors in SPKT. The incidence of TF in our study population aligns with the recent series. Continuous efforts should focus on identifying and mitigating potential risk factors to enhance SPKT outcomes, thereby reducing post-transplant complications.


Subject(s)
Diabetes Mellitus, Type 1 , Graft Survival , Kidney Failure, Chronic , Kidney Transplantation , Pancreas Transplantation , Postoperative Complications , Humans , Female , Male , Pancreas Transplantation/adverse effects , Retrospective Studies , Kidney Transplantation/adverse effects , Adult , Postoperative Complications/etiology , Follow-Up Studies , Risk Factors , Kidney Failure, Chronic/surgery , Prognosis , Diabetes Mellitus, Type 1/surgery , Diabetes Mellitus, Type 1/complications , Graft Rejection/etiology , Middle Aged , Reoperation/statistics & numerical data , Kidney Function Tests , Survival Rate , Glomerular Filtration Rate
18.
Transpl Int ; 37: 12774, 2024.
Article in English | MEDLINE | ID: mdl-38779355

ABSTRACT

Lung transplantation (LuTx) is an established treatment for patients with end-stage lung diseases, however, outcomes are limited by acute and chronic rejection. One aspect that has received increasing attention is the role of the host's humoral alloresponse, particularly the formation of de novo donor-specific antibodies (dnDSAs). The aim of this study was to investigate the clinical significance of transient and persistent dnDSAs and to understand their impact on outcomes after LuTx. A retrospective analysis was conducted using DSA screening data from LuTx recipients obtained at the Medical University of Vienna between February 2016 and March 2021. Of the 405 LuTx recipients analyzed, 205 patients developed dnDSA during the follow-up period. Among these, 167 (81%) had transient dnDSA and 38 (19%) persistent dnDSA. Persistent but not transient dnDSAs were associated with chronic lung allograft dysfunction (CLAD) and antibody-mediated rejection (AMR) (p < 0.001 and p = 0.006, respectively). CLAD-free survival rates for persistent dnDSAs at 1-, 3-, and 5-year post-transplantation were significantly lower than for transient dnDSAs (89%, 59%, 56% vs. 91%, 79%, 77%; p = 0.004). Temporal dynamics of dnDSAs after LuTx have a substantial effect on patient outcomes. This study underlines that the persistence of dnDSAs poses a significant risk to graft and patient survival.


Subject(s)
Graft Rejection , Isoantibodies , Lung Transplantation , Tissue Donors , Humans , Male , Female , Retrospective Studies , Middle Aged , Graft Rejection/immunology , Adult , Isoantibodies/immunology , Isoantibodies/blood , Graft Survival/immunology , Aged
19.
Ann Plast Surg ; 92(6): 703-710, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38768024

ABSTRACT

INTRODUCTION: Breast reconstruction with the deep inferior epigastric perforator (DIEP) flap is the current gold-standard autologous option. The profunda artery perforator (PAP) and lumbar artery perforator (LAP) flaps have more recently been described as alternatives for patients who are not candidates for a DIEP flap. The aim of this study was to review the survival and complication rates of PAP and LAP flaps, using the DIEP flap as a benchmark. METHODS: A literature search was conducted using PubMed, MEDLINE, Embase, BIOSIS, Web of Science, and Cochrane databases. Papers were screened by title and abstract, and full texts reviewed by three independent blinded reviewers. Quality was assessed using MINORS criteria. RESULTS: Sixty-three studies were included, for a total of 745 PAP, 62 stacked PAP, 187 LAP, and 23,748 DIEP flap breast reconstructions. The PAP (98.3%) had comparable success rate to DIEP (98.4%), and the stacked PAP (88.7%) and LAP (92.5%) success rate was significantly lower (P < 0.0001). The PAP and LAP groups both had a low incidence of fat necrosis. However, the revision rate for the LAP group was 16.1% whereas the PAP group was 3.3%. Donor site wound dehiscence rate was 2.9 in the LAP group and 9.1% in the PAP group. CONCLUSIONS: Profunda artery perforator and DIEP flaps demonstrate very high rates of overall survival. The LAP flap has a lower survival rate. This review highlights the survival and complication rates of these alternative flaps, which may help clinicians in guiding autologous reconstruction technique when a DIEP flap is unavailable.


Subject(s)
Mammaplasty , Perforator Flap , Humans , Mammaplasty/methods , Perforator Flap/blood supply , Perforator Flap/transplantation , Female , Graft Survival , Postoperative Complications/epidemiology , Epigastric Arteries/transplantation
20.
Ann Plast Surg ; 92(6): 700-702, 2024 Jun 01.
Article in English | MEDLINE | ID: mdl-38768023

ABSTRACT

BACKGROUND: There is currently no standardization in the field of research on fat grafts in rats, which is one of the most popular topics in plastic surgery. The aim of our study was to demonstrate the effects of selecting paraepididymal fat grafts as the donor area on enhancing the reliability of fat graft studies. METHODS: In this study, 12 male Sprague-Dawley rats were used to obtain adipose grafts from both inguinal and paraepididymal regions. After measuring the graft weights, they were subjected to histological examination using hematoxylin-eosin staining, as well as immunohistochemical staining with antiperilipin antibody. Purity of the samples, viability of adipose cells, and the presence of lymph nodes within the grafts were analyzed. RESULTS: The purity of adipose cells in graft samples obtained from the paraepididymal region was found to be 98.1% compared with the total sample. In contrast, the purity of adipose cells in graft samples obtained from the inguinal region was 58.37%. Hematoxylin-eosin staining revealed significantly higher adipocyte viability and vascularity in the paraepididymal region compared with the inguinal region (P = 0.0134). Conversely, lymphatic tissue content in samples obtained from the inguinal region was significantly higher compared with paraepididymal adipose tissue samples (P < 0.0001). Immunohistochemical staining with antiperilipin antibody showed a denser and more uniform staining pattern in paraepididymal adipose grafts (P < 0.0001). CONCLUSIONS: Using paraepididymal fat, devoid of lymphatic tissue, naturally eliminates 2 critical biases (estrogen and lymphatic tissue), enhancing the standardization and reliability of fat graft survival studies.


Subject(s)
Adipose Tissue , Epididymis , Graft Survival , Rats, Sprague-Dawley , Animals , Male , Adipose Tissue/transplantation , Rats
SELECTION OF CITATIONS
SEARCH DETAIL
...