Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 32
Filter
1.
Int J Psychiatry Med ; : 912174231205660, 2023 Oct 09.
Article in English | MEDLINE | ID: mdl-37807925

ABSTRACT

BACKGROUND: COVID-19 increased moral distress (MD) and moral injury (MI) among healthcare professionals (HCPs). MD and MI were studied among inpatient and outpatient HCPs during March 2022. OBJECTIVES: We sought to examine (1) the relationship between MD and MI; (2) the relationship between MD/MI and pandemic-related burnout and resilience; and (3) the degree to which HCPs experienced pandemic-related MD and MI based on their background. METHODS: A survey was conducted to measure MD, MI, burnout, resilience, and intent to leave healthcare at 2 academic medical centers during a 4-week period. A convenience sample of 184 participants (physicians, nurses, residents, respiratory therapists, advanced practice providers) completed the survey. In this mixed-methods approach, researchers analyzed both quantitative and qualitative survey data and triangulated the findings. RESULTS: There was a moderate association between MD and MI (r = .47, P < .001). Regression results indicated that burnout was significantly associated with both MD and MI (P = .02 and P < .001, respectively), while intent to leave was associated only with MD (P < .001). Qualitative results yielded 8 sources of MD and MI: workload, distrust, lack of teamwork/collaboration, loss of connection, lack of leadership, futile care, outside stressors, and vulnerability. CONCLUSIONS: While interrelated conceptually, MD and MI should be viewed as distinct constructs. HCPs were significantly impacted by the COVID-19 pandemic, with MD and MI being experienced by all HCP categories. Understanding the sources of MD and MI among HCPs could help to improve well-being and work satisfaction.

2.
Am Surg ; 89(5): 1442-1448, 2023 May.
Article in English | MEDLINE | ID: mdl-34851174

ABSTRACT

BACKGROUND: Despite advances in online education during the COVID-19 pandemic, its impact on surgical simulation remains unclear. The aim of this study was to compare the costs and resources required to maintain simulation training in the pandemic and to evaluate how it affected exposure of medical students to simulation during their surgical clerkship. METHODS: The number of learners, contact hours, staff hours, and costs were collected from a multi-departmental simulation center of a single academic institution in a retrospective fashion. Utilization and expenditure metrics were compared between the first quarter of academic years 2018-2020. Statistical analysis was performed to evaluate potential differences between overall resource utilization before and during the pandemic, and subgroup analysis was performed for the resources required for the training of the third-year medical students. RESULTS: The overall number of learners and contact hours decreased during the first quarter of the academic year 2020 in comparison with 2019 and 2018. However, the staff hours increased. In addition, the costs for PPE increased for the same periods of time. In the subgroup analysis of the third-year medical students, there was an increase in the number of learners, as well as in the staff hours and in the space required to perform the simulation training. DISCUSSION: Despite an increase in costs and resources spent on surgical simulation during the pandemic, the utilization by academic entities has remained unaffected. Further studies are required to identify potential solutions to lower simulation resources without a negative impact on the quality of surgical simulation.


Subject(s)
COVID-19 , Humans , COVID-19/epidemiology , Pandemics , Retrospective Studies , Costs and Cost Analysis , Computer Simulation
3.
JAMA Surg ; 156(3): 239-245, 2021 03 01.
Article in English | MEDLINE | ID: mdl-33326009

ABSTRACT

Importance: Although optimal access is accepted as the key to quality care, an accepted methodology to ascertain potential disparities in surgical access has not been defined. Objective: To develop a systematic approach to detect surgical access disparities. Design, Setting, and Participants: This cross-sectional study used publicly available data from the Health Cost and Utilization Project State Inpatient Database from 2016. Using the surgical rate observed in the 5 highest-ranked counties (HRCs), the expected surgical rate in the 5 lowest-ranked counties (LRCs) in North Carolina were calculated. Patients 18 years and older who underwent an inpatient general surgery procedure and patients who underwent emergency inpatient cholecystectomy, herniorrhaphy, or bariatric surgery in 2016 were included. Data were collected from January to December 2016, and data were analyzed from March to July 2020. Exposures: Health outcome county rank as defined by the Robert Wood Johnson Foundation. Main Outcomes and Measures: The primary outcome was the proportional surgical ratio (PSR), which was the disparity in surgical access defined as the observed number of surgical procedures in the 5 LRCs relative to the expected number of procedures using the 5 HRCs as the standardized reference population. Results: In 2016, approximately 1.9 million adults lived in the 5 HRCs, while approximately 246 854 lived in the 5 LRCs. A total of 28 924 inpatient general surgical procedures were performed, with 4521 being performed in those living in the 5 LRCs and 24 403 in those living in the 5 HRCs. The rate of general surgery in the 5 HRCs was 13.09 procedures per 1000 population. Using the 5 HRCs as the reference, the PSR for the 5 LRCs was 1.40 (95% CI, 1.35-1.44). For emergent/urgent cholecystectomy, the PSR for the 5 LRCs was 2.26 (95% CI, 2.02-2.51), and the PSR for emergent/urgent herniorrhaphy was 1.83 (95% CI, 1.33-2.45). Age-adjusted rate of obesity (body mass index [calculated as weight in kilograms divided by height in meters squared] greater than 30), on average, was 36.6% (SD, 3.4) in the 5 LRCs vs 25.4% (SD, 4.6) in the 5 HRCs (P = .002). The rate of bariatric surgery in the 5 HRCs was 33.07 per 10 000 population with obesity. For the 5 LRCs, the PSR was 0.60 (95% CI, 0.51-0.69). Conclusions and Relevance: The PSR is a systematic approach to define potential disparities in surgical access and should be useful for identifying, investigating, and monitoring interventions intended to mitigate disparities in surgical access that effects the health of vulnerable populations.


Subject(s)
Health Services Accessibility/statistics & numerical data , Healthcare Disparities/statistics & numerical data , Surgical Procedures, Operative/statistics & numerical data , Adult , Aged , Cross-Sectional Studies , Databases, Factual , Female , Hospitalization/statistics & numerical data , Humans , Male , Middle Aged , North Carolina , Procedures and Techniques Utilization , Socioeconomic Factors
4.
J Surg Educ ; 75(2): 304-312, 2018.
Article in English | MEDLINE | ID: mdl-29396274

ABSTRACT

PURPOSE: The Accreditation Council for Graduate Medical Education (ACGME) continues to play an integral role in accreditation of surgical programs. The institution of case logs to demonstrate competency of graduating residents is a key component of evaluation. This study compared the number of vascular cases a surgical resident has completed according to the ACGME operative log to their operative proficiency, quality of anastomosis, operative experience, and confidence in both a simulation and operative setting. MATERIALS AND METHODS: General surgery residents ranging from PGY 1 to 5 participated in a simulation laboratory in which they completed an end-to-side vascular anastomosis. Each participant was given a weighted score based on technical proficiency and anastomosis quality using a previously validated Global Rating Scale (Duran et al, 2014). These scores were correlated to the General Surgery Milestones. Participants completed preoperative and postoperative surveys assessing resident operative experience using the 4-level Zwisch scale (DaRosa et al., 2013), confidence with vascular procedures and confidence performing simulated anastomoses. Confidence was assessed on a scale from 1 to 9 (not confident to extremely confident). Case logs were recorded for each participant. An IRB approved questionnaire was distributed to assess preoperative and postoperative roles of both the resident physician and faculty, with a defined goal. Univariate and multivariate analysis was performed. RESULTS: Twenty-one general surgery residents were evaluated in the simulation laboratory and 8 residents were assessed intraoperatively. The residents were evenly distributed throughout clinical years. Groups of residents were divided into quartiles based upon the number of vascular cases recorded in the ACGME database. No correlation was found between number of cases, Milestones score and the weighted score (p = 0.94). No statistical significance was found between confidence and quality of anastomosis (p = 0.1). Resident operative experience per the Zwisch scale was categorized most commonly as "Smart Help" by both the trainee and attending surgeon, despite mean resident confidence ratings of 6.67 (± 1.61) with vascular procedures. CONCLUSIONS: ACGME case logs, which are utilized to assess readiness for completion of general surgery residency, may not be indicative of a resident's operative competency and technical proficiency. Confidence is not correlated with technical ability. Faculty and resident insight as to their role in a procedure differ, as faculty feel that they are providing less help than the resident perceives. Careful examination of resident operative technique is the best measure of competency.


Subject(s)
Clinical Competence , Simulation Training , Vascular Surgical Procedures/education , Workload/statistics & numerical data , Accreditation/standards , Adult , Anastomosis, Surgical/education , Cohort Studies , Education, Medical, Graduate/methods , Female , General Surgery/education , Humans , Internship and Residency/methods , Male , Professional Autonomy , Prospective Studies , Self Concept , United States
5.
Transpl Int ; 30(6): 566-578, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28211192

ABSTRACT

Controversy exists as to whether African American (AA) transplant recipients are at risk for developing de novo donor-specific anti-human leucocyte antigen (HLA) antibody (dnDSA). We studied 341 HLA-mismatched, primary renal allograft recipients who were consecutively transplanted between 3/1999 and 12/2010. Sera were collected sequentially pre- and post-transplant and tested for anti-HLA immunoglobulin G (IgG) via single antigen bead assay. Of the 341 transplant patients (225 AA and 116 non-AA), 107 developed dnDSA at a median of 9.2 months post-transplant. AA patients had a 5-year dnDSA incidence of 35%. This was significantly higher than the 5-year dnDSA incidence for non-AA patients (21%). DQ mismatch (risk) and receiving a living-related donor (LRD) transplant (protective) were transplant factors associated with dnDSA. Within the AA patient cohort, HLA-DQ mismatch, not-receiving a LRD transplant, nonadherence and BK viraemia were the most common factors associated with early dnDSA (occurring <24 months post-transplant). Nonadherence and pretransplant diabetes history were the strong precursors to late dnDSA. Despite the higher rates of dnDSA in the AA cohort, post-dnDSA survival was the same in AA and non-AA patients. This study suggests that DQ matching, increasing LRD transplantation in AA patients and minimizing under-immunosuppression will be key to preventing dnDSA.


Subject(s)
Isoantibodies/blood , Kidney Transplantation , Racial Groups , Tissue Donors , Adult , Black or African American , Antibody Specificity , BK Virus , Cohort Studies , Female , Graft Rejection/etiology , Graft Rejection/immunology , HLA Antigens/immunology , HLA-DQ Antigens/immunology , Histocompatibility Testing , Humans , Immunoglobulin G/blood , Kidney Transplantation/adverse effects , Living Donors , Male , Middle Aged , Polyomavirus Infections/etiology , Risk Factors , Time Factors , Tumor Virus Infections/etiology , Viremia/etiology , White People
6.
Clin Transplant ; 30(9): 1108-14, 2016 09.
Article in English | MEDLINE | ID: mdl-27327607

ABSTRACT

BACKGROUND: The role of anti-HLA-DP antibodies in renal transplantation is poorly defined. This study describes the impact of donor (donor-specific antibody [DSA]) and non-donor-specific antibodies against HLA-DP antigens in renal transplant patients. METHODS: Of 195 consecutive patients transplanted between September 2009 and December 2011, 166 primary kidney recipients and their donors were typed (high-resolution) for DP antigens. Sera taken pre-transplant and at 1, 3, 6, 9, and 12 months, and annually post-transplant were retrospectively tested for anti-DP antibodies using single-antigen beads. RESULTS: In 81 (49%) patients, anti-DP antibodies were found; 64% (n=52) of patients were positive in the pre-transplant samples and 36% (n=29) were positive exclusively post-transplant. The median time from transplantation to antibody was 20.9 months. Fifty-five percent (n=16) of the de novo anti-DP antibodies were accompanied by another de novo DSA. Anti-DP antibody-positive patients had a higher rate of rejection (compared with anti-DP antibody-negative patients, P=.01). The estimated glomerular filtration rate declined more with anti-DP antibodies (-5.5% vs +26%). CONCLUSIONS: Antibodies against HLA-DP antigens are common. De novo anti-DP antibodies commonly appear after acute rejection and accompany DSA, which makes it difficult to determine whether anti-DP antibodies are the cause or the consequence of graft injury.


Subject(s)
Graft Rejection/immunology , HLA-DP Antigens/immunology , Isoantibodies/immunology , Kidney Transplantation , Tissue Donors , Female , Follow-Up Studies , Graft Rejection/epidemiology , Graft Survival/immunology , Histocompatibility Testing , Humans , Incidence , Male , Middle Aged , North Carolina/epidemiology , Retrospective Studies
7.
J Surg Educ ; 72(6): e226-35, 2015.
Article in English | MEDLINE | ID: mdl-26381924

ABSTRACT

PURPOSE: Milestones for the assessment of residents in graduate medical education mark a change in our evaluation paradigms. The Accreditation Council for Graduate Medical Education has created milestones and defined them as significant points in development of a resident based on the 6 competencies. We propose that a similar approach be taken for resident assessment of teaching faculty. We believe this will establish parity and objectivity for faculty evaluation, provide improved data about attending surgeons' teaching, and standardize faculty evaluations by residents. METHODS: A small group of advanced surgery educators determined appropriate educational characteristics, resulting in creation of 11 milestones (Fig. 2) that were reviewed by faculty and residents. The residents have historically answered 16 questions, developed by our surgical education committee (Fig. 3), on a 5-point Likert score (never to very often). Three weeks after completing this Likert-type evaluation, the residents were asked to again evaluate attending faculty using the Faculty Milestones evaluation. The residents then completed a survey of 7 questions (scale of 1-9-disagree to strongly agree, neutral = 5), assessing the new milestones and compared with the previous Likert evaluation system. RESULTS: Of 32 surgery residents, 13 completed the Likert evaluations (3760 data points) and 13 completed the milestones evaluations (1800 data points). The number completing both or neither is not known, as the responses are anonymous when used for faculty feedback. The Faculty Milestones attending physicians' scores have far fewer top of range scores (21% vs 42%) and have a wider spread of data giving better indication of areas for improvement in teaching skills. The residents completed 17 surveys (116 responses) to evaluate the new milestones system. Surveys indicated that milestones were easier to use (average rating 6.13 ± 0.42 Standard Error (SE)), effective (6.82 ± 0.39) and efficient (6.11 ± 0.53), and more objective (6.69 ± 0.39/6.75 ± 0.38) than the Likert evaluations are. Average response was 6.47 ± 0.46 for overall satisfaction with the Faculty Milestones evaluation. More surveys were completed than evaluations, as all residents had an opportunity to review both evaluation systems. CONCLUSIONS: Faculty Milestones are more objective in evaluating surgical faculty and mirror the new paradigm in resident evaluations. Residents found this was an easier, more effective, efficient, and objective evaluation of our faculty. Although our Faculty Milestones are designed for surgical educators, they are likely to be applicable with appropriate modifications to other medical educators as well.


Subject(s)
Clinical Competence , Faculty, Medical , General Surgery/education , Internship and Residency , Records
8.
Clin Transpl ; 31: 293-301, 2015.
Article in English | MEDLINE | ID: mdl-28514591

ABSTRACT

BACKGROUND: Human leukocyte antigen (HLA) antibodies are a major cause of graft loss in mismatched transplant recipients. However, the time to graft loss resulting from antibody induced injury is unpredictable. The unpredictable nature of antibodies may be related to the subclass of antibodies. In this study, HLA immunoglobulin G (IgG) subclasses were investigated to determine whether a unique IgG subclass composition could better identify those patients at eminent risk for graft loss. METHODS: The serial serum samples from the 57 patients with post-transplant HLA class II donor specific antibodies (DSA) were tested for the three IgG subclasses (IgG1, IgG3, and IgG4). RESULTS: IgG3 and IgG4 were highly prevalent in failed patients compared to functioning patients (82 % vs. 34%, 45% vs. 20%, respectively). IgG3 development showed a distinct subclass trend between failed and functioning patients with poor graft survival (log rank p=0.0006). IgG1 was almost equally abundant in both groups (100% and 97%, respectively). Of the 5 patterns of IgG subclass combinations observed, IgG1+3+ showed the strongest association with graft failure (hazard ratio 3.14, p=0.007). CONCLUSION: Patients with IgG3 subclass HLA DSA showed lower graft survival. Post-transplant monitoring for IgG subclasses rather than total IgG monitoring may identify patients at risk for graft failure.

9.
J Surg Res ; 192(1): 1-5, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25151468

ABSTRACT

BACKGROUND: The Hemodialysis Reliable Outflow (HeRO) vascular access device is a hybrid polytetrafluoroethylene graft-stent construct designed to address central venous occlusive disease. Although initial experience has demonstrated excellent mid-term patency rates, subsequent studies have led to external validity questions. The purpose of this study was to examine a single center experience with this vascular access device in challenging access cases with associated costs. METHODS: A retrospective study representing the authors' cumulative HeRO vascular access device experience was undertaken. The primary endpoint was graft failure or death, with secondary endpoints including secondary intervention rates and cost. RESULTS: Forty-one patients with 15,579 HeRO days and a mean of 12.7 ± 1.5 mo with the vascular access device were available for analysis. Secondary patency was 81.6% at 6 mo and 53.7% at 12 mo. The reintervention rate was 2.84 procedures per HeRO vascular access device year. Associated HeRO costs related to subsequent procedures were estimated at $34,713.63 per patient/y. CONCLUSIONS: These data on the patency and primary outcome data diverge significantly from initial multicenter studies and represent a real-world application of this technology. It is costly to maintain patency. Use of HeRO vascular access devices should be judicious with outcome expectations reduced.


Subject(s)
Arteriovenous Shunt, Surgical/standards , Graft Occlusion, Vascular/prevention & control , Kidney Failure, Chronic/therapy , Renal Dialysis/instrumentation , Vascular Access Devices/standards , Arteriovenous Shunt, Surgical/economics , Female , Graft Occlusion, Vascular/economics , Graft Occlusion, Vascular/mortality , Health Expenditures/statistics & numerical data , Humans , Kaplan-Meier Estimate , Kidney Failure, Chronic/economics , Kidney Failure, Chronic/mortality , Length of Stay/statistics & numerical data , Male , Middle Aged , Outcome Assessment, Health Care/economics , Renal Dialysis/economics , Renal Dialysis/mortality , Retrospective Studies , Vascular Access Devices/economics
10.
Transplantation ; 98(10): 1097-104, 2014 Nov 27.
Article in English | MEDLINE | ID: mdl-24911039

ABSTRACT

BACKGROUND: Many patients develop de novo donor-specific anti-human leukocyte antigen antibodies (dnDSA) after transplantation. Despite development of dnDSA, not all patients will immediately fail. This study analyzes dnDSA intensity and longitudinal trends as prospective clinical parameters to assess subsequent allograft function. METHODS: Twenty-four patients with dnDSA onset in the first 2 years after transplantation received antibody monitoring by LABScreen single antigen beads. Estimated glomerular filtration rate (eGFR) was recorded at time of dnDSA onset and up to 24 months thereafter. The dnDSA mean fluorescence intensity (MFI) of the stable function patient group (n=8; eGFR decline ≤ 25%) was compared with the impaired function patient group (n=16; eGFR decline>25%) using first year peak MFI (pMFI), eight month MFI change (ΔMFI), and eighteen month MFI trend (MFI slope). RESULTS: Both groups showed similar dnDSA characteristics (time to onset after transplantation, class I/II distribution, and initial MFI). Between groups, MFI trends were analyzed. Impaired patients showed a higher pMFI during the first year (median pMFI, 13,055 vs. 2,397; P=0.007). Longitudinal analysis revealed that ΔMFI was strongly associated with dysfunction. Both a ΔMFI increase greater than 20% as well as a stronger increase (ΔMFI>50%) were followed by graft dysfunction in almost all patients and could significantly differentiate between stable and impaired function patients (P=0.001 and P=0.04, respectively). CONCLUSION: Our study suggests that tracking dnDSA intensity, particularly in the early period after onset, is important to estimate the impact of dnDSA on the allograft and could, therefore, determine help on how best to monitor patients with dnDSA.


Subject(s)
HLA Antigens , Isoantibodies/blood , Kidney Transplantation/adverse effects , Tissue Donors , Adult , Aged , Antibody Specificity , Female , Glomerular Filtration Rate , Graft Rejection/etiology , Graft Rejection/immunology , Graft Rejection/physiopathology , Humans , Immunoglobulin G/blood , Male , Middle Aged , Prospective Studies , Retrospective Studies , Risk Factors
11.
Transplantation ; 97(5): 494-501, 2014 Mar 15.
Article in English | MEDLINE | ID: mdl-24487396

ABSTRACT

BACKGROUND: With standard IgG donor-specific anti-HLA antibody (DSA) testing, it is unclear which immunoglobulin-G (IgG) DSA positive patients will fail. We looked further into the immune response by studying immunoglobulin-M (IgM) and IgG subclass 3 (IgG3) DSA to determine if these identify the IgG DSA patients at highest risk for allograft loss. METHODS: In 189 consecutively transplanted primary renal allograft recipients, sera were collected sequentially pre- and posttransplant. Of the 189, 179 patients had sera available to retrospectively test for anti-HLA IgG, IgM, and IgG3 antibodies via LABScreen single-antigen bead assay and were included in the study. All patients had a negative crossmatch. Per patient, all DSA (IgM, IgG3, and IgG) refers to the same serologic specificity. RESULTS: Overall, 100 (56%) patients developed an alloimmune response (IgM or IgG DSA positive, or both). Ninety-five patients developed IgM DSA and 47 patients developed IgG DSA. IgM DSA was detected in 42 of 47 patients with IgG DSA. IgM DSA alone did not increase the allograft loss risk, whereas IgG DSA did (P=0.002). Once IgG DSA appeared, IgM DSA persisted in 33 patients and an isotype switch to IgG3 positive DSA occurred in 25 patients. Patients with IgM persistent IgG3 positive DSA (n=19) were more likely to have allograft failure than those without (P=0.02). CONCLUSION: This study shows the evolution of the humoral immune response from IgM to IgG DSA posttransplant. We found that development of IgM persistent IgG3 positive DSA identifies the most dangerous IgG DSA subpopulation.


Subject(s)
Graft Rejection/epidemiology , Graft Rejection/immunology , Immunoglobulin G/physiology , Immunoglobulin M/physiology , Isoantibodies/physiology , Kidney Transplantation , Transplantation , Allografts , Antibody Specificity/immunology , Female , Humans , Immunity, Humoral/physiology , Immunoglobulin G/blood , Immunoglobulin G/immunology , Immunoglobulin M/blood , Immunoglobulin M/immunology , Isoantibodies/blood , Isoantibodies/immunology , Male , Middle Aged , Prevalence , Retrospective Studies , Risk Factors , Treatment Outcome
12.
Clin Transpl ; : 137-42, 2014.
Article in English | MEDLINE | ID: mdl-26281138

ABSTRACT

The development of donor specific antibodies (DSA) post transplant has been associated with chronic rejection and graft failure. In a longitudinal study, we have shown that increases in DSA precede rejection by months, thus allowing time for intervention. We hypothesized that mycophenolic acid (MPA) dose increases may reduce and/or stabilize DSA strength and also preserve renal function. Thirty stable DSA positive kidney transplant recipients participated in this Institutional Review Board approved, exploratory, open-label, single center study to assess the efficacy of MPA dose escalation in patients with DSA. MPA escalation was well tolerated and most patients were able to take higher doses for at least two years (duration of the study). In addition, MPA escalation is safe and participants had no significant side effects such as cytomegalovirus and BK infections. Long-term allograft survival of the MPA escalation group was superior when compared with the control group (p = 0.018). This pilot study indicates that escalation of MPA is safe and may stabilize DSA. In addition, five-year follow up demonstrates improved long-term survival with MPA escalation compared with DSA positive recipients receiving the standard of care. Additional studies using larger cohorts are warranted.


Subject(s)
Graft Rejection/prevention & control , Graft Survival/drug effects , HLA Antigens/immunology , Histocompatibility , Immunosuppressive Agents/administration & dosage , Isoantibodies/blood , Kidney Transplantation , Mycophenolic Acid/administration & dosage , Adult , Biomarkers/blood , Female , Graft Rejection/immunology , Humans , Immunosuppressive Agents/adverse effects , Kidney Transplantation/adverse effects , Male , Middle Aged , Monitoring, Immunologic , Mycophenolic Acid/adverse effects , North Carolina , Risk Factors , Time Factors , Treatment Outcome
13.
Transplantation ; 96(10): 919-25, 2013 Nov 27.
Article in English | MEDLINE | ID: mdl-23912173

ABSTRACT

BACKGROUND: Approximately 7% to 9% of patients with donor-specific anti-human leukocyte antigen (HLA) antibodies (DSA) fail within 1 year post-DSA onset. However, little is known as to how this DSA-associated failure temporally progresses. This longitudinal study investigates DSA's temporal relationship to allograft dysfunction and identifies predictors of allograft function's progressive deterioration post-DSA. METHODS: A cohort of 175 non-HLA identical patients receiving their first transplant between March 1999 and March 2006 were analyzed. Protocol testing for DSA via single antigen beads was done before transplantation and at 1, 3, 6, 9, and 12 months after transplantation then annually. Estimated glomerular filtration rate (eGFR) was analyzed before and after DSA onset. RESULTS: Forty-two patients developed DSA and had adequate eGFR information for analysis. Before DSA onset, the 42 patients had stable eGFR. By 1 year post-DSA, the cohort's eGFR was significantly lower (P<0.001); however, 30 of 42 had stable function. Twelve patients had failure or early allograft dysfunction (eGFR decline >25% from DSA onset). Those who failed early (by 1 year post-DSA) had more antibody-mediated rejection than stable patients (P=0.03). Late failures (after 1 year post-DSA) were predictable with evidence of early allograft dysfunction (eGFR decline >25% by 1 year post-DSA; P<0.001). Early allograft dysfunction preceded late failure by nearly 1 year. CONCLUSIONS: DSA is temporally related to allograft function deterioration. However, in many cases, late allograft failures are preceded by early allograft dysfunction. Therefore, monitoring for early allograft dysfunction provides treating physicians with a window of opportunity for treatment or continued monitoring.


Subject(s)
Glomerular Filtration Rate/physiology , Graft Rejection/immunology , Graft Survival/physiology , HLA Antigens/immunology , Isoantibodies/immunology , Kidney Transplantation , Adult , Female , Follow-Up Studies , Graft Rejection/physiopathology , Histocompatibility Testing , Humans , Male , Middle Aged , Retrospective Studies , Tissue Donors , Transplantation, Homologous
14.
Transplantation ; 95(9): 1113-9, 2013 May 15.
Article in English | MEDLINE | ID: mdl-23514959

ABSTRACT

BACKGROUND: Anti-HLA-DQ antibodies are the predominant HLA class II donor-specific antibodies (DSAs) after transplantation. Recently, de novo DQ DSA has been associated with worse allograft outcomes. The aim of this study was to determine the further complement-binding characteristics of the most harmful DQ DSA. METHODS: Single-antigen bead technology was used to screen 284 primary kidney transplant recipients for the presence of posttransplantation DQ DSA. Peak DSA sera of 34 recipients with only de novo DQ DSA and of 20 recipients with de novo DQ plus other DSAs were further analyzed by a modified single-antigen bead assay using immunoglobulin (Ig)-G subclass-specific reporter antibodies and a C1q-binding assay. RESULTS: Compared with recipients who did not have DSA, those with de novo persistent DQ-only DSA and with de novo DQ plus other DSAs had more acute rejection (AR) episodes (22%, P=0.005; and 36%, P=0.0009), increased risk of allograft loss (hazards ratio, 3.7, P=0.03; and hazards ratio, 11.4, P=0.001), and a lower 5-year allograft survival. De novo DQ-only recipients with AR had more IgG1/IgG3 combination and C1q-binding antibodies (51%, P=0.01; and 63%, P=0.001) than patients with no AR. Furthermore, the presence of C1q-binding de novo DQ DSA was associated with a 30% lower 5-year allograft survival (P=0.003). CONCLUSIONS: The presence of de novo persistent, complement-binding DQ DSA negatively impacts kidney allograft outcomes. Therefore, early posttransplantation detection, monitoring, and removal of complement-binding DQ might be crucial for improving long-term kidney transplantation outcomes.


Subject(s)
Complement C1q/immunology , HLA-DQ Antigens/immunology , Immunoglobulin G/classification , Isoantibodies/immunology , Kidney Transplantation , Tissue Donors , Adult , Aged , Female , Graft Rejection , Graft Survival , Humans , Immunoglobulin G/immunology , Male , Middle Aged , Retrospective Studies , Transplantation, Homologous
15.
Transplantation ; 95(3): 410-7, 2013 Feb 15.
Article in English | MEDLINE | ID: mdl-23380861

ABSTRACT

BACKGROUND: To date, limited information is available describing the incidence and impact of de novo donor-specific anti-human leukocyte antigen (HLA) antibodies (dnDSA) in the primary renal transplant patient. This report details the dnDSA incidence and actual 3-year post-dnDSA graft outcomes. METHODS: The study includes 189 consecutive nonsensitized, non-HLA-identical patients who received a primary kidney transplant between March 1999 and March 2006. Protocol testing for DSA via LABScreen single antigen beads (One Lambda) was done before transplantation and at 1, 3, 6, 9, and 12 months after transplantation then annually and when clinically indicated. RESULTS: Of 189 patients, 47 (25%) developed dnDSA within 10 years. The 5-year posttransplantation cumulative incidence was 20%, with the largest proportion of patients developing dnDSA in the first posttransplantation year (11%). Young patients (18-35 years old at transplantation), deceased-donor transplant recipients, pretransplantation HLA (non-DSA)-positive patients, and patients with a DQ mismatch were the most likely to develop dnDSA. From DSA appearance, 9% of patients lost their graft at 1 year. Actual 3-year death-censored post-dnDSA graft loss was 24%. CONCLUSION: We conclude that 11% of the patients without detectable DSA at transplantation will have detectable DSA at 1 year, and over the next 4 years, the incidence of dnDSA will increase to 20%. After dnDSA development, 24% of the patients will fail within 3 years. Given these findings, future trials are warranted to determine if treatment of dnDSA-positive patients can prevent allograft failure.


Subject(s)
Graft Rejection/epidemiology , HLA Antigens/immunology , Isoantibodies/blood , Kidney Transplantation/immunology , Tissue Donors , Adolescent , Adult , Aged , Aged, 80 and over , Female , Follow-Up Studies , Graft Rejection/immunology , Histocompatibility Testing , Humans , Incidence , Longitudinal Studies , Male , Middle Aged , Retrospective Studies , Time Factors , Transplantation, Homologous , Young Adult
16.
Clin Transpl ; : 319-24, 2013.
Article in English | MEDLINE | ID: mdl-25095524

ABSTRACT

Donor specific human leukocyte antigen (HLA) antibodies (DSA) are a significant cause of allograft failure. However, it has been reported that some DSA negative patients still experience allograft failure. In addition, some DSA positive patients maintain good graft function for >20 years. These findings suggest that while DSA is a cause of failure, it is not the sole risk factor for graft dysfunction and that the presence of DSA alone may not predict the time course of graft failure. Here, we report the predictive value of a proprietary panel of four biomarkers in long-term renal allograft outcome. A total of 310 consecutive patients, who received kidney transplants between 1999 and 2012, were included in this study. Recipient sera was tested for HLA antibodies and biomarkers at 3, 6, 12, 24, and 36 months post-transplant. HLA antibodies were identified using Labscreen single antigen beads. The biomarker combination (BMC) test consisted of a proprietary panel of 4 biomarkers and was performed using Luminex. Sera were defined as positive when any one of the 4 biomarkers became detectable. Sera of normal healthy people were used as negative controls. Graft survival analyses were performed and compared between different patient groups based on the positivity of DSA and BMC. Our results indicate that 57% of DSA negative patients and 54% of DSA positive patients had detectable biomarkers. There was no significant difference in BMC positive patients between the DSA positive and negative groups, which suggests that presence of BMC is not associated with HLA DSA. DSA positive patients had a 10% lower 10-year graft survival rate than patients without DSA, while BMC positive patients had a 25% lower 10-year graft survival rate than patients without detectable BMC. When DSA negative patients were divided into two groups based on the positivity of BMC, BMC positive patients had a 20% lower 10-year graft survival rate compared to BMC negative patients (p<0.05). Similarly, when DSA positive patients were divided into two groups based on the positivity of BMC, BMC positive patients had a 30% lower 10-year graft survival rate compared to BMC negative patients (p<0.01). When both DSA and BMC testing results were considered, DSA and BMC double positive patients had the lowest and double negative patients had the highest graft survival rates. The survival rates for the BMC alone and DSA alone positive groups were in between (p<0.001). Multivariate Cox models confirmed that BMC was an independent risk factor for graft failure, with a higher hazard ratio than DSA (BMC=2.60 versus DSA=1.64). In conclusion, serum BMC is an independent predictor of graft failure. BMC was more significantly associated with graft failure than DSA. In combination with DSA, BMC better predicted graft outcome than DSA or BMC alone.


Subject(s)
Graft Rejection , HLA Antigens/immunology , Kidney Transplantation/mortality , Biomarkers , Graft Rejection/diagnosis , Graft Rejection/immunology , Graft Rejection/mortality , Graft Survival/immunology , Humans , Isoantibodies/immunology , Multivariate Analysis , Predictive Value of Tests , Proportional Hazards Models , Risk Factors , Transplantation, Homologous
18.
Clin Transpl ; : 337-40, 2011.
Article in English | MEDLINE | ID: mdl-22755428

ABSTRACT

The donor specific anti-HLA antibody (DSA) has been increasingly recognized as the major cause of allograft loss. Despite this, no published reports exist describing the true epidemiology of de novo DSA.Here we describe the epidemiology of DSA based on the results of one of the longest running antibody study in consecutive renal transplant recipients. The study includes 224 non-sensitized, non-HLA-identical patients who received a primary kidney transplant between 3/1999-3/2006. Protocol testing for DSA was done pre-transplant, at 1, 3, 6, 9, and 12 months, and then annually. DSA was tested using single antigen beads. Data from the East Carolina University transplant cohort indicate that the prevalence of DSA in the first year post-transplant is 12.1 cases per 100. The average annual incidence of DSA is 4.7 per 100 cases, per year. The highest incidence of DSA was in the first year post transplant. Although deceased donors and African-Americans have a higher incidence rate of DSA than the comparator living donors and non-African American groups, respectively, these factors were not associated with DSA onset. The one factor found to be predictive of DSA was DQ mismatch (p = 0.036). Based on these epidemiologic findings in combination with previous reports showing DSA is a cause of allograft failure, it seems reasonable that at least annual testing should be done even in "low-risk" transplant patients, because every year a new 5% of patients will develop DSA.


Subject(s)
HLA Antigens/immunology , Histocompatibility , Isoantibodies/blood , Kidney Transplantation/immunology , Transplantation Tolerance , Desensitization, Immunologic , Female , Graft Rejection/epidemiology , Graft Rejection/immunology , Graft Rejection/prevention & control , Graft Survival , Histocompatibility/drug effects , Histocompatibility Testing , Humans , Immunosuppressive Agents/therapeutic use , Kidney Transplantation/adverse effects , Male , Monitoring, Immunologic , North Carolina/epidemiology , Prospective Studies , Risk Assessment , Risk Factors , Time Factors , Transplantation Tolerance/drug effects , Treatment Outcome
19.
Transplantation ; 89(8): 962-7, 2010 Apr 27.
Article in English | MEDLINE | ID: mdl-20075791

ABSTRACT

BACKGROUND: The common endpoint in the treatment of antibody-mediated rejection (AMR) is functional reversal (creatinine levels). Reduction of human leukocyte antigen (HLA) antibody strength is not commonly considered as an essential endpoint for AMR resolution. The purpose of this study was to determine whether reduction in HLA antibody intensity in patients with histologic AMR reversal influences long-term renal allograft survival. METHODS: Renal allograft recipients were included if he or she had a biopsy diagnosis of AMR (between August 2000 and October 2008) and serial evaluation for HLA antibodies prebiopsy and postbiopsy. Antibody reduction was defined as mean fluorescence intensity decrease more than 50% in highest intensity antibody after AMR therapy and the absence of new antibody formation. Patients were treated with plasmapheresis, thymoglobulin/OKT3, and corticosteroids. Survival analysis was performed using STATA/MP v10 (College Station, TX). RESULTS: Twenty-eight patients were analyzed. Antibody reduction failed to occur in 22 of 28 cases. Baseline characteristics were similar between groups. Antibody nonresponders had significantly shorter allograft survival time (61.4 months) compared with antibody responders (no failures) (P=0.04, log-rank test). CONCLUSIONS: In conclusion, failure to significantly reduce antibody levels and prevent new formation was strongly predictive of allograft loss. This observation suggests that the therapeutic intervention that reduces antibody production may prolong graft survival in transplantation.


Subject(s)
Graft Rejection/prevention & control , Graft Survival , HLA Antigens/immunology , Isoantibodies/blood , Kidney Transplantation/immunology , Acute Disease , Adrenal Cortex Hormones/therapeutic use , Adult , Antibodies, Monoclonal/therapeutic use , Antibody Formation , Antilymphocyte Serum , Biopsy , Down-Regulation , Drug Therapy, Combination , Female , Graft Rejection/immunology , Graft Rejection/pathology , Humans , Immunosuppressive Agents/therapeutic use , Isoantibodies/biosynthesis , Kaplan-Meier Estimate , Male , Middle Aged , Muromonab-CD3/therapeutic use , Plasmapheresis , Retrospective Studies , Time Factors , Transplantation, Homologous , Treatment Outcome
20.
Ren Fail ; 31(7): 593-6, 2009.
Article in English | MEDLINE | ID: mdl-19839857

ABSTRACT

A case of substantial debris captured by an embolic protection device placed in a patient's transplanted renal artery during common iliac artery angioplasty and stent placement is presented. To the authors knowledge, no previous literature exists pertaining to renal artery protection in patients with a history of a heterotopic renal transplant undergoing aortoiliac endovascular interventions.


Subject(s)
Angioplasty, Balloon/methods , Arterial Occlusive Diseases/therapy , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Salvage Therapy/methods , Stents , Aorta, Abdominal/diagnostic imaging , Aorta, Abdominal/physiopathology , Aortography/methods , Arterial Occlusive Diseases/diagnostic imaging , Female , Follow-Up Studies , Graft Rejection , Humans , Iliac Artery/diagnostic imaging , Iliac Artery/physiopathology , Kidney Failure, Chronic/diagnosis , Kidney Function Tests , Kidney Transplantation/methods , Middle Aged , Recovery of Function , Risk Assessment , Transplantation, Heterotopic , Treatment Outcome , Vascular Patency
SELECTION OF CITATIONS
SEARCH DETAIL
...