Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
1.
J Patient Saf ; 16(4): 255-258, 2020 12.
Article in English | MEDLINE | ID: mdl-32217934

ABSTRACT

OBJECTIVES: The aim of the study was to compare retained surgical item (RSI) rates for 137 Veterans Health Administration Surgery Programs with and without surgical count technology and the root cause analysis (RCA) for soft good RSI events between October 1, 2009 and December 31, 2016. A 2017 survey identified 46 programs to have independently acquired surgical count technology. METHODS: Retained surgical item rates were calculated by the sum of events (sharp, soft good, instrument) divided by the total procedures performed. The RCAs for RSI events were analyzed using codebooks for procedure type/location and root cause characterization. RESULTS: One hundred twenty-four RSI events occurred in 2,964,472 procedures for an overall RSI rate of 1/23,908 procedures. The RSI rates for 46 programs with surgical count technology were significantly higher in comparison with 91 programs without a surgical count technology system (1/18,221 versus 1/30,593, P = 0.0026). The RSI rates before and after acquiring the surgical count technology were not significantly different (1/17,508 versus 1/18,673, P = 0.8015). Root cause analyses for 42 soft good RSI events identified multiple associated disciplines (general surgery 26, urology 5, cardiac 4, neurosurgery 3, vascular 2, thoracic 1, gynecology 1) and locations (abdomen 26, thorax 7, retroperitoneal 4, paraspinal 2, extremity 1, pelvis 1, and head/neck 1). Human factors (n = 24), failure of policy/procedure (n = 21), and communication (n = 19) accounted for 64 (65%) of the 98 root causes identified. CONCLUSIONS: Acquisition of surgical count technology did not significantly improve RSI rates. Soft good RSI events are associated with multiple disciplines and locations and the following dominant root causes: human factors, failure to follow policy/procedure, and communication.


Subject(s)
Foreign Bodies/epidemiology , Veterans Health , Humans , Technology
2.
Surg Infect (Larchmt) ; 19(3): 278-285, 2018 Apr.
Article in English | MEDLINE | ID: mdl-29389252

ABSTRACT

BACKGROUND: Surgical site infection (SSI) complicates approximately 2% of surgeries in the Veterans Affairs (VA) hospitals. Surgical site infections are responsible for increased morbidity, length of hospital stay, cost, and mortality. Surgical site infection can be minimized by modifying risk factors. In this study, we identified risk factors and developed accurate predictive surgical specialty-specific SSI risk prediction models for the Veterans Health Administration (VHA) surgery population. METHODS: In a retrospective observation study, surgical patients who underwent surgery from October 2013 to September 2016 from 136 VA hospitals were included. The Veteran Affairs Surgical Quality Improvement Program (VASQIP) database was used for the pre-operative demographic and clinical characteristics, intra-operative characteristics, and 30-day post-operative outcomes. The study population represents 11 surgical specialties: neurosurgery, urology, podiatry, otolaryngology, general, orthopedic, plastic, thoracic, vascular, cardiac coronary artery bypass graft (CABG), and cardiac valve/other surgery. Multivariable logistic regression models were developed for the 30-day post-operative SSIs. RESULTS: Among 354,528 surgical procedures, 6,538 (1.8%) had SSIs within 30 days. Surgical site infection rates varied among surgical specialty (0.7%-3.0%). Surgical site infection rates were higher in emergency procedures, procedures with long operative duration, greater complexity, and higher relative value units. Other factors associated with increased SSI risk were high level of American Society of Anesthesiologists (ASA) classification (level 4 and 5), dyspnea, open wound/infection, wound classification, ascites, bleeding disorder, chemotherapy, smoking, history of severe chronic obstructive pulmonary disease (COPD), radiotherapy, steroid use for chronic conditions, and weight loss. Each surgical specialty had a distinct combination of risk factors. Accurate SSI risk-predictive surgery specialty-specific models were developed with number of variables ranging from 9 to 21 and the C-index ranging from 0.63 to 0.81, indicating acceptable discrimination. The decile plot of predicted versus observed SSI rates showed strong calibration. CONCLUSIONS: Surgery specialty-specific risk factors of 30-day post-operative SSI rates have been identified for a variety of surgery specialties. Accurate SSI risk-predictive surgery specialty-specific SSI predictive models have been developed and validated for the VHA surgery population. These models can be used to develop optimal preventive measures for high-risk patients, patient-centered care planning, and surgical quality improvement.


Subject(s)
Hospitals, Veterans/statistics & numerical data , Surgical Procedures, Operative , Surgical Wound Infection/epidemiology , Veterans/statistics & numerical data , Humans , Patient-Centered Care , Quality Improvement , Retrospective Studies , Risk Factors , Surgical Procedures, Operative/adverse effects , Surgical Procedures, Operative/standards , Surgical Procedures, Operative/statistics & numerical data , Treatment Outcome , United States/epidemiology
3.
JAMA Netw Open ; 1(7): e185147, 2018 11 02.
Article in English | MEDLINE | ID: mdl-30646381

ABSTRACT

Importance: Reducing wrong-site surgery is fundamental to safe, high-quality care. This is a follow-up study examining 8 years of reported surgical adverse events and root causes in the nation's largest integrated health care system. Objectives: To provide a follow-up description of incorrect surgical procedures reported from 2010 to 2017 from US Veterans Health Administration (VHA) medical centers, compared with the previous studies of 2001 to 2006 and 2006 to 2009, and to recommend actions for future prevention of such events. Design, Setting, and Participants: This quality improvement study describes patient safety adverse events and close calls reported from 86 VHA medical centers from the approximately 130 VHA facilities with a surgical program. The surgical procedures and programs vary in size and complexity from small rural centers to large, complex urban facilities. Procedures occurring between January 1, 2010, and December 31, 2017, were included. Data analysis took place in 2018. Main Outcomes and Measures: The categories of incorrect procedure types were wrong patient, side, site (including wrong-level spine), procedure, or implant. Events included those in or out of the operating room, adverse events or close calls, surgical specialty, and harm. These results were compared with the previous studies of VHA-reported wrong-site surgery (2001-2006 and 2006-2009). Results: Our review produced 483 reports (277 adverse events and 206 close calls). The rate of in-operating room (in-OR) reported adverse events with harm has continued to trend downward from 1.74 to 0.47 reported adverse events with harm per 100 000 procedures between 2000 and 2017 based on 6 591 986 in-OR procedures. When in-OR events were examined by discipline as a rate, dentistry had 1.54, neurosurgery had 1.53, and ophthalmology had 1.06 reported in-OR adverse events per 10 000 cases. The overall VHA in-OR rate for adverse events during 2010 to 2017 was 0.53 per 10 000 procedures based on 3 234 514 in-OR procedures. The most common root cause for adverse events was related to issues in performing a comprehensive time-out (28.4%). In these cases, the time-out either was conducted incorrectly or was incomplete in some way. Conclusions and Relevance: Over the period studied, the VHA identified a decrease in the rate of reported adverse events in the OR associated with harm and continued reporting of adverse event close calls. Organizational efforts continue to examine root cause analysis reports, promulgate lessons learned, and enhance policy to promote a culture and behavior that minimizes events and is transparent in reporting occurrences.


Subject(s)
Medical Errors , Veterans Health/statistics & numerical data , Follow-Up Studies , Humans , Medical Errors/classification , Medical Errors/prevention & control , Medical Errors/statistics & numerical data , Patient Safety , Quality of Health Care , United States , United States Department of Veterans Affairs
4.
JAMA Surg ; 151(5): 417-22, 2016 05 01.
Article in English | MEDLINE | ID: mdl-26747331

ABSTRACT

IMPORTANCE: For more than 2 decades, the Veterans Health Administration (VHA) has relied on risk-adjusted, postoperative, 30-day mortality data as a measure of surgical quality of care. Recently, the use of 30-day mortality data has been criticized based on a theory that health care professionals manage patient care to meet the metric and that other outcome metrics are available. OBJECTIVES: To determine whether postoperative mortality data identify a delay in care to meet a 30-day mortality metric and to evaluate whether 30-day mortality risk score groups stratify survival patterns up to 365 days after surgery in surgical procedures assessed by the Veterans Affairs Surgical Quality Improvement Program (VASQIP). DESIGN, SETTING, AND PARTICIPANTS: Patients undergoing VASQIP-assessed surgical procedures within the VHA from October 1, 2011, to September 30, 2013, were evaluated. Data on 365-day survival follow-up of 212 733 surgical cases using VHA Vital Status and admission records were obtained with 10 947 mortality events. Data analysis was conducted from September 3, 2014, to November 9, 2015. MAIN OUTCOMES AND MEASURES: Survival up to 365 days after surgery for the overall cohort divided into 10 equal groups (deciles). RESULTS: There were 10 947 mortality events identified in a cohort of 212 733 surgical patients. The mean probability of death was 1.03% (95% CI, 1.01%-1.04%). Risk estimate groups in the 212 733 surgical cases analyzed showed significantly different postoperative survival, with consistency beyond the time frame for which they were developed. The lowest risk decile had the highest 365-day survival probability (99.74%; 95% CI, 99.66%-99.80%); the highest risk decile had the lowest 365-day survival probability (72.04%; 95% CI, 71.43%-72.64%). The 9 lowest risk deciles had linear survival curves from 0 to 365 postoperative days, with the highest risk decile having early survival risk and becoming more linear after the first 180 days. Survival curves between 25 and 35 days were consistent for all risk deciles and showed no evidence that mortality rates were affected in the immediate period beyond day 30. The setting of mortality varied by postoperative day ranges, with index hospitalization events declining and deaths outside of the hospital increasing up to 365 days. CONCLUSIONS AND RELEVANCE: Deciles of 30-day mortality estimates are associated with significantly different survival outcomes at 365 days even after removing patients who died within the first 30 postoperative days. No evidence of delays in patient care and treatment to meet a 30-day metric were identified. These findings reinforce the usefulness of 30-day mortality risk stratification as a surrogate for long-term outcomes.


Subject(s)
Hospitals, Veterans/statistics & numerical data , Hospitals, Veterans/standards , Risk Adjustment , Surgical Procedures, Operative/mortality , Female , Humans , Kaplan-Meier Estimate , Male , Preoperative Period , Retrospective Studies , Risk Factors , Surgical Procedures, Operative/standards , Survival Rate , Time Factors , Time-to-Treatment , United States/epidemiology , United States Department of Veterans Affairs
5.
JAMA Surg ; 151(4): 314-22, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26606675

ABSTRACT

IMPORTANCE: This study analyzes and reports Clostridium difficile infection (CDI) rates, risk factors, and associations with postoperative outcomes in the Veterans Health Administration (VHA). OBJECTIVE: To report 30-day postoperative CDI rates and outcomes and identify associated risks by surgical procedures and preoperative patient demographics in a large integrated health care system. DESIGN, SETTING, AND PARTICIPANTS: In a retrospective observational study conducted from September 2014 to April 2015, the Veterans Affairs Surgical Quality Improvement Program database and the Decision Support System pharmacy database were linked to analyze the association of postoperative CDI with patients' demographics, preoperative comorbidities, operative characteristics, and preoperative medications. The Veterans Affairs Surgical Quality Improvement Program assessments from October 1, 2009, to September 30, 2013, were investigated. The study was conducted at 134 VHA surgery programs, and the study population represents 12 surgical specialties: general, gynecological, neurosurgical, oral, orthopedics, otolaryngologic, plastic, podiatric, thoracic, transplant, urologic, and peripheral vascular. MAIN OUTCOMES AND MEASURES: Thirty-day postoperative CDI rates, risk factors of CDI, and association of CDI with postoperative morbidity and mortality. RESULTS: Among 468,386 surgical procedures, the postoperative CDI rate was 0.4% per year and varied by the VHA Surgery Program (0.0% to 1.4%) and surgical specialty (0.0% to 2.4%). Thirty-day CDI rates were higher in emergency procedures, procedures with greater complexity and higher relative value units, and those with a contaminated/infected wound classification. Patients with postoperative CDI were significantly older, more frequently hospitalized after surgery (59.9% vs 15.4%), had longer preoperative hospital stays (9.1 days vs 1.9 days), and had received 3 or more classes of antibiotics (1.5% vs 0.3% for a single antibiotic class) (all P < .001). Patients with CDI had higher rates of other postoperative morbidity (86.0% vs 7.1%), 30-day mortality (5.3% vs 1.0%), and longer postoperative hospital stays (17.9 days vs 3.6 days). Independent risk factors for CDI included commonly identified patient factors (albumin, functional class, and weight loss), procedural characteristics (complexity, relative value units, emergency, and wound classification), surgical program complexity, the number of preoperative antibiotic classes, and length of preoperative hospital stay. CONCLUSIONS AND RELEVANCE: The number and class of antibiotics administered after surgery, preoperative length of stay, procedural characteristics, surgical program complexity, and patient comorbidities are associated with postoperative CDI in the VHA.


Subject(s)
Clostridioides difficile/isolation & purification , Enterocolitis, Pseudomembranous/epidemiology , Hospitals, Veterans , Quality Improvement , Surgical Wound Infection/epidemiology , United States Department of Veterans Affairs , Veterans Health , Aged , Aged, 80 and over , Female , Follow-Up Studies , Humans , Male , Middle Aged , Morbidity/trends , Retrospective Studies , Risk Factors , Survival Rate/trends , United States/epidemiology
6.
Am J Surg ; 204(5): 626-30, 2012 Nov.
Article in English | MEDLINE | ID: mdl-22906244

ABSTRACT

BACKGROUND: The aim of this study was to examine the relationship between patient education level and 5-year mortality after major lower extremity amputation. METHODS: The records of all patients who underwent above-knee or below-knee amputation at the Nashville Veterans Affairs Medical Center by the vascular surgery service between January 2000 and August 2006 were retrospectively reviewed. Formal levels of education of the study patients were recorded. Outcomes were compared between those patients who had completed high school and those who had not. Bivariate analysis using χ(2) and Student's t tests and multivariate logistic regression were performed. RESULTS: Five-year mortality for patients who had completed high school was lower than for those who had not completed high school (62.6% vs 84.3%, P = .001), even after adjusting for important clinical factors (odds ratio for death, .377; 95% confidence interval, .164-.868; P = .022). CONCLUSION: Patients with less education have increased long-term mortality after lower extremity amputation.


Subject(s)
Amputation, Surgical/mortality , Educational Status , Leg/surgery , Peripheral Arterial Disease/surgery , Aged , Amputation, Surgical/rehabilitation , Artificial Limbs , Chi-Square Distribution , Humans , Kaplan-Meier Estimate , Logistic Models , Middle Aged , Multivariate Analysis , Peripheral Arterial Disease/mortality , Recovery of Function , Retrospective Studies , Social Class , Walking
7.
Transplantation ; 80(2): 279-81, 2005 Jul 27.
Article in English | MEDLINE | ID: mdl-16041276

ABSTRACT

Serum sickness is an immune-complex mediated illness that frequently occurs in patients after polyclonal antibody therapy (ATGAM or thymoglobulin). Serum sickness presents with significant morbidity but is self-limited and resolves with prolonged steroid therapy. We present five renal transplant patients who developed serum sickness after polyclonal antibody treatment with severe symptoms that persisted after being started on systemic steroids. These patients underwent one or two courses of therapeutic plasma exchange (TPE) with subsequent complete resolution of their symptoms. Renal transplant recipients with serum sickness after polyclonal antibody therapy may benefit from TPE by accelerating their time to recovery and thereby reducing overall morbidity.


Subject(s)
Antilymphocyte Serum/adverse effects , Kidney Transplantation/immunology , Plasma Exchange , Serum Sickness/immunology , Serum Sickness/therapy , Adult , Animals , Female , Horses , Humans , Immunosuppressive Agents/adverse effects , Male , Mice , Middle Aged , Rabbits , Tissue Donors
8.
Am J Surg ; 188(5): 571-4, 2004 Nov.
Article in English | MEDLINE | ID: mdl-15546572

ABSTRACT

BACKGROUND: We sought to determine if disparities in survival and health-related quality of life (HRQOL) occurred after solid organ transplantation at our institution. METHODS: Data were extracted from a database including information regarding transplants that took place from 1990 to 2002. The HRQOL was assessed in patients by using the Karnofsky functional performance (FP) index and the Medical Outcomes Study Short Form 36 (SF-36) questionnaire. RESULTS: Data were collected on recipients of liver (n = 413), heart (n = 299), kidney (n = 892), and lung (n = 156). Blacks represented a minority of recipients: liver 7%, heart 8%, kidney 23%, and lung 6%. There were no statistically significant differences in patient survival between blacks and whites. Graft survival differed in kidney only with a 5-year survival: 72% for blacks versus 79% for whites (P <0.001). The FP and HRQOL improved (P <0.05) after transplantation in both groups. There were no differences on measures of the FP or HRQOL. CONCLUSIONS: Blacks had comparable survival and improvement in FP and HRQOL in comparison with whites.


Subject(s)
Black People/statistics & numerical data , Graft Rejection/ethnology , Organ Transplantation/ethnology , Quality of Life , White People/statistics & numerical data , Adult , Female , Graft Survival , Heart Transplantation/ethnology , Heart Transplantation/mortality , Heart Transplantation/standards , Humans , Kidney Transplantation/ethnology , Kidney Transplantation/mortality , Kidney Transplantation/standards , Liver Transplantation/ethnology , Liver Transplantation/mortality , Liver Transplantation/standards , Lung Transplantation/ethnology , Lung Transplantation/mortality , Lung Transplantation/standards , Male , Middle Aged , Organ Transplantation/mortality , Organ Transplantation/standards , Registries , Retrospective Studies , Survival Rate , Treatment Outcome , United States
9.
Am J Surg ; 188(5): 611-3, 2004 Nov.
Article in English | MEDLINE | ID: mdl-15546581

ABSTRACT

BACKGROUND: A shortage of organ donors remains the major limiting factor in kidney transplantation. Living donor renal transplantation, especially living-unrelated donors, may expand the donor pool by providing another source of excellent grafts. METHODS: Between 1983 and 2003, 109 living donor kidney transplants were performed. Potential donors were assessed with a standardized routine. Antithymocyte serum (N-ATS) and Basiliximab were used as induction agents. Sandimmune, Gengraf, Neoral, and Prograf were the main immunosuppressants with Immuran, Mycophenolate Mofetil, and steroids. Eighty-two percent of the recipients were from out of state. RESULTS: Seventy-eight percent of the living donors were from living-related donors and 22% were from living-unrelated donors. One- and three-year patient survival rates were 97.6% and 93.2% with 1- and 3-year graft survival rates of 93.2% and 88.3%, respectively. There were 6 delayed graft functions (5.5%), 16 acute cellular rejections (10%), and 10 chronic rejections (9%). Twelve patients died, 7 of them with a functioning graft. In the past 6 years (1997-2003), the number of living donor kidney transplants surpassed deceased donor kidney transplants. CONCLUSIONS: Because of the limited number of cadaveric kidneys available for transplant, living donors represent a valuable source, and the use of living-unrelated donors has produced an additional supply of organs. In our program, the proportion of living donors used for kidney transplant is comparable with other non-Veterans Administration programs and the survival of these allografts appears to be superior to deceased donor kidney transplants.


Subject(s)
Kidney Transplantation/methods , Living Donors , Female , Follow-Up Studies , Graft Rejection , Graft Survival , Hospitals, Veterans , Humans , Kidney Transplantation/mortality , Kidney Transplantation/statistics & numerical data , Male , Postoperative Complications/epidemiology , Registries , Retrospective Studies , Risk Assessment , Survival Analysis , Treatment Outcome
10.
Am J Surg ; 188(5): 614-6, 2004 Nov.
Article in English | MEDLINE | ID: mdl-15546582

ABSTRACT

BACKGROUND: Native arteriovenous fistulas (AVFs) have been found to exhibit higher survival rates and lower complication rates than prosthetic grafts (AVGs). METHODS: Between August 2001 and December 2003, 93 patients with end stage renal disease (ESRD) had primary dialysis access placed at a single Veterans Administration medical center. Of these 93 patients, 67 had AVFs created and 26 patients had AVGs implanted. RESULTS: The percentage of patients who did not require additional intervention was 84% (56 of 67) for AVF and 78% (21 of 26) for AVG after 4 to 31 months of follow-up evaluation. In the AVF group, repeat interventions were as follows: collateral ligation (4), angioplasty owing to central stenosis (2), AVF ligation due to arterial steal phenomenon (1), and new AVF creation owing to clotting (1). Four AVFs were later converted to AVG. In the AVG group there were 4 venous anastomosis stenosis seen in 3 patients who required angioplasty. Two patients needed thrombectomy and revision, and 1 graft was removed because of infection. AVF prevalence in our dialysis patients was 63%, with 33% AVG and 4% temporary catheter. CONCLUSIONS: The National Kidney Foundation-Dialysis Outcome Quality Initiative (NKF-DOQI) guidelines for dialysis access reawakened interest in maximizing the use of renal veins for AVF. AVFs created by using the patient's native vein provides the best vascular access for dialysis when compared with prosthetic grafts. AVF has better long-term patency with fewer complications.


Subject(s)
Arteriovenous Shunt, Surgical/statistics & numerical data , Kidney Failure, Chronic/therapy , Renal Dialysis/methods , Aged , Arteriovenous Shunt, Surgical/methods , Cohort Studies , Female , Health Care Surveys , Hospitals, Veterans , Humans , Kidney Failure, Chronic/diagnosis , Male , Middle Aged , Prognosis , Renal Dialysis/mortality , Risk Assessment , Survival Analysis , Treatment Outcome
12.
Am J Surg ; 186(5): 476-80, 2003 Nov.
Article in English | MEDLINE | ID: mdl-14599610

ABSTRACT

BACKGROUND: Some previous studies suggested that transplantation performed in Department of Veterans Affairs (VA) patients was associated with a higher rate of complications and poorer outcomes. We examined more than a decade of experience with solid organ transplantation at a single center and compared VA patients with nonveteran patients to assess long-term patient and graft survival and health-related quality of life (HRQOL). METHODS: Demographic, clinical, and survival data were extracted from a database that included all transplants from January 1990 through December 2002 at Vanderbilt University Medical Center (non-VA) and the Nashville VA Medical Center (VA). The HRQOL was assessed in a subset of patients using the Karnofsky functional performance (FP) index and the Short-Form-36 self-report questionnaire. Data were analyzed by Kaplan-Meier survival and analysis of variance methods. RESULTS: One thousand eight hundred nine adult patients receiving solid organ transplants (1,896 grafts) between 1990 and 2002 were reviewed: 380 VA patients (141 liver, 54 heart, 183 kidney, 2 lung) and 1429 non-VA patients (280 liver, 246 heart, 749 kidney, 154 lung). Mean follow-up time was 46 +/- 1 months. Five-year graft survival for VA and non-VA patients, respectively, was liver 65% +/- 5% versus 69% +/- 3% (P = 0.97); heart 73% +/- 8% versus 73% +/- 3% (P = 0.67); and kidney 76% +/- 5% versus 77% +/- 2% (P = 0.84). Five-year patient survival was liver 75% +/- 5% versus 78% +/- 3% (P = 0.94); heart 73% +/- 8% versus 74% +/- 3% (P = 0.75); and kidney 84% +/- 4% versus 87% +/- 2% (P = 0.21) for VA and non-VA, respectively. In the first 3 years after transplant, the FP scores for VA versus non-VA patients were 85 +/- 2 versus 87 +/- 1 (P = 0.50). The SF-36 mental component scales were 47 +/- 3 versus 49 +/- 1 (P = 0.39); and the SF-36 physical component scales were 37 +/- 2 versus 38 +/- 1 (P = 0.59), respectively. Longer-term (through year 7) HRQOL scores for VA versus non-VA patients were FP 85 +/- 1 versus 88 +/- 1 (P = 0.17); mental component scales 47 +/- 2 versus 49 +/- 1 (P = 0.29); and physical component scales 35 +/- 2 versus 39 +/- 1 (P = 0.05), respectively. CONCLUSIONS: The veteran patients have similar graft and patient survival as the nonveteran patients. Overall quality of life is similar between veterans and nonveterans during the first three years after transplantation. A trend toward a later decline in the veterans' perception of their physical functioning may stem from the increased prevalence of hepatitis C virus among VA liver transplant recipients, a known factor reducing late HRQOL.


Subject(s)
Organ Transplantation/mortality , Quality of Life , Adult , Case-Control Studies , Cross-Sectional Studies , Databases, Factual , Female , Follow-Up Studies , Graft Survival , Hospitals, Veterans/statistics & numerical data , Humans , Karnofsky Performance Status , Male , Middle Aged , Organ Transplantation/psychology , Surveys and Questionnaires , Survival Analysis , Tennessee , Time Factors , United States , United States Department of Veterans Affairs , Veterans/psychology , Veterans/statistics & numerical data
13.
Am J Surg ; 186(5): 535-9, 2003 Nov.
Article in English | MEDLINE | ID: mdl-14599621

ABSTRACT

BACKGROUND: The current study was undertaken to identify factors specific to kidney transplantation that are associated with posttransplant functional performance (FP) and health-related quality of life (HRQOL). METHODS: Karnofsky FP status was assessed longitudinally in 86 adult kidney transplant recipients. Patients reported HRQOL using the Short Form-36 (SF-36) health survey and the Psychosocial Adjustment to Illness Scale (PAIS). RESULTS: FP improved (P <0.001) after kidney transplantation (from 75 +/- 1 to 77 +/- 1, 81 +/- 1, and 82 +/- 1 at 0, 3, 6, and 12 months, respectively). Patients receiving organs from living donors showed continued improvement through posttransplant year 1 while those receiving cadaveric organs stabilized at month 6 (simple interaction contrast, year 1 versus pretransplant; P <0.05). Patients receiving dialysis therapy for 6 months or more prior to transplantation demonstrated lower SF-36 posttransplant physical component scores in comparison with patients who were transplanted preemptively (38 +/- 1 versus 45 +/- 2, P <0.05). Path analysis demonstrated the positive direct effect of time on FP with kidney transplantation (beta = 0.23, P <0.05), and the negative direct effects on FP of diabetes (beta = -0.22) and cadaveric organs (beta = -0.22, both P <0.05). In turn, FP had a positive direct effect on HRQOL (beta = 0.40, P <0.001). CONCLUSIONS: Overall improvement in FP is attenuated 1 year after kidney transplantation in recipients of organs from cadaveric donors. The positive effect of time after transplantation, and the negative effects of cadaveric organs and diabetes on posttransplant HRQOL, are indirect and are mediated by the direct effects of these variables on posttransplant FP.


Subject(s)
Diabetes Complications , Kidney Transplantation , Quality of Life , Tissue Donors , Adult , Cadaver , Female , Humans , Karnofsky Performance Status , Kidney Transplantation/physiology , Kidney Transplantation/psychology , Male , Middle Aged , Time Factors
14.
Clin Transplant ; 17 Suppl 9: 31-4, 2003.
Article in English | MEDLINE | ID: mdl-12795665

ABSTRACT

UNLABELLED: The worsening shortage of cadaver donor kidneys has prompted use of expanded or marginal donor kidneys (MDK), i.e. older age or donor history of hypertension or diabetes. MDK may be especially susceptible to calcineurin-inhibitor (CI) mediated vasoconstriction and nephrotoxicity. Similarly, early use of CI in patients with delayed graft function may prolong ischaemic injury. We developed a CI-free protocol of antibody induction, sirolimus, mycophenolate mofetil, and prednisone in recipients with MDK or DGF. METHODS: Adult renal transplant recipients who received MDK or had DGF were treated with a CI-free protocol consisting of antibody induction (basiliximab or thymoglobulin), sirolimus, mycophenolate mofetil, and prednisone. Serial biopsies were performed for persistent DGF. Patients were followed prospectively with the primary endpoints being patient and graft survival, biopsy-proven acute rejection, and sirolimus-related toxicity. RESULTS: Nineteen recipients were treated. Mean follow-up was 294 days. Actuarial 6- and 12-month patient survival was 100% and 100% and graft survival was 93% and 93%, respectively. The only graft loss was due to primary non-function (PNF). The incidence of AR was 16%. Mean serum creatinine at last follow-up was 1.6 mg/dL. Sirolimus-related toxicity included lymphocele (1), wound infection (2), thrombocytopenia (1). and interstitial pneumonitis (1). CONCLUSION: A CI-free protocol with antibody induction and sirolimus results in low rates of AR and PNF and excellent early patient and graft survival in patients with MDK and DGF. CI-free protocols may allow expansion of the kidney donor pool by encouraging utilization of MDK at high risk for DGF or CI-mediated nephrotoxicity.


Subject(s)
Calcineurin Inhibitors , Graft Rejection/therapy , Graft Survival/drug effects , Immunosuppressive Agents/therapeutic use , Kidney Diseases/chemically induced , Kidney Transplantation/immunology , Mycophenolic Acid/analogs & derivatives , Adult , Aged , Clinical Protocols , Cyclosporine/adverse effects , Female , Graft Survival/immunology , Humans , Immunosuppressive Agents/adverse effects , Kidney Transplantation/methods , Male , Mycophenolic Acid/therapeutic use , Pilot Projects , Prednisone/therapeutic use , Recovery of Function/drug effects , Sirolimus/therapeutic use , Tacrolimus/adverse effects
SELECTION OF CITATIONS
SEARCH DETAIL
...