Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
1.
Am J Transplant ; 18(8): 1977-1985, 2018 08.
Article in English | MEDLINE | ID: mdl-29446225

ABSTRACT

We aimed to evaluate the influence of urological complications occurring within the first year after kidney transplantation on long-term patient and graft outcomes, and sought to examine the impact of the management approach of ureteral strictures on long-term graft function. We collected data on urological complications occurring within the first year posttransplant. Graft survivals, patient survival, and rejection rates were compared between recipients with and without urological complications. Male gender of the recipient, delayed graft function, and donor age were found to be significant risk factors for urological complications after kidney transplantation (P < .05). Death censored graft survival analysis showed that only ureteral strictures had a negative impact on long-term graft survival (P = .0009) compared to other complications. Death censored graft survival was significantly shorter in kidney recipients managed initially with minimally invasive approach when compared to the recipients with no stricture (P = .001). However, graft survival was not statistically different in patients managed initially with open surgery (P = .47). Ureteral strictures following kidney transplantation appear to be strongly negatively correlated with long-term graft survival. Our analysis suggests that kidney recipients with ureteral stricture should be managed initially with open surgery, with better long-term graft survival.


Subject(s)
Constriction, Pathologic/surgery , Delayed Graft Function/surgery , Graft Rejection/surgery , Graft Survival , Kidney Failure, Chronic/surgery , Kidney Transplantation/adverse effects , Ureteral Obstruction/surgery , Adult , Constriction, Pathologic/etiology , Constriction, Pathologic/pathology , Delayed Graft Function/etiology , Delayed Graft Function/pathology , Female , Follow-Up Studies , Graft Rejection/etiology , Graft Rejection/pathology , Humans , Male , Middle Aged , Patient Selection , Postoperative Complications , Prognosis , Retrospective Studies , Risk Factors , Survival Rate , Ureteral Obstruction/etiology , Ureteral Obstruction/pathology
2.
Am J Transplant ; 17(1): 191-200, 2017 01.
Article in English | MEDLINE | ID: mdl-27375072

ABSTRACT

For donation after circulatory death (DCD), many centers allow 1 h after treatment withdrawal to donor death for kidneys. Our center has consistently allowed 2 h. We hypothesized that waiting longer would be associated with worse outcome. A single-center, retrospective analysis of DCD kidneys transplanted between 2008 and 2013 as well as a nationwide survey of organ procurement organization DCD practices were conducted. We identified 296 DCD kidneys, of which 247 (83.4%) were transplanted and 49 (16.6%) were discarded. Of the 247 recipients, 225 (group 1; 91.1%) received kidneys with a time to death (TTD) of 0-1 h; 22 (group 2; 8.9%) received grafts with a TTD of 1-2 h. Five-year patient survival was 88.8% for group 1, and 83.9% for group 2 (p = 0.667); Graft survival was also similar, with 5-year survival of 74.1% for group 1, and 83.9% for group 2 (p = 0.507). The delayed graft function rate was the same in both groups (50.2% vs. 50.0%, p = 0.984). TTD was not predictive of graft failure. Nationally, the average maximum wait-time for DCD kidneys was 77.2 min. By waiting 2 h for DCD kidneys, we performed 9.8% more transplants without worse outcomes. Nationally, this practice would allow for hundreds of additional kidney transplants, annually.


Subject(s)
Brain Death , Graft Rejection/prevention & control , Heart Arrest , Kidney Failure, Chronic/surgery , Tissue Donors/statistics & numerical data , Tissue and Organ Procurement/methods , Adult , Donor Selection , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Survival , Hospitals, High-Volume , Humans , Kidney Function Tests , Kidney Transplantation , Male , Middle Aged , Prognosis , Registries , Retrospective Studies , Risk Factors , Time Factors , Tissue and Organ Procurement/statistics & numerical data , United States
3.
Am J Transplant ; 13(11): 2945-55, 2013 Nov.
Article in English | MEDLINE | ID: mdl-24102905

ABSTRACT

Antibody-mediated rejection (AMR) after pancreas transplantation is a recently identified entity. We describe the incidence of, risk factors for, and outcomes after AMR, and the correlation of C4d immunostaining and donor-specific antibody (DSA) in the diagnosis of AMR. We retrospectively analyzed 162 pancreas transplants in 159 patients who underwent 94 pancreas allograft biopsies between 2006 and 2009. Univariate and multivariate analyses were performed to evaluate risk factors for pancreas graft AMR. One-year rejection rates and survival after rejection were calculated by Kaplan-Meier methods. AMR occurred in 10% of patients by 1-year posttransplant. Multivariate risk factors identified for AMR include nonprimary simultaneous pancreas-kidney (SPK) transplant, primary solitary pancreas (PAN) transplant and race mismatch. After pancreas rejection, patient survival was 100% but 20% (8 of 41) of pancreas grafts failed within 1 year. Graft survival after acute cellular rejection (ACR), AMR and mixed rejection was similar. Of biopsies that stained >5% C4d, 80% were associated with increased Class I DSA. In summary, AMR occurs at a measurable rate after pancreas transplantation, and the diagnosis should be actively sought using C4d staining and DSA levels in patients with graft dysfunction, especially after nonprimary SPK and primary PAN transplantation.


Subject(s)
Graft Rejection/etiology , Immunity, Cellular/immunology , Isoantibodies/immunology , Pancreas Transplantation/adverse effects , Postoperative Complications , Adult , Allografts , Complement C4b/immunology , Female , Follow-Up Studies , Graft Rejection/epidemiology , Graft Rejection/mortality , Graft Survival , Humans , Incidence , Male , Peptide Fragments/immunology , Prognosis , ROC Curve , Retrospective Studies , Risk Factors , Survival Rate , Wisconsin/epidemiology
4.
Am J Transplant ; 11(3): 500-10, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21342448

ABSTRACT

The role of humoral alloreactivity in ABO-compatible liver transplantation remains unclear. To understand the significance of donor-specific HLA alloantibodies (DSA) in liver rejection, we applied the currently used strategy for detection of antibody-mediated rejection of other solid allografts. For this purpose we reviewed the data on 43 recipients of ABO identical/compatible donor livers who had indication liver biopsy stained for complement element C4d and contemporaneous circulating DSA determination. Seventeen (40%) patients had significant circulating DSA in association with diffuse portal C4d deposition (DSA+/diffuse C4d+). These DSA+/diffuse C4d+ subjects had higher frequency of acute cellular rejection (ACR) 15/17 versus 13/26 (88% vs. 50%), p = 0.02, and steroid resistant rejection 7/17 versus 5/26 (41% vs. 19%), p = 0.03. Based on detection of the combination DSA+/diffuse C4d+, 53.6% of cases of ACR had evidence of concurrent humoral alloreactivity. Six of the 10 patients with ductopenic rejection had circulating DSA and diffuse portal C4d, three of whom (2 early and 1 late posttransplantation) developed unrelenting cholestasis, necessitating specific antibody-depleting therapy to salvage the allografts. Thus, in ABO-compatible liver transplantation humoral alloreactivity mediated by antibodies against donor HLA molecules appears to be frequently intertwined with cellular mechanisms of rejection, and to play a role in ductopenia development.


Subject(s)
ABO Blood-Group System/immunology , Bile Duct Diseases/etiology , Graft Rejection/immunology , Histocompatibility Antigens Class I/immunology , Isoantibodies/blood , Liver Transplantation/immunology , Tissue Donors , Adolescent , Adult , Aged , Bile Duct Diseases/pathology , Complement C4b/immunology , Complement C4b/metabolism , Female , Flow Cytometry , Humans , Liver Transplantation/mortality , Male , Middle Aged , Peptide Fragments/immunology , Peptide Fragments/metabolism , Risk Factors , Transplantation, Homologous/immunology , Treatment Outcome , Young Adult
5.
Am J Transplant ; 8(8): 1702-10, 2008 Aug.
Article in English | MEDLINE | ID: mdl-18694474

ABSTRACT

Alemtuzumab is a humanized, rat monoclonal antibody directed against the CD52 antigen. After binding, alemtuzumab causes profound and durable depletion and has been successfully used as immune induction therapy for organ transplantation. This was a single center, retrospective review of patients who underwent simultaneous pancreas-kidney transplantation at the University of Wisconsin using alemtuzumab induction therapy compared with historical controls that received induction with basiliximab. There were no differences in donor or recipient demographics, rates of patient survival, renal or pancreas allograft survival, renal allograft delayed graft function, EBV infection, BKV infection, PTLD or sepsis. There was a statistically significant increase in the incidence of cytomegalovirus (CMV) infection in the alemtuzumab-treated group. Given the significantly higher incidence of CMV infections, we have since altered our induction protocol to consist of a single 30 mg dose of alemtuzumab instead of two doses. The long-term effects of this change remain to be seen. Due to the results seen in this study, the low initial cost of the drug and the absence of any severe, short-term side effects, alemtuzumab has been selected as the induction drug of choice at our center for patients undergoing SPK.


Subject(s)
Antibodies, Monoclonal/therapeutic use , Antibodies, Neoplasm/therapeutic use , Graft Survival , Immunosuppressive Agents/therapeutic use , Immunotherapy/methods , Recombinant Fusion Proteins/therapeutic use , Adult , Alemtuzumab , Antibodies, Monoclonal, Humanized , Antineoplastic Agents , Basiliximab , Female , Humans , Kidney Transplantation , Male , Middle Aged , Pancreas Transplantation , Retrospective Studies , Treatment Outcome
6.
Transplant Proc ; 40(2): 513-5, 2008 Mar.
Article in English | MEDLINE | ID: mdl-18374117

ABSTRACT

Preserving kidney function in patients after solitary pancreas transplantation (SPTx) is an important consideration, yet various factors may negatively impact long-term function of the native kidneys or kidney allograft. To determine changes in kidney function over time in a series of patients receiving SPTx, we conducted a retrospective analysis and tracked changes in serum creatinine (SCr) and calculated glomerular filtration rate (GFR) from baseline to 6 months, 1 year, or 3 years after SPTx in a series of pancreas after kidney transplants PAK; (n = 61) and pancreas transplants alone PTA; (n = 27) performed at our institution. The mean follow-up for the PAK and PTA groups was 3.4 and 2.7 years, respectively. In this series, 8% of patients after SPTx developed significant kidney failure, defined by either initiation of dialysis or receiving a kidney transplant (PAK-6, PTA-1). Twenty seven percent of SPTx patients with a baseline GFR < 60 suffered either an elevated SCr > 2.2, dialysis, or kidney transplant, whereas no patients with a baseline GFR > 60 developed significant kidney dysfunction. In the PAK group, the GFR did not show significant deterioration over time. In contrast to relatively stable kidney function in PAK patients, PTA patients experienced overall significantly greater rates of decline over time. GFR in PTA patients decreased from 78 +/- 19 (40 to 114) mL/min/1.73 m2 at baseline to 65 +/- 20 at 1 year (P = .006), while SCr increased from 1.03 +/- 0.25 mg/dL to 1.28 +/- 0.43 over the same time period (P = .012). These data show that kidney function may deteriorate after SPTx and proper patient selection may reduce the frequency of this complication.


Subject(s)
Kidney Function Tests , Pancreas Transplantation/physiology , Analysis of Variance , Follow-Up Studies , Humans , Immunosuppression Therapy/methods , Kidney Transplantation/immunology , Kidney Transplantation/physiology , Pancreas Transplantation/immunology , Retrospective Studies
7.
Transplant Proc ; 40(1): 219-23, 2008.
Article in English | MEDLINE | ID: mdl-18261591

ABSTRACT

Morphologic characteristics of the graft have been proposed as a major contributor to the long-term outcomes in orthotopic liver transplantation (OLT). Our objective was to determine the impact of donor variables, including donor age, donor-recipient HLA match, and type of donation (DCD vs donation after brain death [DBD]), on the outcome of OLT in 192 patients with hepatitis C virus (HCV). Fourteen patients underwent OLT from donation after cardiac death (DCD) donors and 188 from DBD donors. Mean donor age, warm ischemia time at recovery, and cold ischemia time were similar between the groups. Overall graft survival rate at 1 year (55% DCD vs 85% DBD) and 5 years (46% DCD vs 78% DBD) was significantly lower in the DCD group (P = .0003). Similarly, patient survival rate at 1 year (62% DCD vs 93% DBD) and 5 years (62% DCD vs 82% DBD) was significantly lower in the DCD group (P = .0295). Incidences of hepatic artery thrombosis, portal vein thrombosis, and primary nonfunction were similar between the DCD and DBD groups. The incidence of liver abscess with ischemic-type biliary stricture was higher in recipients from DCD as compared with DBD (42% vs 2%). A trend toward lower graft survival was noted in recipients from donors older than 60 years of age in the HCV population (P = .07), with statistically lower patient survival (P = .02). Donor- recipient HLA matching did not appear to correlate with OLT outcome in patients with HCV. DCD donors and donors older than 60 years of age significantly impact patient and graft survival. Lower graft and patient survival in recipients from DCD donors does not appear to be related to early disease recurrence.


Subject(s)
Hepatitis C/surgery , Liver Transplantation/physiology , Tissue Donors/statistics & numerical data , Adult , Alanine Transaminase/blood , Aspartate Aminotransferases/blood , Cadaver , Female , Graft Survival , Humans , Liver Function Tests , Liver Transplantation/mortality , Living Donors , Male , Middle Aged , Recurrence , Retrospective Studies , Survival Analysis , Time Factors , Treatment Outcome
8.
Pediatr Transplant ; 11(6): 661-70, 2007 Sep.
Article in English | MEDLINE | ID: mdl-17663691

ABSTRACT

Developments in surgical technique, immunosuppression, organ procurement and preservation, and patient selection criteria have resulted in improved long-term patient and graft survival after pediatric liver transplantation. In this study, we examined the results of 196 liver transplants performed in 155 pediatric patients at University of Wisconsin Children's Hospital. Patients were divided into two groups according to age at the time of liver transplant. Infants under 12 months of age comprised Group 1 (n=74) and children from one to 18 yr comprised Group 2 (n=122). Outcomes for whole, reduced-size, and split liver transplantation were compared in infants and children. Biliary atresia was the most common indication in both groups. Patients underwent 128 whole size, 50 reduced size, and 18 split liver transplants. Forty-one retransplantations were performed in 14 infants (18.9%) and in 27 children (22.1%). One hundred eleven patients (56.6%) had one or more rejection episode [37 infants (50.0%) and 74 children (60.6%)]. Thirty-nine patients (19.8%) developed CMV infections, 42 (21.4%) developed EBV infections, and 14 developed PTLD (six infants and eight children). Thirty-six patients (18.3%) developed HAT. Seven patients (4.5%) developed malignancy (one infant and six children). Out of 155 patients, 33 (21.3%) died during the study period. The most common etiology of mortality included central nervous system pathology (n=7; 4.5%), sepsis (n=6; 3.8%), and cardiac causes (n=6; 3.8%). One-, five-, and 10-yr actuarial patient survival was 86, 79, and 74% in infants and 90, 83 and 80% in children. Graft survival at one, five, and 10 yr was 77, 73 and 71% in infants and 88, 81 and 78% in children, respectively. Despite its technical challenges, the outcomes of liver transplantation in pediatric patients with end-stage liver disease are excellent and result in significant long-term patient and graft survival.


Subject(s)
Liver Transplantation , Adolescent , Age Factors , Child , Child, Preschool , Follow-Up Studies , Graft Survival , Hospitals, University , Humans , Infant , Liver Transplantation/adverse effects , Liver Transplantation/mortality , Survival Rate , Time Factors , Treatment Outcome , Wisconsin
9.
Transplant Proc ; 38(10): 3685-8, 2006 Dec.
Article in English | MEDLINE | ID: mdl-17175367

ABSTRACT

BACKGROUND: It remains unclear which donor and recipient factors influence long-term allograft function in lung transplantation (LTx). METHODS: From October 1988 to February 2005, a total of 280 recipients underwent LTx at our center. Donor data and cause of death (CoD) were analyzed. The CoD was categorized according to rate of increase in intracranial pressure at the time of death. Each donor and recipient factor was correlated with long-term graft function. Recipient details, type of transplant, indication for transplant, and time on waiting list were analyzed. Recipients were stratified based on allograft ischemia time (AIT): 0 to 6, 6 to 8, 8 to 10, and >10 hours. RESULTS: Mean donor age was 30.9 years (36.7% male); 49.8% were cytomegalovirus (CMV) positive. Donor CoD was characterized by a slow rise in intracranial pressure (ICP) in 34.4%, rapid ICP in 18.7%, an intermediate ICP in 44.3%, and with no rise in 2.6%. A graft survival benefit was seen with female donors (P = .048); 34.4% of recipients ultimately developed graft failure at long term follow-up. Mean recipient age was 48 years; 63% were male and mean body-mass index (BMI) was 23.6; 60.2% had single lung transplantation, and mean wait list time was 323 days. Mean AIT totaled 421 minutes. Graft survival was longer with AIT of 8 to 10 hours compared to 6 to 8 hours (P = .03). CONCLUSIONS: Donor factor analysis implied only female donor status conferred a long-term graft survival advantage. Intracranial pressure rise differences appear clinically unimportant. Prolonged cold ischemic time (>10 hours) or low recipient BMI did not adversely affect allograft function in our review.


Subject(s)
Graft Survival/physiology , Lung Transplantation/physiology , Tissue Donors/statistics & numerical data , Adult , Cause of Death , Cytomegalovirus Infections/epidemiology , Cytomegalovirus Infections/transmission , Factor Analysis, Statistical , Female , Humans , Lung Diseases/classification , Lung Diseases/surgery , Lung Transplantation/mortality , Male , Middle Aged , Retrospective Studies , Survival Analysis , Transplantation, Homologous/physiology
10.
J Orthop Res ; 19(4): 565-72, 2001 Jul.
Article in English | MEDLINE | ID: mdl-11518263

ABSTRACT

Changes in expression of type III alpha1-collagen and myosin II heavy chains were characterized in rabbit skeletal muscle following single stretch injury using quantitative reverse transcription-polymerase chain reaction. Collagen III expression was highly elevated in the injured leg compared with the control limb both at the myotendinous junction and in the distal muscle belly. While upregulation of collagen III expression at the myotendinous junction was maximal on day 1, collagen III expression in the distal muscle belly was unchanged on day 1 but highly elevated by day 3. Over the initial 7-day period, there was on average a 94% increase in collagen III expression at the myotendinous junction and a 42% increase in the distal muscle belly. On the other hand, there was little difference, in fact, slightly less expression of myosin II isoforms, in the injured leg compared with the control side. Immunohistochemical analysis of injured muscle showed significant collagen III deposition at the myotendinous junction beginning at day 3 post-injury and still evident by day 14. Focal deposits of type I and III collagen were first apparent in the distal muscle belly by day 3 and striking by day 7. Taken together, the data suggest the formation of connective tissue scar at the injury site and the absence of significant muscle regeneration following muscle stretch. Furthermore, microinjuries distant to the primary site of injury may result in more general muscle fibrosis and scarring.


Subject(s)
Collagen/genetics , Muscle, Skeletal/injuries , Muscle, Skeletal/physiology , Myosin Heavy Chains/genetics , Wound Healing/physiology , Animals , Collagen/analysis , Fibroblasts/physiology , Gene Expression/physiology , Male , Muscle Fibers, Skeletal/cytology , Muscle Fibers, Skeletal/physiology , Muscle, Skeletal/cytology , Myosin Heavy Chains/analysis , RNA, Messenger/metabolism , Rabbits , Reverse Transcriptase Polymerase Chain Reaction , Tendons/cytology , Tendons/physiology
11.
Otolaryngol Head Neck Surg ; 125(1): 44-8, 2001 Jul.
Article in English | MEDLINE | ID: mdl-11458213

ABSTRACT

OBJECTIVE: To determine the effect of nasal irrigation on sinonasal symptoms. STUDY DESIGN AND SETTING: A total of 150 adult subjects with chronic sinusitis symptoms were recruited from the community and assigned to 1 of 3 treatment groups: nasal irrigation with bulb syringe, nasal irrigation with nasal irrigation pot, or control treatment with reflexology massage. Groups 1 and 2 performed daily hypertonic saline irrigation with 1 device for 2 weeks and then with the other device for 2 weeks. Group 3 performed reflexology massage daily for 2 weeks. Prospective data collected included pretreatment Medical Outcomes Study Short Form, pretreatment and posttreatment Rhinosinusitis Outcomes Measure, daily medication use, subjective treatment efficacy, and preference of irrigation method. RESULTS: There was a significant and equivalent improvement in Rhinosinusitis Outcomes Measure 31 score after 2 weeks of intervention in each treatment group; 35% of subjects reported decreased use of sinus medication. CONCLUSION: Daily nasal irrigation using either a bulb syringe, nasal irrigation pot, and daily reflexology massage were equally efficacious and resulted in improvement in the symptoms of chronic sinusitis in over 70% of subjects. Medication usage was decreased in approximately one third of participants regardless of intervention.


Subject(s)
Massage , Rhinitis/therapy , Therapeutic Irrigation/methods , Adult , Analysis of Variance , Chronic Disease , Female , Follow-Up Studies , Humans , Male , Patient Satisfaction , Probability , Prospective Studies , Rhinitis/diagnosis , Statistics, Nonparametric , Therapeutic Irrigation/instrumentation , Treatment Outcome
12.
J Urol ; 165(6 Pt 2): 2377-9, 2001 Jun.
Article in English | MEDLINE | ID: mdl-11371981

ABSTRACT

PURPOSE: The long-term success and efficacy of the artificial urinary sphincter for the management of neurogenic incontinence have been well documented. We evaluated if long-term results were affected by patient age at the time of sphincter placement. MATERIALS AND METHODS: A retrospective review of the medical records of patients who underwent artificial urinary sphincter placement and had minimum of 10 years of followup was conducted. All patients with an intact sphincter were interviewed to assess current results. Patients were stratified into groups 1 and 2 if the sphincter was implanted before or after age 11 years, respectively, and the results were compared statistically. RESULTS: An artificial urinary sphincter was placed in 45 children at Children's Hospital of Michigan between October 1978 and August 1986, and medical records and followup were available for 32. Mean followup was 15.4 years. Of the 21 group 1 patients 12 (57%) have an intact sphincter after 26 revisions, and all are dry and 9 (75%) require intermittent catheterization. Of the 11 group 2 patients 7 (64%) have an intact sphincter, and 6 (86%) are dry, 3 (43%) perform intermittent catheterization and 6 required 8 revisions. There was no statistically significant difference in the number of artificial urinary sphincter removals, continence, revision rate, bladder augmentations, complications or upper tract changes. CONCLUSIONS: The artificial urinary sphincter is a successful and durable option for the surgical management of neurogenic incontinence. The long-term results appear independent of patient age at the time of sphincter placement.


Subject(s)
Urinary Bladder, Neurogenic/surgery , Urinary Incontinence/surgery , Urinary Sphincter, Artificial , Adolescent , Age Factors , Child , Female , Humans , Male , Retrospective Studies , Treatment Outcome , Urinary Bladder/surgery , Urinary Bladder, Neurogenic/complications , Urinary Incontinence/etiology
15.
Clin J Sport Med ; 10(4): 239-44, 2000 Oct.
Article in English | MEDLINE | ID: mdl-11086748

ABSTRACT

OBJECTIVE: The purpose of this study was to determine if a preseason measurement of balance while in a unilateral stance could predict susceptibility to ankle injury in a cohort of high school basketball players. Predicting risk for ankle injury could be important in helping to reduce the risk of these injuries and furthermore save health care costs. DESIGN: Cohort study. SETTING: Data were collected at five high schools during the first 2 weeks of the 1997-1998 and 1998-1999 basketball seasons. SUBJECTS: 210 (119 male, age = 16.1 +/- 1.1 yr; height = 182.98 +/- 7.4 cm; weight = 76.4 +/- 10.9 kg; and 91 female, age = 16.3 +/- 1.3 yr; height = 170.9 +/- 7.8 cm; weight = 63.4 +/- 8.4 kg) high school basketball players who did not sustain a time loss ankle or knee injury within the previous 12 months served as subjects. Subjects did not use prophylactic ankle taping or bracing during the season. ASSESSMENT OF RISK FACTORS: Balance was quantified from postural sway scores measured while subjects performed unilateral balance tests with eyes both open and closed. Logistic regression analysis was carried out to determine if gender, dominant leg, and balance scores were related to ankle sprain injuries. In addition, Fischer's exact test was used to determine if the rate of ankle injuries was the same whether the subject had poor, average, or good balance. Balance was assessed by measuring postural sway with the NeuroCom New Balance Master version 6.0 (NeuroCom International, Clackamas, OR, U.S.A.). Testing to determine postural sway consisted of having subjects stand on one leg for three trials of 10 seconds with their eyes open, then repeated with their eyes closed. Subjects then underwent the same assessment while standing on the other leg. Postural sway was defined as the average degrees of sway per second (degrees S/S) for the 12 trials producing a compilation (COMP) score. OUTCOME MEASURES: Ankle injury resulting in missed participation. RESULTS: Subjects who sustained ankle sprains had a preseason COMP score of 2.01 +/- 0.32 (Mean +/- SD), while athletes who did not sustain ankle injuries had a score of 1.74 +/- 0.31. Higher postural sway scores corresponded to increased ankle sprain injury rates (p = 0.001). Subjects who demonstrated poor balance (high sway scores) had nearly seven times as many ankle sprains as subjects who had good balance (low sway scores) (p = 0.0002.) CONCLUSION: In this cohort of high school basketball players, pre-season balance measurement (postural sway) served as a predictor of ankle sprain susceptibility.


Subject(s)
Ankle Injuries/physiopathology , Ankle Joint/physiology , Basketball/injuries , Postural Balance/physiology , Sprains and Strains/physiopathology , Adolescent , Cohort Studies , Female , Humans , Joint Instability/physiopathology , Logistic Models , Male , Posture/physiology , Predictive Value of Tests , Risk Factors
16.
Transplantation ; 70(5): 780-3, 2000 Sep 15.
Article in English | MEDLINE | ID: mdl-11003357

ABSTRACT

BACKGROUND: Advances in perioperative care and immunosuppression have enabled clinicians to broaden the indications for organ transplantation. Advanced age is no longer considered a contraindication to transplantation at most centers. Although short-term studies of elderly liver transplant recipients have demonstrated that the incidence of complications and overall patient survival are similar to those of younger adults, transplant center-specific, long-term data are not available. METHODS: From August of 1984 to September of 1997, 91 patients 60 years of age or older received primary liver transplants at the University of Wisconsin, Madison. This group of patients was compared with a group of younger adults (n=387) ranging in age from 18 to 59 years who received primary liver transplants during the same period. The most common indications for transplantation in both groups were Laennec's cirrhosis, hepatitis C, primary biliary cirrhosis, primary sclerosing cholangitis, and cryptogenic cirrhosis. There was no difference in the preoperative severity of illness between the groups. Results. The length of hospitalization was the same for both groups, and there were no significant differences in the incidence of rejection, infection (surgical or opportunistic), repeat operation, readmission, or repeat transplantation between the groups. The only significant difference identified between the groups was long-term survival. Five-year patient survival was 52% in the older group and 75% in the younger group (P<0.05). Ten-year patient survival was 35% in the older group and 60% in the younger group (P<0.05). The most common cause of late mortality in elderly liver recipients was malignancy (35.0%), whereas most of the young adult deaths were the result of infectious complications (24.2%). CONCLUSION: Although older recipients at this center did as well as younger recipients in the early years after liver transplantation, long-term survival results were not as encouraging.


Subject(s)
Liver Transplantation/mortality , Aged , Aging/physiology , Cause of Death , Female , Graft Rejection/epidemiology , Humans , Liver Transplantation/immunology , Male , Survival Rate/trends , Survivors
17.
Radiology ; 216(3): 865-71, 2000 Sep.
Article in English | MEDLINE | ID: mdl-10966724

ABSTRACT

PURPOSE: To determine the imaging characteristics of a new computed tomographic (CT) contrast material with both hepatocyte-selective and blood-pool components (iodinated triglyceride (ITG)-dual) versus standard iohexol. MATERIALS AND METHODS: VX2 carcinoma was inoculated in seven rabbits. Animals underwent nonenhanced, iohexol-enhanced (600 mg of iodine per kilogram of body weight), and ITG-dual-enhanced (blood-pool moiety, 100 mg of iodine per kilogram; hepatocyte-selective moiety, 100 or 200 mg of iodine per kilogram, injected 90 minutes apart) helical CT. Livers were removed, preserved in formalin, suspended in agar, and sectioned transversely at 3-mm intervals. Attenuation values for normal liver and tumors were obtained, and blinded readers evaluated images for lesions by using a modified free-response receiver operating characteristic (ROC) method. RESULTS: A total of 47 separate tumor sites were detected at pathologic examination. ITG-dual-enhanced scans obtained with 300 mg of iodine per kilogram demonstrated similar liver opacification to iohexol-enhanced scans obtained with 600 mg of iodine per kilogram, but with less lesion enhancement, which resulted in better liver-to-lesion contrast. Blinded readers had a higher sensitivity, accuracy, and area under the ROC curve for ITG-dual-enhanced scans as compared with iohexol-enhanced scans (P: <.01). CONCLUSION: ITG-dual-enhanced CT quantitatively and qualitatively improved liver lesion detection versus iohexol-enhanced CT. Future clinical trials with various human tumor types after potential approval for human use are needed to determine the ultimate role of this or other dual-mechanism contrast materials.


Subject(s)
Contrast Media , Iohexol , Liver Neoplasms, Experimental/diagnostic imaging , Tomography, X-Ray Computed , Triglycerides , Animals , Humans , Liver/diagnostic imaging , Liver/pathology , Liver Neoplasms, Experimental/pathology , Rabbits , Sensitivity and Specificity
18.
Kidney Int ; 57(5): 2129-35, 2000 May.
Article in English | MEDLINE | ID: mdl-10792634

ABSTRACT

BACKGROUND: Diabetic renal disease continues to be the most significant cause of end-stage renal disease (ESRD) in the United States. Renal transplantation improves diabetic ESRD patient survival; however, the diabetic state remains associated with poor patient survival. Simultaneous pancreas-kidney (SPK) transplantation can restore normoglycemia and thus may improve outcomes. METHODS: We assessed the impact of SPK on age-range-matched type 1 diabetic patients who underwent renal transplantation at a single center. The observed/expected life span and annual mortality rates (AMRs) were used as measures of survival. A Cox proportional hazards analysis was used to analyze the impact of potential variables on mortality in SPK recipients. RESULTS: SPK transplantation (N = 335) increased the observed/expected life span compared with diabetic cadaveric (DM-Cad, N = 147) and live-donor (DM-Live, N = 160) transplant recipients (P = 0.004) and significantly reduced the AMRs (SPK, 1. 5%; DM-Cad, 6.27%; DM-Live, 3.65%, P = 0.008, SPK vs. other DM). Moreover, the SPK observed/expected life span and AMR were not significantly different from that of age-range-matched nondiabetic transplant recipients (N = 492). The only variable that was significantly associated with patient survival was discharge serum creatinine (relative risk 1.16, P < or = 0.0154). CONCLUSION: These data demonstrate that SPK improves the ability for type 1 diabetic patients to live more of their expected life span. This suggests that glycemic control, even as a late intervention in a diabetic patient's lifetime, may beneficially affect survival.


Subject(s)
Diabetes Mellitus, Type 1/surgery , Diabetic Nephropathies/surgery , Kidney Failure, Chronic/surgery , Kidney Transplantation , Pancreas Transplantation , Adult , Diabetes Mellitus, Type 1/mortality , Diabetic Nephropathies/mortality , Female , Humans , Kidney Failure, Chronic/mortality , Male
19.
Transplantation ; 69(5): 799-805, 2000 Mar 15.
Article in English | MEDLINE | ID: mdl-10755529

ABSTRACT

BACKGROUND: Despite the long history of use of antithymocyte globulins (ATG) in renal transplantation, ideal doses and duration of ATG administration based on the monitoring of T lymphocytes have yet to be defined. METHODS: Two immunosuppressive regimens based on low-dose rabbit ATG (Thymoglobuline; Imtix-Sang-stat, Lyon, France) were assessed during the first year after transplantation: daily ATG (DATG; n=23) where 50 mg of ATG was given every day and intermittent ATG (IATG; n=16) where similar doses of ATG were given for the first 3 days and then intermittently only if CD3+ T lymphocytes (measured by flow cytometry) were > 10/mm3. Both groups received steroids, azathioprine, and cyclosporine. RESULTS: ATG-induced depletion was similar for peripheral blood lymphocytes and T cells in both groups: it began at day 1 after transplantation, was submaximal at day 3, and reached maximum intensity between days 6 and 8, from which time cell counts progressively increased. However, T-cell depletion was still present at day 20. The total ATG dose per patient (381.5+/-121 vs. 564+/-135 mg/patient) and the mean cumulative daily dose of ATG (0.60+/-0.17 vs. 0.80+/-0.14 mg/kg/day) were significantly lower in the IATG group (P=0.0001 and 0.0006, respectively). The overlap of ATG and cyclosporine treatment was 6.7+/-3 vs. 7.4+/-4.3 days (P=NS), and the mean duration of ATG therapy was 11.3+/-3.2 vs. 11.6+/-2.7 days in the IATG and DATG groups, respectively (P=NS). ATG was given in an average of one dose every 1.6 days in the IATG group compared with one dose daily in the DATG group (P=7 x 10(-7)). There was no significant difference in renal graft function, the number of acute graft rejections, or ATG-related side effects and complications. Despite the daily immunological follow-up, there was a net saving of $760/patient in the cost of treatment in the IATG group. CONCLUSION: IATG had the advantage of a reduction in the dose of ATG and in the cost of treatment, while offering similar T-cell depletion and effective immunosuppression. This approach could be proposed as an induction protocol, particularly for patients with poor graft function in whom cyclosporine introduction has to be delayed or those with increased risk of cytomegalovirus infections or secondary malignancies.


Subject(s)
Antilymphocyte Serum/administration & dosage , Immunosuppressive Agents/administration & dosage , Kidney Transplantation , Adult , Aged , Animals , Antilymphocyte Serum/adverse effects , Antilymphocyte Serum/therapeutic use , Blood Cells/pathology , Clonal Deletion , Cyclosporine/adverse effects , Cyclosporine/therapeutic use , Dose-Response Relationship, Drug , Female , Graft Rejection/drug therapy , Health Care Costs , Hematologic Diseases/chemically induced , Humans , Immunosuppressive Agents/adverse effects , Immunosuppressive Agents/therapeutic use , Infections/chemically induced , Kidney/physiopathology , Lymphocytes/pathology , Male , Middle Aged , Rabbits
20.
Clin Transplant ; 13(4): 349-55, 1999 Aug.
Article in English | MEDLINE | ID: mdl-10485378

ABSTRACT

BACKGROUND: Renal transplant artery stenosis (RTAS) continues to be a problematic, but potentially correctable, cause of post-transplant hypertension and graft dysfunction. Older transplant recipients, prone to peripheral vascular disease (PVD), may have pseudoRTAS with PVD involving their iliac system. METHODS: We retrospectively analyzed 819 patients who underwent kidney transplantation between 1993 and 1997 to determine the contribution of pseudoRTAS to renal transplant renovascular disease. Univariate analyses were performed for donor and recipient variables, including age, weight, gender, race, renal disease, cholesterol and creatinine values, human leukocyte antigen (HLA) matching, cytomegalovirus (CMV) infection, and immunosuppressive medications. Significant variables were then analyzed by a Cox proportional hazards model. RESULTS: Ninety-two patients (11.2%) underwent renal transplant arteriogram (Agram) or magnetic resonance angiography (MRA) for suspected RTAS. RTAS or pseudoRTAS, defined as one or more hemodynamically significant lesions in the transplant artery or iliac system, was evident in 44 patients (5.4%). Variables significantly associated with RTAS by univariate analysis were weight at the time of transplant (p = 0.0258), male gender (p = 0.034), discharge serum creatinine > 2 mg/dL (p = 0.0041), and donor age (p = 0.0062). Variables significantly associated with pseudoRTAS by univariate analysis were weight at the time of transplant (p = 0.0285), recipient age (p = 0.0049), insulin-dependent diabetes mellitus (IDDM; p = 0.0042), panel reactive antibody (PRA) at transplant (p = 0.018), and body mass index (p = 0.04). Weight at transplant and donor age remained significantly associated with an increased risk for RTAS in a multivariate stepwise Cox proportional hazards model. IDDM, transplant PRA, weight at transplant, and donor age were significantly associated with an increased risk for pseudoRTAS in a multivariate stepwise Cox proportional hazards model. Importantly, both RTAS and pseudoRTAS were associated with poorer graft survival (p < 0.007 for each). CONCLUSIONS: Renal transplant renovascular disease encompasses pre-existing PVD acting as pseudoRTAS, as well as classical RTAS. Efforts to identify and correct renal transplant renovascular disease of either nature are important, given its negative impact on graft survival.


Subject(s)
Arterial Occlusive Diseases/etiology , Iliac Artery , Kidney Transplantation/adverse effects , Peripheral Vascular Diseases/etiology , Renal Artery Obstruction/etiology , Adult , Arterial Occlusive Diseases/diagnosis , Female , Humans , Hypertension, Renovascular/etiology , Male , Middle Aged , Peripheral Vascular Diseases/diagnosis , Risk Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...