Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 65
Filter
1.
Trends Pharmacol Sci ; 40(1): 4-7, 2019 01.
Article in English | MEDLINE | ID: mdl-30527590

ABSTRACT

The myeloid-derived suppressor cell (MDSC) is the 'queen bee' of the tumor microenvironment. MDSCs protect the cancer from the patient's immune system, make the tumor resistant to immunotherapy, and allow the tumor to thrive while the patient withers away. Eliminating MDSCs should improve response rates to cancer therapy and patient survival.


Subject(s)
Myeloid-Derived Suppressor Cells/immunology , Neoplasms/immunology , Tumor Microenvironment/immunology , Animals , Humans , Immunotherapy/methods , Neoplasms/therapy , Survival , Treatment Outcome
2.
Inflamm Bowel Dis ; 11(8): 713-9, 2005 Aug.
Article in English | MEDLINE | ID: mdl-16043985

ABSTRACT

BACKGROUND: RDP58 is a novel anti-inflammatory d-amino acid decapeptide that inhibits synthesis of proinflammatory cytokines by disrupting cell signaling at the pre-MAPK MyD88-IRAK-TRAF6 protein complex. We therefore evaluated its efficacy and safety in parallel multicenter, double-blind, randomized concept studies in ulcerative colitis (UC). METHODS: In the first trial, 34 patients with mild to moderate active UC were randomized (1:2) to placebo (n = 13) or RDP58 100 mg (n = 21). In the second trial, 93 similar patients were randomized (1:1:1) to placebo (n = 30) RDP58 200 mg (n = 31), or RDP 300 mg (n = 32). In both studies, treatment success was defined as a simple clinical colitis activity index score of no more than 3 at 28 days. Sigmoidoscopy and rectal biopsy (at baseline and 28 days) and safety measures (baseline and 28 and 56 days) were other endpoints. RESULTS: Treatment success on RDP 100 mg was 29% versus 46% on placebo (P = 0.46). There were no significant differences in sigmoidoscopy or histology score. In the second study, treatment success on the higher doses of RDP58 (200 and 300 mg) was 71% and 72%, respectively, versus 43% on placebo (P = 0.016). Improvements in sigmoidoscopy scores (41% on 200 mg and 46% on 300 mg versus 32% on placebo) did not reach significance, but histology scores improved significantly (P = 0.002) versus placebo. Overall, adverse events were no different between placebo (3.3 +/- 2.4) and RDP58 (2.7 +/- 1.4, 300-mg group). CONCLUSIONS: RDP58 at a dose of 200 or 300 mg, but not 100 mg, was effective in mild-to-moderate UC. RDP58 was safe and well tolerated, and its novel action makes it an attractive potential therapy.


Subject(s)
Colitis, Ulcerative/drug therapy , Peptides/administration & dosage , Administration, Oral , Adult , Colitis, Ulcerative/diagnosis , Colonoscopy , Dose-Response Relationship, Drug , Double-Blind Method , Drug Administration Schedule , Female , Follow-Up Studies , Humans , Intestinal Mucosa/drug effects , Intestinal Mucosa/pathology , Male , Middle Aged , Probability , Reference Values , Risk Assessment , Severity of Illness Index , Treatment Outcome
3.
Am J Transplant ; 1(4): 360-5, 2001 Nov.
Article in English | MEDLINE | ID: mdl-12099381

ABSTRACT

There has been considerable recent debate concerning the reconfiguration of the cadaveric liver allocation system with the intent to allocate livers to more severely ill patients over greater distances. We sought to assess the economic implications of longer preservation times in cadaveric liver transplantation that may be seen in a restructured allocation system. A total of 683 patients with nonfulminant liver disease, aged 16 years or older, receiving a cadaveric donor liver as their only transplant, were drawn from a prospective cohort of patients who received transplants between January 1991 and July 1994 at the University of California, San Francisco, the Mayo Clinic, Rochester, Minnesota, or the University of Nebraska, Omaha. The primary outcome measure was standardized hospitalization resource utilization from the day of transplantation through discharge. Secondary outcome measures included 2-year patient survival, and 2-year retransplantation rates. Results indicated that each 1-h increase in preservation time was associated with a 1.4% increase in standardized hospital resource utilization (p = 0.014). The effects on 2-year patient survival and retransplantation rates were not measurably affected by an increase in preservation time. We conclude that policies that increase preservation time may be expected to increase the cost of liver transplantation.


Subject(s)
Liver Transplantation/physiology , Organ Preservation/methods , Body Constitution , Child , Costs and Cost Analysis , Databases, Factual , Female , Hospitalization/economics , Humans , Liver Transplantation/economics , Liver Transplantation/mortality , Male , National Institutes of Health (U.S.) , Organ Preservation/economics , Racial Groups , Reoperation/economics , Reoperation/statistics & numerical data , Retrospective Studies , Survival Analysis , Tissue Donors/statistics & numerical data , Treatment Outcome , United States
4.
Transplantation ; 70(3): 537-40, 2000 Aug 15.
Article in English | MEDLINE | ID: mdl-10949200

ABSTRACT

BACKGROUND: Recently the United Network for Organ Sharing (UNOS) began a pilot study to evaluate prospectively the merits of an allocation of cadaveric kidneys based on broader classes of HLA antigens, called cross-reactive groups (CREG). The objectives of the pilot study consider patient outcomes, but not the potential economic impact of a CREG-based allocation. This study predicts the impact of a CREG-based local allocation of cadaveric kidneys on 3-year Medicare payments and graft survival. METHODS: The UNOS renal transplant registry was merged to Medicare claims data for 1991-1997 by the United States Renal Data System. Average accumulated Medicare payments and graft survival up to 3 years posttransplant for first cadaveric renal transplant recipients were stratified by cross-reactive group mismatch categories. The economic impact was defined as the difference in average 3-year costs per transplant between the current and proposed allocation algorithms. Average 3-year costs were computed as a weighted average of costs, where the weights were the actual and predicted distributions of transplants across cross-reactive group categories. RESULTS: Results suggest that an organ allocation based on cross-reactive group matching criteria would result in a 3-year cost savings of $1,231 (2%) per transplant, and an average 3-year graft survival improvement of 0.6%. CONCLUSIONS: Cost savings and graft survival improvements can be expected if CREG criteria were to replace current criteria in the current allocation policy for cadaveric kidneys, although the savings appear to be smaller than may be achievable through expanded HLA matching.


Subject(s)
Histocompatibility Testing/methods , Kidney Transplantation/economics , Kidney Transplantation/immunology , Tissue and Organ Procurement/economics , Tissue and Organ Procurement/methods , Algorithms , Cost Savings , Cross Reactions , Graft Survival , Humans , Pilot Projects , Prospective Studies , United States
5.
N Engl J Med ; 341(19): 1440-6, 1999 Nov 04.
Article in English | MEDLINE | ID: mdl-10547408

ABSTRACT

BACKGROUND: The potential economic effects of the allocation of cadaveric kidneys on the basis of tissue-matching criteria is controversial. We analyzed the economic costs associated with the transplantation of cadaveric kidneys with various numbers of HLA mismatches and examined the potential economic benefits of a local, as compared with a national, system designed to minimize HLA mismatches between donor and recipient in first cadaveric renal transplantations. METHODS: All data were supplied by the U.S. Renal Data System. Data on all payments made by Medicare from 1991 through 1997 for the care of recipients of a first cadaveric renal transplant were analyzed according to the number of HLA-A, B, and DR mismatches between donor and recipient and the duration of cold ischemia before transplantation. RESULTS: Average Medicare payments for renal transplant recipients in the three years after transplantation increased from 60,436 dollars per patient for fully HLA-matched kidneys (those with no HLA-A, B, or DR mismatches) to 80,807 dollars for kidneys with six HLA mismatches between donor and recipient, a difference of 34 percent (P<0.001). By three years after transplantation, the average Medicare payments were 64,119 dollars for transplantations of kidneys with less than 12 hours of cold ischemia time and 74,997 dollars for those with more than 36 hours (P<0.001). In simulations, the assignment of cadaveric kidneys to recipients by a method that minimized HLA mismatching within a local geographic area (i.e., within one of the approximately 50 organ-procurement organizations, which cover widely varying geographic areas) produced the largest cost savings (4,290 dollars per patient over a period of three years) and the largest improvements in the graft-survival rate (2.3 percent) when the potential costs of longer cold ischemia time were considered. CONCLUSIONS: Transplantation of better-matched cadaveric kidneys could have substantial economic advantages. In our simulations, HLA-based allocation of kidneys at the local level produced the largest estimated cost savings, when the duration of cold ischemia was taken into account. No additional savings were estimated to result from a national allocation program, because the additional costs of longer cold ischemia time were greater than the advantages of optimizing HLA matching.


Subject(s)
Health Care Costs/statistics & numerical data , Health Care Rationing/organization & administration , Histocompatibility Testing/economics , Kidney Transplantation/economics , Medicare/economics , Patient Selection , Resource Allocation , Cadaver , Cost Savings , Graft Survival , Health Care Rationing/economics , Humans , Kidney Transplantation/immunology , Medicare/statistics & numerical data , Organ Preservation , Time Factors , Tissue and Organ Procurement/economics , Tissue and Organ Procurement/organization & administration , Transplantation Immunology , United States
6.
Kidney Int ; 55(6): 2415-22, 1999 Jun.
Article in English | MEDLINE | ID: mdl-10354290

ABSTRACT

UNLABELLED: Correlation of histology to rejection reversal: A Thymoglobulin Multicenter Trial report BACKGROUND: Histology may provide a link between clinical response to antirejection therapy and graft function. In a subset of centers, renal biopsy was a secondary end point for the Thymoglobulin Multicenter Trial. METHODS: Thirty-eight patients had a protocol biopsy one to two weeks following the end of therapy. Inclusion and post-treatment biopsies were graded and scored according to Banff criteria by a central pathologist who was blinded to the type and outcome of therapy and the timing of the biopsy. RESULTS: The majority of patients (31 of 38) had moderate rejection on their inclusion biopsy. An improvement of at least one Banff grade occurred in 58% of the patients. The treatment was clinically successful in 33 patients, but two thirds of the patients (25 out of 38) demonstrated residual inflammation in the graft. The degree of improvement of inflammation was proportionate to rejection severity (P = 0.006). Banff scoring indicated that residual inflammation was less in Thymoglobulin-treated patients than in those receiving Atgam (P < 0.05) and correlated with the incidence of recurrent rejection (P = 0.015). CONCLUSIONS: These data demonstrate a discrepancy between clinical and histological resolution of acute renal allograft rejection. Residual infiltrates in the graft following rejection therapy are common and, despite clinical improvement, may indicate an increased risk for recurrent rejection.


Subject(s)
Antilymphocyte Serum/therapeutic use , Graft Rejection/pathology , Graft Rejection/therapy , Kidney Transplantation/adverse effects , Acute Disease , Adult , Biopsy , Double-Blind Method , Female , Graft Rejection/etiology , Humans , Kidney Transplantation/immunology , Kidney Transplantation/pathology , Male , Middle Aged
7.
Kidney Int ; 55(6): 2457-66, 1999 Jun.
Article in English | MEDLINE | ID: mdl-10354295

ABSTRACT

BACKGROUND: The association between cyclosporine (CsA) and thrombotic microangiopathy (TMA) in renal allografts is well documented. However, predisposing factors and therapy guidelines are not adequately characterized. METHODS: We reviewed 188 patients with kidney or kidney-pancreas transplants who were treated between January 1994 and December 1996 with prednisone, CsA, or tacrolimus, and azathioprine or mycophenolate. We analyzed 50 patients who had graft biopsies: 26 with TMA and 24 with no TMA, as well as 19 patients with well-functioning grafts who never required biopsy. RESULTS: TMA was observed in 26 of 188 renal graft recipients (14%). TMA was confined to the allograft kidney without any systemic evidence in 24 of the 26 patients. At the time of the diagnosis of TMA, 24 of the patients were on CsA, with 19 on the microemulsion form. Conversely, 5 of 18 control patients with no graft dysfunction were on the microemulsion form of CsA (P = 0.0026). Graft loss was seen in 8 of 26 patients with TMA. Conversion from CsA to tacrolimus resulted in a one-year salvage of graft function in 13 of 16 (81%) patients. CONCLUSIONS: TMA was the cause of renal graft dysfunction in 14% of renal graft recipients and was associated with the use of the microemulsion form of CsA. Systemic signs of TMA were rare, underscoring the importance of the graft biopsy in making the diagnosis. The most successful strategy was switching from CsA to tacrolimus, with good graft function in 81% of the recipients one year after the TMA episode.


Subject(s)
Cyclosporine/adverse effects , Immunosuppressive Agents/adverse effects , Kidney Diseases/etiology , Kidney Transplantation/adverse effects , Thrombosis/etiology , Adolescent , Adult , Case-Control Studies , Female , Graft Rejection/drug therapy , Graft Rejection/pathology , Graft Rejection/physiopathology , Humans , Kidney Diseases/pathology , Kidney Diseases/physiopathology , Kidney Transplantation/pathology , Kidney Transplantation/physiology , Male , Middle Aged , Tacrolimus/therapeutic use , Thrombosis/pathology , Thrombosis/physiopathology
8.
Transplantation ; 66(1): 29-37, 1998 Jul 15.
Article in English | MEDLINE | ID: mdl-9679818

ABSTRACT

BACKGROUND: Thymoglobulin, a rabbit anti-human thymocyte globulin, was compared with Atgam, a horse anti-human thymocyte globulin for the treatment of acute rejection after renal transplantation. METHODS: A multicenter, double-blind, randomized trial with enrollment stratification based on standardized histology (Banff grading) was conducted. Subjects received 7-14 days of Thymoglobulin (1.5 mg/kg/ day) or Atgam (15 mg/kg/day). The primary end point was rejection reversal (return of serum creatinine level to or below the day 0 baseline value). RESULTS: A total of 163 patients were enrolled at 25 transplant centers in the United States. No differences in demographics or transplant characteristics were noted. Intent-to-treat analysis demonstrated that Thymoglobulin had a higher rejection reversal rate than Atgam (88% versus 76%, P=0.027, primary end point). Day 30 graft survival rates (Thymoglobulin 94% and Atgam 90%, P=0.17), day 30 serum creatinine levels as a percentage of baseline (Thymoglobulin 72% and Atgam 80%; P=0.43), and improvement in posttreatment biopsy results (Thymoglobulin 65% and Atgam 50%; P=0.15) were not statistically different. T-cell depletion was maintained more effectively with Thymoglobulin than Atgam both at the end of therapy (P=0.001) and at day 30 (P=0.016). Recurrent rejection, at 90 days after therapy, occurred less frequently with Thymoglobulin (17%) versus Atgam (36%) (P=0.011). A similar incidence of adverse events, post-therapy infections, and 1-year patient and graft survival rates were observed with both treatments. CONCLUSIONS: Thymoglobulin was found to be superior to Atgam in reversing acute rejection and preventing recurrent rejection after therapy in renal transplant recipients.


Subject(s)
Antilymphocyte Serum/therapeutic use , Graft Rejection/therapy , Immunosuppressive Agents/therapeutic use , Kidney Transplantation , Acute Disease , Adolescent , Adult , Aged , Animals , Antilymphocyte Serum/adverse effects , Double-Blind Method , Female , Humans , Male , Middle Aged , Rabbits
10.
Arch Pathol Lab Med ; 121(7): 714-8, 1997 Jul.
Article in English | MEDLINE | ID: mdl-9240907

ABSTRACT

OBJECTIVE: To evaluate the histopathologic changes that occur in human small intestine or time when preserved in Viaspan organ preservation solution. DESIGN: Short segments of human small intestine were placed in standard organ preservation solution (Viaspan) and stored in conditions that mimic the clinical situation associated with clinical organ procurement, preservation, and transplantation. The intestinal segments were removed at sequential time points and placed in 10% formalin. Specimens underwent histopathologic examination to determine time-related changes. SPECIMENS: Short intestinal segments were obtained from seven multiorgan cadaver donors. Specimens were obtained in a way that exactly mimicked small intestinal organ retrieval. RESULTS: Small intestinal histology remained normal for the first 6 hours. After 6 hours, vacuolar separation began to occur between the epithelium and the basement membrane in the upper half of the villi. After 9 hours of cold preservation, epithelial detachment extended deep into the crypts with occasional shedding of cells and villi. CONCLUSIONS: Currently used small intestinal preservation using Viaspan results in considerable histopathologic changes in human jejunum after 9 hours of cold storage. The histopathologic pattern appears normal for the first 6 hours and suggests that preservation times should be limited to this time period when possible.


Subject(s)
Intestine, Small/cytology , Organ Preservation Solutions , Organ Preservation/methods , Adenosine , Adolescent , Adult , Allopurinol , Basement Membrane/cytology , Epithelial Cells , Female , Glutathione , Humans , Insulin , Intestinal Mucosa/cytology , Male , Middle Aged , Raffinose , Time Factors
13.
JAMA ; 277(6): 455-6; author reply 456-7, 1997 Feb 12.
Article in English | MEDLINE | ID: mdl-9020262
14.
Arch Surg ; 132(1): 35-9; discussion 40, 1997 Jan.
Article in English | MEDLINE | ID: mdl-9006550

ABSTRACT

OBJECTIVE: To evaluate the cause of worse kidney allograft survival in black recipients, which has been the source of considerable interest and debate. DESIGN: Three hundred ninety-two consecutive renal allografts (O HLA mismatch grafts excluded) were reviewed. Of the recipients, 57% were black, 27% received living donor grafts, and 86% received their first transplant. All recipients underwent an oral cyclosporine induction protocol with triple drug maintenance. Crude graft survival, the risk of rejection, and the need for dialysis were determined using donor and recipient demographic and immunologic variables. RESULTS: Graft survival was 84%, 67%, and 50% at 1, 3, and 5 years after the transplantation, respectively. The survival of black recipients was 4%, 11%, and 20% worse than that of white recipients at 1, 3, and 5 years, respectively (P < .002). When only pretransplantation variables were considered, black recipient race was the only variable that predicted graft loss in the multivariate analysis (relative risk [RR] = 1.6, P = .09). When posttransplantation and pretransplantation variables were used, cadaver donor (RR = 1.7), an episode of rejection (RR = 2.6), and the need for dialysis (RR = 2.7) were independent variables that predicted graft loss (P < .001). Black recipient race was a dependent variable. Four pretransplantation variables predicted the risk of dialysis: black race (RR = 3.6), male recipient (RR = 2.1), cadaveric donor (RR = 2.2), and a peak panel-reactive antibody level greater than 30% (RR = 2.8). Three pretransplantation variables predicted the risk of rejection: black race (RR = 1.7), male recipient (RR = 1.6), and a current panel-reactive antibody level greater than 30% (RR = 5.3). CONCLUSIONS: These data suggest that black recipient race is a dependent predictor of renal allograft survival when the posttransplantation events of rejection and dialysis are considered. Black recipients have more immunologic complications after renal transplantation that result in worse graft survival. These results confirm the importance of postallograft events as the major determinants of long-term graft survival and suggest that black recipients are receiving inadequate immunosuppression. These data support attempts to tailor immunosuppressive protocols to recipient pretransplantation risk profiles as a way to improve graft survival in the high-risk recipient.


Subject(s)
Black People , Graft Rejection/epidemiology , Kidney Transplantation , Adult , Female , Graft Survival , Humans , Incidence , Male , Retrospective Studies
16.
Clin Transplant ; 10(6 Pt 2): 635-8, 1996 Dec.
Article in English | MEDLINE | ID: mdl-8996757

ABSTRACT

Rupture of a renal allograft (RAR) is an uncommon but serious complication of renal transplantation. A recent RAR prompted a review of our experience, with the purpose of (1) identifying conditions that may predispose this complication and (2) defining strategies for prevention. A 5-yr, consecutive living-related (LRD) and cadaver donor (CD) cohort of 331 patients was studied retrospectively. Twelve patients (3.6%) had RAR. Donor characteristics, procurement and preservation conditions, and recipient characteristics were major study categories. Data analysis was computer-based and included multivariate analysis. The nine White and two Black cadaver donors were "ideal", mean age 29 yr, with mean high creatinine (CR) of 1.3 and terminal CR of 1.1 mg/dl and mean terminal urine output of 423 ml/min. Nine of 11 CD had low-dose dopamine use (terminal, mean 8, range 5-13 micrograms/kg/min). Eleven of 11 donors had procurement en-bloc, 9 of which were multiple organ procurement. All had 4+/4+ flush and cold storage with UW solution. Mean cold ischemia time (CIT) was 22 h, 28 min (range 15 h, 16 min to 40 h). For patients with RAR mean age was 39 yr; there were 12 Black patients and 7 males, 5 females. HLA match was 1 antigen (AG) for 3, 2 AG for 8, and 4 AG for 1 (mean 1.9). Nine patients had delayed or declining renal function requiring dialysis. The panel reactive antibody was at peak, mean 47% (range 0-100%) and current, mean 18% (range 0-84%). Six of 12 had OKT3 therapy at time of RAR and six had biopsies. Day of RAR was mean 10, median 9 (range 4-21). Pain and drop in hematocrit were observed in most. There was one fatality (8%), and all kidneys were removed. All kidneys showed at least minimal rejection but six had severe acute tubular necrosis (ATN) with edema and minimal rejection. Statistically significant associations with RAR were older recipient age (p = 0.01), donor-recipient race mismatch (White donor to Black recipient) (p = 0.007), and dialysis requirement (p < 0.001). Other variables were not statistically correlated: gender, race, CIT, transplant number, LRD vs. CD, peak or current PRA, and total HLA and BDR mismatch. The data suggest that ATN and rejection act synergistically to cause RAR and that early delayed function requires intensive and perhaps novel immunosuppression, especially in Black recipients.


Subject(s)
Kidney Diseases/etiology , Kidney Transplantation/adverse effects , Adolescent , Adult , Age Factors , Causality , Female , Graft Rejection/complications , Humans , Kidney Diseases/pathology , Kidney Tubular Necrosis, Acute/complications , Male , Middle Aged , Multivariate Analysis , Racial Groups , Retrospective Studies , Rupture, Spontaneous , Transplantation, Homologous
19.
Transplantation ; 60(12): 1401-6, 1995 Dec 27.
Article in English | MEDLINE | ID: mdl-8545864

ABSTRACT

Black kidney transplant recipients have worse graft survival than white recipients. Speculation regarding etiology has focused on differences in human lymphocyte antigens (HLA). Some suggest that improvements in graft survival would be obtained if donor and recipient race were matched. We reviewed 236 cadaver transplants performed over 9 years at a single center using an HLA-match-driven allocation system and a uniform immunosuppressive protocol to determine the impact of donor race on graft survival. A multivariate analysis of graft survival using patient race, sex, age, transplant number, current and maximum plasma renin activity, donor race, cold ischemia time and HLA mismatch, the need for dialysis, and the presence of rejection as independent variables. Sixty percent of recipients were black, and 82% were primary transplants; 28 kidneys (12%) were from black donors. The 112 patients with the same race donor had identical 5-year graft survival as the 124 who had a different race donor (40%; P = 0.1726). The 5-year survival of the 88 white recipients of white donor organs was better than that of the 120 black recipients of white donor organs (54% vs. 42%, respectively; P = 0.0398). Black recipients (t1/2 = 37 months) did worse than white recipients (t1/2 = 60 months) regardless of organ source (P = 0.023). In the multivariate analysis, neither donor nor recipient race were an independent variable in predicting graft survival. Rejection (RR = 2.9) and the need for dialysis on the transplant admission (RR = 4.1) were the only factors that predicted poor survival. Black recipients had more rejection (P = 0.04) but not more need for dialysis posttransplant regardless of donor race. Donor race did not affect graft survival in this series. The effect of recipient race on graft survival was due to an increased incidence of rejection episodes in black recipients, which was independent of HLA mismatch. These data suggest that improvements in immunosuppression, not changes in allocation, are needed to improve graft survival.


Subject(s)
Graft Survival , Kidney Transplantation , Adult , Age Factors , Black People , Follow-Up Studies , Graft Survival/genetics , Graft Survival/immunology , Histocompatibility Testing , Humans , Regression Analysis , Risk Factors , Sex Factors , White People
SELECTION OF CITATIONS
SEARCH DETAIL
...