Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 19 de 19
Filter
1.
Article in English | MEDLINE | ID: mdl-38522846

ABSTRACT

This study aimed to compare outcomes of hand-sewn and stapler closure techniques of pancreatic stump in patients undergoing distal pancreatectomy (DP). Impact of stapler closure reinforcement using mesh on outcomes was also evaluated. Literature search was carried out using multiple data sources to identify studies that compared hand-sewn and stapler closure techniques in management of pancreatic stump following DP. Odds ratio (OR) was determined for clinically relevant postoperative pancreatic fistula (POPF) via random-effects modelling. Subsequently, trial sequential analysis was performed. Thirty-two studies with a total of 4,022 patients undergoing DP with hand-sewn (n = 1,184) or stapler (n = 2,838) closure technique of pancreatic stump were analyzed. Hand-sewn closure significantly increased the risk of clinically relevant POPF compared to stapler closure (OR: 1.56, p = 0.02). When stapler closure was considered, staple line reinforcement significantly reduced formation of such POPF (OR: 0.54, p = 0.002). When only randomized controlled trials were considered, there was no significant difference in clinically relevant POPF between hand-sewn and stapler closure techniques (OR: 1.20, p = 0.64) or between reinforced and standard stapler closure techniques (OR: 0.50, p = 0.08). When observational studies were considered, hand-sewn closure was associated with a significantly higher rate of clinically relevant POPF compared to stapler closure (OR: 1.59, p = 0.03). Moreover, when stapler closure was considered, staple line reinforcement significantly reduced formation of such POPF (OR: 0.55, p = 0.02). Trial sequential analysis detected risk of type 2 error. In conclusion, reinforced stapler closure in DP may reduce risk of clinically relevant POPF compared to hand-sewn closure or stapler closure without reinforcement. Future randomized research is needed to provide stronger evidence.

4.
HPB (Oxford) ; 20(9): 848-853, 2018 09.
Article in English | MEDLINE | ID: mdl-29705345

ABSTRACT

BACKGROUND: Blood group is reported to have an effect upon survival following pancreatoduodenectomy for pancreatic ductal adenocarcinoma. The effect of blood group is not known, however, among patients with other periampullary cancers. This study sought to review this. METHODS: Data were collected for a range of factors and survival outcomes from patients treated at two centres. Those with blood groups B and AB were excluded, due to small numbers. Patient survival was compared between patients with blood groups O and A using multivariable analysis which accounted for confounding factors. RESULTS: Among 431 patients, 235 (54.5%) and 196 (45.5%) were of blood groups A and O respectively. Baseline comparisons found a significant difference in the distribution of tumour types (p = 0.011), with blood group O patients having more ampullary carcinomas (33.2% vs 23.4%) and less pancreatic ductal adenocarcinomas (45.4 vs 61.3%) than group A. On multivariable analysis, after accounting for confounding factors including pathologic variables, survival was found to be significantly shorter in those with blood group A than group O (p = 0.047, HR 1.30 [95%CI: 1.00-1.69]). CONCLUSIONS: There is a difference in the distribution of blood groups across the different types of periampullary cancers. Survival is shorter among blood group A patients.


Subject(s)
ABO Blood-Group System , Ampulla of Vater/surgery , Bile Duct Neoplasms/surgery , Carcinoma, Pancreatic Ductal/surgery , Cholangiocarcinoma/surgery , Duodenal Neoplasms/surgery , Pancreatic Neoplasms/surgery , Pancreaticoduodenectomy , Aged , Ampulla of Vater/pathology , Bile Duct Neoplasms/blood , Bile Duct Neoplasms/mortality , Bile Duct Neoplasms/pathology , Carcinoma, Pancreatic Ductal/blood , Carcinoma, Pancreatic Ductal/mortality , Carcinoma, Pancreatic Ductal/pathology , Cholangiocarcinoma/blood , Cholangiocarcinoma/mortality , Cholangiocarcinoma/pathology , Databases, Factual , Duodenal Neoplasms/blood , Duodenal Neoplasms/mortality , Duodenal Neoplasms/pathology , England , Female , Humans , Male , Middle Aged , Pancreatic Neoplasms/blood , Pancreatic Neoplasms/mortality , Pancreatic Neoplasms/pathology , Pancreaticoduodenectomy/adverse effects , Pancreaticoduodenectomy/mortality , Retrospective Studies , Risk Assessment , Risk Factors , Time Factors , Treatment Outcome
5.
Clin Transplant ; 31(11)2017 Nov.
Article in English | MEDLINE | ID: mdl-28871663

ABSTRACT

BACKGROUND: The demand for kidney retransplantation following graft failure is rising. Repeat transplantation is often associated with poorer outcomes due to both immunological and surgical challenges. The aim of this study was to compare surgical and functional outcomes of kidney retransplantation in recipients that had previously had at least two kidney transplants with a focus on those with antibody incompatibility. METHODS: We analyzed 66 patients who underwent renal transplantation at a single center between 2003 and 2011. Consecutive patients receiving their 3rd or 4th kidney were case-matched with an equal number of 1st and 2nd transplants. RESULTS: Twenty-two 3rd and 4th kidney transplants were matched with 22 first and 22 seconds transplants. Operative times and length of stay were equivalent between the subgroups. Surgical complication rates were similar in all groups (22.7% in 1st and 2nd transplants, and 27.2% in 3rd/4th transplants). There was no significant difference in patient or graft survival over 5 years. Graft function was similar between transplant groups at 1, 3, and 5 years. CONCLUSIONS: Third and fourth kidney transplants can be performed safely with similar outcomes to 1st and 2nd transplants. Kidney retransplantation from antibody-incompatible donors may be appropriate for highly sensitized patients.


Subject(s)
Graft Rejection/prevention & control , Histocompatibility Testing , Kidney Transplantation , Living Donors , Postoperative Complications/prevention & control , Reoperation , Tissue and Organ Procurement/methods , Adult , Case-Control Studies , Female , Follow-Up Studies , Glomerular Filtration Rate , Graft Rejection/epidemiology , Graft Survival , Humans , Kidney Failure, Chronic/surgery , Kidney Function Tests , Male , Prognosis , Registries , Risk Factors , Survival Rate , United Kingdom/epidemiology
6.
Hepatobiliary Pancreat Dis Int ; 15(6): 655-659, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27919856

ABSTRACT

Serum aminotransferases have been used as surrogate markers for liver ischemia-reperfusion injury that follows liver surgery. Some studies have suggested that rises in serum alanine aminotransferase (ALT) correlate with patient outcome after liver resection. We assessed whether postoperative day 1 (POD 1) ALT could be used to predict patient morbidity and mortality following liver resection. We reviewed our prospectively held database and included consecutive adult patients undergoing elective liver resection in our institution between January 2013 and December 2014. Primary outcome assessed was correlation of POD 1 ALT with patient's morbidity and mortality. We also assessed whether concurrent radiofrequency ablation, neoadjuvant chemotherapy and use of the Pringle maneuver significantly affected the level of POD 1 ALT. A total of 110 liver resections were included in the study. The overall in-hospital patient morbidity and mortality were 31.8% and 0.9%, respectively. The median level of POD 1 ALT was 275 IU/L. No correlation was found between POD 1 serum ALT levels and patient morbidity after elective liver resection, whilst correlation with mortality was not possible because of the low number of mortalities. Patients undergoing concurrent radiofrequency ablation were noted to have an increased level of POD 1 serum ALT but not those given neoadjuvant chemotherapy and those in whom the Pringle maneuver was used. Our study demonstrates POD 1 serum ALT does not correlate with patient morbidity after elective liver resection.


Subject(s)
Alanine Transaminase/blood , Colorectal Neoplasms/pathology , Hepatectomy/adverse effects , Liver Neoplasms/surgery , Postoperative Complications/blood , Adult , Aged , Biomarkers/blood , Catheter Ablation/adverse effects , Colorectal Neoplasms/mortality , Databases, Factual , Elective Surgical Procedures , England , Female , Hepatectomy/mortality , Humans , Liver Neoplasms/mortality , Liver Neoplasms/secondary , Male , Middle Aged , Neoadjuvant Therapy/adverse effects , Postoperative Complications/etiology , Postoperative Complications/mortality , Predictive Value of Tests , Retrospective Studies , Risk Factors , Time Factors , Treatment Outcome , Up-Regulation
7.
Transplant Res ; 3: 16, 2014.
Article in English | MEDLINE | ID: mdl-25206974

ABSTRACT

INTRODUCTION: Delayed graft function (DGF) remains a significant and detrimental postoperative phenomenon following living-related renal allograft transplantation, with a published incidence of up to 15%. Early therapeutic vasodilatory interventions have been shown to improve DGF, and modifications to immunosuppressive regimens may subsequently lessen its impact. This pilot study assesses the potential applicability of perioperative non-invasive cardiac output monitoring (NICOM), transit-time flow monitoring (TTFM) of the transplant renal artery and pre-/perioperative thromboelastography (TEG) in the early prediction of DGF and perioperative complications. METHODS: Ten consecutive living-related renal allograft recipients were studied. Non-invasive cardiac output monitoring commenced immediately following induction of anaesthesia and was maintained throughout the perioperative period. Doppler-based TTFM was performed during natural haemostatic pauses in the transplant surgery: immediately following graft reperfusion and following ureteric implantation. Central venous blood sampling for TEG was performed following induction of anaesthesia and during abdominal closure. RESULTS: A single incidence of DGF was seen within the studied cohort and one intra-operative (thrombotic) complication noted. NICOM confirmed a predictable trend of increased cardiac index (CI) following allograft reperfusion (mean CI - clamped: 3.17 ± 0.29 L/min/m(2), post-reperfusion: 3.50 ± 0.35 L/min/m(2); P < 0.05) mediated by a significant reduction in total peripheral resistance. Reduced TTFM at the point of allograft reperfusion (227 ml/min c.f. mean; 411 ml/min (95% CI: 358 to 465)) was identified in a subject who experienced intra-operative transplant renal artery thrombosis. TEG data exhibited significant reductions in clot lysis (LY30 (%): pre-op: 1.0 (0.29 to 1.71), post reperfusion 0.33 (0.15 to 0.80); P = 0.02) and a trend towards increased clot initiation following allograft reperfusion. CONCLUSIONS: Reduced renal arterial blood flow (falling without the 95% CI of the mean), was able to accurately predict anastomotic complications within this pilot study. TEG data suggest the emergence of a prothrombotic state, of uncertain clinical significance, following allograft reperfusion. Abrogation of characteristic haemodynamic trends, as determined by NICOM, following allograft reperfusion may permit prediction of individuals at risk of DGF. The findings of this pilot study mandate a larger definitive trial to determine the clinical applications and predictive value of these technologies.

8.
Hepatobiliary Pancreat Dis Int ; 12(3): 310-6, 2013 Jun.
Article in English | MEDLINE | ID: mdl-23742777

ABSTRACT

BACKGROUND: Intraperitoneal local anesthesia (IPLA) during elective laparoscopic cholecystectomy (el-LC) decreases post-operative pain. None of the studies have explored the efficacy of IPLA at emergency laparoscopic cholecystectomy (em-LC). A longer operative duration, the greater frequency of washing, and the inflammation associated with cholecystitis or pancreatitis are a few reasons why it cannot be assumed that a benefit in pain scores will be seen in em-LC with IPLA. This study was undertaken to assess the efficacy of IPLA in patients undergoing em-LC. METHODS: Double-blind randomized sham controlled trial was conducted of 41 consecutive subjects undergoing em-LC. IPLA was delivered by a combination of injection to the diaphragmatic and topical wash over the liver and gallbladder with bupivacaine or saline. The primary outcome was visual analogue scale pain scores until discharge. Secondary outcomes included pain scores in theatre recovery and analgesic consumption. RESULTS: One patient had a procedure converted to open and was excluded. There was no significant difference in pain scores in the ward or theatre recovery. Analgesic use, respiratory rate, oxygen saturation, duration to ambulation, eating, satisfaction scores, and time to discharge were comparable between the two groups. CONCLUSIONS: IPLA during em-LC does not influence postoperative pain. Other modalities of analgesia should be explored for decreasing the interval between diagnosis of acute admission and em-LC.


Subject(s)
Anesthetics, Local/administration & dosage , Bupivacaine/administration & dosage , Cholecystectomy, Laparoscopic , Pain, Postoperative/prevention & control , Administration, Topical , Adult , Analgesics/therapeutic use , Cholecystectomy, Laparoscopic/adverse effects , Double-Blind Method , Emergencies , England , Female , Humans , Injections, Intraperitoneal , Length of Stay , Male , Middle Aged , Pain Measurement , Pain, Postoperative/diagnosis , Pain, Postoperative/etiology , Patient Discharge , Therapeutic Irrigation , Time Factors , Treatment Outcome
9.
Transplantation ; 93(9): 867-73, 2012 May 15.
Article in English | MEDLINE | ID: mdl-22361472

ABSTRACT

BACKGROUND: The role of the complement system in antibody-mediated rejection has been investigated in relation to circulating complement interacting with renal microvascular endothelium, resulting in the formation of peritubular capillary C4d. However, the possible importance of local complement synthesis is less clear. The aim of this study was to determine whether human vascular endothelium could produce C4 in response to stimulation in vitro. METHODS: Human microvascular endothelial cells and glomerular endothelial cells were stimulated with endotoxins, cytokines, and human leukocyte antigen-specific antibodies. Synthesis of complement was investigated using western blotting and indirect immunofluorescence. De novo C4 synthesis was confirmed by using C4 small interfering RNA. RESULTS: Glomerular and microvascular endothelium, both produce C3 and C4 complement protein. Complement synthesis was stimulant-specific-C3 was produced mainly after stimulation with lipopolysaccharide whereas C4 synthesis occurred on treatment with gamma interferon. Culture with human leukocyte antigen-specific antibodies resulted in a significant increase of C4 protein synthesis by both cell lines. CONCLUSIONS: We have shown for the first time that human microvascular endothelium can be stimulated to synthesize C4 in vitro. The implications of this for clinical transplantation, especially in the context of antibody-mediated rejection, its histological interpretation and as a potential target for therapy would have to be determined by further studies.


Subject(s)
Antibodies/immunology , Complement C4/biosynthesis , Glomerular Mesangium/metabolism , Graft Rejection/immunology , HLA Antigens/immunology , Interferon-gamma/pharmacology , Antibodies/drug effects , Antiviral Agents/pharmacology , Blotting, Western , Cells, Cultured , Complement C4/drug effects , Complement C4/immunology , Fluorescent Antibody Technique, Indirect , Glomerular Mesangium/immunology , Glomerular Mesangium/pathology , Graft Rejection/pathology , Graft Rejection/prevention & control , Humans , Kidney Transplantation/immunology , Kidney Transplantation/pathology
10.
Transplantation ; 92(8): 900-6, 2011 Oct 27.
Article in English | MEDLINE | ID: mdl-21968524

ABSTRACT

BACKGROUND: Human leukocyte antigen (HLA) antibody-incompatible renal transplantation has been increasingly performed since 2000 but with few data on the medium-term outcomes. METHODS: Between 2003 and 2011, 84 patients received renal transplants with a pretreatment donor-specific antibody (DSA) level of more than 500 in a microbead assay. Seventeen patients had positive complement-dependent cytotoxic (CDC) crossmatch (XM), 44 had negative CDC XM and positive flow cytometric XM, and 23 had DSA detectable by microbead only. We also reviewed 28 patients with HLA antibodies but no DSA at transplant. DSAs were removed with plasmapheresis pretransplant, and patients did not routinely receive antithymocyte globulin posttransplant. RESULTS: Mean follow-up posttransplantation was 39.6 (range 2-91) months. Patient survival after the first year was 93.8%. Death-censored graft survival at 1, 3, and 5 years was 97.5%, 94.2%, and 80.4%, respectively, in all DSA+ve patients, worse at 5 years in the CDC+ve than in the CDC-ve/DSA+ve group at 45.6% and 88.6%, respectively (P<0.03). Five-year graft survival in the DSA-ve group was 82.1%. Rejection occurred in 53.1% of DSA+ve patients in the first year compared with 22% in the DSA-ve patients (P<0.003). CONCLUSIONS: HLA antibody-incompatible renal transplantation had a high success rate if the CDC XM was negative. Further work is required to predict which CDC+ve XM grafts will be successful and to treat slowly progressive graft damage because of DSA in the first few years after transplantation.


Subject(s)
HLA Antigens/immunology , Histocompatibility Testing , Isoantibodies/immunology , Kidney Transplantation , Acute Disease , Adult , Aged , Female , Follow-Up Studies , Graft Rejection/therapy , Graft Survival , Humans , Isoantibodies/blood , Kidney Transplantation/adverse effects , Kidney Transplantation/mortality , Male , Middle Aged , Proteinuria/etiology , Tissue Donors
11.
Ther Apher Dial ; 14(4): 392-9, 2010 Aug 01.
Article in English | MEDLINE | ID: mdl-20649760

ABSTRACT

Double filtration plasmapheresis (DFPP) was used in preference to plasma exchange in our program of antibody-incompatible transplantation, to treat higher volumes of plasma. Forty-two patients had 259 sessions of DFPP, 201 pre-transplant and 58 post-transplant. At the first treatment session, the mean plasma volume treated was 3.81 L (range 3-6 L), 55.5 mL/kg (range 36.2-83.6 mL/kg). Serum IgG fell by mean 59.4% (SD 10.2%), and IgM by 69.3% (SD 16.1%). Nine patients did not require increases in plasma volumes treated, and six did not tolerate higher plasma volumes. In the remaining patients, the mean maximum plasma volume treated pre-transplant was 6.67 L (range 4-15 L), 96.1 mL/kg (range 60.2-208.9 mL/kg). The complement dependent cytotoxic crossmatch was positive in 14 cases pre-treatment, and remained positive in six (42.8%) cases. The flow cytometric crossmatch was positive in 29 cases pre-treatment, and in 21 (72.4%) after DFPP. Post-transplant, DFPP was ineffective at reducing donor specific antibody levels during periods of rapid donor specific antibody synthesis. Post-transplant, the one year graft survival rate was 94%, although there was a high rate of early rejection. In summary, DFPP enabled the treatment of plasma volumes that were almost double those that would have been feasible with plasma exchange. Despite this, most patients were transplanted with a positive crossmatch, and DFPP post-transplant was unable to control rising antibody levels.


Subject(s)
Blood Group Incompatibility/immunology , Kidney Transplantation/immunology , Plasmapheresis/methods , ABO Blood-Group System/immunology , Adolescent , Adult , Aged , Female , Filtration , Flow Cytometry , Graft Rejection/immunology , Graft Survival/immunology , Humans , Male , Middle Aged , Young Adult
12.
Transpl Immunol ; 23(4): 161-5, 2010 Aug.
Article in English | MEDLINE | ID: mdl-20600903

ABSTRACT

HLA antibody-incompatible transplantation has a higher risk of rejection when compared to standard renal transplantation. Soluble CD30 (sCD30) has been shown in many, but not all, studies to be a biomarker for risk of rejection in standard renal transplant recipients. We sought to define the value of sCD30 and soluble CD27 (sCD27) in patients receiving HLA antibody-incompatible transplants. Serum taken at different time points from 32 HLA antibody-incompatible transplant recipients was retrospectively assessed for sCD30 and sCD27 levels by enzyme-linked immunosorbent assay (ELISA). This was compared to episodes of acute rejection, post-transplant donor-specific antibody (DSA) levels and 12 month serum creatinine levels. No association was found between sCD27 and sCD30 levels and risk of acute rejection or DSA levels. Higher sCD30 levels at 4-6 weeks post-transplantation were associated with a higher serum creatinine at 12 months. Conclusion patients undergoing HLA antibody-incompatible transplantation are at a high risk of rejection but neither sCD30 (unlike in standard transplantation) nor sCD27 was found to be a risk factor. High sCD30 levels measured at 4-6 weeks post-transplantation was associated with poorer graft function at one year.


Subject(s)
Graft Rejection/diagnosis , Graft Rejection/immunology , Isoantibodies/metabolism , Kidney Transplantation , Adolescent , Adult , Biomarkers/blood , Creatinine/blood , Enzyme-Linked Immunosorbent Assay , Female , Follow-Up Studies , Graft Rejection/blood , HLA Antigens/immunology , Humans , Isoantibodies/immunology , Ki-1 Antigen/blood , Male , Middle Aged , Prognosis , Tumor Necrosis Factor Receptor Superfamily, Member 7/blood
13.
Nephrol Dial Transplant ; 25(4): 1306-12, 2010 Apr.
Article in English | MEDLINE | ID: mdl-19934085

ABSTRACT

BACKGROUND: The aim of this study was to examine the development of acute antibody-mediated rejection in HLA antibody-incompatible renal transplantation in relation to the Banff 07 histological classification. METHODS: Renal biopsies were scored using the Banff 07 diagnostic criteria, and paraffin-embedded sections were stained with the pan-leucocyte marker CD45. RESULTS: Thirty-six patients had 72 renal biopsies. In biopsies performed 30 min after graft reperfusion, the mean number of CD45+ cells per glomerulus was higher than in control grafts (P < 0.04) and was associated with the donor-specific antibody (DSA) level at transplantation measured by microbeads (P < 0.01), and eight out of nine patients with greater than five CD45+ cells per glomerulus had early post-transplant rejection or oliguria, compared to 11 out of 20 with less than five cells per glomerulus (P < 0.01). In the first 10 days post-transplant, although peritubular capillary (PTC) leucocyte margination grade 3 and C4d deposition were specific for rejection, their sensitivities were low. PTC C4d staining was only seen in two out of 11 biopsies taken in the first 5 days after transplant, even in the presence of rejection, but was present in the majority of later biopsies with rejection. In biopsies stained for CD3, CD68 and CD20, it was notable that CD20+ cells were not seen during acute rejection, the infiltrates comprising CD3+ and CD68+ leucocytes. CONCLUSIONS: Glomerular margination of leucocytes occurred early after transplantation and was associated with DSA level and early graft dysfunction. The Banff 07 PTC margination scoring system was easy to apply, especially when CD45 staining was used, and PTC margination grade 3 was always associated with clinical rejection.


Subject(s)
Graft Rejection/immunology , HLA Antigens/immunology , Immunoglobulin G/immunology , Kidney Transplantation/immunology , Adolescent , Adult , Complement C4b/immunology , Female , Graft Rejection/pathology , Humans , Kidney Diseases/therapy , Kidney Transplantation/pathology , Male , Middle Aged , Peptide Fragments/immunology , Young Adult
14.
Transplantation ; 87(6): 882-8, 2009 Mar 27.
Article in English | MEDLINE | ID: mdl-19300192

ABSTRACT

BACKGROUND: After human leukocyte antigen (HLA) antibody-incompatible transplantation, donor specific and third party HLA antibodies may be found, and their levels fall in a donor-specific manner during the first month. However, these changes have not been previously described in detail. METHODS: Donor-specific HLA antibody (DSA) and third-party HLA antibody (TPA) levels were measured using the microbead method in 44 presensitized patients who had renal transplantation. RESULTS: DSA+TPA fell in the first 4 days after transplantation, and greater falls in DSA indicated absorption by the graft. This occurred for class I (57.8% fall compared with 20.2% for TPA, P<0.0005), HLA DR (63.0% vs. 24.3%, P<0.0004), and for HLA DP/DQ/DRB3-4 (34% vs. 17.5%, P=0.014). Peak DSA levels occurred at a mean of 13 days posttransplant, and they were higher than pretreatment in 25 (57%) patients and lower in 19 (43%) patients (P=ns). The risk of rejection was associated with peak DSA levels; 15 of 25 (60%) patients with DSA at median fluorescence intensity (MFI) more than 7000U experienced rejection, compared with 4 of 7 (57%) patients with peak DSA MFI 2000 to 7000U, and 2 of 12 (17%) patients with peak DSA MFI less than 2000U (P<0.02). DSA levels subsequently fell in a donor specific manner compared to TPA. CONCLUSION: DSA levels may change markedly in the first month after antibody incompatible transplantation, and the risk of rejection was associated with higher pretreatment and peak levels.


Subject(s)
Blood Group Incompatibility/immunology , HLA Antigens/immunology , Isoantibodies/immunology , Kidney Transplantation/immunology , Female , Graft Rejection/immunology , HLA-DR Antigens/immunology , Histocompatibility Testing/methods , Humans , Immunosuppressive Agents/therapeutic use , Living Donors/statistics & numerical data , Male , Mycophenolic Acid/analogs & derivatives , Mycophenolic Acid/therapeutic use , Prednisolone/therapeutic use , Tacrolimus/therapeutic use
15.
Transplantation ; 86(3): 474-7, 2008 Aug 15.
Article in English | MEDLINE | ID: mdl-18698253

ABSTRACT

Current methods of measuring ABO antibody levels based on the hemagglutination (HA) titers have the disadvantages of relatively poor reproducibility and do not offer fine discrimination of antibody concentration. We therefore developed a simple and rapid method of measuring ABO antibody levels using flow cytometry (FC). For validation, we analyzed plasma samples from 79 blood donors. Both IgM and IgG were detected and measured with IgG essentially restricted blood group O donors. Forty-two successive samples were collected from a patient with blood group O undergoing antibody removal and subsequent transplantation from a group A2 donor and tested by both HA and FC. Changes in IgG measured by FC (relative median fluorescence) correlated well with HA titers and importantly rejection episodes were preempted by a rising relative median fluorescence. The method allowed quantitative discrimination in the range of antibody levels relevant to ABO incompatible transplantation and has the advantages over HA of objective measurement and reproducibility.


Subject(s)
ABO Blood-Group System , Blood Group Incompatibility , Flow Cytometry , Immunoglobulin G/blood , Immunoglobulin M/blood , Isoantibodies/blood , Kidney Transplantation , Blood Grouping and Crossmatching , Hemagglutination Tests , Humans , Predictive Value of Tests , Reproducibility of Results , Time Factors
16.
Transplantation ; 84(7): 876-84, 2007 Oct 15.
Article in English | MEDLINE | ID: mdl-17984841

ABSTRACT

BACKGROUND: Accommodation to antibody is an important mechanism in successful ABO-incompatible transplantation, but its importance in human leukocyte antigen (HLA) antibody-incompatible transplantation is less clear, as sensitive techniques facilitating daily measurement of donor-specific HLA antibodies (DSAs) have only recently been developed. METHODS: We report 24 patients who had HLA antibody-incompatible kidney transplantation (21 living donors, 3 deceased), 21 of whom had pretransplant plasmapheresis. Eight had positive complement-dependent cytotoxic (CDC) crossmatch (XM) pretransplant plasmapheresis, nine had positive flow cytometric (FC) XM, and seven had DSA detectable by microbead analysis only. After transplant, DSA levels were monitored closely with microbead assays. RESULTS: Rejection occurred in five of eight (62.5%) CDC-positive cases, in three of nine (33%) FC-positive cases, and in two of seven (29%) of microbead-only cases at a median of 6.5 days after transplantation. Resolution occurred at a median of 15 days after transplantation, in 8 of 10 cases when the microbead level of DSA had median fluorescence intensity (MFI) >2000 U, in 6 of 10 when the microbead MFI >4000 U. In 8 of 10 cases, the microbead MFI at the time of resolution was greater than at the onset. DSA did not always cause clinical rejection. In five cases with a posttransplant DSA peaking at MFI >2000 U on microbead assay, rejection did not occur. CONCLUSION: These data suggest that the dominant method of successful transplantation was function of the transplant in the presence of circulating DSA, and they also define the period during which this occurred.


Subject(s)
HLA Antigens/immunology , Histocompatibility Testing , Kidney Transplantation/immunology , Adult , Aged , Antibodies/chemistry , Biopsy , Flow Cytometry , Graft Survival , HLA Antigens/chemistry , Humans , Immunosuppressive Agents/therapeutic use , Kidney Transplantation/methods , Living Donors , Middle Aged , Plasmapheresis , Polystyrenes/chemistry , Time Factors
17.
Transpl Int ; 18(7): 806-10, 2005 Jul.
Article in English | MEDLINE | ID: mdl-15948859

ABSTRACT

It is recommended that cyclosporine dosing should be based on the whole blood level 2 h after a dose (C2), not the trough level (C0). Initial studies did not however establish the outcome of dosing according to C2 levels in long-term patients previously managed by C0 levels. C0 and C2 were measured in 152 stable patients receiving Neoral therapy, mean 86.9 months after transplantation. This showed that 38 (25%) had C2 levels above a target range of 700-900 microg/l. Higher C2 levels were associated with higher cholesterol levels (P = 0.0058) and higher diastolic blood pressure (P = 0.0163). Cyclosporine dose reduction was undertaken in 32 patients with high C2 levels. For logistical reasons, C2 was not performed regularly, but an individualized C0 level was set for each patient. A 16% reduction in mean cyclosporine dose was achieved, associated with a 28% fall in mean C0, from 212 to 153 microg/l, and a 25% fall in mean C2, from 1075 to 820 microg/l. There was no excess in adverse events in the dose reduction cohort, compared with patients with initial C2 levels <900 microg/l. Over a mean 15 month follow-up period in the dose reduction cohort, there was a 4.4% reduction in mean diastolic blood pressure, from 84.9 (SEM 2.1) to 80.2 (1.9) mmHg, P = 0.023; and a 10.4% reduction in mean cholesterol, from 5.71 (0.27) to 5.11 (0.25), P = 0.005 (patients starting on statin during follow-up excluded). In patients with initial C2 <900 microg/l, blood pressure did not fall and the cholesterol fell by 3.9%, from 5.27 (0.14) to 5.07 (0.15) mmol/l (P = 0.0405). In conclusion, cyclosporine dose reduction was safe in stable long-term renal allograft recipients with high C2 levels. There was an improvement cholesterol levels and a small improvement in blood pressure after cyclosporine dose reduction.


Subject(s)
Cyclosporine/administration & dosage , Cyclosporine/blood , Immunosuppressive Agents/administration & dosage , Immunosuppressive Agents/blood , Kidney Transplantation , Blood Pressure/drug effects , Cholesterol/blood , Cohort Studies , Cyclosporine/therapeutic use , Dose-Response Relationship, Drug , Female , Follow-Up Studies , Humans , Immunosuppressive Agents/therapeutic use , Male , Middle Aged , Time Factors , Transplantation, Homologous
18.
Nephrol Dial Transplant ; 19(2): 444-50, 2004 Feb.
Article in English | MEDLINE | ID: mdl-14736972

ABSTRACT

BACKGROUND: This study was designed to examine the hypothesis that the nephrotoxicities caused by cyclosporin and tacrolimus might differ in respect of sodium and potassium handling. METHODS: 125 patients were studied retrospectively for the first 90 days after renal transplantation. Eighty were treated initially with cyclosporin and 45 with tacrolimus. RESULTS: A serum sodium level of <135 mmol/l was present for 542/5171 (10.5%) days under tacrolimus treatment compared with 377/5486 (6.9%) days under cyclosporin treatment (P < 0.0001). Severe hyponatraemia, below 120 mmol/l, was also more prevalent under tacrolimus than cyclosporin treatment, P < 0.025. Nine patients, all receiving tacrolimus, were treated with fludrocortisone for fluid depletion and/or hyponatraemia. Serum potassium levels were higher in tacrolimus-treated patients (P < 0.0001), and subjects with hyponatraemia were more likely to experience hyperkalaemia (P < 0.0001). CONCLUSIONS: Hyponatraemia and hyperkalaemia were more frequent in tacrolimus-treated subjects. Taken together with previous work showing that hyperuricaemia is more frequent with cyclosporin treatment, and hypomagnesaemia with tacrolimus treatment, these findings are consistent with qualitative differences between the nephrotoxicities of cyclosporin and tacrolimus.


Subject(s)
Cyclosporine/adverse effects , Hyperkalemia/epidemiology , Hyponatremia/epidemiology , Kidney Transplantation/immunology , Tacrolimus/adverse effects , Adult , Cyclosporine/administration & dosage , Female , Follow-Up Studies , Graft Rejection/prevention & control , Graft Survival , Humans , Hyperkalemia/chemically induced , Hyponatremia/chemically induced , Immunosuppressive Agents/administration & dosage , Immunosuppressive Agents/adverse effects , Kidney Failure, Chronic/surgery , Kidney Transplantation/methods , Male , Middle Aged , Postoperative Care , Prevalence , Probability , Retrospective Studies , Risk Assessment , Severity of Illness Index , Tacrolimus/administration & dosage , Transplantation Immunology/drug effects , Transplantation Immunology/physiology , Treatment Outcome
SELECTION OF CITATIONS
SEARCH DETAIL
...