Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 27
Filter
1.
Transpl Infect Dis ; 20(2): e12859, 2018 Apr.
Article in English | MEDLINE | ID: mdl-29427394

ABSTRACT

BACKGROUND: Invasive fungal infection (IFI) is a severe complication of liver transplantation burdened by high mortality. Guidelines recommend targeted rather than universal antifungal prophylaxis based on tiers of risk. METHODS: We aimed to evaluate IFI incidence, risk factors, and outcome after implementation of a simplified two-tiered targeted prophylaxis regimen based on a single broad-spectrum antifungal drug (amphotericin B). Patients presenting 1 or more risk factors according to literature were administered prophylaxis. Prospectively collected data on all adult patients transplanted in Turin from January 2011 to December 2015 were reviewed. RESULTS: Patients re-transplanted before postoperative day 7 were considered once, yielding a study cohort of 581 cases. Prophylaxis was administered to 299 (51.4%) patients; adherence to protocol was 94.1%. Sixteen patients developed 18 IFIs for an overall rate of 2.8%. All IFI cases were in targeted prophylaxis group; none of the non-prophylaxis group developed IFI. Most cases (81.3%) presented within 30 days after transplantation during prophylaxis; predominant pathogens were molds (94.4%). Only 1 case of candidemia was observed. One-year mortality in IFI patients was 33.3% vs 6.4% in patients without IFI (P = .001); IFI attributable mortality was 6.3%. At multivariate analysis, significant risk factors for IFI were renal replacement therapy (OR = 8.1) and re-operation (OR = 5.2). CONCLUSIONS: The implementation of a simplified targeted prophylaxis regimen appeared to be safe and applicable and was associated with low IFI incidence and mortality. Association of IFI with re-operation and renal replacement therapy calls for further studies to identify optimal prophylaxis in this subset of patients.


Subject(s)
Amphotericin B/pharmacology , Antifungal Agents/pharmacology , Invasive Fungal Infections/prevention & control , Liver Transplantation/adverse effects , Female , Humans , Male , Middle Aged , Mycoses/prevention & control , Risk Factors , Scedosporium
2.
Int J Surg ; 31: 93-9, 2016 Jul.
Article in English | MEDLINE | ID: mdl-27267949

ABSTRACT

BACKGROUND: Pancreaticoduodenectomy is still associated to high morbility, especially due to pancreatic surgery related and infectious complications: many risk factors have already been advocated. Aim of this study is to evaluate the role of preoperative oral immunonutrition in well nourished patients scheduled for pancreaticoduodenectomy. METHODS: From February 2014 to June 2015, 54 well nourished patients undergoing pancreaticoduodenectomy were enrolled for 5 days preoperative oral immunonutrition. A series of consecutive patients submitted to the same intervention in the same department, with preoperative standard oral diet, was matched 1:1. For analysis demographic, pathological and surgical variables were considered. Mortality rate, overall postoperative morbility, pancreatic fistula, post pancreatectomy haemorrhage, delayed gastric emptying, infectious complications and length of hospital stay were described for each groups. Chi squared test, Fisher's Exact test and Student's T test were used for comparison. Differences were considered statistically significant at p < 0.05. Statistics was performed using a freeware Microsoft Excel (®) based program and SPSS v 10.00. RESULTS: No statistical differences in term of mortality (2.1% in each groups) and overall morbility rate (41.6% vs 47.9%) occurred between the groups as well as for pancreatic surgery related complications. Conversely, statistical differences were found for infectious complications (22.9% vs 43.7%, p = 0.034) and length of hospital stay (18.3 ± 6.8 days vs 21.7 ± 8.3, p = 0.035) in immunonutrition group. CONCLUSION: Preoperative oral immunonutrition is effective for well nourished patients scheduled for pancreaticoduodenectomy; it helps to reduce the risk of postoperative infectious complications and length of hospital stays.


Subject(s)
Common Bile Duct Neoplasms/diet therapy , Common Bile Duct Neoplasms/surgery , Pancreatic Diseases/diet therapy , Pancreatic Diseases/surgery , Pancreaticoduodenectomy/adverse effects , Preoperative Care , Aged , Female , Humans , Length of Stay , Male , Middle Aged , Nutritional Status , Postoperative Complications/etiology , Postoperative Complications/prevention & control , Risk Factors
3.
Springerplus ; 5: 7, 2016.
Article in English | MEDLINE | ID: mdl-26759746

ABSTRACT

Contamination of perfusion fluid (PF) could lead to serious infections in kidney transplant recipients. Preemptive therapy (PE-T) in case of yeast contamination of PF is mandatory. The usefulness of PE-T in presence of bacteria remains unclear. In this study we evaluated the incidence of PF bacterial contamination and the impact of PE-T on clinical outcome. Microbiological data of 290 PF and clinical data of the corresponding recipients collected in our hospital from January 2010 and December 2012 were analyzed. Recipients with bacterial contaminated PF (101) were divided in 3 groups: group 1 (n = 52) PE-T treated bacteria resistant to perioperative antibiotic prophylaxis (PAP), group 2 (n = 28) bacteria sensitive to PAP, group 3 (n = 21) PE-T-untreated bacteria resistant to PAP. Incidence of positive PF was 34.8 %, 50.4 % staphylococci, 9.9 % C. albicans. No significant differences in the rate of PF-related infections between the three groups were found. In conclusion, although PF contamination is frequent, the incidence of PF-related infections is very low. In addition, in this study PE-T did not help to reduce the rate of PF-related infection suggesting that a resonable reduction in the use of antibiotic terapy could be made. However, waiting for largest and prospective clinical trials to confirm our findings, a closely clinical and microbiologic monitoring of the recipient is highly recommended in case of PF contamination.

4.
Transplant Proc ; 46(7): 2259-62, 2014 Sep.
Article in English | MEDLINE | ID: mdl-25242765

ABSTRACT

BACKGROUND: Kidney biopsy (KB) represents the criterion standard to obtain information on diagnosis and prognosis of renal allograft dysfunctions. However, it can be associated with bleeding complications (BCs). Bleeding time test (BTT), the best predictive indicator of post-biopsy BCs, is not a very reproducible test and is invasive. Therefore, the aim of this study was to evaluate whether the platelet function analyzer (PFA-100), a very reliable test to investigate primary hemostasis, could be useful in predicting the risk of bleeding complications in transplant patients undergoing KB. METHODS: We carried out a retrospective analysis of PFA-100 collagen-epinephrine (C-EPI) and collagen-adenosine diphosphate (C-ADP) closure times in 119 patients undergoing KB in our center. Data regarding BTT, age, sex, blood pressure, number of renal allograft punctures for each biopsy procedure, thromboplastin time, prothrombin time, complete blood count, and prophylactic therapy with desmopressin were also collected. Major (need for blood transfusion) or minor (no need for any intervention) BCs (hematoma and hematuria) were recorded. RESULTS: Indications for KB were: delayed graft function (n=23), allograft dysfunction (n=40), proteinuria (n=27), allograft dysfunction plus proteinuria (n=19), and protocol biopsy (n=10). Nine of the 119 patients (7.5%) developed minor BCs (6 macrohematuria, 3 hematoma), major BCs did not develop. No significant differences were found in any of the clinical and laboratory data, including BTT and PFA-100 (C-EPI and C-ADP) between patients who developed BCs compared with those who did not. In addition, there was no correlation between PFA-100 test (C-EPI and C-ADP) values and BTT data [R2=0.002; P=.6]. CONCLUSIONS: The PFA-100 test was not useful in predicting the risk of BCs in kidney transplant patients undergoing renal allograft biopsy.


Subject(s)
Biopsy/adverse effects , Hematoma/etiology , Hematuria/etiology , Kidney Transplantation , Platelet Function Tests , Female , Humans , Kidney/pathology , Male , Middle Aged , Retrospective Studies , Transplantation, Homologous
5.
Transplant Proc ; 46(7): 2308-11, 2014 Sep.
Article in English | MEDLINE | ID: mdl-25242775

ABSTRACT

BACKGROUND: Hepatitis B virus recurrence after liver transplantation (LT) has practically disappeared with a prophylaxis combining anti-hepatitis B surface antigen Immunoglobulins (HBIg) and antiviral drugs. Recently, cost-saving requirements pushed us to move from a fixed schedule of 50,000 IU intravenous HBIg in the first month after LT to an "on demand" administration guided by close monitoring of HBV surface antigen (HBsAg) and anti-HBV surface Antigen antibody (HBsAb) with a serological target of HBsAg negative and HBsAb>300 mIU/mL. In this context, we investigated the meaning of HBsAg quantitative determination at LT in predicting the need of HBIg in the first month after LT. METHODS: From February 2012 to July 2013, we performed 35 LTs in HBsAg-positive patients, 18 of whom had hepatitis Delta virus coinfection (Delta-positive). Anti-HBV prophylaxis was based on nucleos(t)ide analogues from day 1 post-LT and intravenous HBIg (10,000 IU intraoperatively and, in the following days, 5,000 IU and 2,500 IU pulses to reach and maintain the serological target). RESULTS: The HBsAg quantitative level at LT was significantly higher in Delta-positive recipients. Complete negativization of HBsAg and HBsAb serum level>300 mIU/mL was achieved on day 3 in Delta-positive and on day 2 in Delta-negative. A positive linear correlation between HBsAg quantitative level at LT and HBIg administered in the first month after LT was observed (RHO=.788), with a total of 32,500 IU HBIg used in HDV-positive and 22,000 IU in HDV-negative recipients (P=.0016). Compared to the old schedule, we saved a median of 14,750 IU in HDV-positive and 28,000 IU in Delta-negative. No HBV recurrence was observed in a median follow-up of 10.5 months. CONCLUSIONS: Delta-positive patients need higher doses of HBIg to reach the serological target after LT because they have greater HBsAg quantitative levels at LT. In future studies, pre-LT HBsAg quantitative determination will be helpful to predict the actual need of HBIg early after LT.


Subject(s)
Hepatitis B Surface Antigens/blood , Hepatitis B/prevention & control , Immunoglobulins, Intravenous/therapeutic use , Immunoglobulins/therapeutic use , Liver Transplantation , Postoperative Care/methods , Postoperative Complications/prevention & control , Adult , Aged , Aged, 80 and over , Antiviral Agents/therapeutic use , Biomarkers/blood , Drug Therapy, Combination , Female , Hepatitis B/diagnosis , Hepatitis B/etiology , Hepatitis B/immunology , Humans , Male , Middle Aged , Postoperative Complications/diagnosis , Postoperative Complications/immunology , Postoperative Complications/virology , Recurrence , Treatment Outcome
6.
Int J Immunopathol Pharmacol ; 25(1): 75-85, 2012.
Article in English | MEDLINE | ID: mdl-22507320

ABSTRACT

Paracrine mediators released from endothelial progenitor cells (EPCs) have been implicated in neoangiogenesis following ischemia. Recently, we demonstrated that microvesicles (MVs) derived from EPCs are able to activate an angiogenic program in quiescent endothelial cells by a horizontal transfer of RNA. In this study we aim to investigate whether EPC-derived MVs are able to induce neoangiogenesis and to enhance recovery in a murine model of hindlimb ischemia. Hindlimb ischemia was induced in severe combined immunodeficient (SCID) mice by ligation and resection of the left femoral artery and mice were treated with EPC-derived MVs (MVs), RNase-inactivated MVs (RnaseMVs), fibroblast-derived MVs or vehicle alone as control (CTL). Since MVs contained the angiogenic miR-126 and miR-296, we evaluated whether microRNAs may account for the angiogenic activities by treating mice with MVs obtained from DICER-knock-down EPC (DICER-MVs). The limb perfusion evaluated by laserdoppler analysis demonstrated that MVs significantly enhanced perfusion in respect to CTL (0.50±0.08 vs 0.39±0.03, p<0.05). After 7 days, immunohistochemical analyses on the gastrocnemius muscle of the ischemic hindlimb showed that MVs but not fibroblast-MVs significantly increased the capillary density in respect to CTL (MVs vs CTL: 24.7±10.3 vs 13.5±6, p<0.0001) and (fibroblast-MVs vs CTL: 10.2±3.4 vs 13.5±6, ns); RNaseMVs and DICER-MVs significantly reduced the effect of MVs (RNaseMVs vs CTL: 15.7±4.1 vs 13.5±6, ns) (MVs vs DICER-MVs 24.7±10.3 vs 18.1±5.8, p <0.05), suggesting a role of RNAs shuttled by MVs. Morphometric analysis confirmed that MVs enhanced limb perfusion and reduced injury. The results of the present study indicate that treatment with EPC-derived MVs improves neovascularization and favors regeneration in severe hindlimb ischemia induced in SCID mice. This suggests a possible use of EPCs-derived MVs for treatment of peripheral arterial disease.


Subject(s)
Cell-Derived Microparticles/physiology , Endothelial Cells/physiology , Hindlimb/blood supply , Ischemia/physiopathology , Neovascularization, Physiologic , Stem Cells/physiology , Animals , Capillaries , Cells, Cultured , Humans , Ischemia/pathology , Mice , Mice, SCID , MicroRNAs/physiology , Muscle, Skeletal/pathology
7.
Am J Transplant ; 10(10): 2270-8, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20840477

ABSTRACT

The development of proteinuria has been observed in kidney-transplanted patients on m-TOR inhibitor (m-TORi) treatment. Recent studies suggest that m-TORi(s) may alter the behavior and integrity of glomerular podocytes. We analyzed renal biopsies from kidney-transplanted patients and evaluated the expression of nephrin, a critical component of the glomerular slit-diaphragm. In a group of patients on 'de novo' m-TORi-treatment, the expression of nephrin within glomeruli was significantly reduced in all cases compared to pretransplant donor biopsies. Biopsies from control transplant patients not treated with m-TORi(s) failed to present a loss of nephrin. In a group of patients subsequently converted to m-TORi-treatment, a protocol biopsy performed before introduction of m-TORi was also available. The expression of nephrin in the pre-m-TORi biopsies was similar to that observed in the pretransplant donor biopsies but was significantly reduced after introduction of m-TORi(s). Proteinuria increased after the m-TORi inititiation in this group. However, in some cases proteinuria remained normal despite reduction of nephrin. In vitro, sirolimus downregulated nephrin expression by human podocytes. Our results suggest that m-TORi(s) may affect nephrin expression in kidney-transplanted patients, consistently with the observation in vitro on cultured podocytes.


Subject(s)
Kidney Glomerulus/metabolism , Kidney Transplantation/adverse effects , Membrane Proteins/biosynthesis , Sirolimus/adverse effects , TOR Serine-Threonine Kinases/antagonists & inhibitors , Adult , Aged , Cells, Cultured , Humans , Middle Aged , Podocytes/metabolism , Proteinuria/chemically induced , Retrospective Studies
8.
Transplant Proc ; 42(6): 2209-13, 2010.
Article in English | MEDLINE | ID: mdl-20692446

ABSTRACT

There is a strong need among the transplantation community to identify common criteria to utilize the pool of expanded criteria donors (ECD), considering the disparity between organ demand and supply as well as the benefits of transplantation on long-term mortality compared with survival on dialysis, also in patients transplanted with these organs. The purpose of this article was to analyze scoring systems proposed in literature by Nyberg, Anglicheau, Rao (Kidney Donor Risk Index), and Schold, seeking to verify whether our clinical and histological allocation strategy matched the Nyberg score. Herein we have reported the results of a preliminary retrospective study on the 5-year outcomes of organs from 60 marginal donors, who were older than 50 years and histologically evaluated before implantation. The donors matched Nyberg class C and D, that is, marginal donors. We noted a tendency toward an association between global and vascular scores with class D (odds ratio 2.2 and 4.3, respectively). Kaplan-Meier graft survival curves were similar to Nyberg data: 83% for class C versus 73% for class D at 5 years. Without any comparison to the Nyberg score, the only feature that was predictive of renal function at 5 years in our population was hypertension in the donor. Further studies are required to identify which of the scoring systems--clinical and/or histological--is more suitable to allocate ECD kidneys and to predict recipient outcomes.


Subject(s)
Kidney Transplantation/physiology , Adult , Aged , Aged, 80 and over , Confidence Intervals , Glomerular Filtration Rate/physiology , Histocompatibility Testing/methods , Humans , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Kidney Transplantation/mortality , Middle Aged , Odds Ratio , Patient Selection , Regression Analysis , Retrospective Studies , Risk Assessment , Tissue Donors/statistics & numerical data , Tissue and Organ Procurement/methods , Tissue and Organ Procurement/standards
9.
Acta Anaesthesiol Scand ; 54(8): 970-8, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20626358

ABSTRACT

BACKGROUND: Early extubation after liver transplantation (LT) is an increasingly applied safe practice. The aim of the present study was to provide a simple extubation rule for accelerated weaning in the operating room (OR). METHODS: Data of 597 patients transplanted at the LT center of Turin (Italy) were retrospectively analyzed. Fifty-two nonextubated patients (excluding those with a scheduled early reoperation) were compared with 545 successfully extubated patients (not in need of reintubation within the first 48 h). Significant variables at univariate analysis were entered into a logistic regression model and the regression coefficients of independent predictors were used to yield a prognostic score called the safe operating room extubation after liver transplantation (SORELT) score. RESULTS: Two major and three minor criteria were found. The major ones were blood transfusions (higher than/or equal to 7 U of packed red blood cells) and end of surgery lactate (higher than/or equal to 3.4 mmol/l). The minor ones were status before LT (home vs. hospitalized patient), duration of surgery (longer than/or equal to 5 h), vasoactive drugs at the end of surgery (dopamine higher than 5 microg/kg/min or norepinephrine higher than 0.05 microg/kg/min). Patients who fulfill the SORELT score-derived criteria (fewer than two major/one major plus two minor/three minor criteria) can be considered for OR extubation. CONCLUSION: Early extubation after LT requires a very careful assessment of the pre-operative, intraoperative, graft and post-operative care data available. The SORELT score helps as a simple and objective aid in considering such a decision.


Subject(s)
Intubation, Intratracheal , Liver Transplantation/physiology , Adult , Aged , Anesthesia, General , Area Under Curve , Blood Transfusion , Catheterization, Swan-Ganz , Device Removal , Female , Fluoroscopy , Humans , Lactic Acid/blood , Male , Middle Aged , Prognosis , ROC Curve , Reproducibility of Results , Risk Assessment , Safety , Young Adult
10.
Clin Transplant ; 23(5): 653-9, 2009.
Article in English | MEDLINE | ID: mdl-19563485

ABSTRACT

Cardiac screening is recommended to prevent cardiovascular death after renal transplantation. This retrospective observational study illustrates the results of application of a cardiac assessment algorithm in a series of 558 renal transplant candidates at a single center in Turin, Italy. A dipyridamole-stress sestamibi myocardial scintiscan (DMS) performed in 302/558 (54.1%) cases was positive in 52 (17.2%), negative in 200 (66.2%), borderline in 16 (5.3%), and with signs of previous necrosis in 34 (11.4%). Coronary lesions detected by angiography in 48.1% of the 52 positives were treated medically (13.5%) or by percutaneous/surgical procedure (34.6%). Coronary lesions were detected in 14.1% of asymptomatic population subgroup. The minor and major cardiovascular event rates and the cardiovascular death rate were 1.9%, 0%, and 0%, respectively, in positive DMS group (high-cardiological risk) vs. 10%, 4.5%, and 3.5% in the negatives (p > 0.5; n.s.). It is suggested that not increased cardiovascular event or deaths rates in the high-risk group reflect early coronary lesion detection and correction. Since 55.9% of cardiovascular events or deaths occurred in the negative group more than 24 months after the DMS, its mandatory repetition every two yr after a negative finding is recommended.


Subject(s)
Cardiovascular Diseases/etiology , Kidney Failure, Chronic/surgery , Kidney Transplantation , Waiting Lists , Adult , Aged , Algorithms , Cardiovascular Diseases/physiopathology , Dipyridamole , Exercise Test , Female , Humans , Italy , Kidney Failure, Chronic/complications , Male , Middle Aged , Prognosis , Retrospective Studies , Risk Factors , Survival Rate , Treatment Outcome
11.
Transplant Proc ; 41(4): 1316-8, 2009 May.
Article in English | MEDLINE | ID: mdl-19460550

ABSTRACT

During orthotopic liver transplantation (OLT), various situations may occur in which biliary reconstruction is neither technically feasible nor recommended. One bridge to a delayed anastomosis can be an external biliary fistula (EBF). This procedure allows the surgeon to execute hemostatic maneuvers, such as abdominal packing; therefore, biliary reconstruction can be subsequently performed in a bloodless operative field without edematous tissues. EBF can be made by placing in the donor biliary tract a cannula that is fixed to the bile duct using 2-0 silk ties and secured outside the abdominal wall. The biliary anastomosis will be performed within 2 days after the OLT. The aim of this study was to examine the safety of EBF in terms of the incidence of biliary complications compared with a direct anastomosis. Among 1,634 adult OLTs performed in 17 years in our center, 1,322 were carried out with termino-terminal hepaticocholedochostomy (HC-TT); two with side-to-side hepaticocholedochostomy; 208 with hepaticojejunostomy (HJ); 31 with EBF and delayed HC-TT, and 71 with EBF and delayed HJ. Biliary complication rates in the EBF group were 24.5%, including 23.9% in the delayed HJ and 25.8% in the delayed HC-TT. Biliary complication incidence among all OLTs was 24.6% (P = NS). No complications related to the procedure were observed. Therefore, EBF is a safe technique without a higher biliary complication rate. It may be useful when a direct biliary anastomosis is dangerous.


Subject(s)
Biliary Fistula/etiology , Liver Transplantation/adverse effects , Humans
12.
G Ital Nefrol ; 26(2): 191-200, 2009.
Article in Italian | MEDLINE | ID: mdl-19382075

ABSTRACT

Whether or not to consider a uremic patient for retransplantation remains a matter of debate. Donor shortage and putative poor outcomes are the main cons, improved results in the last decade and a better survival (HR 0.50) with retransplantation than dialysis stand as pros. The percentage of patients waitlisted for retransplantation or already having been retransplanted is increasing (up to 20-30%) and the absolute contraindications are limited to rare conditions (loss of previous transplant due to anti-glomerular basement antibodies in Alport's syndrome, early recurrence of GNF or hemolytic uremic syndrome). When retransplantation is considered, however, careful screening for risk factors is mandatory, whether they are related to the previous graft or to the recipient's clinical features or the donor's demographics and immunological status. In the last decade the clinical outcomes of retransplantation have significantly improved. No difference in patient survival at the fifth year has been reported between first, second and third grafts. The kidney survival at the same interval is above 70% for the second graft and 65% for the third graft. Nephrectomy of a previous graft is not necessary if not for clinical reasons. As far as the maximum number of retransplants is concerned, most transplant centers (69%) set no clear-cut limit. In conclusion, also taking into account that many patients after graft failure ask for readmission to the waiting list (75% in our experience), we think the retransplantation option should always be evaluated.


Subject(s)
Kidney Transplantation , Graft Survival , Humans , Kidney Transplantation/mortality , Prognosis , Reoperation , Survival Rate , Treatment Outcome
13.
Int J Immunopathol Pharmacol ; 22(4): 1135-41, 2009.
Article in English | MEDLINE | ID: mdl-20074480

ABSTRACT

Immunosuppressive treatment has changed the prognosis of Lupus nephritis over time, but improvement in prognosis is difficult to analyze in different historical periods, and should be better demonstrated in comparison with life expectancy of sex-and age-matched people. Long-term patient and renal survival of 90 patients diagnosed with Lupus nephritis at our center from 1968 to 2001 with a follow-up time of 14+/-8 years was retrospectively evaluated. Patient and kidney survival significantly increased over time. Multivariate analyses show that risks of patient and renal death decreased by 8% at each year of follow-up, and increased by more than 5 time in patients aged > 30 years at diagnosis. As only 14 patients were men, relative survival as compared to that of the sex- and age-matched general population of the Piedmont Region was calculated for the 76 women. Improvement in the survival of the cohort of women was seen at any time of follow-up: in particular, it was sharply lower in the first period (relative survival at 5, 10 and 15 years = 0.784, 0.665, and 0.620, respectively) and increased in the second (relative survival at 5, 10 and 15 years = 0.939, 0.921, and 0.850, respectively) nearly approaching that expected for the general population, i.e. 0.993, 0.983 and 0.967, respectively. Taken together, our data allow us to draw the conclusion that life expectancy in women with Lupus nephritis has improved over time, paralleling an improved awareness of the disease and a significant increase in steroid pulse therapy as induction/remission phase. Improvement in survival is for the first time demonstrated to cover the gap with life expectancy of the general population for women with Lupus nephritis.


Subject(s)
Immunosuppressive Agents/therapeutic use , Life Expectancy , Lupus Nephritis/drug therapy , Lupus Nephritis/mortality , Women's Health , Adult , Age Factors , Cause of Death , Female , Humans , Italy/epidemiology , Kaplan-Meier Estimate , Male , Proportional Hazards Models , Retrospective Studies , Risk Assessment , Risk Factors , Sex Factors , Time Factors , Treatment Outcome , Young Adult
14.
Lupus ; 16(11): 881-6, 2007.
Article in English | MEDLINE | ID: mdl-17971361

ABSTRACT

Polyomavirus BK (BKV) reactivation can occur in immunodeficient patients. Few studies on BKV infection in patients with systemic lupus erytematosus (SLE) nephritis are available. Aim of this study was to analyse the prevalence of BKV infection by quantifying viral load and to investigate the association with clinical and histological parameters indicating duration, type and activity of SLE.BKV-DNA was evaluated by polymerase chain reaction in serum (sBKV) and urine (uBKV) specimens from 40 patients with SLE nephritis and 29 healthy controls. Renal function, urinary activity, clinical index of SLE activity [SLE Disease Activity Index (SLEDAI) score], CD4+/CD8+ ratio, histological classes and duration of SLE nephritis were compared according to the BKV-DNA-positivity.sBKV was present in 15% of SLE patients and in 13.8% of controls; uBKV in 32% of SLE patients and in 17.2% of controls. There was no significant difference in terms of kidney function, urinary activity, SLEDAI score, presence of anti-dsDNA antibodies, CD4+/CD8+ ratio and BKV viremia and/viruria, as well as there was no significant correlation between SLEDAI score, anti-dsDNA antibodies titers and median viral load. Duration of nephropathy tended to be shorter in patients with BKV viremia and/or viruria; proteinuria/creatininuria ratio tended to be higher in patients with positive sBKV and uBKV. BKV-DNA-positivity tended to be more frequent in patients treated with an immunosuppressive agent versus those on steroid treatment. Reactivation of BKV infection can occur in patients with SLE, although prevalence data do not significantly differ from those obtained in the control group. The trend toward an association between BKV infection and degree of proteinuria and less duration of SLE nephritis could indicate a major susceptibility to develop BKV infection in more active phases of the disease. The role of BKV reactivation in terms of clinical parameters and histological pattern, as well as the role of therapeutic protocols in the onset of BKV reactivation and, conversely, the therapeutic implication of BKV reactivation in SLE patients remain to be defined and should be addressed in further studies on a larger number of patients.


Subject(s)
BK Virus/pathogenicity , Lupus Nephritis/complications , Lupus Nephritis/virology , Polyomavirus Infections/epidemiology , Virus Latency/immunology , Adult , BK Virus/genetics , BK Virus/physiology , Case-Control Studies , DNA, Viral/blood , DNA, Viral/urine , Female , Follow-Up Studies , Humans , Kidney Function Tests , Lupus Nephritis/pathology , Male , Middle Aged , Prevalence , Severity of Illness Index
15.
Scand J Clin Lab Invest ; 66(4): 299-307, 2006.
Article in English | MEDLINE | ID: mdl-16777758

ABSTRACT

OBJECTIVE: Three main tests are commonly employed for the measurement of proteinuria: the dipstick test, the urinary protein/creatinine ratio (P/C) and the 24-h urine collection. The aim of this study was to evaluate the correlation between these methods, comparing linear regression and ROC curve data. MATERIAL AND METHODS: A total of 297 consecutive outpatients with different renal diseases were included in the study. Twenty-four-hour proteinuria was considered the reference test. RESULTS: A high degree of correlation was observed between all the tests (p<0.0001), the highest regression coefficient being between 24-h proteinuria and P/C (R=0.82), and the lowest between P/C and the dipstick test (R=0.72). The dipstick test failed to detect pathological proteinuria in 94 patients (31.6%). Therefore, in these subjects, the patterns of proteinuria were assessed by immunofixation and sodium dodecyl sulphate (SDS) electrophoresis. CONCLUSIONS: Our data strongly support the use of urinary P/C for the detection of proteinuria, at least in nephrology units, where the prevalence of proteinuria is likely to be high.


Subject(s)
Creatinine/urine , Kidney Diseases/diagnosis , Proteinuria/diagnosis , Reagent Strips , Adolescent , Adult , Aged , Aged, 80 and over , Humans , Kidney Diseases/urine , Linear Models , Middle Aged , Proteinuria/etiology , Proteinuria/urine , ROC Curve , Reference Values , Time Factors
16.
Transplant Proc ; 37(2): 721-5, 2005 Mar.
Article in English | MEDLINE | ID: mdl-15848513

ABSTRACT

INTRODUCTION: Worldwide organ shortage and the increasing age of end-stage renal disease patients demanding a graft have prompted extensive use of marginal donors. The "old-for-old" allocation has been proposed for the elderly. The aim of this study was to evaluate the results of a policy of free acceptance into the waiting list of recipients older than 65 years. METHODS: From 1987 to 2004 70 patients whose mean age was 67.4 +/- 2.8 years, underwent an extensive pretransplant evaluation including cardiac studies. Immunosuppression was based upon low-dose steroids, and cyclosporine (50%) or tacrolimus (44%). RESULTS: Patient and graft survival at 1, 3, 5, and 10 years were 85%, 78.5%, 75%, 50%, and 80%, 74%, 70%, 36%, respectively. Death occurred in 17/70 (24%), 14 of whom had a functioning graft. The causes of death were 30% cancer, 23% cardiovascular, 23% sepsis, 12% cerebrovascular hemorrhage, 12% meningitis. The acute rejection (AR) rate was 18.6%. The causes of graft loss were: 71% patient death, 4% irreversible AR, 4% vascular thrombosis, and 21% chronic allograft dysfunction. The main complications were: 52% prostatic hypertrophy, 40% urinary tract infections, 8.6% diabetes, 11% pneumonia, 10% cardiovascular diseases, 7% urological complications, 8% abdominal pathology, 6% acute pyelonephritis, 8% non-skin cancer. CONCLUSIONS: Despite the increased vulnerability of the elderly, they should not be excluded a priori from renal transplantation. Extensive pretransplant screening, mainly cardiovascular, and a tailored immunosuppression are two crucial issues. The moderate rate of AR suggests that these patients do not have an impaired immunocompetence as far as acute rejection is concerned.


Subject(s)
Age Factors , Graft Survival/physiology , Kidney Transplantation , Adult , Aged , Contraindications , Heart Function Tests , Humans , Immunosuppressive Agents/therapeutic use , Kidney Diseases/physiopathology , Kidney Diseases/surgery , Kidney Transplantation/mortality , Middle Aged , Survival Analysis , Tissue Donors/supply & distribution
17.
Minerva Anestesiol ; 69(5): 365-70, 2003 May.
Article in Italian | MEDLINE | ID: mdl-12768168

ABSTRACT

We evaluated 481 liver donors in order to assess the incidence of positive cultures on samples obtained before harvesting, at harvesting and on preservation fluid; to determine factors related to positive cultures in the donor; to analyse the bacterial and fungal transmission from donor to recipient; to verify the influence of donor culture positivity on graft and patient survival. Cultures were positive in 232 of 481 (48%) donors. Bacteremia was present in 101 of 481 (20%) donors. Intensive care length of stay was significantly longer in culture-positive donors. A Gram-negative bacteria transmission from the infected donor to the graft recipient was proven in 1 case. No differences in 1-year survival and retransplantation rates were found between patients receiving livers from culture-positive or negative donors. In conclusion, even if rare, donor to host infection transmission is proven. Extended criteria for organ procurement may explain the high number of culture-positive donors we report. Careful microbiological surveillance and treatment can reduce the clinical negative impact on recipient outcome.


Subject(s)
Infections/transmission , Liver Transplantation , Liver/microbiology , Living Donors , Humans , Retrospective Studies
18.
Dig Liver Dis ; 34(2): 122-6, 2002 Feb.
Article in English | MEDLINE | ID: mdl-11926555

ABSTRACT

BACKGROUND: The risk of hepatic artery thrombosis after orthotopic liver transplantation is higher in cases of poor hepatic arterial inflow, small or anomalous recipient hepatic arteries, unsafe native hepatic arteries. AIMS: To assess the use of arterial conduits as alternative technique for graft revascularization. PATIENTS: At the Liver Transplant Center of the "S. Giovanni Battista" Hospital in Torino, a review has been made of 600 consecutive orthotopic liver transplantations in 545 adult patients from 1990 to 1999. METHODS: In 95 orthotopic liver transplantations (15.8%) in 88 patients, the graft was supplied by infrarenal conduit, while in 505 orthotopic liver transplantations (84.2%) in 457 patients, a direct anastomosis was used. RESULTS AND CONCLUSIONS: The overall incidence of hepatic artery thrombosis in our series was 3.5% (21/600): 5.3% (5/91) for conduits and 3.2% (16/505) for standard technique (p=ns, chi2 test). The actuarial 5-year graft survival was 67.7% for conduits and 68.6% for the standard technique; p (log rank): ns. The iliac prosthesis torsion was the only complication related to the use of infrarenal iliac conduit. The arterial conduit, performed with donor iliac artery, is an effective and safe revascularization technique in patients at high risk of arterial thrombosis.


Subject(s)
Hepatic Artery , Liver Transplantation/methods , Liver/blood supply , Vascular Surgical Procedures/methods , Adolescent , Adult , Anastomosis, Surgical/adverse effects , Child , Child, Preschool , Graft Survival , Hepatic Artery/surgery , Humans , Incidence , Middle Aged , Thrombosis/etiology , Treatment Outcome , Vascular Patency , Vascular Surgical Procedures/adverse effects
19.
J Nephrol ; 14(3): 169-75, 2001.
Article in English | MEDLINE | ID: mdl-11439740

ABSTRACT

BACKGROUND: Since dialysis has brought long-term survival to uremic patients, we can now speculate on more subtle problems derived from imbalance or sub-optimal regulation of some elements such as trace metals. We focused on the rubidium (Rb) status in dialysis patients (HD), as concerns about its possible deficiency have been raised. METHODS: Rb in uremic patients was evaluated by: A) serum concentration (graphite furnace atomic absorption spectroscopy) from blood samples of 70 patients on chronic hemodialysis (HD) in comparison with 75 controls; B) tissue concentration (neutron activation analysis) from autopsy or biopsy samples (20) of HD patients in comparison with 21 controls; C) in vivo intradialytic mass balance during standard bicarbonate dialysis in 8 HD patients. RESULTS: A) Serum Rb concentrations in HD patients significantly were lower than in normal controls (304 +/- 81 micrograms/L versus 350 +/- 74 micrograms/L p < 0.001, log-transformed 5.68 +/- 0.28 versus 5.84 +/- 0.20, p < 0.001). Univariate logistic regression analysis found a significantly higher risk of serum Rb < 250-300 and 350 micrograms/L in uremic patients than in controls (Odd ratios or 12.6, 95% CI 2.77-57.04; 4.0, 95% CI 1.92-8.4; 2.08, 95% CI 1.02-4.25, respectively). B) Rb was significantly lower in tissues of HD patients, including brain (2250 +/- 1520 ng/g versus 5490 +/- 1250 ng/g, p = 0.0002) than normal controls. C) Rb was transferred from the patients' blood to the dialysis bath during a standard bicarbonate dialysis session, giving mean intradialytic Rb removal of 4.0 +/- 1.1 mg/session. CONCLUSIONS: These results confirm that Rb deficiency may arise in uremic patients, and indicate that diffusive dialysis treatments allow Rb removal which, however, with a standard bicarbonate schedule does not seem to be any greater than that expected with normal urine output (20 mg/week). Further studies are needed to clarify the roles of many factors in this Rb deficiency, including the effects of uremia by itself, pre-dialysis factors (diet, impaired renal function and drugs), dialysis procedures (frequency, hours, diffusive/convective components) or other biochemical/clinical parameters (hemoglobin, body mass index, age). The finding of a Rb deficiency in uremia is important as it has a role in neurobehavioural functions, mainly as an antidepressant. As Rb deficiency may be implicated in central nervous system alterations which strongly influence the quality of life, we believe that monitoring serum Rb in uremic patients and clarifying the causal mechanisms of deficiency will facilitate future therapeutic approaches.


Subject(s)
Kidney Failure, Chronic/metabolism , Kidney Failure, Chronic/therapy , Renal Dialysis/adverse effects , Rubidium/deficiency , Rubidium/metabolism , Adult , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged
SELECTION OF CITATIONS
SEARCH DETAIL
...