Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
1.
Clin Nutr ESPEN ; 51: 323-335, 2022 10.
Article in English | MEDLINE | ID: mdl-36184224

ABSTRACT

INTRODUCTION: Malnutrition is a widely prevalent yet unrecognized problem of patients with end-stage liver disease (ESLD) awaiting liver transplant (LT). The aim of this study was to ascertain pre-LT nutritional status of adult Asian Indian patients with ESLD including functional deficit (frailty) and determine predictor/s of poor post-LT outcome (futility). METHODS: A prospective, longitudinal, single-center study which enrolled adults listed for LT between October 2014 and April 2018. Consecutive patients who consented, met the inclusion criteria, and underwent LT were included. The demographic data, nutritional, functional and dietary assessments from evaluation till 1-year-post-LT were documented. Data was analyzed using SPSS-25.0. RESULTS: One hundred and fifty two patients, aged 50.6 ± 8.3 years, predominantly men (82.9%), underwent LT (predominantly deceased donor LT (DDLT) 78.3%). One hundred and thirty five patients (88.8%) were discharged alive after LT. The presence of pre-LT hepatorenal syndrome (HRS), baseline handgrip strength (HGS) <20 kg and SGA rated malnutrition, 1st-pre-LT follow-up HGS (<20 kg), non-achievers of ≥80% targeted energy and protein intake were associated with increasing 1-year-post-LT mortality rate (p < 0.05). A cut-off of 18.15 kg HGS was found to predict 1-year-post-LT survival, with optimal sensitivity (77.4); specificity (43.5), and an area under a receiver operating curve (ROC) of 0.676 (p = 0.008), with a 95% CI [0.545-0.806]. Step-wise binary logistic regression indicated that non-achievers of ≥80% targeted protein at 1st-pre-LT follow-up and baseline HGS (<18.15 kg) are two independent factors that predict 1-year-post-LT survival with an odds ratio of 13.168 and 7.041, respectively. CONCLUSION: A cut-off of 18.15 kg baseline HGS before LT was found to be a significant predictor of poor survival in Asian Indian patients with ESLD. It was also instrumental in identifying patients at risk and helping to target intensive nutritional intervention therapy. Patients in whom ≥80% pre-LT nutritional goal was successfully achieved during the waiting period had an improved 1-year-post-LT survival.


Subject(s)
End Stage Liver Disease , Liver Transplantation , Malnutrition , Adult , End Stage Liver Disease/surgery , Female , Hand Strength , Humans , Male , Malnutrition/diagnosis , Prospective Studies
2.
Clin Nutr ESPEN ; 23: 200-204, 2018 02.
Article in English | MEDLINE | ID: mdl-29460799

ABSTRACT

BACKGROUND AND AIM: Nutritional therapy is an integral part of care in all phases of liver transplantation (LTx). However, there are several factors that make it a challenge to manage malnutrition in these patients including, but not limited to, loss of appetite, dietary restrictions and dietary habits. Dietary habits are guided by personal choice, social, cultural and regional background with diversity ranging from veganism to vegetarianism with the latter predominant in Indian population. Therefore, it is difficult to improve nutritional intake of patients with standard dietary recommendations. We evaluated the effects of implementing personalized dietary counseling and a customized nutrition plan on its ability to enhance oral intake and, thereby improve nutritional status of patients with end stage liver disease (ESLD) being evaluated for LTx. We compared the outcomes with a matched group of patients who were prescribed standard dietary recommendations from a historic database. Primary outcome was measured by number of patients achieving ≥75% of recommended energy and protein requirements during hospitalization for LTx. Secondary outcomes included mean energy and protein intake, hours of ventilation, length of stay in Intensive Care Unit (ICU) and hospital, mortality and readmission rate in the acute phase (3months) after LTx. METHODS: This was a prospective observational study, performed at a single LTx centre. All patients >18years who enrolled for LTx and consented for the study were included. The study was conducted after obtaining institutional ethics committee approval. A protocol based nutrition planning was implemented from April'14. According to this protocol, all patients being evaluated for LTx underwent a detailed nutritional assessment by a qualified Clinical Dietitian (CD) and regularly followed up with until LTx. Nutritional intervention, including a customized nutrition care plan and personalized dietary counseling, was provided based on the severity of malnutrition. To evaluate the efficacy of this protocol, we compared the nutritional adequacy (calorie and protein intake) of 65 consecutive patients who underwent LTx between August'14-October'15 (group 1) with a historic database of 65 patients who underwent LTx between January'13 and April'14 (group 2). Patients' demographics, disease severity score, baseline markers of nutritional status (subjective global assessment (SGA), and body mass index (BMI)), were recorded. First, assessment of individual patient's oral energy and protein intake was determined by the daily calorie count during hospitalization. Then the nutritional intervention (oral nutrition supplement (ONS)/enteral nutrition (EN)/parenteral nutrition (PN)) plan was customized according to their spontaneous oral intake. As part of the protocol, health related quality of life was also assessed using short form 8 (SF-8) in group 1. Statistical analyses using Pearson's correlation, Chi-Square test were applied with SPSS version 20.0. RESULTS: The mean age of group 1 and 2 were 52.6 ± 9.8, 51.9 ± 10.5 (range 25-70years) with BMI of 26.8 ± 6.0, 26.5 ± 5.4 respectively. According to SGA, there was significant improvement in the nutritional status of group 1 patients compared to group 2 on admission for LTx. It was indicated that 88% of group 1 individuals in comparison to 98% in group 2 were malnourished. The calorie intake of group 1 (1740.2 ± 254.8) was significantly higher than group 2 (1568.5 ± 321.6) (p = 0.005). The marked improvement in protein intake in group 1 (63.1 ± 12.1) when compared with group 2 (53.1 ± 13.4) was statistically significant (p = 0.008). A subset analysis showed that non-vegetarians (consuming meat and dairy products) between the groups showed that group 1 had a significantly higher calorie (p = 0.004) and protein (p = 0.0001) intake compared to individuals in group 2. Following implementation of study's protocol, the goal of achieving ≥75% of the prescribed calories (p = 0.013) and protein (p = 0.0001) was significantly higher in group 1. CONCLUSION: When compared to the standard prescription, an individualized protocol to diagnose, stratify the severity of malnutrition early, and follow up by customized nutrition planning for patients helped to achieve nutritional targets more effectively. Inspite of patients' diversity in nutritional habits and reluctance to accept change, it is clear that a qualified and dedicated transplant nutrition team can successfully implement perioperative nutrition protocol to achieve better nutritional targets.


Subject(s)
Counseling , Diet , Liver Transplantation , Nutritional Requirements , Perioperative Care , Adult , Aged , Body Mass Index , Dietary Proteins , Female , Hospitalization , Humans , India , Intensive Care Units , Length of Stay , Male , Middle Aged , Nutrition Assessment , Nutritional Support , Prospective Studies , Quality of Life , Treatment Outcome , Young Adult
3.
Prog Transplant ; 26(4): 340-347, 2016 Dec.
Article in English | MEDLINE | ID: mdl-27543202

ABSTRACT

BACKGROUND: Nations with emerging deceased-donor liver transplantation programs, such as India, face problems associated with poor donor maintenance. Cold ischemic time (CIT) is typically maintained short by matching donor organ recovery and recipient hepatectomy to achieve maximum favorable outcome. We analyzed different extended criteria donor factors including donor acidosis, which may act as a surrogate marker of poor donor maintenance, to quantify the risk of primary nonfunction (PNF) or initial poor function (IPF). METHODS: A single-center retrospective outcome analysis of prospectively collected data of patients undergoing deceased-donor liver transplantation over 2 years to determine the impact of different extended criteria donor factors on IPF and PNF. RESULTS: From March 2013 to February 2015, a total of 84 patients underwent deceased-donor liver transplantation. None developed PNF. Thirteen (15.5%) patients developed IPF. Graft macrosteatosis and donor acidosis were only related to IPF ( P = .002 and P = .032, respectively). Cold ischemic time was maintained short (81 cases ≤8 hours, maximum 11 hours) in all cases. CONCLUSION: Poor donor maintenance as evidenced by donor acidosis and graft macrosteatosis had significant impact in developing IPF when CIT is kept short. Similar study with larger sample size is required to establish extended criteria cutoff values.


Subject(s)
Cold Ischemia , Liver Transplantation , Tissue Donors , Graft Survival , Humans , Retrospective Studies , Risk Factors , Treatment Outcome
4.
J Clin Diagn Res ; 10(12): PD24-PD25, 2016 Dec.
Article in English | MEDLINE | ID: mdl-28208936

ABSTRACT

Autosomal Dominant Polycystic Kidney Disease (ADPKD) is a chronic affliction characterized by numerous liver and kidney cysts. There is a gradual but progressive renal and liver impairment which may require combined liver-kidney transplantation. Compression of the retrohepatic Inferior Vena Cava (IVC) by an enlarged polycystic liver may impede clear visualization on pre-operative imaging and miss an underlying thrombosis or obliteration. This may result in an intra-operative surprise. Management can be challenging requiring modification of conventional surgical approach. We present our experience of a 67-year-old patient who underwent combined liver-kidney deceased donor transplantation for decompensated chronic liver disease with chronic kidney disease due to ADPKD. She was diagnosed with ADPKD for 16 year, with progressive deterioration in kidney function over the last 6 year and liver decompensation following knee replacement surgery requiring regular renal replacement therapy. We report this case to highlight the peri-operative challenges and their management along with a review of published literature on this uncommon occurrence.

5.
Can Urol Assoc J ; 5(6): E142-7, 2011 Dec.
Article in English | MEDLINE | ID: mdl-21388588

ABSTRACT

INTRODUCTION: Allosensitization is a significant obstacle to retransplantation for patients with primary renal graft failure. METHODS: We assessed the impact of allograft nephrectomy (Group I) and weaning of immunosuppression (Group II) on percent panel reactive antibody (%PRA) at various time points after graft failure in 132 patients with a median follow-up of 47 months. Of these, 68% had allograft nephrectomy while 32% were placed on the waiting list and were either taken off immunosuppression, left on prednisone or on low-dose immunosuppressive therapy. RESULTS: When groups were stratified into early (<6 months) and late (>6 months) graft failure, patients who had transplant nephrectomy for early failure demonstrated a decline in %PRA from 46% at time of graft failure to 27% at last follow-up (p = 0.02); conversely, %PRA continued to rise in Group II experiencing early allograft failure. Both Groups I and II patients with late graft failure maintained elevated %PRA at last follow-up. CONCLUSION: Allograft nephrectomy may play a role in limiting allosensitization in patients with early but not late graft failures.

6.
Liver Transpl ; 13(4): 543-51, 2007 Apr.
Article in English | MEDLINE | ID: mdl-17394152

ABSTRACT

Milan and University of California at San Francisco (UCSF) criteria are used to select patients with hepatocellular carcinoma (HCC) for liver transplantation (LT). Recurrent HCC is a significant cause of death. There is no widely accepted pathological assessment strategy to predict recurrent HCC after transplantation. This study compares the pathology of patients meeting Milan and UCSF criteria and develops a pathological score and nomogram to assess the risk of recurrent HCC after transplantation. All explanted livers with HCC from our center over the 18-yr period 1985 to 2003 were assessed for multiple pathological features and relevant clinical data were recorded; multivariate analysis was performed to determine features associated with recurrent HCC. Using pathological variables that independently predicted recurrent HCC, a pathological score and nomogram were developed to determine the probability of recurrent HCC. Of 75 cases analyzed, 50 (67%) met Milan criteria, 9 (12%) met only UCSF criteria and 16 (21%) met neither criteria based on explant pathology. There were 20 cases of recurrent HCC and the mean follow-up was 8 yr. Recurrent HCC was more common (67 vs. 12%; P < 0.001) and survival was lower (15 vs. 83% at 5 yr; 15 vs. 55% at 8 yr; P < 0.001) with those who met only UCSF criteria, compared to those who met Milan criteria. Cryptogenic cirrhosis (25 vs. 5%; P = 0.015), preoperative AFP >1,000 ng/mL (20 vs. 0%; P < 0.001) and postoperative OKT3 use (40 vs. 15%; P = 0.017) were more common among patients with recurrent HCC. While microvascular invasion was the strongest pathological predictor of recurrent HCC, tumor size >or=3 cm (P = 0.004; odds ratio [OR] = 7.42), nuclear grade (P = 0.044; OR = 3.25), microsatellitosis (P = 0.020; OR = 4.82), and giant/bizarre cells (P = 0.028; OR = 4.78) also predicted recurrent HCC independently from vascular invasion. The score and nomogram stratified the risk of recurrent HCC into 3 tiers: low (<5%), intermediate (40-65%), and high (>95%). In conclusion, compared to patients meeting Milan criteria, patients who meet only UCSF criteria have a worse survival and an increased rate of recurrent HCC with long-term follow-up, as well as more frequent occurrence of adverse histopathological features, such as microvascular invasion. Application of a pathological score and nomogram could help identify patients at increased risk for tumor recurrence, who may benefit from increased surveillance or adjuvant therapy.


Subject(s)
Carcinoma, Hepatocellular/pathology , Carcinoma, Hepatocellular/surgery , Liver Neoplasms/pathology , Liver Neoplasms/surgery , Liver Transplantation , Adult , Carcinoma, Hepatocellular/mortality , Female , Humans , Liver Neoplasms/mortality , Liver Transplantation/mortality , Liver Transplantation/pathology , Male , Middle Aged , Neoplasm Invasiveness/pathology , Neovascularization, Pathologic , Recurrence , Survival Analysis , Time Factors
7.
Liver Transpl ; 10(9): 1183-8, 2004 Sep.
Article in English | MEDLINE | ID: mdl-15350012

ABSTRACT

Doppler ultrasound (DUS) is able to measure parameters of blood flow within vessels of transplanted organs, and vascular complications are associated with abnormal values. We analyzed the findings of 51 consecutive patients who underwent DUS on 2 occasions in the first postoperative week following liver transplantation for cirrhosis to determine the range of values in patients following liver transplantation. Three patients developed early vascular thromboses that were detected by the absence of a Doppler signal. In patients making an uneventful recovery, the arterial velocity tended to increase and the resistive index (RI) to decrease during the first postoperative week. All recipients were shown to have high-velocity segments within the hepatic artery, without an increase in flow resistance. Assessment of the portal vein revealed narrowing at the anastomosis, associated with a segmental doubling of flow velocity, and the mean portal venous flow decreased by approximately 20% in the first postoperative week. In conclusion, a wide range of abnormalities occurs in the vessels of liver transplant recipients, which were not associated with the development of vascular complications or affect patient management.


Subject(s)
Hepatic Artery/physiology , Liver Transplantation/physiology , Portal Vein/physiology , Ultrasonography, Doppler , Blood Flow Velocity , Female , Hepatic Artery/diagnostic imaging , Humans , Liver/diagnostic imaging , Liver Cirrhosis/surgery , Male , Portal Vein/diagnostic imaging , Postoperative Period , Prospective Studies , Regional Blood Flow , Vascular Resistance
8.
Urology ; 64(3): 458-61, 2004 Sep.
Article in English | MEDLINE | ID: mdl-15351570

ABSTRACT

OBJECTIVES: To compare, retrospectively, the results of laparoscopic partial nephrectomy (LPN) to open partial nephrectomy (OPN) using a tumor size-matched cohort of patients. Limited data are available comparing LPN to OPN in the treatment of small renal tumors. METHODS: Between September 2000 and September 2003, 27 LPNs and 22 OPNs were performed to treat renal masses less than 4 cm. Patient demographics and tumor location and size (2.4 +/- 1.0 cm versus 2.9 +/- 0.9 cm, respectively; P = not statistically significant) were similar between the LPN and OPN groups. RESULTS: Although the mean operative time was longer in the LPN than in the OPN group (210 +/- 76 minutes versus 144 +/- 24 minutes; P <0.001), the blood loss was comparable between the two groups (250 +/- 250 mL versus 334 +/- 343 mL; P = not statistically significant). No blood transfusions were performed in either group. The hospital stay was significantly reduced after LPN compared with after OPN (2.9 +/- 1.5 days versus 6.4 +/- 1.8 days; P <0.0002), and the postoperative parenteral narcotic requirements were lower in the LPN group (mean morphine equivalent 43 +/- 62 mg versus 187 +/- 71 mg; P <0.02). Three complications occurred in each group. With LPN, no patient had positive margins or tumor recurrence. Also, direct financial analysis demonstrated lower total hospital costs after LPN (4839 dollars+/- 1551 dollars versus 6297 dollars+/- 2972 dollars; P <0.05). CONCLUSIONS: LPN confers several benefits over OPN concerning patient convalescence and costs, despite prolonged resection times at our current phase of the learning curve. Long-term results on cancer control in patients treated with LPN continue to be assessed.


Subject(s)
Laparoscopy , Nephrectomy/methods , Adult , Aged , Cohort Studies , Feasibility Studies , Female , Hospital Costs , Humans , Kidney Neoplasms/diagnostic imaging , Kidney Neoplasms/surgery , Laparoscopy/economics , Laparoscopy/methods , Laparoscopy/statistics & numerical data , Male , Middle Aged , Nephrectomy/economics , Nephrectomy/statistics & numerical data , Ontario , Retrospective Studies , Treatment Outcome , Ultrasonography, Interventional
9.
Liver Transpl ; 9(11): S73-8, 2003 Nov.
Article in English | MEDLINE | ID: mdl-14586900

ABSTRACT

1. Cirrhosis from chronic hepatitis C is the most common indication for liver grafting today. The course of hepatitis C is accelerated after liver transplantation, and no current therapy reliably prevents or arrests it. 2. It is anticipated that 20% or more of hepatitis C virus-positive transplant recipients will develop allograft cirrhosis, and the only solution will be retransplantation. 3. Results of retransplantation are inferior to primary transplantation. 4. Recipient risk factors that adversely affect mortality after repeated liver grafting include age older than 50 years, renal insufficiency, and severity of hyperbilirubinemia. When present, they reduce survival after retransplantation to approximately 40% or less. 5. Retransplantation on a large scale for recurrent hepatitis C is problematic from the perspectives of outcome, resource utilization, and fairness to candidates awaiting primary grafts.


Subject(s)
Hepatitis C/surgery , Liver Transplantation , Graft Survival/physiology , Humans , Liver/virology , Liver Transplantation/statistics & numerical data , Reoperation , Risk Factors , Treatment Outcome , United States
10.
Transplantation ; 75(6): 851-6, 2003 Mar 27.
Article in English | MEDLINE | ID: mdl-12660514

ABSTRACT

BACKGROUND: Posttransplant lymphoproliferative disorder (PTLD) remains a difficult management issue; therefore, many studies focus on the identification of risk factors to allow for preventive strategies. We investigated risk factors for PTLD in the adult renal transplant setting. METHODS: A single-center, matched case-control study design was used. Cases were identified from patients who underwent a first renal transplant between January 1, 1985, and December 1, 2001. Two controls were chosen per case, matched (+/-1 year) by date of transplant and graft survival. Clinical and demographic data were ascertained from medical records. Pretransplant serology for Epstein-Barr virus (EBV) and cytomegalovirus was confirmed on frozen, stored sera. Statistical analysis included univariate and multivariable examination of putative risk factors using conditional logistic regression. RESULTS: Twenty cases of PTLD were identified, an incidence of 2.4%. Median time from transplant to diagnosis was 55 months (range, 3-168 months), with 16 cases of late-onset PTLD (>1 year posttransplant). The only significant risk in univariate analysis was EBV-negative status at transplant (risk ratio 6.0, P=0.03). In multivariable analysis, EBV-negative status remained significant (adjusted risk ratio 8.9, P=0.01). The risk related to EBV status held true when late cases were analyzed separately (adjusted risk ratio 7.1, P=0.03). CONCLUSIONS: Pretransplant EBV-seronegative status is a strong risk for development of PTLD in adult renal allograft recipients, even in late disease. These results indicate that primary infection with EBV may have a pathogenic role in some cases of late PTLD.


Subject(s)
Epstein-Barr Virus Infections/diagnosis , Epstein-Barr Virus Infections/epidemiology , Kidney Transplantation , Lymphoproliferative Disorders/epidemiology , Lymphoproliferative Disorders/virology , Adult , Antibodies, Viral/blood , Case-Control Studies , Epstein-Barr Virus Infections/immunology , Female , Humans , Male , Middle Aged , Postoperative Complications/epidemiology , Postoperative Complications/virology , Risk Factors , Transplantation, Homologous
SELECTION OF CITATIONS
SEARCH DETAIL
...