Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 59
Filter
1.
Talanta ; 250: 123729, 2022 Dec 01.
Article in English | MEDLINE | ID: mdl-35839605

ABSTRACT

Ovarian cancer has a high mortality rate due to its unclear symptomology and the lack of precise early detection tools. If detected in the first stage, over 90% of patients reach remission. As such, developing a reliable method of early detection is crucial in reducing the mortality rate of the disease. One potential method would be to identify specific biomarkers that are unique to ovarian cancer, which could be detected using a blood test. While this can be done using gas chromatography - mass spectrometry (GC-MS), identifying these biomarkers is an enormous task. One way to expedite the process is to utilize trained scent detection canines. In this study, dogs who were previously trained to respond to positive blood samples from ovarian cancer patients were then tested on their ability to recognize samples prepared by micro-preparative gas chromatography (MP-GC) techniques. MP-GC employed a gradient-cooled glass tube connected to the GC outlet to collect GC eluents containing the plasma-derived volatiles in positive blood samples. These post-column fractions were collected at the exit of the GC according to their eluent times (i.e., 0-15 min, 15-25 min and 25-35 min or 0-35 min) and these full or fractional collections were presented to the trained dogs to judge their responses. Dogs' time spent investigating the odor was used as an indication of odor recognition and was significantly longer on the early (0-15 min) and middle (15-25 min) fractions of the ovarian cancer than the late (25-35 min) fraction of plasma odorants or either the negative fractions or distractors odorants. These findings suggest that characteristic odor biomarkers of ovarian cancer for dogs may exist in the relatively small and more volatile compounds. Additionally, variation between dogs suggests that there may be a number of different biomarkers that can be used to identify ovarian cancer.


Subject(s)
Ovarian Neoplasms , Volatile Organic Compounds , Animals , Dogs , Female , Gas Chromatography-Mass Spectrometry/methods , Humans , Odorants/analysis , Ovarian Neoplasms/diagnosis , Volatile Organic Compounds/analysis
2.
Psychoneuroendocrinology ; 107: 160-168, 2019 09.
Article in English | MEDLINE | ID: mdl-31132568

ABSTRACT

BACKGROUND: The relationship between disturbed sleep and stress is well-documented. Sleep disorders and stress are highly prevalent during the perinatal period, and both are known to contribute to a number of adverse maternal and foetal outcomes. Arginine vasopressin (AVP) is a hormone and a neuropeptide that is involved in stress response, social bonding and circadian regulation of the sleep-wake cycle. Whether the AVP system is involved in regulation of stress response and sleep quality in the context of the perinatal mental health is currently unknown. The objective of the present study was to assess the relationship between levels of cumulative and ongoing psychosocial risk, levels of disordered sleep and AVP methylation in a community sample of pregnant and postpartum women. METHODS: A sample of 316 participants completed a battery of questionnaires during the second trimester of pregnancy (PN2, 12-14 weeks gestation), third trimester (PN3, 32-34 weeks gestation), and at 7-9 weeks postpartum (PP). Disordered sleep was measured using the Sleep Symptom Checklist at PN2, PN3 and PP; cumulative psychosocial risk was assessed with the Antenatal Risk Questionnaire (ANRQ) at PN2; salivary DNA was collected at the follow-up (FU, 2.9 years postpartum); and % methylation were calculated for AVP and for two of the three AVP receptor genes (AVPR1a and AVPR1b). Women were separated into high (HighPR) and low (LowPR) psychosocial risk groups, based on their scores on the ANRQ. RESULTS: Women in the HighPR group had significantly worse sleep disturbances during PN2 (p < .001) and PN3 (p < .001), but not at PP (p = .146) than women in the LowPR group. In HighPR participants only, methylation of AVP at intron 1 negatively correlated with sleep disturbances at PN2 (rs=-.390, p = .001), PN3 (rs=-.384, p = .002) and at PP (rs= -.269, p = .032). There was no association between sleep disturbances and AVPR1a or AVPR1b methylation, or between sleep disturbances and any of the AVP methylation for the LowPR group. Lastly, cumulative psychosocial stress was a moderator for the relationship between AVP intron 1 methylation and disordered sleep at PN2 (p < .001, adjusted R2 = .105), PN2 (p < .001, adjusted R2 = .088) and PP (p = .003, adjusted R2 = .064). CONCLUSIONS: Our results suggest that cumulative psychosocial stress exacerbates sleep disorders in pregnant women, and that salivary DNA methylation patterns of the AVP gene may be seen as a marker of biological predisposition to stress and sleep reactivity during the perinatal period. Further research is needed to establish causal links between AVP methylation, sleep and stress.


Subject(s)
Arginine Vasopressin/metabolism , Sleep Wake Disorders/physiopathology , Stress, Psychological/metabolism , Adult , Arginine Vasopressin/genetics , DNA Methylation/genetics , Depression, Postpartum/psychology , Female , Humans , Longitudinal Studies , Neurophysins/metabolism , Parturition , Postpartum Period/psychology , Pregnancy , Pregnant Women , Prenatal Care , Protein Precursors/metabolism , Psychology , Receptors, Vasopressin/metabolism , Sleep/physiology , Surveys and Questionnaires , Vasopressins/genetics , Vasopressins/metabolism
3.
Transplant Proc ; 50(9): 2645-2647, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30401367

ABSTRACT

BACKGROUND: To evaluate the effect of dextrose contained in banked blood products on the changes of blood glucose levels in adult living donor liver transplantation patients retrospectively. METHODS: Four hundred seventy-seven patients were divided into a non-blood transfusion (BT) group (G1) and a BT group (G2). The changes in blood glucose levels during the operation were compared using a Mann-Whitney U test, and a P value less than .05 was regarded as significant. RESULTS: No significant changes were detected in blood glucose levels after anesthesia, during dissection phase, in the anhepatic phase, or after reperfusion between the groups. Estimated blood loss for G1 (n = 89) and G2 (n = 388) were 718 ± 514 and 5804 ± 877 mL respectively, G1 had no blood transfusion but G2 had received 4350 ± 6230 mL leukocyte-poor red blood cell transfusion, the pre- and end operation hemoglobin for G1 and G2 were 13.2 ± 2.0, 10.2 ± 1.9 and 10.1 ± 1.6, 10.2 ± 1.9 mg/dL respectively, indicating that they were not under or over transfused. CONCLUSION: When banked blood products are used to replace ongoing blood loss, the dextrose contained in citrate-phosphate-dextrose-adenine seems to have no effect on the changes in the blood glucose levels of the recipients.


Subject(s)
Blood Glucose/analysis , Blood Transfusion/statistics & numerical data , Hemostasis, Surgical/methods , Liver Transplantation/methods , Adult , Blood Banks , Citrates/blood , Female , Glucose , Humans , Living Donors , Male , Middle Aged , Retrospective Studies , Statistics, Nonparametric
4.
Transplant Proc ; 50(9): 2648-2650, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30401368

ABSTRACT

OBJECTIVES: The aims of this study were to compare the core temperature changes between pediatric patients lying on regular operating room linen drapes and a water-repellent sheepskin rug during living donor liver transplantation (LDLT) and to evaluate the effectiveness of using a water-repellent sheepskin rug in preventing profound hypothermia due to fluid overflow from the abdominal cavity during LDLT. PATIENTS AND METHODS: The operative records of pediatric patients who underwent LDLT from June 1994-September 2003 were reviewed retrospectively. The nasopharyngeal temperature (NT) changes during the LDLT procedure between patients lying on regular operating room drapes (GI) and water-repellent sheepskin rug (GII) were compared and analyzed using the Mann-Whitney U test. A P value <.05 was regarded as significant. RESULTS: Thirty-two patients were included in GI and 56 in GII. Profound hypothermia was not observed in any recipients lying on a water-repellent sheepskin rug (GII). The NT after induction and the following 4 hours into the LT procedure were significantly higher in GII than GI. CONCLUSION: Pediatric patients lying on water-repellent sheepskin preserved their core temperature better in comparison to patients lying on linen drapes. The use of a water-repellent sheepskin rug seems to be effective in preventing profound hypothermia related to physical contact with abdominal fluid overflow during the LDLT.


Subject(s)
Bedding and Linens , Body Temperature , Liver Transplantation/methods , Absorption, Physicochemical , Animals , Child, Preschool , Equipment Design , Female , Humans , Hydrophobic and Hydrophilic Interactions , Living Donors , Male , Operating Rooms , Retrospective Studies , Sheep , Water
5.
Transplant Proc ; 50(9): 2651-2653, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30401369

ABSTRACT

BACKGROUND: Opsite (Smith & Nephew, Hull, UK) is widely used in wound care but its use in eye protection against corneal abrasion during major surgery is rarely reported. The purpose of the current study is to compare the effectiveness of using Opsite in eye protection with either wet gauze alone or with wet gauze following application of eye ointment in patients undergoing living donor liver transplantation (LDLT). METHODS: This is a prospective, double-blinded, randomized controlled trial. Forty-one patients undergoing liver transplantation were enrolled. One eye of each patient was protected with sterile gauze soaked with normal saline solution and covered with Opsite. Duratears (ALCON, Fort Worth, Tex, United States) ointment was applied to the other eye before covering it with sterile wet gauze and Opsite (ointment group). The corneal examination was carried out after fluorescein staining before and at the end of surgery by the same doctor. A Student t-test and a χ2 test were used for the statistical analyses. RESULTS: Forty-one patients with 82 eyes were observed in this study. No corneal epithelial defects were found in either the normal saline group or the ointment group. CONCLUSION: Opsite combined with wet gauze with or without additional eye ointment provided 100% protection against corneal abrasion in patients undergoing LDLT.


Subject(s)
Anesthesia, General/adverse effects , Corneal Injuries/prevention & control , Liver Transplantation/methods , Occlusive Dressings , Polyurethanes/administration & dosage , Bandages , Corneal Injuries/etiology , Double-Blind Method , Female , Humans , Male , Middle Aged , Prospective Studies , Treatment Outcome
6.
Transplant Proc ; 50(9): 2654-2656, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30401370

ABSTRACT

OBJECTIVE: Right lobe living donor hepatectomy poses a greater risk for the donor in relation to blood loss. The aims of this study were to compare anesthetic and intraoperative fluid management in right and left lateral segment living donor hepatectomy. PATIENTS AND METHODS: The anesthesia records of living donor hepatectomy patients were retrospectively reviewed. Donor age and weight, anesthesia time, central venous pressure, blood loss, blood product transfusion, intravenous fluids used, doses of furosemide, and urine output were compared and analyzed between groups using the Mann Whitney U test. RESULTS: Forty-six patients underwent living donor left lateral segment hepatectomy (Group I); while 31 patients underwent right lobe hepatectomy (Group II). The mean blood loss in Group II was significantly higher compared to Group I (118 ± 81 mL vs 68 ± 64 mL), but clinically such amount of blood loss was not high enough to affect the hemodynamics. The fluid management was therefore not meaningfully different between the two groups. No blood transfusions or colloid infusions were required for either group. Urine output, hemoglobin changes, blood urea nitrogen, and serum creatinine pre- and postoperatively were not significantly different between groups. CONCLUSIONS: As long as blood loss is minimal, we found no difference in the anesthetic management and fluid replacements between right and left lateral segment living donor hepatectomy.


Subject(s)
Anesthesia/methods , Blood Loss, Surgical/prevention & control , Fluid Therapy/methods , Hepatectomy/methods , Liver Transplantation , Tissue and Organ Harvesting/methods , Adult , Blood Loss, Surgical/statistics & numerical data , Blood Transfusion/statistics & numerical data , Central Venous Pressure , Female , Hemodynamics , Hemoglobins , Hepatectomy/adverse effects , Humans , Liver/surgery , Living Donors , Male , Middle Aged , Postoperative Period , Retrospective Studies , Tissue and Organ Harvesting/adverse effects
7.
Transplant Proc ; 50(9): 2661-2663, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30401372

ABSTRACT

BACKGROUND: Blood loss during liver surgery is found to be correlated with central venous pressure (CVP). The aim of the current retrospective study is to find out the cutoff value of CVP and stroke volume variation (SVV), which may increase the risk of having intraoperative blood loss of more than 100 mL during living liver donor hepatectomies. METHOD AND PATIENTS: Twenty-seven adult living liver donors were divided into 2 groups according to whether they had intraoperative blood loss of less (G1) or more than 100 mL (G2). The mean values of the patients' CVP and SVV at the beginning of the transaction of the liver parenchyma was used as the cutoff point. Its correlation to intraoperative blood loss was evaluated using the χ2 test; P < .001 was regarded as significant. RESULTS: The cutoff points of CVP and SVV were 8 mm Hg and 13% respectively. The odds ratio of having blood loss exceeding 100 mL was 91.25 (P < .001) and 0.36 (P < .001) for CVP and SVV, respectively. CONCLUSION: CVP less than 5 mm Hg, as suggested by most authors, is not always clinical achievable. Our results show that a value of less than 8 mm Hg or SVV 13% is able to achieve a minimal blood loss of 100 mL during parenchyma transaction during a living donor hepatectomy. Measurements used to lower the CVP or increased SVV in our serial were intravenous fluids restriction and the use of a diuretic.


Subject(s)
Blood Loss, Surgical/physiopathology , Central Venous Pressure/physiology , Hepatectomy/methods , Stroke Volume/physiology , Tissue and Organ Harvesting/methods , Adult , Female , Humans , Liver/surgery , Liver Transplantation/methods , Living Donors , Male , Reference Values , Retrospective Studies
8.
Transplant Proc ; 48(4): 1022-4, 2016 May.
Article in English | MEDLINE | ID: mdl-27320547

ABSTRACT

BACKGROUND: Hyperkalemia, defined as a serum potassium level higher than 5 mEq/L, is common in the liver transplantation setting. Severe hyperkalemia may induce fatal cardiac arrhythmias; therefore, it should be monitored and treated accordingly. The aim of the current retrospective study is to evaluate and indentify the predictive risk factors of hyperkalemia during living-donor liver transplantation (LDLT). METHODS AND PATIENTS: Four hundred eighty-seven adult LDLT patients were included in the study. Intraoperative serum potassium levels were monitored at least five times during LDLT; patients with a potassium level higher than 5 mEq/L were included in group 1, and the others with normokalemia in group 2. Patients' categorical characteristics and intraoperative numeric variables with a P value <.1 were selected into a multiple binary logistic regression model. In multivariate analysis, a P value of <.05 is regarded as a risk factor in the development of hyperkalemia. RESULTS: Fifty-one of 487 (10.4%) patients had hyperkalemia with a serum potassium level higher than 5.0 mEq/L during LDLT. Predictive factors with P < .1 in univariate analysis (Table 1), such as anesthesia time, preoperative albumin level, Model for End-stage Liver Disease score, preoperative bilirubin level, amount of blood loss, red blood cell (RBC) and fresh frozen plasma transfused, 5% albumin administered, hemoglobin at the end of surgery, and the amount of furosemide used, were further analyzed by multivariate binary regression. Results show that the anesthesia time, preoperative serum albumin level, and RBC count are determinant risk factors in the development of the hyperkalemia in our LDLT serials. CONCLUSION: Prolonged anesthesia time, preoperative serum albumin level, and intraoperative RBC transfusion are three determinant factors in the development of intraoperative hyperkalemia, and close monitoring of serum potassium levels in patients with abovementioned risk factors are recommended.


Subject(s)
Hyperkalemia/etiology , Intraoperative Complications/etiology , Liver Transplantation/adverse effects , Living Donors , Adult , End Stage Liver Disease/physiopathology , End Stage Liver Disease/surgery , Erythrocyte Transfusion/methods , Female , Humans , Hyperkalemia/blood , Liver Function Tests , Logistic Models , Male , Middle Aged , Multivariate Analysis , Operative Time , Plasma , Potassium/blood , Retrospective Studies , Risk Factors , Transplant Recipients
9.
Transplant Proc ; 48(4): 1052-4, 2016 May.
Article in English | MEDLINE | ID: mdl-27320554

ABSTRACT

BACKGROUND: Whether the history of esophageal variceal bleeding (EVB) can be used clinically to predict the tolerability or hemodynamic instability during clamping of the inferior vena cava (IVC) and portal vein in liver transplantation is unknown and, therefore, needs to be elucidated. PATIENTS AND METHODS: A total of 50 anesthesia charts of patients who underwent living donor liver transplantation were reviewed, analyzed and compared retrospectively. Patients without a history of EVB were classified as group 1 and patients with a history of EVB were classified as group 2. The numbers of patients with a decrease in cardiac index (CI) of ≥20%, ≥30%, or ≥40% from their preclamping values after IVC clamping were compared with a χ(2), and a P value of .05 was regarded as statistically significant. RESULTS: The measured hemodynamic parameters before and 5 minutes after clamping of the IVC and portal vein were all significantly different in comparison with the patient's preclamping values. The incidence of a decrease in CI of ≥20%, ≥30%, or ≥40% 5 minutes after clamping of the IVC and portal vein were not significantly different between groups. CONCLUSIONS: Clamping of the portal vein and IVC without performing veno-venous bypass in living donor liver transplantation had a significant negative impact on CI in both groups due to the drastic reduction in the venous return. Statistical analysis revealed that a history of EVB affects hemodynamics in a manner similar to that in patients without history of EVB during IVC clamping.


Subject(s)
Esophageal and Gastric Varices/physiopathology , Gastrointestinal Hemorrhage/physiopathology , Hemodynamics/physiology , Liver Transplantation/methods , Vena Cava, Inferior/surgery , Adult , Aged , Constriction , Esophageal and Gastric Varices/complications , Esophageal and Gastric Varices/surgery , Female , Gastrointestinal Hemorrhage/etiology , Gastrointestinal Hemorrhage/surgery , Humans , Living Donors , Male , Middle Aged , Portal Vein/physiopathology , Portal Vein/surgery , Preoperative Period , Retrospective Studies
10.
Transplant Proc ; 48(4): 1049-51, 2016 May.
Article in English | MEDLINE | ID: mdl-27320553

ABSTRACT

BACKGROUND: The aim of this study is to determine whether preoperative portal vein flow velocity or size has any correlative effect on hemodynamic changes during clamping of the inferior vena cava in liver transplantation. PATIENTS AND METHODS: A total of 42 anesthesia charts of adult patients who underwent living donor liver transplantation (LDLT) were analyzed and compared retrospectively. Preoperative portal vein (PV) flow velocity and sizes were obtained using Doppler ultrasound. All changes in the hemodynamic data before and after clamping of the portal vein (PV) and inferior vena cava (IVC) were recorded and analyzed by linear regression. A P value of <.05 was considered significant. RESULTS: Heart rate (HR), mean arterial blood pressure (MAP), central venous pressure (CVP), cardiac output (CO), cardiac index (CI), and stroke volume (SV) before and after clamping of the PV and IVC were significantly different for as long as the PV and IVC were clamped. Linear regression analysis indicated that R2 of HR, MAP, CVP, CO, and CI in correlation with the PV velocity were 0.002, 0.035, 0.024, and 0.001; R2 of the PV diameter for HR, MAP, CVP, CO, and CI were 0.028, 0.01, 0.034, and 0.004. The changes in the percentages of cardiac output at 1- and 5-minute intervals after IVC clamping were not correlated significantly with either the preoperative flow velocity or the size of the PV. CONCLUSION: Preoperative PV flow velocity and size are not correlated or associated with hemodynamic changes during IVC clamping in liver transplantation.


Subject(s)
Hemodynamics/physiology , Liver Transplantation/methods , Portal Vein/physiopathology , Preoperative Period , Vena Cava, Inferior/surgery , Adult , Aged , Cardiac Output , Central Venous Pressure , Constriction , Female , Heart Rate , Humans , Linear Models , Living Donors , Male , Middle Aged , Portal Vein/surgery , Retrospective Studies , Stroke Volume , Vena Cava, Inferior/physiopathology
11.
Transplant Proc ; 48(4): 1074-6, 2016 May.
Article in English | MEDLINE | ID: mdl-27320560

ABSTRACT

BACKGROUND: In this study, as our center transitions from using patient-controlled analgesia (PCA) morphine with intravenous (IV) ketorolac to PCA morphine with IV parecoxib, the two regimens are compared in terms of quality of pain control. METHODS: Post-operative pain management sheets were collected retrospectively among the living donors of liver transplantation during this transitional period. Group parecoxib was given plain PCA morphine. A single dose of IV parecoxib 40 mg was given 30 minutes before the end of surgery. Group ketorolac was given PCA morphine pre-mixed ketorolac with a concentration of 1.87 mg/mL. Daily and total morphine consumption, Visual Analog Score (VAS), and number of rescue attempts made up to 3 post-operative days, together with satisfaction score and incidence of side effects of PCA usage, were analyzed and compared by means of the Mann-Whitney U test; a value of P < .05 was regarded as significant, and data are given as mean ± SD. RESULTS: Fifty patients were analyzed; group 1 comprised 21 patients and group 2 comprised 29 patients. There was no difference between group 1 and group 2 in terms of daily VAS. PCA morphine requirements were significantly lower at day 2 and day 3 in group 1. However, the total overall morphine usage and satisfactory score was not statistically different (P = .863, P = .052). CONCLUSIONS: A single dose of IV parecoxib 40 mg can provide satisfactory pain control when paired with PCA morphine for donors undergoing living donor liver transplantation. The use of parecoxib in the multimodal analgesia regimen has similar efficacy, with possibly less morphine consumption, when compared with ketorolac.


Subject(s)
Analgesia, Patient-Controlled/methods , Liver Transplantation , Living Donors , Pain, Postoperative/drug therapy , Tissue and Organ Harvesting/adverse effects , Adult , Analgesics, Opioid/administration & dosage , Anti-Inflammatory Agents, Non-Steroidal/administration & dosage , Cyclooxygenase 2 Inhibitors/administration & dosage , Drug Therapy, Combination , Female , Humans , Isoxazoles/administration & dosage , Ketorolac/administration & dosage , Male , Morphine/administration & dosage , Pain Management/methods , Pain Measurement , Pain, Postoperative/etiology , Retrospective Studies
12.
Transplant Proc ; 48(4): 1080-2, 2016 May.
Article in English | MEDLINE | ID: mdl-27320562

ABSTRACT

BACKGROUND: The aim of this study was to compare the outcomes of pain management with the use of patient-controlled analgesia (PCA) fentanyl with IV parecoxib between patients with healthy liver with patients with diseased liver undergoing major liver resection. METHODS: Patients with healthy liver undergoing partial hepatectomy as liver donors for liver transplantation (group 1) and patients with liver cirrhosis (Child's criteria A) undergoing major liver resection for hepatoma (group 2) were identified retrospectively. Both groups routinely received post-operative IV PCA fentanyl and a single dose of parecoxib 40 mg. They were followed up for 3 days or until PCA fentanyl was discontinued post-operatively. Daily Visual Analog Scale, PCA fentanyl usage, rescue attempts, and common drug side effects were collected and analyzed with the use of SPSS version 20. RESULTS: One hundred one patients were included in the study: 54 in group 1, and 47 in group 2. There were no statistical differences between the two groups in terms of the daily and total fentanyl usage, VAS resting, and incidence of itchiness. The rate of rescue analgesia on post-operative day (POD) 1 was lower in group 2, with a value of P = .045. VAS dynamics were better on POD 1 and 2 for group 2, with P = .05 and P = .012, respectively. CONCLUSIONS: We found that combining a single dose of IV parecoxib 40 mg with PCA fentanyl is an easy and effective method of acute pain control after major liver resection. We propose the careful usage of post-operative fentanyl and parecoxib in patients with diseased liver, given the difference in effect as compared with healthy liver.


Subject(s)
Analgesics/therapeutic use , Fentanyl/therapeutic use , Hepatectomy/adverse effects , Isoxazoles/therapeutic use , Liver Cirrhosis/surgery , Pain, Postoperative/drug therapy , Acute Pain/diagnosis , Acute Pain/drug therapy , Acute Pain/etiology , Adult , Aged , Analgesia, Patient-Controlled , Drug Therapy, Combination , Female , Humans , Liver Transplantation , Living Donors , Male , Middle Aged , Pain Management , Pain Measurement , Pain, Postoperative/diagnosis , Pain, Postoperative/etiology , Retrospective Studies
13.
Transplant Proc ; 48(4): 1071-3, 2016 May.
Article in English | MEDLINE | ID: mdl-27320559

ABSTRACT

OBJECTIVE: Dual graft living donor liver transplantation (LDLT) is an alternative way to overcome small-for-size syndrome in LDLT. Surgical technique and outcome of using dual grafts have been reported, but there are no reports regarding anesthetic management. The aim of the current study is to compare the anesthetic management of single graft and dual graft liver transplantation. METHODS AND PATIENTS: Anesthesia records of 24 single graft liver transplantation recipients (GI) and 6 dual graft recipients (GII) were reviewed, analyzed, and compared retrospectively. Patient characteristics and intraoperative data between groups were compared with Mann-Whitney t test and Fisher's exact test where appropriate. P value less than .05 was regarded as significant. RESULTS: Patient characteristics and most of the intraoperative data were similar between groups. Significant difference was noted in the total anesthesia time and the anhepatic time. Both times were significantly longer in GII compared to GI. CONCLUSION: Dual graft living donor liver transplantation is surely a technically more challenging and demanding procedure. Therefore the total anesthesia time is longer, especially the anhepatic phase, because there are more graft vessels to be reconstructed before reperfusion. Overall the anesthetic management in terms of blood transfusion, fluid administration, sodium bicarbonate, calcium supplement, and the number of patients requiring fractional diluted noradrenaline support for maintenance of acceptable hemodynamic were not much different between the 2 groups.


Subject(s)
Anesthesia/methods , Liver Transplantation/methods , Monitoring, Intraoperative/statistics & numerical data , Adult , Anesthesia/adverse effects , Blood Transfusion/statistics & numerical data , Fluid Therapy/statistics & numerical data , Hemodynamics , Humans , Living Donors , Male , Middle Aged , Retrospective Studies , Statistics, Nonparametric
14.
Transplant Proc ; 48(4): 1077-9, 2016 May.
Article in English | MEDLINE | ID: mdl-27320561

ABSTRACT

BACKGROUND: To test the hypothesis that low end-tidal carbon dioxide tension encountered during anhepatic phase in liver transplantation is related to hemodynamic status rather than ventilatory status, and can be used to predict the change in cardiac output during anhepatic phase. METHODS: We retrospectively analyzed and compared data, included end-tidal carbon dioxide tension (ETCO2), arterial blood pressure, heart rate, central venous pressure, cardiac output, cardiac index, and stroke volume, before and after inferior vena cava clamping, and 0, 5, 10, 30 minutes during the anhepatic, and 5 minutes after the release of IVC cross clamp during the reperfusion phase, with paired Student t test, repeated measurement, and linear regression. P < .05 was regarded as significant. RESULTS: The cardiac output and ETCO2 decrease significantly after clamping the inferior vena cava and increase concomitantly after unclamping. There is a positive correlation between the changes in % in cardiac output and ETCO2 (Pearson coefficient r = 0.741). CONCLUSION: The changes in ETCO2 can be used to predict the changes of the cardiac output in % when cardiac output monitoring is not available. Before unclamping of the IVC, mild hyperventilation is suggested to prevent excessive increase in PaCO2.


Subject(s)
Carbon Dioxide/physiology , Cardiac Output/physiology , Liver Diseases/blood , Liver Diseases/physiopathology , Liver Transplantation , Vena Cava, Inferior/surgery , Adult , Blood Gas Analysis , Central Venous Pressure/physiology , Constriction , Heart Rate/physiology , Humans , Liver Circulation/physiology , Liver Diseases/surgery , Living Donors , Monitoring, Intraoperative , Retrospective Studies , Tidal Volume/physiology
15.
Osteoporos Int ; 27(5): 1691-9, 2016 May.
Article in English | MEDLINE | ID: mdl-26782682

ABSTRACT

UNLABELLED: This systematic review was performed to compare the diagnostic accuracy of vertebral fracture assessment (VFA) with that of spinal radiography for identification of vertebral fractures (VFs). VFA appeared to have moderate sensitivity and high specificity for detecting VFs when compared with spinal radiography. INTRODUCTION: VFs are recognized as the hallmark of osteoporosis, and a previous VF increases the risk of a future fracture. Therefore, the timely detection of VFs is important for prevention of further fractures. This systematic review examined the diagnostic accuracy of VFA using dual X-ray absorptiometry (DXA) to identify VFs. METHODS: We searched for potentially relevant studies using electronic databases, including Ovid-Medline, Ovid-EMBASE, Cochrane library, and four Korean databases, from their inception to May 2013. We compared the diagnostic accuracy of VFA with that of spinal radiography for detection of VFs by analyzing the sensitivity and specificity using a 2 × 2 contingency table. Subgroup analyses were also performed on studies with a low risk of bias and applicability. RESULTS: Twelve studies were analyzed for the diagnostic accuracy of VFA. The sensitivity and specificity were 0.70-0.93 and 0.95-1.00, respectively, analyzed on a per-vertebra basis, and 0.65-1.00 and 0.74-1.00 on a per-patient basis. The sensitivity and specificity of five studies in subgroups with a low risk of bias in the intervention test were 0.70-0.84 and 0.96-0.99, respectively. In studies with a low risk of bias in the patient selection, those based on a per-vertebra basis in three studies were 0.70-0.93 and 0.96-1.00, respectively. CONCLUSIONS: VFA had moderate sensitivity and high specificity for detecting VF when compared with spinal radiography. However, the present findings are insufficient to assess whether spinal radiography should be replaced by VFA.


Subject(s)
Osteoporotic Fractures/diagnostic imaging , Spinal Fractures/diagnostic imaging , Absorptiometry, Photon/methods , Female , Humans , Male , Osteoporosis, Postmenopausal/complications , Osteoporotic Fractures/etiology , Radiography , Sensitivity and Specificity , Spinal Fractures/etiology
16.
Cancer Gene Ther ; 22(6): 302-11, 2015 Jun.
Article in English | MEDLINE | ID: mdl-26021486

ABSTRACT

Pediatric brainstem glioma is an incurable malignancy because of its inoperability. As a result of their extensive tropism toward cancer and the possibility of autologous transplantation, human adipose-derived mesenchymal stem cells (hAT-MSC) are attractive vehicles to deliver therapeutic genes to brainstem gliomas. In this study, in a good manufacturing practice (GMP) facility, we established clinically applicable hAT-MSCs expressing therapeutic genes and investigated their therapeutic efficacy against brainstem glioma in mice. For feasible clinical applications, (1) primary hAT-MSCs were cultured from human subcutaneous fat to make autologous transplantation possible, (2) hAT-MSCs were genetically engineered to express carboxyl esterase (CE) and (3) a secreted form of the tumor necrosis factor-related apoptosis-inducing ligand (sTRAIL) expression vector for synergistic effects was delivered by a gene transfer technology that did not result in genomic integration of the vector. (4) Human CE and sTRAIL sequences were utilized to avoid immunological side effects. The hAT-MSCs expressing CE±sTRAIL showed significant therapeutic effects against brainstem gliomas in vitro and in vivo. However, the simultaneous expression of CE and sTRAIL had no synergistic effects in vivo. The results indicate that non-viral transient single sTRAIL gene transfer to autologous hAT-MSCs is a clinically applicable stem cell-based gene therapy for brainstem gliomas in terms of therapeutic effects and safety.


Subject(s)
Adipose Tissue/cytology , Brain Stem Neoplasms/therapy , Genetic Therapy/methods , Glioma/therapy , Mesenchymal Stem Cell Transplantation , Animals , Cell Line, Tumor , Female , Humans , Mesenchymal Stem Cells/metabolism , Mice , TNF-Related Apoptosis-Inducing Ligand/genetics , Transgenes , Xenograft Model Antitumor Assays
18.
J Dent ; 38(2): 166-71, 2010 Feb.
Article in English | MEDLINE | ID: mdl-19819290

ABSTRACT

OBJECTIVES: Some studies have been conducted to evaluate the effect of different topical fluoride regimens on the remineralization of initial carious lesions. This study was conducted to compare the effects of 3 topical fluoride treatments on the surface microhardness, fluoride uptake, and fluorescence lesion area in enamel. METHODS: Forty-eight bovine teeth were demineralized and subjected to one of the following treatments: (1) no treatment (control), (2) iontophoresis using 2% sodium fluoride solution, (3) 1.23% acidulated phosphate fluoride gel application, and (4) 5% sodium fluoride varnish application. Six persons continuously wore a mandibular removable appliance mounted with eight treated bovine teeth orally for 4 weeks, except while eating, sleeping, and brushing. Microhardness of enamel surfaces was measured using a digital microhardness tester. The fluoride concentration was analyzed using a fluoride electrode, and the fluorescence lesion area was calculated by confocal laser scanning microscopy. RESULTS: No significant differences in the microhardness were observed in response to the 3 fluoride regimens. The highest level of fluoride was observed in the APF gel group. APF gel group also showed significantly reduced fluorescence lesion areas compared to those of the control group. CONCLUSIONS: The fluoride regimens showed no difference in surface microhardness; although APF gel showed the best effects in terms of fluoride uptake and decrease in the fluorescence lesion area, its effects were not significantly different from those of fluoride varnish.


Subject(s)
Cariostatic Agents/therapeutic use , Dental Caries/prevention & control , Dental Enamel/drug effects , Fluorides, Topical/therapeutic use , Tooth Remineralization/methods , Acidulated Phosphate Fluoride/therapeutic use , Adult , Animals , Cariostatic Agents/pharmacokinetics , Cattle , Dental Caries/metabolism , Dental Caries/pathology , Dental Enamel/metabolism , Dental Enamel/pathology , Fluorescence , Fluorides/pharmacokinetics , Gels , Hardness , Humans , Ion-Selective Electrodes , Iontophoresis , Microscopy, Confocal , Sodium Fluoride/therapeutic use
19.
J Environ Qual ; 37(1): 207-18, 2008.
Article in English | MEDLINE | ID: mdl-18178894

ABSTRACT

Herbicide-tolerant Zoysia grass (Zoysia japonica Steud.) has been generated previously through Agrobacterium tumefaciens-mediated transformation. The genetically modified (GM) Zoysia grass survived Basta spraying and grew to maturity normally while the wild-type (WT) grass stopped growing and died. GM Zoysia grass will permit more efficient weed control for various turf grass plantings such as home lawns, golf courses, and parks. We examined the environmental/biodiversity risks of herbicide-tolerant GM Zoysia before applying to regulatory agencies for approval for commercial release. The GM and WT Zoysia grass' substantial trait equivalence, ability to cross-pollinate, and gene flow in confined and unconfined test fields were selectively analyzed for environmental/biodiversity effects. No difference between GM and WT Zoysia grass in substantial traits was found. To assess the potential for cross-pollination and gene flow, a non-selective herbicide, Basta, was used. Results showed that unintended cross-pollination with and gene flow from GM Zoysia grass were not detected in neighboring weed species examined, but were observed in WT Zoysia grass (on average, 6% at proximity, 1.2% at a distance of 0.5 m and 0.12% at a radius of 3 m, and 0% at distances over 3 m). On the basis of these initial studies, we conclude that the GM Zoysia grass generated in our laboratory and tested in the Nam Jeju County field does not appear to pose a significant risk when cultivated outside of test fields.


Subject(s)
Herbicide Resistance , Plants, Genetically Modified/physiology , Poaceae/physiology , Adult , Antigens, Plant/immunology , Female , Gene Flow , Humans , Hybridization, Genetic , Hypersensitivity/etiology , Hypersensitivity/immunology , Korea , Male , Phenotype , Plants, Genetically Modified/anatomy & histology , Poaceae/anatomy & histology , Pollen/immunology , Pollination , Risk Assessment , Skin Tests , Wind
20.
Acta Neurochir Suppl ; 99: 125-32, 2006.
Article in English | MEDLINE | ID: mdl-17370778

ABSTRACT

We investigated the effect of stereotaxically transplanted human mesenchymal stem cells (hMSCs) on behavioral change after traumatic cold brain injury in adult rats. Cortical lesions (n= 20) were induced by touching a metal stamp, cooled with liquid nitrogen, to the dura over the forelimb motor cortex of adult rats. The procedure produced a localized lesion, and the animals showed significant motor deficits. hMSCs were freshly isolated from human iliac bone and cultured in tissue culture flasks with 10 ml Dulbecco's modified Eagle's medium. The animals received hMSC grafts (3 x 10(5) hMSCs) 6 days after cold lesion (n = 10). All rats were sacrificed 3 or 7 weeks after cold injury, and immunohistochemical staining was performed on brain sections to identify donor hMSCs. Neurological evaluations were performed with the forepaw adjusting step test and modified neurological scoring. Treatment with 3 x 10(5) hMSCs improved the rat's neurological functions. We also found that the transplanted cells successfully migrated into the injured brain, preferentially localized around the injury site, and expressed the neuronal and astrocyte marker. These data suggest that hMSCs may be a potential therapeutic tool for brain injuries.


Subject(s)
Brain Injuries/therapy , Mesenchymal Stem Cell Transplantation , Animals , Behavior, Animal , Bone Marrow Transplantation/pathology , Brain Injuries/pathology , Cerebral Cortex/pathology , Humans , Models, Animal , Motor Cortex/injuries , Motor Cortex/pathology , Rats , Transplantation, Heterologous
SELECTION OF CITATIONS
SEARCH DETAIL
...