Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 17 de 17
Filter
1.
Article in English | MEDLINE | ID: mdl-38581223

ABSTRACT

BACKGROUND: Our bodies have adaptive mechanisms to fasting, in which glycogen stored in the liver and muscle protein are broken down, but also lipid mobilisation is triggered. As a result, glycerol and fatty acids are released into the bloodstream, increasing the production of ketone bodies in liver. However, there are limited studies on the incidence of perioperative urinary ketosis, the intraoperative blood glucose changes and metabolic acidosis after fasting for surgery in non-diabetic adult patients. METHODS: We conducted a retrospective cohort study involving 1831 patients undergoing gynecologic surgery under general anesthesia from January to December 2022. Ketosis was assessed using a postoperative urine test, while blood glucose levels and acid-base status were collected from intraoperative arterial blood gas analyses. RESULTS: Of 1535 patients who underwent postoperative urinalysis, 912 (59.4%) patients had ketonuria. Patients with ketonuria were younger, had lower body mass index, and had fewer comorbidities than those without ketonuria. After adjustments, younger age, higher body mass index and surgery starting late afternoon were significant risk factors for postoperative ketonuria. Of the 929 patients assessed with intraoperative arterial blood gas analyses, 29.0% showed metabolic acidosis. Multivariable logistic regression revealed that perioperative ketonuria and prolonged surgery significantly increased the risk for moderate-to-severe metabolic acidosis. CONCLUSION: Perioperative urinary ketosis and intraoperative metabolic acidosis are common in patients undergoing gynecologic surgery, even with short-term preoperative fasting. The risks are notably higher in younger patients with lower body mass index. Optimization of preoperative fasting strategies including implementation of oral carbohydrate loading should be considered for reducing perioperative metabolic derangement due to ketosis.

2.
Crit Care ; 28(1): 76, 2024 03 14.
Article in English | MEDLINE | ID: mdl-38486247

ABSTRACT

BACKGROUND: A real-time model for predicting short-term mortality in critically ill patients is needed to identify patients at imminent risk. However, the performance of the model needs to be validated in various clinical settings and ethnicities before its clinical application. In this study, we aim to develop an ensemble machine learning model using routinely measured clinical variables at a single academic institution in South Korea. METHODS: We developed an ensemble model using deep learning and light gradient boosting machine models. Internal validation was performed using the last two years of the internal cohort dataset, collected from a single academic hospital in South Korea between 2007 and 2021. External validation was performed using the full Medical Information Mart for Intensive Care (MIMIC), eICU-Collaborative Research Database (eICU-CRD), and Amsterdam University Medical Center database (AmsterdamUMCdb) data. The area under the receiver operating characteristic curve (AUROC) was calculated and compared to that for the National Early Warning Score (NEWS). RESULTS: The developed model (iMORS) demonstrated high predictive performance with an internal AUROC of 0.964 (95% confidence interval [CI] 0.963-0.965) and external AUROCs of 0.890 (95% CI 0.889-0.891) for MIMIC, 0.886 (95% CI 0.885-0.887) for eICU-CRD, and 0.870 (95% CI 0.868-0.873) for AmsterdamUMCdb. The model outperformed the NEWS with higher AUROCs in the internal and external validation (0.866 for the internal, 0.746 for MIMIC, 0.798 for eICU-CRD, and 0.819 for AmsterdamUMCdb; p < 0.001). CONCLUSIONS: Our real-time machine learning model to predict short-term mortality in critically ill patients showed excellent performance in both internal and external validations. This model could be a useful decision-support tool in the intensive care units to assist clinicians.


Subject(s)
Academic Medical Centers , Critical Illness , Humans , Area Under Curve , Critical Care , Intensive Care Units , Machine Learning
3.
Clin Transplant ; 38(1): e15231, 2024 01.
Article in English | MEDLINE | ID: mdl-38289882

ABSTRACT

INTRODUCTION: There is insufficient evidence regarding the optimal regimen for ascites replacement after living donor liver transplantation (LT) and its effectiveness. The aim of this study is to evaluate the impact of replacing postoperative ascites after LT with albumin on time to first flatus during recovery with early ambulation and incidence of acute kidney injury (AKI). METHODS: Adult patients who underwent elective living donor LT at Seoul National University Hospital from 2019 to 2021 were randomly assigned to either the albumin group or lactated Ringer's group, based on the ascites replacement regimen. Replacement of postoperative ascites was performed for all patients every 4 h after LT until the patient was transferred to the general ward. Seventy percent of ascites drained during the previous 4 h was replaced over the next 4 h with continuous infusion of fluids with a prescribed regimen according to the assigned group. In the albumin group, 30% of a total of 70% of drained ascites was replaced with 5% albumin solution, and remnant 40% was replaced with lactated Ringer's solution. In the lactated Ringer's group, 70% of drained ascites was replaced with only lactated Ringer's solution. The primary outcome was the time to first flatus from the end of the LT and the secondary outcome was the incidence of AKI for up to postoperative day 7. RESULTS: Among the 157 patients who were screened for eligibility, 72 patients were enrolled. The mean age was 63 ± 8.2 years, and 73.0 % (46/63) were male. Time to first flatus was similar between the two groups (66.7 ± 24.1 h vs. 68.5 ± 25.6 h, p = .778). The albumin group showed a higher glomerular filtration rate and lower incidence of AKI until postoperative day 7, compared to the lactated Ringer's group. CONCLUSIONS: Using lactated Ringer's solution alone for replacement of ascites after living donor LT did not reduce the time to first flatus and was associated with an increased risk of AKI. Further research on the optimal ascites replacement regimen and the target serum albumin level which should be corrected after LT is required.


Subject(s)
Acute Kidney Injury , Liver Transplantation , Aged , Female , Humans , Male , Middle Aged , Acute Kidney Injury/etiology , Albumins , Ascites/etiology , Flatulence , Isotonic Solutions , Liver Transplantation/adverse effects , Living Donors , Ringer's Lactate
4.
J Hepatobiliary Pancreat Sci ; 31(1): 34-41, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37792597

ABSTRACT

BACKGROUND/PURPOSE: Prophylactic antibiotics administration before percutaneous biliary intervention (PBI) is currently recommended, but the underlying evidence is mostly extrapolated from prophylactic antibiotics before surgery. The aim of this study was to evaluate the impact of prophylactic antibiotics administration timing on the incidence of suspected systemic infection after PBI. METHODS: The incidence of suspected systemic infection after PBI was compared in patients who received prophylactic antibiotics at four different time intervals between antibiotics administration and skin puncture for PBI. Suspected post-intervention systemic infection was assessed according to predetermined clinical criteria. RESULTS: There were 98 (21.6%) suspected systemic infections after 454 PBIs in 404 patients. There were significant differences among the four groups in the incidence of suspected systemic infection after the intervention (p = .020). Fever was the most common sign of suspected systemic infection. Administration of prophylactic antibiotics more than an hour before PBI was identified as an independent risk factor of suspected systemic infection after adjusting for other relevant factors (adjusted odds ratio = 10.54; 95% confidence interval, 1.40-78.86). CONCLUSIONS: The incidence of suspected systemic infection after the PBI was significantly lower when prophylactic antibiotics were administered within an hour before the intervention.


Subject(s)
Anti-Bacterial Agents , Biliary Tract Surgical Procedures , Humans , Anti-Bacterial Agents/therapeutic use , Antibiotic Prophylaxis , Surgical Wound Infection/prevention & control
5.
Sci Rep ; 13(1): 19947, 2023 11 15.
Article in English | MEDLINE | ID: mdl-37968287

ABSTRACT

Although pulmonary artery catheter (PAC) has been used during liver transplantation surgery, the usefulness of PAC has rarely been investigated. We evaluated whether the use of PAC is associated with better clinical outcomes compared to arterial waveform-based monitoring after liver transplantation. A total of 1565 cases undergoing liver transplantation were reviewed. We determined whether patients received PAC or not and divided our cohort into the PAC with hemodynamic monitoring using PAC and the non-PAC with arterial waveform-based monitoring using FloTrac-Vigileo. Propensity score matching was performed. Acute kidney injury (AKI), early allograft dysfunction (EAD) and 1-year all-cause mortality or graft failure were compared in the matched cohorts. Logistic regression analysis was performed in the inverse probability of treatment-weighted (IPTW) cohort for postoperative EAD and AKI, respectively. Five-year overall survival was compared between the two groups. In the matched cohort, there was no significant difference in the incidence of AKI, EAD, length of hospital or ICU stay, and 1-year all-cause mortality between the groups. In the IPTW cohort, the use of PAC was not a significant predictor for AKI or EAD (AKI: odds ratio (95% confidence interval) of 1.20 (0.47-1.56), p = 0.229; EAD: 0.99 (0.38-1.14), p = 0.323). There was no significant difference in the survival between groups after propensity score matching (Log-rank test p = 0.578). In conclusion, posttransplant clinical outcomes were not significantly different between the groups with and without PAC. Anesthetic management without the use of PAC may be possible in low-risk patients during liver transplantation. The risk should be carefully assessed by considering MELD scores, ischemic time, surgical history, previous treatment of underlying liver disease, and degree of portal and pulmonary hypertension.Registration: https://clinicaltrials.gov/ct2/show/NCT05457114 (registration date: July 15, 2022).


Subject(s)
Acute Kidney Injury , Liver Transplantation , Humans , Liver Transplantation/adverse effects , Pulmonary Artery , Retrospective Studies , Acute Kidney Injury/diagnosis , Acute Kidney Injury/etiology , Catheters
6.
BMC Nephrol ; 24(1): 334, 2023 11 10.
Article in English | MEDLINE | ID: mdl-37950190

ABSTRACT

BACKGROUND: Continuous renal replacement therapy is a relatively common modality applied to critically ill patients with renal impairment. To maintain stable continuous renal replacement therapy, sufficient blood flow through the circuit is crucial, but catheter dysfunction reduces the blood flow by inadequate pressures within the circuit. Therefore, exploring and modifying the possible risk factors related to catheter dysfunction can help to provide continuous renal replacement therapy with minimal interruption. METHODS: Adult patients who received continuous renal replacement therapy at Seoul National University Hospital between January 2019 and December 2021 were retrospectively analyzed. Patients who received continuous renal replacement therapy via a temporary hemodialysis catheter, inserted at the bedside under ultrasound guidance within 12 h of continuous renal replacement therapy initiation were included. RESULTS: A total of 507 continuous renal replacement therapy sessions in 457 patients were analyzed. Dialysis catheter dysfunction occurred in 119 sessions (23.5%). Multivariate analysis showed that less prolonged prothrombin time (adjusted OR 0.49, 95% CI, 0.30-0.82, p = 0.007) and activated partial thromboplastin time (adjusted OR 1.01, 95% CI, 1.00-1.01, p = 0.049) were associated with increased risk of catheter dysfunction. Risk factors of re-catheterization included vascular access to the left jugular and femoral vein. CONCLUSIONS: In critically ill patients undergoing continuous renal replacement therapy, less prolonged prothrombin time was associated with earlier catheter dysfunction. Use of left internal jugular veins and femoral vein were associated with increased risk of re-catheterization compared to the right internal jugular vein.


Subject(s)
Catheterization, Central Venous , Continuous Renal Replacement Therapy , Adult , Humans , Renal Dialysis/adverse effects , Retrospective Studies , Critical Illness/therapy , Catheters, Indwelling/adverse effects , Catheterization , Risk Factors , Catheterization, Central Venous/adverse effects , Renal Replacement Therapy/adverse effects
7.
Sci Rep ; 13(1): 15599, 2023 09 20.
Article in English | MEDLINE | ID: mdl-37730856

ABSTRACT

Guidelines from the World Health Organization strongly recommend the use of a high fraction of inspired oxygen (FiO2) in adult patients undergoing general anesthesia to reduce surgical site infection (SSI). However, previous meta-analyses reported inconsistent results. We aimed to address this controversy by focusing specifically on abdominal surgery with relatively high risk of SSI. Medline, EMBASE, and Cochrane CENTRAL databases were searched. Randomized trials of abdominal surgery comparing high to low perioperative FiO2 were included, given that the incidence of SSI was reported as an outcome. Meta-analyses of risk ratios (RR) were performed using a fixed effects model. Subgroup analysis and meta-regression were employed to explore sources of heterogeneity. We included 27 trials involving 15977 patients. The use of high FiO2 significantly reduced the incidence of SSI (n = 27, risk ratio (RR): 0.87; 95% confidence interval (CI): 0.79, 0.95; I2 = 49%, Z = 3.05). Trial sequential analysis (TSA) revealed that z-curve crossed the trial sequential boundary and data are sufficient. This finding held true for the subgroup of emergency operations (n = 2, RR: 0.54; 95% CI: 0.35, 0.84; I2 = 0%, Z = 2.75), procedures using air as carrier gas (n = 9, RR: 0.79; 95% CI: 0.69, 0.91; I2 = 60%, Z = 3.26), and when a high level of FiO2 was maintained for a postoperative 6 h or more (n = 9, RR: 0.68; 95% CI: 0.56, 0.83; I2 = 46%, Z = 3.83). Meta-regression revealed no significant interaction between SSI with any covariates including age, sex, body-mass index, diabetes mellitus, duration of surgery, and smoking. Quality of evidence was assessed to be moderate to very low. Our pooled analysis revealed that the application of high FiO2 reduced the incidence of SSI after abdominal operations. Although TSA demonstrated sufficient data and cumulative analysis crossed the TSA boundary, our results should be interpreted cautiously given the low quality of evidence.Registration: https://www.crd.york.ac.uk/prospero (CRD42022369212) on October 2022.


Subject(s)
Anesthesia, General , Surgical Wound Infection , Adult , Humans , Surgical Wound Infection/prevention & control , Anesthesia, General/adverse effects , Body Mass Index , Databases, Factual , Oxygen
8.
Anesth Pain Med (Seoul) ; 18(3): 213-219, 2023 Jul.
Article in English | MEDLINE | ID: mdl-37691592

ABSTRACT

With the growing interest of researchers in machine learning and artificial intelligence (AI) based on large data, their roles in medical research have become increasingly prominent. Despite the proliferation of predictive models in perioperative medicine, external validation is lacking. Open datasets, defined as publicly available datasets for research, play a crucial role by providing high-quality data, facilitating collaboration, and allowing an objective evaluation of the developed models. Among the available datasets for surgical patients, VitalDB has been the most widely used, with the Medical Informatics Operating Room Vitals and Events Repository recently launched and the Informative Surgical Patient dataset for Innovative Research Environment expected to be released soon. For critically ill patients, the available resources include the Medical Information Mart for Intensive Care, the eICU Collaborative Research Database, the Amsterdam University Medical Centers Database, and the High time Resolution ICU Dataset, with the anticipated release of the Intensive Care Network with Million Patients' information for the AI Clinical decision support system Technology dataset. This review presents a detailed comparison of each to enrich our understanding of these open datasets for data science and AI research in perioperative medicine.

9.
Transplant Proc ; 55(7): 1715-1725, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37419732

ABSTRACT

BACKGROUND: Hematopoietic stem cell transplantation (HSCT) is a complex, high-risk procedure with significant morbidity and mortality. The positive impact of higher institutional case volume on survival has been reported in various high-risk procedures. The association between annual institutional HSCT case volume and mortality was analyzed using the National Health Insurance Service database. METHODS: Data on 16,213 HSCTs performed in 46 Korean centers between 2007 and 2018 were extracted. Centers were divided into low- or high-volume centers using an average of 25 annual cases as the cut-off. Adjusted odds ratios (OR) for 1-year mortality after allogeneic and autologous HSCT were estimated using multivariable logistic regression. RESULTS: For allogeneic HSCT, low-volume centers (≤25 cases/y) were associated with higher 1-year mortality (adjusted OR 1.17, 95% CI 1.04-1.31, P = .008). However, low-volume centers did not show higher 1-year mortality (adjusted OR 1.03, 95% CI 0.89-1.19, P = .709) for autologous HSCT. Long-term mortality after HSCT was significantly worse in low-volume centers (adjusted hazard ratio [HR] 1.17, 95% CI, 1.09-1.25, P < .001 and adjusted HR 1.09, 95% CI, 1.01-1.17, P = .024, allogeneic and autologous HSCT, respectively) compared with high-volume centers. CONCLUSION: Our data suggest that higher institutional HSCT case volume seems to be associated with better short- and long-term survival.


Subject(s)
Health Facilities , Hematopoietic Stem Cell Transplantation , Humans , Transplantation, Autologous , Data Collection , Hematopoietic Stem Cell Transplantation/adverse effects , Retrospective Studies
10.
J Am Coll Surg ; 237(4): 606-613, 2023 10 01.
Article in English | MEDLINE | ID: mdl-37350477

ABSTRACT

BACKGROUND: Atelectasis is a common complication after upper abdominal surgery and considered as a cause of early postoperative fever (EPF) within 48 hours after surgery. However, the pathophysiologic mechanism of how atelectasis causes fever remains unclear. STUDY DESIGN: Data for adult patients who underwent elective major upper abdominal surgery under general anesthesia at Seoul National University Hospital between January and December of 2021 were retrospectively analyzed. The primary outcome was the association between fever and atelectasis within 2 days after surgery. RESULTS: Of 1,624 patients, 810 patients (49.9%) developed EPF. The incidence of atelectasis was similar between the fever group and the no-fever group (51.6% vs 53.9%, p = 0.348). Multivariate analysis showed no significant association between atelectasis and EPF. Culture tests (21.7% vs 8.8%, p < 0.001) and prolonged use of antibiotics (25.9% vs 13.9%, p < 0.001) were more frequent in the fever group compared to the no-fever group. However, the frequency of bacterial growth on culture tests and postoperative pulmonary complications within 7 days were similar between the two groups. CONCLUSIONS: EPF after major upper abdominal surgery was not associated with radiologically detected atelectasis. EPF also was not associated with the increased risk of postoperative pulmonary complications, bacterial growth on culture studies, or prolonged length of hospital stay.


Subject(s)
Pulmonary Atelectasis , Adult , Humans , Retrospective Studies , Pulmonary Atelectasis/etiology , Pulmonary Atelectasis/complications , Lung , Postoperative Complications/diagnosis , Postoperative Complications/epidemiology , Postoperative Complications/etiology , Elective Surgical Procedures/adverse effects
11.
Clin Transl Sci ; 16(7): 1177-1185, 2023 07.
Article in English | MEDLINE | ID: mdl-37038357

ABSTRACT

Antithrombin-III (AT-III) concentrates have been used in the immediate postoperative period after liver transplantation to prevent critical thrombosis. We aimed to investigate a more appropriate method for AT-III concentrate administration to maintain plasma AT-III activity level within the target range. In this randomized controlled trial, 130 adult patients undergoing living-donor liver transplantation were randomized to either the intermittent group or continuous group. In the intermittent group, 500 international units (IU) of AT-III concentrate were administered after liver transplantation and repeated every 6 h for 72 h. In the continuous group, 3000 IU of AT-III were continuously infused for 71 h after a loading dose of 2000 IU over 1 h. Plasma AT-III activity level was measured at 12, 24, 48, 72, and 84 h from the first AT-III administration. The primary outcome was the target (80%-120%) attainment rate at 72 h. Target attainment rates at other timepoints and associated complications were collected as secondary outcomes. A total of 107 patients were included in the analysis. The target attainment rates at 72 h post-dose were 30% and 62% in the intermittent group and continuous group, respectively (p = 0.003). Compared to the intermittent group, patients in the continuous group reached the target level more rapidly (12 vs. 24 h, median time, p < 0.001) and were more likely to remain in the target range until 84 h. For maintaining the target plasma AT-III activity level after living-donor liver transplantation, continuous infusion of AT-III seemed to be more appropriate compared to the conventional intermittent infusion regimen.


Subject(s)
Liver Transplantation , Thrombosis , Adult , Humans , Antithrombin III , Liver Transplantation/adverse effects , Living Donors , Anticoagulants , Thrombosis/etiology , Thrombosis/prevention & control
12.
Sci Rep ; 13(1): 6951, 2023 04 28.
Article in English | MEDLINE | ID: mdl-37117258

ABSTRACT

Corticosteroids remain the mainstay of immunosuppression for liver transplant recipients despite several serious complications including infection, hepatitis C virus (HCV) recurrence, diabetes mellitus (DM), and hypertension. We attempted to compare the safety and efficacy of T-cell specific antibody induction with complete corticosteroid avoidance. We searched MEDLINE, EMBASE, and Cochrane central library. Randomized controlled trials comparing T-cell specific antibody induction with corticosteroid induction immunosuppression were included. Our primary outcome was the incidence of biopsy-proven acute rejection. Eleven trials involving 1683 patients were included. The incidence of acute rejection was not significantly different between the antibody and steroid induction groups (risk ratio [RR] 0.85, 95% confidence interval [CI] 0.72, 1.01, P = 0.06, I2 = 0%). However, T-cell specific antibody induction significantly reduced the risk of cytomegalovirus infection (RR 0.48, 95% CI 0.33, 0.70, P = 0.0002, I2 = 3%), HCV recurrence (RR 0.89, 95% CI 0.80, 0.99, P = 0.03, I2 = 0%), DM (RR 0.41, 95% CI 0.32, 0.54, P < 0.0001, I2 = 0%) and hypertension (RR 0.71, 95% CI 0.55, 0.90, P = 0.005, I2 = 35%). Trial sequential analysis for acute rejection showed that the cumulative z-curve did not cross the Trial sequential boundary and the required information size was not reached. T-cell specific antibody induction compared to corticosteroid induction seems to significantly reduce opportunistic infections including cytomegalovirus infection and HCV recurrence and metabolic complications including DM and hypertension. However, given the insufficient study power, low quality of evidence, and heterogeneous immunosuppressive regimens, our results should be cautiously appreciated.


Subject(s)
Cytomegalovirus Infections , Diabetes Mellitus , Hepatitis C , Hypertension , Liver Transplantation , Humans , Liver Transplantation/adverse effects , Immunosuppression Therapy/methods , Immunosuppressive Agents/adverse effects , Antilymphocyte Serum , Adrenal Cortex Hormones/therapeutic use , T-Lymphocytes
13.
Nutrition ; 99-100: 111638, 2022.
Article in English | MEDLINE | ID: mdl-35576874

ABSTRACT

OBJECTIVES: Because most patients who develop pressure ulcer (PU) are malnourished, additional nutritional support is important for PU improvement. The aim of this study was to investigate the potential benefit of a simple nutritional support protocol in PU improvement. METHODS: This study was a comparative before-and-after study, prospectively performed from May to December 2020. Participants were inpatients of Seoul National University Hospital (SNUH), South Korea. Among the patients who developed PU from May to December 2020, those on enteral nutrition were included in the protocol group. Application of the nutritional support protocol was established in May 2020 in SNUH. Serum levels of prealbumin, transferrin, cholesterol, and zinc were measured initially and 2 and 4 wk after protocol application to evaluate clinical course. A tailored regimen that adjusted the amount of protein and trace elements was provided according to consultation with the nutritional support team. Size and Pressure Ulcer Scale for Healing was evaluated every 2 wk by the same nurse in charge of PU. To validate the efficacy of the protocol, patients who developed PU from May to December 2018, were hospitalized for >2 wk, and who received enteral nutrition were selected as a control group. RESULTS: Sixty-one patients were included in the protocol group and 100 were in the control group. The protocol group had a higher proportion of PU improvement (85.2 versus 50%; P < 0.001), daily protein intake (1.6 ± 3.2 versus 0.9 ± 0.4; P = 0.048), Braden scale (12.9 ± 1.8 versus 12.3 ± 1.8; P = 0.025), and baseline albumin level (3.1 ± 0.5 versus 2.8 ± 0.4; P = 0.001) when compared with the control group. Multivariate analysis showed that implementation of the nutritional support protocol was the most effective factor in improving PU (odds ratio, 0.18; 95% confidence interval, 0.089-0.366; P < 0.001). CONCLUSIONS: A simple nutritional support protocol was easy to develop and its application contributed significantly to the recovery of PU.


Subject(s)
Malnutrition , Pressure Ulcer , Enteral Nutrition/methods , Hospitalization , Humans , Malnutrition/complications , Malnutrition/therapy , Nutritional Support , Pressure Ulcer/etiology , Pressure Ulcer/therapy
14.
BMC Anesthesiol ; 20(1): 285, 2020 11 14.
Article in English | MEDLINE | ID: mdl-33189145

ABSTRACT

BACKGROUND: Cerebral oximetry has been widely used to measure regional oxygen saturation in brain tissue, especially during cardiac surgery. Despite its popularity, there have been inconsistent results on the use of cerebral oximetry during cardiac surgery, and few studies have evaluated cerebral oximetry during off pump coronary artery bypass graft surgery (OPCAB). METHODS: To evaluate the relationship between intraoperative cerebral oximetry and postoperative delirium in patients who underwent OPCAB, we included 1439 patients who underwent OPCAB between October 2004 and December 2016 and among them, 815 patients with sufficient data on regional cerebral oxygen saturation (rSO2) were enrolled in this study. We retrospectively analyzed perioperative variables and the reduction in rSO2 below cut-off values of 75, 70, 65, 60, 55, 50, 45, 40, and 35%. Furthermore, we evaluated the relationship between the reduction in rSO2 and postoperative delirium. RESULTS: Delirium occurred in 105 of 815 patients. In both univariable and multivariable analyses, the duration of rSO2 reduction was significantly longer in patients with delirium at cut-offs of < 50 and 45% (for every 5 min, adjusted odds ratio (OR) 1.007 [95% Confidence interval (CI) 1.001 to 1.014] and adjusted OR 1.012 [1.003 to 1.021]; p = 0.024 and 0.011, respectively). The proportion of patients with a rSO2 reduction < 45% was significantly higher among those with delirium (adjusted OR 1.737[1.064 to 2.836], p = 0.027). CONCLUSIONS: In patients undergoing OPCAB, intraoperative rSO2 reduction was associated with postoperative delirium. Duration of rSO2 less than 50% was 40% longer in the patients with postoperative delirium. The cut-off value of intraoperative rSO2 that associated with postoperative delirium was 50% for the total patient population and 55% for the patients younger than 68 years.


Subject(s)
Brain/metabolism , Coronary Artery Bypass, Off-Pump , Emergence Delirium/epidemiology , Monitoring, Intraoperative/methods , Oximetry/methods , Oxygen/metabolism , Aged , Brain/physiopathology , Cerebrovascular Circulation/physiology , Emergence Delirium/physiopathology , Female , Humans , Male , Republic of Korea/epidemiology , Retrospective Studies
15.
Pediatr Neurosurg ; 55(1): 36-41, 2020.
Article in English | MEDLINE | ID: mdl-31940654

ABSTRACT

INTRODUCTION: Intravenous patient-controlled analgesia (PCA) has been one of the most popular modalities for postoperative pain management in orthopedic surgery, plastic surgery, or neurosurgery in children. OBJECTIVE: We compared the effects of fentanyl and sufentanil used in intravenous PCA on postoperative pain management and opioid-related side effects in pediatric moyamoya disease. METHODS: This retrospective study included 97 pediatric patients who underwent surgery for moyamoya disease. Preoperative and perioperative parameters were assessed. The PCA regimen was as follows: fentanyl group (0.2 µg/kg/mL, 1 mL of loading volume, 0.1 µg/kg/h of basal infusion, a bolus of 0.2 µg/kg on demand, "lock-out" interval of 15 min); sufen-tanil group (0.04 µg/kg/mL, 1 mL of loading volume, 0.02 µg/kg/h of basal infusion, a bolus of 0.04 µg/kg on demand, 15 min lock-out), 10 µg/kg (up to 300 µg) of ramosetron for prophylaxis of postoperative nausea and vomiting with the same loading dose in both groups. Peripheral nerve blocks were performed. Pain was assessed by numeric rating scale or revised Faces Pain Scale. Side effects were reviewed. RESULTS: The two groups showed similar pain scores and incidence of nausea or vomiting during the first 48 h postoperatively. Additional analgesics were more frequent in the fentanyl group, and PCA was discontinued more frequently in the sufentanil group. CONCLUSIONS: Postoperatively, sufen-tanil in PCA provided more analgesia than fentanyl with less additional analgesics in moyamoya disease. However, PCA with sufentanil was more frequently discontinued due to nausea or vomiting compared to fentanyl-based PCA.


Subject(s)
Analgesia, Patient-Controlled , Analgesics, Opioid/therapeutic use , Fentanyl/therapeutic use , Moyamoya Disease/surgery , Pain, Postoperative/drug therapy , Sufentanil/therapeutic use , Administration, Intravenous , Adolescent , Analgesics, Opioid/adverse effects , Child , Child, Preschool , Female , Fentanyl/administration & dosage , Fentanyl/adverse effects , Humans , Male , Nausea/etiology , Pain Measurement , Retrospective Studies , Sufentanil/administration & dosage , Sufentanil/adverse effects , Vomiting/etiology
16.
J Clin Med ; 8(1)2018 Dec 28.
Article in English | MEDLINE | ID: mdl-30597881

ABSTRACT

Acute kidney injury (AKI) is a frequent complication after living donor liver transplantation (LDLT), and is associated with increased mortality. However, the association between intraoperative oliguria and the risk of AKI remains uncertain for LDLT. We sought to determine the association between intraoperative oliguria alone and oliguria coupled with hemodynamic derangement and the risk of AKI after LDLT. We evaluated the hemodynamic variables, including mean arterial pressure, cardiac index, and mixed venous oxygen saturation (SvO2). We reviewed 583 adult patients without baseline renal dysfunction and who did not receive hydroxyethyl starch during surgery. AKI was defined using the Kidney Disease Improving Global Outcomes criteria according to the serum creatinine criteria. Multivariable logistic regression analysis was performed with and without oliguria and oliguria coupled with a decrease in SvO2. The performance was compared with respect to the area under the receiver operating characteristic curve (AUC). Intraoperative oliguria <0.5 and <0.3 mL/kg/h were significantly associated with the risk of AKI; however, their performance in predicting AKI was poor. The AUC of single predictors increased significantly when oliguria was combined with decreased SvO2 (AUC 0.72; 95% confidence interval (CI) 0.68⁻0.75 vs. AUC of oliguria alone 0.61; 95% CI 0.56⁻0.61; p < 0.0001; vs. AUC of SvO2 alone 0.66; 95% CI 0.61⁻0.70; p < 0.0001). Addition of oliguria coupled with SvO2 reduction also increased the AUC of multivariable prediction (AUC 0.87; 95% CI 0.84⁻0.90 vs. AUC with oliguria 0.73; 95% CI 0.69⁻0.77; p < 0.0001; vs. AUC with neither oliguria nor SvO2 reduction 0.68; 95% CI 0.64⁻0.72; p < 0.0001). Intraoperative oliguria coupled with a decrease in SvO2 may suggest the risk of AKI after LDLT more reliably than oliguria alone or decrease in SvO2 alone. Intraoperative oliguria should be interpreted in conjunction with SvO2 to predict AKI in patients with normal preoperative renal function and who did not receive hydroxyethyl starch during surgery.

17.
BMC Anesthesiol ; 17(1): 136, 2017 Oct 10.
Article in English | MEDLINE | ID: mdl-29017455

ABSTRACT

BACKGROUND: Head fixation can induce hemodynamic instability. Remifentanil is commonly used with propofol for total intravenous anesthesia (TIVA) during neurosurgery. This study investigated the 90% effective concentration (EC90) of remifentanil for blunting of cardiovascular responses to head fixation during neurosurgery via bispectral index (BIS) monitoring. METHODS: Fifty patients undergoing neurosurgery requiring head fixation were enrolled. This study was performed using the biased coin up-and-down design sequential method (BCD). After tracheal intubation, the effect-site target concentration (Ce) of remifentanil was adjusted to achieve hemodynamic stability and reset to the level preoperatively assigned to each patient, according to the BCD method, approximately 10 min before head fixation. Baseline hemodynamic values were recorded before head fixation. An ineffective response was defined as a case with a > 20% increase in hemodynamic values from baseline. Otherwise, the response was determined to be effective. The EC90 of remifentanil was calculated as a modified isotonic estimator. RESULTS: Forty-three patients completed this study. The EC90 of remifentanil for blunting cardiovascular responses to head fixation was estimated to be 6.48 ng/mL (95% CI, 5.94-6.83 ng/mL). CONCLUSIONS: Adjustment of the Ce of remifentanil to approximately 6.5 ng/mL before head fixation could prevent noxious cardiovascular responses in 90% of neurosurgical ASA I-II patients aged 20 to 65 years old during propofol target-controlled infusion titrated to maintain BIS between 40 and 50. TRIAL REGISTRATION: ClinicalTrials.gov Identifier NCT01489137 , retrospectively registered 5 December 2011.


Subject(s)
Anesthetics, Intravenous/administration & dosage , Cardiovascular Diseases/chemically induced , Neurosurgical Procedures/methods , Patient Positioning/methods , Piperidines/administration & dosage , Propofol/administration & dosage , Adult , Aged , Anesthesia, Intravenous/adverse effects , Anesthetics, Intravenous/adverse effects , Cardiovascular Diseases/prevention & control , Consciousness Monitors , Electroencephalography/drug effects , Electroencephalography/methods , Female , Humans , Male , Middle Aged , Piperidines/adverse effects , Propofol/adverse effects , Remifentanil
SELECTION OF CITATIONS
SEARCH DETAIL
...