Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 67
Filter
1.
Anaesthesia ; 78(1): 64-72, 2023 01.
Article in English | MEDLINE | ID: mdl-36198200

ABSTRACT

Unanticipated difficult laryngoscopy is associated with serious airway-related complications. We aimed to develop and test a convolutional neural network-based deep-learning model that uses lateral cervical spine radiographs to predict Cormack-Lehane grade 3 or 4 direct laryngoscopy views of the glottis. We analysed the radiographs of 5939 thyroid surgery patients at our hospital, 253 (4%) of whom had grade 3 or 4 glottic views. We used 10 randomly sampled datasets to train a model. We compared the new model with six similar models (VGG, ResNet, Xception, ResNext, DenseNet and SENet). The Brier score (95%CI) of the new model, 0.023 (0.021-0.025), was lower ('better') than the other models: VGG, 0.034 (0.034-0.035); ResNet, 0.033 (0.033-0.035); Xception, 0.032 (0.031-0.033); ResNext, 0.033 (0.032-0.033); DenseNet, 0.030 (0.029-0.032); SENet, 0.031 (0.029-0.032), all p < 0.001. We calculated mean (95%CI) of the new model for: R2 , 0.428 (0.388-0.468); mean squared error, 0.023 (0.021-0.025); mean absolute error, 0.048 (0.046-0.049); balanced accuracy, 0.713 (0.684-0.742); and area under the receiver operating characteristic curve, 0.965 (0.962-0.969). Radiographic features around the hyoid bone, pharynx and cervical spine were associated with grade 3 and 4 glottic views.


Subject(s)
Deep Learning , Humans
2.
Anaesthesia ; 77(1): 54-58, 2022 01.
Article in English | MEDLINE | ID: mdl-34403493

ABSTRACT

Sore throat after tracheal intubation impairs postoperative recovery. We randomly allocated 172 ASA physical status 1-2 participants, scheduled for laparoscopic lower abdominal surgery, to tracheal intubation with larger tubes (n = 88) or smaller tubes (n = 84), with internal diameters 7.5-mm vs. 6.5-mm for men and 7.0-mm vs. 6.0-mm for women. Primary outcome was the rates of no, mild, moderate or severe sore throat 1 h after surgery, which were 60, 10, 17 and 1 with larger tracheal tubes and 79, 5, 0 and 0 with smaller tubes, p < 0.001. The equivalent rates 24 h after surgery were 64, 16, 8 and 0 vs. 74, 6, 3 and 1, p = 0.037. Intra-operative ventilatory variables were unaffected by tube diameter, including peak inspiratory pressure, plateau pressure and end-tidal carbon dioxide partial pressure. In summary, smaller tracheal tubes benefitted patients having laparoscopic operations.


Subject(s)
Intubation, Intratracheal/methods , Adult , Aged , Carbon Dioxide/blood , Female , Humans , Intubation, Intratracheal/adverse effects , Intubation, Intratracheal/instrumentation , Laparoscopy , Male , Middle Aged , Pharyngitis/etiology , Treatment Outcome
3.
Transplant Proc ; 50(10): 3644-3649, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30577250

ABSTRACT

BACKGROUND: There is still controversy as to whether the case volume affects clinical outcomes after liver transplantation. This nationwide retrospective cohort study aimed to investigate the relationship between institutional case volume and post-transplant outcomes after deceased donor liver transplantation. MATERIAL AND METHODS: The data was extracted from the database of Korean National Healthcare Insurance Service. A total of 2648 adult deceased donor liver transplantations were performed at 54 centers in Korea from January 2007 to December 2016. Centers were divided into high-, medium-, and low-volume centers according to the average annual number of deceased donor liver transplantations as follows: < 10, 10-30, and >30. RESULTS: In-hospital mortality rates in high-, medium-, and low-volume centers were 10.3%, 14.3%, and 17.1%, respectively. Multivariable logistic regression analysis revealed that low-volume centers (adjusted odds ratio 1.953; 95% confidence interval, 1.461-2.611; P < .001) and medium-volume centers (adjusted odds ratio 1.480; 95% confidence interval, 1.098-1.994; P = .010) had a significantly higher in-hospital mortality compared to high-volume centers. Long-term mortality rates were also higher in low-volume centers (P = .007). CONCLUSION: Centers with higher volume showed better in-hospital mortality and long-term survival after deceased donor liver transplantation compared to centers with lower volume.


Subject(s)
Hospital Mortality , Hospitals/statistics & numerical data , Liver Transplantation/mortality , Adult , Cohort Studies , Female , Humans , Liver Transplantation/methods , Male , Middle Aged , Odds Ratio , Outcome Assessment, Health Care , Registries , Republic of Korea , Retrospective Studies
4.
Transplant Proc ; 50(8): 2354-2358, 2018 Oct.
Article in English | MEDLINE | ID: mdl-30316357

ABSTRACT

BACKGROUND: Single antigen bead assay (SAB) is a sensitive method for detecting HLA antibodies, but it does not specifically identify clinically relevant subsets. Recently, a new assay has been developed for detection of C3d bound to HLA antibody-antigen complex. We evaluated the C3d assay regarding its correlation with SAB in renal patients. METHODS: A total of 138 serum samples from 109 sensitized patients were tested in parallel by SAB and C3d assay for detection of HLA class I antibodies. The relationship between C3d assay and SAB was analyzed for the numbers and median fluorescent intensity (MFI) values of the identified antibodies. RESULTS: Of the 138 samples, 137 were positive on SAB; of the 137 SAB-positive samples, 76 were positive on C3d assay. A total of 3748 and 685 antibodies were identified by the SAB and the C3d assay, respectively. The maximal MFI values of the SAB in the 76 samples that were C3d assay-positive were significantly higher than those of the 61 samples that were C3d assay-negative (P < .05), with the median values of 17,057 and 6066, respectively. Only 11 (0.4%) of the 2905 antibodies with MFI < 10,000 on SAB vs 501 (59.4%) of the 843 antibodies with MFI > 10,000 on SAB were identified by C3d assay with MFI > 1000. CONCLUSIONS: The C3d assay positivity seems to be dependent on its MFI value on SAB. Further studies are needed to ascertain the clinical significance of C3d positivity by itself.


Subject(s)
Antibodies/blood , Complement C1q/analysis , HLA Antigens/blood , Immunologic Tests/methods , Adult , Complement C1q/immunology , Female , HLA Antigens/immunology , Humans , Male
5.
Transplant Proc ; 50(8): 2426-2430, 2018 Oct.
Article in English | MEDLINE | ID: mdl-30316371

ABSTRACT

INTRODUCTION: Kidneys from acute kidney injury (AKI) donors are used for kidney transplantation. However, different Acute Kidney Injury Network (AKIN) criteria may show varying results after transplantation. We investigated the clinical outcomes in kidney transplantation from deceased donors with AKI as defined by the AKIN criteria at a single center. METHODS: We retrospectively reviewed the medical records of 101 consecutive deceased donors and kidney transplantation recipients from March 2009 to June 2015 in a single center. Donor and recipient clinical characteristics with creatinine level, delayed graft function, estimated glomerular filtration rate (eGFR), rejection, and graft survival were investigated. RESULTS: Of the 101 deceased donor kidneys, AKI occurred in 64 (63.4%) deceased donors. No differences in eGFR and serum creatinine level were found according to AKIN criteria. However, the AKIN stage 3 group had a slightly decreased kidney function without statistical significance. In the older AKI donor group, creatinine level was significantly higher than in other groups at 1 month (P = .015). No differences were found between the 2 groups in patient survival, graft survival, or rejection-free survival (P = .359, P = .568, and P = .717, respectively). CONCLUSIONS: Kidney transplantation from deceased donors with AKI showed comparable outcomes despite high rates of delayed graft function. AKIN stage 3 donors and aged-deceased donors with AKI showed a slightly reduced renal function without statistical significance; hence, use from donors with AKI needs to be considered to expand donor pools, but caution should be taken for AKIN stage 3 donors and aged donors with AKI.


Subject(s)
Acute Kidney Injury , Graft Survival , Kidney Transplantation/methods , Tissue Donors , Acute Kidney Injury/physiopathology , Adult , Cadaver , Delayed Graft Function/epidemiology , Delayed Graft Function/etiology , Female , Humans , Male , Middle Aged , Retrospective Studies
6.
Transplant Proc ; 50(4): 1025-1028, 2018 May.
Article in English | MEDLINE | ID: mdl-29678267

ABSTRACT

BACKGROUND: Increased cold ischemia time in cadaveric kidney transplants has been associated with a high rate of delayed graft function (DGF), and even with graft survival. Kidney transplantation using in-house donors reduces cold preservation time. The purpose of this study was to compare the clinical outcomes after transplantation in house and externally. METHODS: We retrospectively reviewed the medical records of donors and recipients of 135 deceased-donor kidney transplantations performed in our center from March 2009 to March 2016. RESULTS: Among the 135 deceased donors, 88 (65.2%) received the kidneys from in-house donors. Median cold ischemia time of transplantation from in-house donors was shorter than for imported donors (180.00 vs 300.00 min; P < .001). The risks of DGF and slow graft function were increased among the imported versus in-house donors. Imported kidney was independently associated with greater odds of DGF in multivariate regression analysis (odds ratio, 4.165; P = .038). However, the renal function of recipients at 1, 3, 5, and 7 years after transplantation was not significantly different between the 2 groups. CONCLUSIONS: Transplantation with in-house donor kidneys was significantly associated with a decreased incidence of DGF, but long-term graft function and survival were similar compared with imported donor kidneys.


Subject(s)
Delayed Graft Function/etiology , Graft Survival , Kidney Transplantation/adverse effects , Kidney Transplantation/methods , Tissue Donors , Adult , Cadaver , Cold Ischemia/adverse effects , Delayed Graft Function/epidemiology , Female , Humans , Male , Middle Aged , Multivariate Analysis , Odds Ratio , Retrospective Studies , Risk Factors , Tissue Donors/supply & distribution , Transplantation, Homologous/adverse effects , Transplantation, Homologous/methods
7.
Transplant Proc ; 49(9): 2050-2054, 2017 Nov.
Article in English | MEDLINE | ID: mdl-29149959

ABSTRACT

BACKGROUND: Recently, urinary tissue inhibitor of metalloproteinase-2 (TIMP-2) and insulin-like growth factor-7 (IGFBP-7), markers for G1 cell cycle arrest, have been identified and validated in predicting the development of acute kidney injury in critically ill patients. It is unknown, however, whether these two biomarkers could predict the development of delayed graft function (DGF) after kidney transplantation (KT). METHODS: This is a single-center, prospective, observational study. We enrolled 74 patients who underwent KT between August 2013 and December 2016. Urine sample were collected immediately after the operation. The primary outcome was development of DGF as defined by need for dialysis of more than 1 session within 7 days of KT. RESULTS: Twenty-three patients (31%) were diagnosed with DGF. In univariate analysis, kidneys from expanded criteria donors, higher donor serum creatinine, lower donor estimated glomerular filtration rate, antithymoglobulin exposure, neutrophil gelatinase associated lipocalin, and urinary [TIMP-2]·[IGFBP7] were significantly different between early graft function and DGF. However, in multivariate analysis adjusting other factors, deceased donor and urinary [TIMP-2]·[IGFBP7] at 0 hours post-transplantation could predict the development of DGF. The receiver operating characteristic curve for prediction of DGF showed an area under the curve of 0.867 (sensitivity 0.86, specificity 0.71) for a cutoff value of 1.39. CONCLUSIONS: Our results indicate that urine [TIMP-2]·[IGFBP7] immediately after transplantation could be an early, predictive biomarker of DGF in kidney transplantation.


Subject(s)
Delayed Graft Function/blood , Delayed Graft Function/diagnosis , Insulin-Like Growth Factor Binding Proteins/blood , Kidney Transplantation , Tissue Inhibitor of Metalloproteinase-2/blood , Adult , Biomarkers/blood , Female , Glomerular Filtration Rate , Humans , Kidney/physiopathology , Lipocalin-2/blood , Male , Middle Aged , Prospective Studies , ROC Curve , Renal Dialysis , Tissue Donors
8.
Acta Anaesthesiol Scand ; 61(9): 1095-1104, 2017 Oct.
Article in English | MEDLINE | ID: mdl-28799206

ABSTRACT

BACKGROUND: There is little information about clinical outcomes after intraoperative cardiac arrest (IOCA). We determined the incidence and characteristics of 3-month mortality after IOCA. METHODS: The electronic medical records of 238,648 adult surgical patients from January 2005 to December 2014 were reviewed retrospectively. Characteristics of IOCA were documented using the Utstein reporting template. RESULTS: IOCA occurred in 50 patients (21/100,000 surgeries). Nineteen patients died in the operating room, and further 12 patients died within 3 months post-arrest (total mortality: 62%). Three survivors at 3 months post-arrest had unfavourable neurological outcome. Finally, 34 patients showed unfavourable clinical outcomes at 3 months post-arrest. The incidences of non-cardiac surgery, emergency, pre-operative intubation state, non-shockable initial cardiac rhythm, hypovolaemic shock, pre-operative complications-induced cardiac arrest, non-anaesthetic cause of cardiac arrest, intra- and post-arrest transfusion, and continuous infusion of inotrope or vasopressor in intensive care unit (ICU) were significantly higher in non-survivors at 3 months post-arrest. Total epinephrine dose administrated during arrest was higher, and the duration of cardiac compressions was longer in non-survivors at 3 months post-arrest. CONCLUSIONS: In this study, the incidence of IOCA was 21/100,000 surgeries and the 3-month mortality rate after IOCA was 62%. Several factors including surgical emergency, non-shockable initial cardiac rhythm, pre-operative complications, surgical complications, long duration of cardiac compressions, high total epinephrine dose, transfusion, and continuous infusion of inotropes or vasopressors in ICU seemed to be risk factors for 3-month mortality after IOCA. These risk factors should be considered in the light of relatively small sample size of this study.


Subject(s)
Heart Arrest/mortality , Intraoperative Complications/mortality , Adult , Aged , Anesthesia , Critical Care , Female , Humans , Incidence , Male , Middle Aged , Nervous System Diseases/epidemiology , Nervous System Diseases/etiology , Nervous System Diseases/mortality , Republic of Korea/epidemiology , Retrospective Studies , Risk Factors , Treatment Outcome
9.
Transplant Proc ; 49(5): 1038-1042, 2017 Jun.
Article in English | MEDLINE | ID: mdl-28583522

ABSTRACT

BACKGROUND: A higher body mass index (BMI) before kidney transplantation (KT) is associated with increased mortality and allograft loss in kidney transplant recipients (KTRs). However, the effect of changes in BMI after KT on these outcomes remains uncertain. The aim of this study was to investigate the effect of baseline BMI and changes in BMI on clinical outcomes in KTRs. METHODS: A total of 869 KTRs were enrolled from a multicenter observational cohort study from 2012 to 2015. Patients were divided into low and high BMI groups before KT based on a BMI cutoff point of 23 kg/m2. Differences in acute rejection and cardiovascular disease (CVD) between the 2 groups were analyzed. In addition, clinical outcomes across the 4 BMI groups divided by BMI change 1 year after KT were compared. Associations between BMI change and laboratory findings were also evaluated. RESULTS: Patients with a higher BMI before KT showed significantly increased CVD after KT (P = .027) compared with patients with a lower BMI. However, among the KTRs with a higher baseline BMI, only persistently higher BMI was associated with increased CVD during the follow-up period (P = .003). Patients with persistently higher BMI had significantly decreased high-density lipoprotein cholesterol and increased hemoglobin, triglyceride, and hemoglobin A1c levels. Baseline BMI and post-transplantation change in BMI were not related to acute rejection in KTRs. CONCLUSIONS: BMI in the 1st year after KT as well as baseline BMI were associated with CVD in KTRs. More careful monitoring of obese KTRs who do not undergo a reduction in BMI after KT is required.


Subject(s)
Body Mass Index , Cardiovascular Diseases/physiopathology , Graft Rejection/physiopathology , Kidney Transplantation/mortality , Adult , Cardiovascular Diseases/blood , Cardiovascular Diseases/mortality , Cohort Studies , Female , Glycated Hemoglobin/analysis , Graft Rejection/blood , Graft Rejection/mortality , Humans , Lipoproteins, HDL/blood , Male , Middle Aged , Postoperative Period , Risk Factors , Time Factors , Triglycerides/blood
10.
Transplant Proc ; 49(1): 88-91, 2017.
Article in English | MEDLINE | ID: mdl-28104166

ABSTRACT

BACKGROUND: The Kidney Donor Risk Index (KDRI) scoring system for deceased donors has been widely introduced for postoperative evaluation of graft function. We analyzed the usefulness of the KDRI in deceased donors with acute kidney injury (AKI). METHODS: Forty-nine recipients from deceased donors with AKI between January 2009 and December 2014 were reviewed retrospectively. Data collected from donor medical records included age, height, weight, hypertension or diabetes history, cause of death, serum creatinine (sCr), and donation after cardiac death. Graft function data including sCr, estimated glomerular filtration rate (eGFR), and acute rejection episodes were monitored for 1 year. Correlations between KDRI score and factors indicating graft function were analyzed. A cutoff value for KDRI score was calculated using a receiver operating characteristic (ROC) curve for significant graft function. RESULTS: The mean ages of donors and recipients were 46.81 ± 13.13 and 47.69 ± 11.43, respectively. The mean KDRI score was 1.24 ± 0.40. Univariable analysis of KDRI score and factors indicating graft function indicated that sCr at 6 to 12 months, eGFR at 1 year, and slow graft function (SGF) had statistical significance. The ROC curve of KDRI score for SGF showed an optimal cutoff value of 1.20, with sensitivity of 69.2% and specificity of 69.4% (area under the curve = 0.75) in deceased donors with AKI. CONCLUSIONS: KDRI score in deceased donors with AKI was correlated with postoperative graft values including eGFR and SGF. KDRI could be used as a predictor for the short-term clinical outcome after kidney transplant from deceased donor with AKI.


Subject(s)
Acute Kidney Injury , Kidney Transplantation/methods , Tissue Donors , Transplants/physiopathology , Acute Kidney Injury/physiopathology , Adult , Cadaver , Female , Graft Survival , Humans , Male , Middle Aged , ROC Curve , Retrospective Studies , Risk Factors , Tissue Donors/supply & distribution
11.
Transplant Proc ; 48(7): 2403-2406, 2016 Sep.
Article in English | MEDLINE | ID: mdl-27742309

ABSTRACT

BACKGROUND: Organ donation after brain death is a major source for obtaining transplantable organs for patients with end-stage organ disease. However, the time from declaring brain death to organ procurement is often longer than expected. Analyzing factors that delay organ procurement may help to prevent damage to organs from marginal and unstable donors and aid in preparation for recipient operation. The aim of this study was to examine factors associated with the interval between the time of declaring brain death and organ procurement. METHODS: Medical records of patients who underwent organ procurement after brain death from February 2009 to April 2015 were retrospectively reviewed. RESULTS: Of the 77 patients which were scheduled to undergo organ procurement, 68 eventually underwent procurement of ≥1 organ. The average time interval from 1st exam for brain death to organ procurement decreased from 1,248 minutes in 2009 to 910 minutes in 2015. Although not statistically significant, during the 6-year period, the time interval decreased from 1,105 minutes to 1,075 minutes in the latter half of the period (P = .623). Organ procurement was extensively delayed most commonly owing to false negative electroencephalogram (EEG; 62.5%). CONCLUSIONS: With increasing experience in dealing with brain death donors, the time interval from declaring brain death to organ procurement decreased. We suggest that an EEG be performed during the initial stages of examination for brain death to prevent unnecessary preparation of recipient operation owing to a false EEG test.


Subject(s)
Brain Death/diagnosis , Electroencephalography , Tissue and Organ Procurement/statistics & numerical data , False Negative Reactions , Female , Humans , Male , Middle Aged , Republic of Korea , Retrospective Studies , Time Factors , Tissue Donors , Transplants
12.
Blood Cancer J ; 5: e358, 2015 Oct 16.
Article in English | MEDLINE | ID: mdl-26473530

ABSTRACT

Monosomal karyotype (MK) defined by either ⩾2 autosomal monosomies or single monosomy with at least one additional structural chromosomal abnormality is associated with a dismal prognosis in patients with acute myeloid leukemia (AML). It was detected in 174 of 3041 AML patients in South Korean Registry. A total of 119 patients who had received induction therapy were finally analyzed to evaluate the predictive factors for a positive prognosis. On multivariate analysis, single monosomy, the absence of abn(17p), ⩾10% of cells with normal metaphase and the achievement of a complete remission (CR) after induction therapy were significant factors for more favorable outcomes. Especially, single monosomy remained as a significantly independent prognostic factor for superior survival in both patients who received allogeneic hematopoietic stem cell transplantation (allo-HSCT) in CR and who did not. Allo-HSCT in CR improved overall survival significantly only in patients with a single monosomy. Our results suggest that MK-AML may be biologically different according to the karyotypic subtype and that allo-HSCT in CR should be strongly recommended to patients with a single monosomy. For other patients, more prudent treatment strategies should be examined. Furthermore, the biological mechanism by which a single monosomy influences survival should be investigated.


Subject(s)
Leukemia, Myeloid, Acute/genetics , Monosomy/genetics , Monosomy/pathology , Abnormal Karyotype , Adolescent , Adult , Aged , Aged, 80 and over , Antineoplastic Combined Chemotherapy Protocols/therapeutic use , Asian People , Combined Modality Therapy , Cytogenetic Analysis , Female , Hematopoietic Stem Cell Transplantation , Humans , Kaplan-Meier Estimate , Leukemia, Myeloid, Acute/mortality , Leukemia, Myeloid, Acute/therapy , Male , Middle Aged , Proportional Hazards Models , Registries , Young Adult
13.
Transpl Infect Dis ; 17(5): 679-87, 2015 Oct.
Article in English | MEDLINE | ID: mdl-26201517

ABSTRACT

BACKGROUND: Although intravenous immunoglobulin (IVIG) is not routinely recommended, many centers still use IVIG during the post-hematopoietic stem cell transplant (HSCT) period. METHOD: A total of 162 multiple myeloma (MM) patients who underwent autologous (auto-) HSCT between January 2008 and June 2013 were retrospectively reviewed. Primary objective was determination of the impact of IVIG on post-transplant infection, and secondary objectives included identification of overall incidence of infection, type of infection, and risk factors for infection after auto-HSCT in MM patients. RESULTS: After auto-HSCT, 53 of 162 patients (32.7%) experienced 104 infectious events. Upper respiratory infection was most common (n = 31, 29.8%) and pneumonia (n = 27, 26.0%) and herpes zoster (n = 15, 14.4%) came next. Among the identifiable organisms causing respiratory infection, influenza virus (n = 10) and Pneumococcus (n = 9) were predominant. Incidence of infection was not statistically different according to IVIG use (34.8% in IVIG (-) vs. 31.3% in IVIG (+), P = 0.631). Incidence of infection requiring hospitalization and multiple episodes of infection showed no difference between the groups (P = 0.147, P = 0.156). In a Cox proportional hazard model, none of the factors including age, gender, type of disease, stage, tandem (vs. single) transplantation,and IVIG was prognostic for infectious event after auto-HSCT (P = 0.955, hazard ratio 0.980 with 95% confidence interval 0.481-1.997 for IVIG). CONCLUSION: In auto-HSCT recipients with MM, incidence of post-transplant infection was not different according to prophylactic IVIG use.


Subject(s)
Bacterial Infections/prevention & control , Hematopoietic Stem Cell Transplantation , Immunocompromised Host , Immunoglobulins, Intravenous/therapeutic use , Immunologic Factors/therapeutic use , Multiple Myeloma/therapy , Virus Diseases/prevention & control , Adult , Aged , Bacterial Infections/epidemiology , Bacterial Infections/immunology , Female , Follow-Up Studies , Humans , Incidence , Male , Middle Aged , Multiple Myeloma/immunology , Proportional Hazards Models , Retrospective Studies , Risk Factors , Transplantation, Autologous , Virus Diseases/epidemiology , Virus Diseases/immunology
14.
Transplant Proc ; 47(4): 1096-8, 2015 May.
Article in English | MEDLINE | ID: mdl-26036528

ABSTRACT

BACKGROUND: Simultaneous pancreas-kidney (SPK) transplantation has been the fundamental treatment and has shown significant results in selected patients diagnosed with type 1 diabetes with renal insufficiency. Most pancreas transplantations are dependent on deceased donors, yet the waiting time for SPK transplantation from deceased donors is significantly long in Asian countries. METHODS: In 3 cases, living-donor SPK transplantation was performed with the use of hand-assisted laparoscopic donor surgery (HALS). Three cases of patients who underwent SPK transplantation from living donors (LDSPK) with the use of HALS at Korea University Anam Hospital from 2012 to 2013 were retrospectively reviewed regarding patient characteristics and clinical outcomes of donors and recipients. For the donors, the pancreas and renal function had been well preserved postoperatively. RESULTS: One donor had a pancreatic fistula, which was controlled with conservative management. Of the 3 cases of recipient operation, 1 case was performed by ABO incompatibility donor. The levels of creatinine, serum insulin, and C-peptide of recipients were normalized and remained stable at the last follow-up. CONCLUSIONS: LDSPK can be an efficient alternative in cases in which the deceased donor is not present at the proper time, depending on the degree of completion in the operator's skill.


Subject(s)
Diabetes Mellitus, Type 1/surgery , Hand-Assisted Laparoscopy/methods , Kidney Transplantation/methods , Living Donors , Pancreas Transplantation/methods , Patient Selection , Adult , Female , Follow-Up Studies , Graft Survival , Humans , Male , Republic of Korea , Retrospective Studies , Treatment Outcome
15.
Eur J Clin Microbiol Infect Dis ; 34(7): 1437-41, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25894983

ABSTRACT

Recent products of piperacillin/tazobactam (PTZ) from the original manufacturer, previously considered a major cause of galactomannan (GM) false-positivity, are reported not to be related to it. However, data regarding generic PTZ are limited and controversial. To evaluate the effect of generic PTZ on GM false-positivity in Korea, we performed a case-control study in adult patients with cancer. A case-control study was designed. Electronic medical records of cancer patients who were admitted and tested for serum GM between March and June 2014 at a tertiary care university hospital were reviewed. During the study period, a single generic PTZ (C manufacturer, Korea) was used. Patients who received PTZ within 24 h prior to serum GM testing were enrolled. Age- and GM test date-matched non-PTZ patients were selected as controls. A total of 110 patients received PTZ within 24 h prior to serum GM testing during the study period. The GM optical density index (ODI) of the PTZ group did not vary significantly from that of the control group (p = 0.251). The percentage of false-positive patients in the PTZ group was also similar to that of the control group (p = 0.538). There was no statistical relationship between GM ODI titer and time interval from PTZ administration (p = 0.095) or cumulative PTZ dose (p = 0.416). In a case-control study that evaluated 220 patients, a generic PTZ in Korea was not related to GM false-positivity.


Subject(s)
Anti-Bacterial Agents/adverse effects , Mannans/blood , Neoplasms/blood , Penicillanic Acid/analogs & derivatives , Piperacillin/adverse effects , Adult , Aged , Anti-Bacterial Agents/administration & dosage , Antigens, Fungal/blood , Aspergillosis/blood , Aspergillosis/etiology , Case-Control Studies , False Positive Reactions , Female , Galactose/analogs & derivatives , Humans , Male , Middle Aged , Neoplasms/complications , Penicillanic Acid/administration & dosage , Penicillanic Acid/adverse effects , Piperacillin/administration & dosage , Retrospective Studies , Tazobactam , Time Factors
16.
Vox Sang ; 107(4): 407-15, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25130876

ABSTRACT

BACKGROUND AND OBJECTIVES: Collection of sufficient CD34+ cells for autologous peripheral blood stem cell (PBSC) transplantation is frequently failed in patients with lymphoma or multiple myeloma (MM). We investigated the incidence and the predictive factors for poor mobilization. MATERIALS AND METHODS: A total of 205 adult patients (101 lymphoma and 104 MM) were retrospectively included for identifying the incidence of mobilization failure and the predictive factors for poor mobilization in conventional G-CSF-based mobilization regimen. Another 17 patients who used plerixafor for mobilization were included. RESULTS: Overall, 14·1% of patients (21·8% of patients with lymphoma, 6·7% of patients with MM) were poor mobilizers. Univariate analysis and multivariate analysis revealed an interval from G-CSF administration to PBSC collection exceeding 10 days and peripheral blood mononuclear cells count on the first day of collection were predictive factors for poor mobilization in lymphoma, but not in MM. Among plerixafor-treated patient group, 9 of 11 poor mobilizers who received second-cycle plerixafor mobilization were able to collect higher number of CD34+ cells than that of CD34+ cells during the G-CSF-based first mobilization. All patients who had received initial plerixafor mobilization reached 2·0 × 10(6) CD34+ cells/kg during the four leukaphereses. CONCLUSION: In conventional G-CSF-based mobilization, early PBSC collection after G-CSF administration might enhance CD34+ cell yield. A combination of a new mobilizing agent, plerixafor, would be helpful to harvest sufficient number of CD34+ cells for successful transplantation outcome while reducing the effort of collection procedures in poor mobilizers.


Subject(s)
Lymphoma/therapy , Multiple Myeloma/therapy , Peripheral Blood Stem Cell Transplantation , Adolescent , Adult , Aged , Antigens, CD34/metabolism , Benzylamines , Cyclams , Female , Granulocyte Colony-Stimulating Factor/therapeutic use , Hematopoietic Stem Cell Mobilization/statistics & numerical data , Heterocyclic Compounds/therapeutic use , Humans , Incidence , Leukocytes, Mononuclear/cytology , Leukocytes, Mononuclear/metabolism , Male , Middle Aged , Multivariate Analysis , Retrospective Studies , Risk Factors , Transplantation, Autologous
17.
Eur J Clin Microbiol Infect Dis ; 33(10): 1847-53, 2014 Oct.
Article in English | MEDLINE | ID: mdl-24853055

ABSTRACT

Cytomegalovirus (CMV) gastrointestinal (GI) disease has been noticed frequently in cancer patients, causing abdominal pain, diarrhea, and GI bleeding. However, little is known about its actual incidence, clinical presentation, and the risk factors for its development among cancer patients. To answer these questions, we analyzed all cases that occurred during an 18-year period at our center. A case-control study was performed to identify risk factors for CMV GI disease. Electronic medical records were reviewed from individuals who were admitted and diagnosed with CMV GI disease during the period of January 1995 through March 2013 at a tertiary care center. Two CMV disease-free cancer patients were matched as controls. A total of 98 episodes of CMV GI disease were included in this study, and the overall incidence rate was 52.5 per 100,000 cancer patients, with an increasing trend throughout the study period. According to multivariate analysis, male sex, low body mass index, lymphopenia, hematological malignancy, and steroid use and red blood cell transfusion within 1 month prior to the CMV disease were identified to be independent risk factors. Among these factors, RBC transfusion showed the highest odds ratio (OR = 5.09). Male sex, low body mass index, lymphopenia, hematological malignancy, steroid use, and red blood cell transfusion within 1 month prior to the CMV disease diagnosis were independent risk factors for the development of CMV GI disease in adult patients with cancer.


Subject(s)
Cytomegalovirus Infections/epidemiology , Gastroenteritis/epidemiology , Neoplasms/complications , Adult , Aged , Aged, 80 and over , Case-Control Studies , Female , Humans , Incidence , Male , Middle Aged , Risk Factors , Transfusion Reaction
18.
J Thromb Haemost ; 12(7): 1035-43, 2014 Jul.
Article in English | MEDLINE | ID: mdl-24837640

ABSTRACT

BACKGROUND: Data on the incidence of venous thromboembolism (VTE) following major surgery in Asian populations are limited. METHODS: Using the Korean Health Insurance Review and Assessment Service database, we performed a nationwide population-based epidemiologic study to estimate the incidence of VTE after major orthopedic, cancer, and benign surgeries. VTE cases were identified from all patients undergoing major surgery between 2007 and 2011 using both diagnostic and drug codes as treatment evidence of VTE within 5 weeks of surgery. We also calculated the relative risk of VTE in major orthopedic and cancer surgery compared to benign surgery. RESULTS: The overall rates of postoperative VTE were 1.24%, 0.67%, and 0.05% for major orthopedic, cancer, and benign surgeries, respectively. Hip fracture (1.60%) and colorectal cancer surgeries (1.67%) were associated with the highest rates of VTE, and the rates steadily increased during the study period. Advanced age, female sex, and general anesthesia were independent risk factors for VTE. Patients undergoing surgery for colorectal, pancreatic, ovarian, and esophageal cancer, and major orthopedic surgery had a > 20-fold higher risk of VTE than those undergoing benign surgery. CONCLUSIONS: This is the largest epidemiologic study to investigate the incidence of VTE after major surgery in Asia, demonstrating that the rates of postoperative VTE are lower than in Caucasian populations. This study contributes to a better understanding of the differences in postoperative VTE development between Korean and Caucasian populations; the data also suggest that perioperative prophylactic strategies in Asians should be based on studies of such populations.


Subject(s)
Postoperative Complications , Venous Thromboembolism/complications , Venous Thromboembolism/epidemiology , Aged , Anesthesia/adverse effects , Databases, Factual , Female , Humans , Incidence , Insurance, Health , Male , Middle Aged , Neoplasms/complications , Neoplasms/surgery , Odds Ratio , Orthopedic Procedures/adverse effects , Postoperative Period , Republic of Korea , Risk Assessment , Risk Factors , Treatment Outcome
19.
Transplant Proc ; 46(2): 376-80, 2014.
Article in English | MEDLINE | ID: mdl-24655967

ABSTRACT

BACKGROUND: Several new biomarkers for the detection of early tubular injury have been investigated in kidney transplant recipients. We recently identified day 2 urinary neutrophil gelatinase-associated lipocalin (NGAL) as a predictor of slow graft function and adverse 1-year outcome. In the present study, we further investigated the value of urinary NGAL and liver-type fatty acid binding protein (L-FABP) for predicting long-term graft outcomes up to 2 years. METHODS: This study was a single-center, prospective observational study. Serial urinary NGAL and L-FABP levels at 0 hours, 2 days, and 6 days after kidney transplantation (KT) were measured, and the clinical data were assessed during the 2-year period after KT. RESULTS: During the 2-year follow-up period, 13 (18.8%), 5 (7.2%), and 4 (5.8%) patients were diagnosed with acute T-cell-mediated rejection, acute antibody-mediated rejection (AMR) and chronic AMR, respectively. In addition, 10 patients (14.3%) developed calcineurin inhibitor toxicity and 6 (8.7%) developed BK viremia. The mean estimated glomerular filtration rates (eGFR) at 1 and 2 years after KT were 65.1 ± 19.1 and 58.5 ± 22.6 mL/min/1.73 m(2), respectively, When poor long-term graft function was defined as eGFR of less than 50 mL/min/1.73 m(2) at 2 years, elderly donors, acute rejection, and high 0-hour urinary L-FABP levels were significant risk factors. Furthermore, in rejection-free patients, L-FABP was strongly associated with poor long-term graft function (P = .006). Multivariate logistic regression analysis showed that high 0-hour L-FABP (P = .015) and acute rejection (P = .006) were independent factors predicting poor long-term graft function. Receiver operating characteristic analysis showed that the area under the curve for urinary L-FABP was 0.692 (P = .036). CONCLUSIONS: Our results suggest that urinary L-FABP may be a useful predictor of adverse long-term outcomes in KT patients.


Subject(s)
Fatty Acid-Binding Proteins/urine , Kidney Transplantation , Adult , Female , Glomerular Filtration Rate , Graft Survival , Humans , Immunosuppressive Agents/administration & dosage , Male , Middle Aged , Prospective Studies , Treatment Outcome
20.
Transplant Proc ; 46(2): 400-2, 2014.
Article in English | MEDLINE | ID: mdl-24655973

ABSTRACT

In transplant recipients, nephrotoxicity due to long-term use of calcineurin inhibitor (CNI) is a serious problem that cannot be overlooked. Medication compliance can cause graft failure in transplant recipients who are bound to long-term medication. In this study, 36 patients who underwent conversion to once-daily Advagraft and sirolimus combination at Korea University Anam Hospital from September 2011 to March 2013 were retrospectively reviewed at 3 and 6 months for laboratory findings, mean arterial pressure (MAP), and so on. After conversion, serum creatinine level and glomerular filtration rate (GFR) decreased significantly at 3 months (P = .024 and P < .001, respectively). Fasting serum glucose level and proteinuria increased significantly at 6 months (P = .016 and P = .030, respectively). The impact of time after conversion at 3 months was significantly related to the increase in postoperative estimated glomerular filtration rate (eGFR). Graft rejection, morbidity, and mortality did not occur within the study period. A once-daily Advagraf and sirolimus regimen can be a novel standard regimen in stable kidney recipients due to its effects in improving renal function and convenience for patients.


Subject(s)
Immunosuppressive Agents/administration & dosage , Kidney Transplantation , Sirolimus/administration & dosage , Tacrolimus/administration & dosage , Adult , Creatinine/blood , Female , Glomerular Filtration Rate , Humans , Male , Middle Aged , Retrospective Studies
SELECTION OF CITATIONS
SEARCH DETAIL
...