Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 43
Filter
1.
Am Surg ; 89(5): 1610-1615, 2023 May.
Article in English | MEDLINE | ID: mdl-34986663

ABSTRACT

BACKGROUND: Delirium occurs frequently in critically ill and injured patients and is associated with significant morbidity and mortality. Limited data exists on the risk factors for developing delirium in critically ill trauma patients and the effect of antipsychotic (AP) medications on delirium progression. OBJECTIVE: The objective of this study is to determine the incidence of delirium in critically ill trauma versus non-trauma surgical patients and determine if the presence of trauma was associated with intensive care unit (ICU) delirium. Secondary outcomes included identifying risk factors for delirium and determining the impact of AP medication use on delirium progression in critically ill trauma patients. METHODS: This retrospective review studies adult trauma/surgical ICU patients admitted between May 2017-July 2018 to a level I trauma and tertiary referral center. Regression modeling was used to determine the impact of AP use on delirium-free days. RESULTS: Delirium was more common in critically ill trauma patients versus non-trauma surgical ICU patients [54/157 (34.4%) vs 42/270 (15.6%), P < .001]. Of the 54 trauma patients with delirium, 28 (52%) received an AP medication for delirium treatment and in the multiple linear regression analysis, AP use was significantly associated with fewer delirium-free days (P = .02). DISCUSSION: Higher admission sequential organ failure assessment scores and increased length of stay were significantly associated with delirium onset in critically ill trauma patients. Use of AP medications for delirium treatment in this population had a negative impact on delirium-free days.


Subject(s)
Antipsychotic Agents , Adult , Humans , Antipsychotic Agents/adverse effects , Critical Illness/therapy , Intensive Care Units , Critical Care , Retrospective Studies , Risk Factors
2.
Ann Pharmacother ; 57(6): 653-661, 2023 06.
Article in English | MEDLINE | ID: mdl-36154486

ABSTRACT

BACKGROUND: Sepsis and septic shock are associated with significant morbidity and mortality. Rapid initiation of appropriate antibiotic therapy is essential, as inadequate therapy early during septic shock has been shown to increase the risk of mortality. However, despite the importance of appropriate antibiotic initiation, in clinical practice, concerns for renal dysfunction frequently lead to antibiotic dose reduction, with scant evidence on the impact of this practice in septic shock patients. OBJECTIVE: The purpose if this article is to investigate the rate and impact of piperacillin-tazobactam dose adjustment in early phase septic shock patients using real-world electronic health record (EHR) data. METHODS: A multicenter, observational, retrospective cohort study was conducted of septic shock patients who received at least 48 hours of piperacillin-tazobactam therapy and concomitant receipt of norepinephrine. Subjects were stratified into 2 groups according to their cumulative 48-hour piperacillin-tazobactam dose: low piperacillin-tazobactam dosing (LOW; <27 g) group and normal piperacillin-tazobactam dosing (NORM; ≥27 g) group. To account for potential confounding variables, propensity score matching was used. The primary study outcome was 28-day norepinephrine-free days (NFD). RESULTS: In all, 1279 patients met study criteria. After propensity score matching (n = 608), the NORM group had more median NFD (23.9 days [interquartile range, IQR: 0-27] vs 13.6 days [IQR: 0-27], P = 0.021). The NORM group also had lower rates of in-hospital mortality/hospice disposition (25.9% [n = 79] vs 35.5% [n = 108]), P = 0.014). Other secondary outcomes were similar between the treatment groups. CONCLUSIONS AND RELEVANCE: In the propensity score-matched cohort, the NORM group had significantly more 28-day NFD. Piperacillin-tazobactam dose reduction in early phase septic shock is associated with worsened clinical outcomes. Clinicians should be vigilant to avoid piperacillin-tazobactam dose reduction in early phase septic shock.


Subject(s)
Piperacillin , Shock, Septic , Humans , Piperacillin/adverse effects , Tazobactam , Shock, Septic/drug therapy , Retrospective Studies , Penicillanic Acid/adverse effects , Anti-Bacterial Agents/therapeutic use , Piperacillin, Tazobactam Drug Combination
3.
Article in English | MEDLINE | ID: mdl-36310806

ABSTRACT

Objective: To re-examine the use of noncarbapenems (NCBPs), specifically piperacillin-tazobactam (PTZ) and cefepime (FEP), for extended-spectrum beta-lactamase-producing Enterobacterales bloodstream infections (ESBL-E BSIs). Design: Retrospective cohort study. Setting: Tertiary-care, academic medical center. Patients: The study included patients hospitalized between May 2016 and May 2019 with a positive blood culture for ESBL-E. Patients were excluded if they received treatment with antibiotics other than meropenem, ertapenem, PTZ, or FEP. Patients were also excluded if they were aged <18 years, received antibiotics for <24 hours, were treated for polymicrobial BSI, or received concomitant antibiotic therapy for a separate gram-negative infection. Methods: We compared CBPs with FEP or PTZ for the treatment of ESBL-E BSI. The primary outcome was in-hospital mortality. Secondary outcomes included clinical cure, microbiologic cure, infection recurrence, and resistance development. Results: Data from 114 patients were collected and analyzed; 74 (65%) patients received carbapenem (CBP) therapy and 40 (35%) patients received a NCBP (30 received FEP and 10 received PTZ). The overall in-hospital mortality was 6% (N = 7), with a higher death rate in the CBP arm than in the N-CBP arm, (8% vs 3%; P = .42). No difference in mortality was detected between subgroups with Pitt bacteremia score ≥4, those requiring ICU admission, those whose infections were cause by a nongenitourinary source or causative organism (ie, 76 had Escherichia coli and 38 had Klebsiella spp). We detected no differences in secondary outcomes between the groups. Conclusion: Compared to CBPs, FEP and PTZ did not result in greater mortality or decreased clinical efficacy for the treatment of ESBL-E BSI caused by susceptible organisms.

4.
J Oncol Pharm Pract ; 28(2): 274-281, 2022 Mar.
Article in English | MEDLINE | ID: mdl-33435822

ABSTRACT

BACKGROUND: Engraftment syndrome (ES) is a common complication of autologous hematopoietic cell transplantation (HCT). The difference in incidence of ES between melphalan formulations has not been widely reported throughout the literature and would allow for a more comprehensive understanding of the advantages and disadvantages of both melphalan formulations. PATIENTS AND METHODS: This retrospective, single-center, observational study evaluated 83 adult multiple myeloma and immunoglobulin light chain amyloidosis patients who received either propylene glycol-containing (PG) or propylene glycol-free (PG-free) melphalan 140 mg/m2 as single-agent conditioning chemotherapy for autologous HCT from May 31, 2015 to May 31, 2019. The primary outcome was to assess the incidence of ES, as defined using the Maiolino criteria, with both melphalan formulations. Secondary outcomes included an analysis of potential risk factors for the development of ES, as well as an evaluation of overall length of stay (LOS). RESULTS: The incidence of ES for PG and PG-free melphalan did not differ significantly, 14/39 (35.9%) and 12/44 (27.3%) (P = 0.4), respectively. No potential risk factors for ES were identified on multivariate logistic regression analysis. A statistically significant difference in number of days to engraftment was identified for PG and PG-free melphalan, 15.56 vs. 13.82 days (P = 0.01), respectively; although, this did not translate to a decrease in LOS, 19.9 vs. 18.59 days (P = 0.14). CONCLUSIONS: The incidence of ES did not differ significantly between melphalan formulations. Future research is needed to determine whether the faster time to engraftment seen with PG-free melphalan may translate to a decrease in LOS.


Subject(s)
Hematopoietic Stem Cell Transplantation , Immunoglobulin Light-chain Amyloidosis , Multiple Myeloma , Hematopoietic Stem Cell Transplantation/adverse effects , Humans , Incidence , Melphalan/adverse effects , Multiple Myeloma/drug therapy , Retrospective Studies , Transplantation Conditioning/adverse effects , Transplantation, Autologous
5.
Am J Health Syst Pharm ; 79(3): 165-172, 2022 01 24.
Article in English | MEDLINE | ID: mdl-34553749

ABSTRACT

PURPOSE: To identify the incidence of continuation of newly initiated loop diuretics upon intensive care unit (ICU) and hospital discharge and identify factors associated with continuation. METHODS: This was a single-center retrospective study using electronic health records in the setting of adult ICUs at a quaternary care academic medical center. It involved patients with sepsis admitted to the ICU from January 1, 2014, to June 30, 2019, who received intravenous fluid resuscitation. The endpoints of interest were (1) the incidence of loop diuretic use during an ICU stay following fluid resuscitation, (2) continuation of loop diuretics following transition of care, and (3) potential factors associated with loop diuretic continuation after transition from the ICU. RESULTS: Of 3,591 patients who received intravenous fluid resuscitation for sepsis, 39.4% (n = 1,415) were newly started on loop diuretics during their ICU stay. Among patients who transitioned to the hospital ward from the ICU, loop diuretics were continued in 33% (388/1,193) of patients. At hospital discharge, 13.4% (52/388) of these patients were prescribed a loop diuretic to be used in the outpatient setting. History of liver disease, development of acute kidney injury, being on vasopressors while in the ICU, receiving blood products, and receiving greater than 90 mL/kg of bolus fluids were significant potential factors associated with loop diuretic continuation after transition from the ICU. CONCLUSION: New initiation of loop diuretics following intravenous fluid resuscitation in patients with sepsis during an ICU stay is a common occurrence. Studies are needed to assess the effect of this practice on patient outcomes and resource utilization.


Subject(s)
Critical Illness , Sodium Potassium Chloride Symporter Inhibitors , Adult , Fluid Therapy , Humans , Intensive Care Units , Retrospective Studies
6.
Am J Pharm Educ ; 85(10): 8614, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34400396

ABSTRACT

Objective. To determine whether third year Doctor of Pharmacy students' self-reported use of optional supplemental material impacted their ability to accurately predict their performance on a low-stakes assessment.Methods. An instructor created optional supplemental material in the form of an online quiz. Students were asked to report whether they used the supplemental material and to predict and postdict their performance on an in-class assessment. The relative accuracy of the predictions and postdictions as well as the assessment grades and overall course grades were compared between students who reported using the supplemental material and those who reported not using the supplemental material.Results. More than half of the students (60%) reported using the supplemental material. Most students underpredicted their performance on the in-class assessment, but there was no difference in the accuracy of predictions based on supplemental material use or non-use (-1.2 vs -1.0) or on the postdictions (-1.3 vs. -1.0). Students who reported using the supplemental material performed better on both the low-stakes assessment (7.7 vs 7.2 out of 10) and overall in the course (87.0% vs 84.9%).Conclusion. Pharmacy students' self-reported use of optional supplemental material does not appear to impact their ability to accurately predict their performance on a low-stakes assessment.


Subject(s)
Education, Pharmacy , Metacognition , Students, Pharmacy , Calibration , Educational Measurement , Humans
7.
Curr Pharm Teach Learn ; 12(9): 1093-1100, 2020 09.
Article in English | MEDLINE | ID: mdl-32624138

ABSTRACT

INTRODUCTION: One of the primary missions of pharmacy education is to produce graduates with the foundations to develop into expert practitioners through continuous learning and reflection upon traditional and clinical experiences. This reflection process and the use of effective strategies to meet specific learning goals can be considered a form of self-regulated learning (SRL). The following study validates an inventory to assess SRL strategies in blended and team-based learning (TBL) environments. METHODS: A SRL strategy inventory was developed based upon the Self-Regulated Strategies Inventory-Self-Report (SRSI-SR) and new items designed for blended and TBL environments. Sixteen new items focused on leveraging the team to learn content, the use and misuse of video lectures and slides, and interaction with social media and the learning management system. Two hundred and thirty doctor of pharmacy students in the third professional year participated in the study. Twenty-eight items from the SRSI-SR and 16 new items were examined through a principal components analysis (PCA). RESULTS: The PCA indicated three distinct components; managing learning environment, maladaptive learning strategies, and seeking and learning information. The total inventory accounted for 46.36% of the score. Maladaptive learning strategies scores were moderately predictive of poor academic achievement in didactic coursework. CONCLUSIONS: The following study demonstrates the importance of reexamination and adaptation of educational inventories such as the SRSI-SR. This study provides specific insight into what maladaptive strategies may be limiting underperforming students from achieving greater success and mastery in the didactic curriculum.


Subject(s)
Curriculum , Education, Pharmacy , Achievement , Humans , Learning , Self Report
8.
Pharmacotherapy ; 40(6): 500-506, 2020 06.
Article in English | MEDLINE | ID: mdl-32246498

ABSTRACT

INTRODUCTION: In intensive care unit (ICU) patients, delirium is frequent, occurs early in ICU admission, and is associated with poor outcomes. Risk models based on clinical factors have shown variable performance in terms of predictive ability. Identification of a candidate biomarker that associates with delirium may lead to a better understanding of disease mechanism, validation biomarker studies, and the ability to develop targeted interventions for prevention and treatment of delirium. This study analyzed metabolite concentrations early in the course of ICU admission to assess the association with delirium onset. METHODS: Within 24 hours of ICU admission, blood samples for global and targeted metabolomics analyses in adult surgical ICU patients were collected prospectively. Metabolites were determined using mass spectrometry/ultra-high-pressure liquid chromatography and analyzed in patients with delirium and a group of controls matched on age, sex, and admission Sequential Organ Function Assessment (SOFA) score. RESULTS: Patients in the study (65 per group) were a mean age of 59 years, had a median SOFA score of 6, and were most commonly admitted to the ICU following major trauma. In the delirium group, median onset of delirium was 3 (interquartile range 1-6) days, and the most common delirium subtype was mixed (56%). Kynurenic acid was significantly increased, and tryptophan concentration was significantly decreased in the delirium group (p=0.04). The ratio of kynurenine-to-tryptophan concentration was significantly higher in the delirium group (p=0.005). CONCLUSIONS: Evidence of upregulation was found in the tryptophan metabolic pathway in delirious patients because tryptophan concentrations were lower, tryptophan metabolites were higher, and the kynurenine-to-tryptophan ratio was increased. These findings suggest a role of increased inflammation and accumulation of neurotoxic metabolites in the pathogenesis of ICU delirium. Future studies should target this pathway to validate metabolites in the tryptophan pathway as risk biomarkers in patients with ICU delirium.


Subject(s)
Delirium/epidemiology , Intensive Care Units , Metabolomics , Tryptophan/metabolism , Adult , Aged , Biomarkers/metabolism , Chromatography, High Pressure Liquid , Delirium/etiology , Female , Humans , Kynurenic Acid/metabolism , Male , Mass Spectrometry , Middle Aged , Organ Dysfunction Scores , Prospective Studies , Risk Factors , Up-Regulation
10.
Clin Lymphoma Myeloma Leuk ; 19(11): 723-728, 2019 11.
Article in English | MEDLINE | ID: mdl-31530473

ABSTRACT

BACKGROUND: Identification of predictive indicators for rituximab infusion-related reactions (R-IRRs) may allow clinicians to modify treatment plans for patients at high risk of reaction to reduce incidence. Use of predictive indicators would significantly improve the patient experience, reduce hospital resource utilization, and decrease infusion chair time. PATIENTS AND METHODS: This retrospective, single-center, observational study evaluated 173 adult malignant hematology patients who received their first dose of rituximab inpatient between July 31, 2015 and July 31, 2018. Patients were excluded if they received prior rituximab, and/or induction chemotherapy, or were pregnant at the time of exposure. The primary outcome was the overall incidence of R-IRRs during the study period. The secondary outcomes were associations between specific patient and disease characteristics and R-IRRs. RESULTS: Of the 173 patients evaluated, 109 met inclusion criteria and 64 were excluded. The overall incidence of R-IRRs was 31 (28.4%) of 109. The following patient and disease characteristics were significantly associated with R-IRRs on univariate analysis: higher actual body weight (P = .04), diagnosis (P = .01), lower hemoglobin (P = .02), and bone marrow involvement (P = .001). In a confirmatory stepwise regression model, higher actual body weight (P = .01) and bone marrow involvement (P = .003) were positively associated with R-IRRs. CONCLUSION: Actual body weight and bone marrow involvement may be utilized as potential predictive indicators of R-IRRs. Further study is needed to validate these indicators and determine appropriate utilization in practice.


Subject(s)
Academic Medical Centers , Antineoplastic Agents, Immunological/adverse effects , Drug-Related Side Effects and Adverse Reactions/epidemiology , Drug-Related Side Effects and Adverse Reactions/etiology , Hematologic Neoplasms/drug therapy , Hematologic Neoplasms/epidemiology , Rituximab/adverse effects , Adult , Aged , Antineoplastic Agents, Immunological/administration & dosage , Combined Modality Therapy , Drug-Related Side Effects and Adverse Reactions/diagnosis , Female , Hematologic Neoplasms/complications , Hematologic Neoplasms/diagnosis , Humans , Infusions, Intravenous/adverse effects , Male , Middle Aged , Neoplasm Staging , Retreatment , Rituximab/administration & dosage
11.
Am J Pharm Educ ; 83(10): 7566, 2019 12.
Article in English | MEDLINE | ID: mdl-32001890

ABSTRACT

Objective. To determine the relationship between student-reported, self-regulated learning (SRL) with use of supplementary material, and overall performance in an advanced therapeutics course in a Doctor of Pharmacy program. Methods. A modified version of the Self-Regulated Strategy Inventory (SRSI-SI) was used to measure three distinct SRL factors: managing study behaviors, managing environment, and maladaptive regulatory behaviors. An instructor created a supplemental 36-question practice quiz and flashcard activity. The in-class assessment and the three SRL factors were analyzed using the practice quiz, and the association between overall course grade and score in each factor domain was determined by regression. Results. Two-hundred seven students (98%) completed the SRSI. One hundred fifty-eight (79%) students reported using the optional practice quiz and doing so was associated with significantly higher in-class quiz scores (8.2 vs 7.6 out of 10) and higher overall course grade (88.0% vs 85.3%). Students reporting use of the optional practice quiz were significantly less likely to report poor study behaviors, inability to manage study environment, and maladaptive study habits. Lower overall course grades were significantly associated with maladaptive study habits. Conclusion. A positive association was determined between use of instructor-created supplemental activities and in-class quiz scores, self-regulated study behaviors, and overall course performance. Maladaptive study habits were associated with a modest negative correlation with overall course grade. The results suggest that when instructors create optional supplementary activities and assessments, many of the students who would benefit the most from the use of these activities fail to utilize the opportunity for extra practice.


Subject(s)
Education, Pharmacy/methods , Education, Pharmacy/statistics & numerical data , Students, Pharmacy/statistics & numerical data , Adult , Curriculum/statistics & numerical data , Educational Measurement/methods , Female , Humans , Learning/physiology , Male
12.
Ann Pharmacother ; 53(4): 385-395, 2019 04.
Article in English | MEDLINE | ID: mdl-30404539

ABSTRACT

OBJECTIVE: Describe recent developments in the pharmacological management of sepsis and septic shock, focusing on fluid resuscitation, vasopressors, and corticosteroids. DATA SOURCES: A literature search limited to randomized controlled trials written in the English language reporting mortality and other clinically relevant outcomes that were published from July 1, 2016, to August 31, 2018, in patients aged ≥ 18 years. Titles and abstracts were reviewed for relevance. References for pertinent review articles were also reviewed. STUDY SELECTION AND DATA EXTRACTION: Relevant randomized controlled trials conducted in patients meeting the pre-defined inclusion criteria were considered for inclusion. DATA SYNTHESIS: From an initial search that identified 147 studies, 14 original research studies met inclusion criteria and were included in this review. Risk of bias (ROB) was assessed using the Revised Cochrane ROB assessment tool, with most included studies having a low ROB. Relevance to Patient Care and Clinical Practice: Sepsis and septic shock pose a significant burden on public health. Despite advances in our understanding of sepsis, mortality remains unacceptably high. Recent developments in the pharmacological management of septic shock have focused on determining optimal composition and dosage of fluid resuscitation, enhanced use of vasopressor therapy, and clarifying the role of corticosteroids. This systematic review will provide recommendations for application to practice focusing on recent research on these topics. CONCLUSIONS: Although recent developments in the pharmacological management of sepsis are encouraging, clinicians must be keen to utilize patient-specific factors to guide therapy and continue to strive to address the remaining unanswered questions.


Subject(s)
Adrenal Cortex Hormones/therapeutic use , Fluid Therapy/methods , Sepsis/therapy , Vasoconstrictor Agents/therapeutic use , Adrenal Cortex Hormones/administration & dosage , Humans , Practice Guidelines as Topic , Randomized Controlled Trials as Topic , Sepsis/drug therapy , Shock, Septic/drug therapy , Shock, Septic/therapy , Vasoconstrictor Agents/administration & dosage
13.
Int J Antimicrob Agents ; 52(5): 719-723, 2018 Nov.
Article in English | MEDLINE | ID: mdl-30125680

ABSTRACT

Management of micro-organisms harbouring AmpC ß-lactamases remains challenging. Carbapenems are often considered first-line agents. Due to growing concern regarding carbapenem-resistant Enterobacteriaceae, integrating non-carbapenem treatment strategies is being explored for these pathogens. The primary objective of this study was to evaluate clinical outcomes in patients with bacteraemia secondary to AmpC-producing organisms treated with cefepime or piperacillin/tazobactam (TZP). A retrospective study of adult patients receiving cefepime or TZP for the treatment of AmpC -producing organisms with positive cefoxitin screen (i.e. Citrobacter, Enterobacter or Serratia spp. along with cefoxitin resistance) isolated from blood cultures was conducted. The primary endpoint was clinical cure at end of therapy (EOT). Secondary endpoints included microbiological eradication, frequency of susceptibility changes following treatment, and 7- and 30-day all-cause mortality. Clinical cure at EOT was 87.1%, with 93.2% of patients achieving microbiological eradication. The 7- and 30-day mortality rates were 3.8% and 10.6%, respectively. Organism susceptibility was exceptionally high, with minimum inhibitory concentrations (MICs) of ≤2 µg/mL in 90% of patients treated with cefepime (n = 108). Selection for resistance to third-generation cephalosporins or primary antimicrobial therapy was infrequent at 6.1% (8/132). In conclusion, use of cefepime or TZP for management of AmpC bloodstream infections was associated with clinical and microbiological cure with infrequent selection for resistance.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Antimicrobial Stewardship , Bacteremia/drug therapy , Bacterial Proteins/metabolism , Cefepime/therapeutic use , Enterobacteriaceae Infections/drug therapy , Piperacillin, Tazobactam Drug Combination/therapeutic use , beta-Lactamase Inhibitors/therapeutic use , beta-Lactamases/metabolism , Adolescent , Adult , Aged , Aged, 80 and over , Anti-Bacterial Agents/pharmacology , Bacteremia/microbiology , Cefepime/pharmacology , Drug Resistance, Bacterial , Enterobacteriaceae/drug effects , Enterobacteriaceae/isolation & purification , Female , Humans , Male , Microbial Sensitivity Tests , Middle Aged , Retrospective Studies , Survival Analysis , Treatment Outcome , Young Adult
14.
Pharmacotherapy ; 38(8): 794-803, 2018 08.
Article in English | MEDLINE | ID: mdl-29883532

ABSTRACT

Over the last decade, the discovery of novel renal biomarkers and research on their use to improve medication effectiveness and safety has expanded considerably. Pharmacists are uniquely positioned to leverage this new technology for renal assessment to improve medication dosing and monitoring. Serum cystatin C is a relatively new, inexpensive, functional renal biomarker that responds more quickly to changing renal function than creatinine and is not meaningfully affected by age, sex, skeletal muscle mass, dietary intake, or deconditioning. Cystatin C has been proposed as an adjunct or alternative to creatinine for glomerular filtration rate assessment and estimation of drug clearance. Tissue inhibitor of metalloproteinase-2·insulin-like growth factor-binding protein 7 ([TIMP-2]·[IGFBP7]) is a composite of two damage biomarkers released into the urine at a checkpoint in mitosis when renal cells undergo stress or sense a future risk of damage. Concentrations of [TIMP-2]·[IGFBP7] increase before a rise in serum creatinine is evident, thus providing insightful information for evaluation in the context of other patient data to predict the risk for impending kidney injury. This article provides a brief overview of novel renal biomarkers being used as a mechanism to improve medication safety including a discussion of cystatin C, as part of drug-dosing algorithms and specifically for vancomycin dosing, and the use of [TIMP-2]·[IGFBP7] for risk prediction in acute kidney injury and drug-induced kidney disease. Select cases of clinical experience with novel renal biomarkers are outlined, and lessons learned and future applications are described.


Subject(s)
Acute Kidney Injury/prevention & control , Biomarkers/metabolism , Cystatin C/blood , Kidney Diseases/prevention & control , Drug-Related Side Effects and Adverse Reactions/prevention & control , Humans
15.
Thromb Res ; 165: 6-13, 2018 05.
Article in English | MEDLINE | ID: mdl-29544199

ABSTRACT

OBJECTIVE: Incidence of venous thromboembolism (VTE) in critically ill patients remains unacceptably high despite widespread use of thromboprophylaxis. A systems biology approach may be useful in understanding disease pathology and predicting response to treatment. Metabolite profile under specific environmental conditions provides the closest link to phenotype, but the relationship between metabolomics and risk of VTE in critically ill patients is unknown. In this study, metabolomics signatures are compared in patients with and without VTE. DESIGN: Multicenter case-control study using prospectively collected data from the Inflammation and Host Response to Injury program, with pathway and in silico gene expression analyses. SETTING: Eight level 1 US trauma centers. PATIENTS: Critically ill adults with blunt trauma who developed VTE within the first 28 days of hospitalization compared to patients without VTE (N-VTE). INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: Patients included in the study (n = 20 VTE, n = 20 N-VTE) were mean age of 34 years, injury severity score of 35, and VTE diagnosed a median of 10.5 days after admission. Global metabolomics revealed two kynurenine metabolites, N-formylkynurenine (AUC = 0.77; 95% CI: 0.59-0.89) and 5-hydroxy-N-formylkynurenine (AUC = 0.80; 95% CI:0.63-0.90) significantly discriminated VTE and N-VTE; ratio between N-formylkynurenine/5-hydroxy-N-formylkynurenine improved predictive power (AUC = 0.87; 95% CI: 0.74-0.95). In the pathway analysis, tryptophan was the only significant metabolic pathway including N-formylkynurenine and 5-hydroxy-N-formylkynurenine (p < 0.001), and 8 proteins directly or indirectly interacted with these metabolites in the interaction network analysis. Of the 8 genes tested in the in silico gene expression analyses, KYNU (p < 0.001), CCBL1 (p < 0.001), and CCBL2 (p = 0.001) were significantly different between VTE and N-VTE, controlling for age and sex. CONCLUSIONS: Two novel kynurenine metabolites in the tryptophan pathway associated with hospital-acquired VTE, and 3 candidate genes were identified via pathway and interaction network analyses. Future studies are warranted to validate these findings in diverse populations using a multi-omics approach.


Subject(s)
Critical Illness/therapy , Kynurenine/adverse effects , Metabolomics/methods , Tryptophan/adverse effects , Venous Thromboembolism/etiology , Adult , Case-Control Studies , Female , Humans , Male , Venous Thromboembolism/pathology
16.
Am J Health Syst Pharm ; 74(16): 1237-1244, 2017 Aug 15.
Article in English | MEDLINE | ID: mdl-28652320

ABSTRACT

PURPOSE: Results of a study to determine trends in oral anticoagulant (OAC) use and OAC switching in patients with atrial fibrillation (AF) or atrial flutter are presented. METHODS: Warfarin has been the most prescribed anticoagulant in patients with AF for decades. Since 2010, several direct OACs (DOACs) have gained U.S. marketing approval for stroke prevention in AF or atrial flutter. A cross-sectional longitudinal analysis was conducted using healthcare and prescription claims databases to characterize OAC use and rates of OAC and DOAC switching during the period 2008-14 in cohorts of Medicare beneficiaries 65 years of age or older with AF or atrial flutter. RESULTS: Overall, 66% of patients with AF or atrial flutter were receiving OACs during the study period. The prevalence of warfarin use decreased from 69.8% in 2008 to 42.2% in 2014. This decrease in warfarin use was paralleled by an increase in dabigatran use, which rose from 1.3% in 2010 to 12.1% in 2011 and then declined to 7.6% in 2014. The prevalence of rivaroxaban use increased from 0.13% in 2011 to 13.87% in 2014. Among anticoagulated patients, an average of 6% annually were switched from one OAC to another. CONCLUSION: Overall OAC utilization in patients with AF or atrial flutter remained steady over the study period. Beginning in 2010, a gradual decrease in use of warfarin was paralleled by an increase in use of DOACs.


Subject(s)
Anticoagulants/administration & dosage , Atrial Fibrillation/drug therapy , Drug Utilization/trends , Medicare/trends , Warfarin/administration & dosage , Administration, Oral , Aged , Aged, 80 and over , Atrial Fibrillation/diagnosis , Atrial Fibrillation/epidemiology , Cross-Sectional Studies , Female , Humans , Longitudinal Studies , Male , United States/epidemiology
17.
Pharmacotherapy ; 37(7): 806-813, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28500694

ABSTRACT

OBJECTIVE: This study aimed to examine secular trends of (i) maternal prescription opioid use in late pregnancy, (ii) neonatal abstinence syndrome (NAS) stratified by late maternal prescription opioid use, and (iii) maternal risk factors among NAS deliveries. METHODS: Women with a live birth who were enrolled 90 days before and 30 days after delivery in Florida Medicaid Analytic Extract billing records linked to birth certificates from 2000 to 2010 were identified for the study. Changes in the annual prevalence of prescription opioid use during pregnancy were tested with use of the Cochran-Armitage trend test. Temporal trends of NAS deliveries were estimated with the use of Poisson regression and stratified by prescription opioid exposure in the last 90 days of pregnancy in the study period. To identify contributors to the increase in NAS cases, variations in prevalence of opioid dispensing, tobacco use, antidepressant use, and substance use disorder among NAS and non-NAS deliveries were examined. RESULTS: There were 41,968 (9.4%) deliveries exposed to at least one opioid prescription in late pregnancy, and this rate remained stable from 2000 to 2010. Among prescription opioid-exposed deliveries, frequency of NAS increased from 1.6 to 25.2 per 1000 live births during the study period (p<0.05). Although the prevalence of maternal use of prescription opioid, tobacco, and antidepressants remained stable among NAS deliveries from 2000 to 2010, the prevalence of substance use disorder diagnoses increased substantially from 38.9% in 2000 to 67.9% in 2006 (p<0.05). CONCLUSIONS: The prevalence of NAS increased dramatically whereas the prevalence of major risk factors, including maternal prescription opioid use, remained stable in Florida between 2000 and 2010. The increase in substance use disorder may be responsible for the sharp increase in NAS deliveries.


Subject(s)
Analgesics, Opioid/adverse effects , Medicaid/trends , Neonatal Abstinence Syndrome/epidemiology , Opioid-Related Disorders/epidemiology , Pregnancy Complications/epidemiology , Prenatal Exposure Delayed Effects/epidemiology , Adolescent , Adult , Antidepressive Agents/adverse effects , Female , Florida/epidemiology , Humans , Infant, Newborn , Male , Neonatal Abstinence Syndrome/diagnosis , Opioid-Related Disorders/diagnosis , Pregnancy , Pregnancy Complications/diagnosis , Prenatal Exposure Delayed Effects/chemically induced , Prenatal Exposure Delayed Effects/diagnosis , Risk Factors , Substance-Related Disorders/diagnosis , Substance-Related Disorders/epidemiology , Tobacco Use Disorder/diagnosis , Tobacco Use Disorder/epidemiology , United States/epidemiology , Young Adult
18.
J Crit Care ; 39: 31-35, 2017 06.
Article in English | MEDLINE | ID: mdl-28152386

ABSTRACT

PURPOSE: To determine the impact on duration of mechanical ventilation (MV) and the need for reintubation after changing from intravenous (IV) to oral phosphate formulations, in response to a national shortage of IV phosphate. METHODS: A retrospective study was performed in adult patients who required MV for at least 48 hours. RESULTS: A total of 136 patients were included, with 68 patients in both the restricted phosphate group and unrestricted phosphate groups. There was no difference in the cumulative phosphate supplementation received (IV and oral) between groups (P=.08). The overall mean serum phosphorus concentration in unrestricted vs restricted group was 3.0 vs 2.9 mg/dL, respectively (P=.24), and the phosphorus concentration was not significantly different between groups during the first 21 days of the study (P=.24). The median MV-free hours in the unrestricted group was 462 hours compared with 507 hours in the restricted group (P=.16), and 9 (13.2%) of patients in each group required reintubation (P=.99). There was no significant difference in mortality, or hospital, or intensive care unit (ICU) length of stay. CONCLUSIONS: No difference in MV-free hours or need for reintubation was observed after a national shortage requiring the restriction of IV phosphate supplementation. Oral phosphate replacement is a safe and an efficient alternative.


Subject(s)
Critical Illness/therapy , Hypophosphatemia/drug therapy , Phosphates/administration & dosage , Respiration, Artificial , Ventilator Weaning , Administration, Oral , Drug Administration Schedule , Enteral Nutrition , Female , Humans , Hypophosphatemia/blood , Infusions, Intravenous , Intensive Care Units , Male , Middle Aged , Phosphates/blood , Retrospective Studies
19.
J Crit Care ; 39: 78-82, 2017 06.
Article in English | MEDLINE | ID: mdl-28231518

ABSTRACT

PURPOSE: To evaluate the efficacy of an early bronchoalveolar lavage (E-BAL) protocol. BAL was performed within 48 h for intubated patients with traumatic brain injury or chest trauma. We hypothesized that E-BAL would decrease antibiotic use and improve outcomes compared to late BAL (L-BAL) triggered by clinical signs of pneumonia. METHODS: Retrospective cohort analysis of 132 patients with quantitative BAL and ≥1 risk factor: head Abbreviated Injury Score ≥2, ≥3 rib fractures, or radiographic signs of aspiration or pulmonary contusion. E-BAL (n=71) was compared to L-BAL (n=61). Pneumonia was defined as ≥104 organisms on BAL or Clinical Pulmonary Infection Score >6. RESULTS: There were no significant differences in age, injury severity, initial Pao2:Fio2, or smoking status between E-BAL and L-BAL groups. 52% and 61% of the E-BAL and L-BAL cultures were positive, respectively. E-BAL patients had fewer antibiotic days (7.3 vs 9.2, P=.034), ventilator days (11 vs 15, P=.002), tracheostomies (49% vs 75%, P=.002), and shorter intensive care unit and hospital length of stay (13 vs 17 days (P=.007), 18 vs 22 days (P=.041)). CONCLUSIONS: More than half of all E-BAL patients had pneumonia present early after admission. E-BAL was associated with fewer days on antibiotics and better outcomes than L-BAL.


Subject(s)
Bronchoalveolar Lavage/methods , Thoracic Injuries/therapy , Adult , Anti-Bacterial Agents/therapeutic use , Female , Humans , Intensive Care Units , Length of Stay , Lung Injury/diagnosis , Male , Middle Aged , Pneumonia/diagnosis , Retrospective Studies
20.
Thromb Res ; 147: 13-15, 2016 Nov.
Article in English | MEDLINE | ID: mdl-27664391

ABSTRACT

Incidence of venous thromboembolism (VTE) in adult trauma patients is high despite mechanical and pharmacologic prophylaxis. We hypothesized that thrombin formation capacity as measured by calibrated automated thrombogram (CAT) is increased early in hospitalization and is associated with the development of VTE. METHODS: We conducted a prospective study in adult, critically ill trauma patients. Plasma was generated from whole blood samples collected within the first 3days of hospital admission. CAT was used to determine lag time, thrombin peak, time to thrombin peak, endogenous thrombin potential (ETP), and velocity index in plasma samples from patients, and in control samples of platelet-poor, pooled normal plasma. RESULTS: There were 35 trauma patients and 35 controls included in this pilot analysis. Patients were a mean (SD) age of 45 (19) years, and 23 (66%) were male. The most common mechanism of injury was motor vehicle crash followed by falls, and the median (IQR) injury severity score was 17 (12-27). Three patients (8.6%) had deep vein thrombosis (DVT) confirmed by Doppler ultrasound on median hospital day 7. Compared to control samples, patients had significantly longer lag times (3.1min vs. 2.7min, p=0.02) and significantly higher ETP (1136nM∗min vs. 1019nM∗min, p=0.007), peak thrombin generation (239nM vs. 176nM, p<0.001), and velocity index (108nM/min vs. 57nM/min, p<0.001) (Fig. 1). There was no difference in the time to peak thrombin generation between the two groups (5.5min vs. 5.7min, p=0.22). In the 3 patients with VTE compared to controls, lag times were shorter and velocity index was higher while ETP and peak thrombin generation were similar. There were no statistically significant differences in thrombin generation parameters in patients with or without VTE, but lag time was numerically shorter, and thrombin peak, time to peak and area-under-the-curve (ETP) were numerically lower in patients with DVT. CONCLUSIONS: We observed a thrombin generation profile in critically ill trauma patients consistent with an early hypercoagulable state; however, thrombin generation parameters did not discriminate patients with VTE.


Subject(s)
Thrombin/metabolism , Venous Thromboembolism/etiology , Wounds and Injuries/complications , Adult , Aged , Blood Coagulation , Blood Coagulation Tests , Critical Illness , Female , Humans , Injury Severity Score , Male , Middle Aged , Prospective Studies , Venous Thromboembolism/blood , Venous Thromboembolism/metabolism , Venous Thrombosis/blood , Venous Thrombosis/etiology , Venous Thrombosis/metabolism , Wounds and Injuries/blood , Wounds and Injuries/metabolism
SELECTION OF CITATIONS
SEARCH DETAIL
...