Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
Article in English | MEDLINE | ID: mdl-39017625

ABSTRACT

DISCLAIMER: In an effort to expedite the publication of articles, AJHP is posting manuscripts online as soon as possible after acceptance. Accepted manuscripts have been peer-reviewed and copyedited, but are posted online before technical formatting and author proofing. These manuscripts are not the final version of record and will be replaced with the final article (formatted per AJHP style and proofed by the authors) at a later time. PURPOSE: Intravenous administration of the antiseizure medication lacosamide can be delayed given operational challenges related to short beyond-use-dating and controlled substance requirements. The purpose of this study was to describe the steps required to successfully transition from intravenous piggyback administration to intravenous push administration and demonstrate that workflow changes improved time to administration without compromising patient safety. METHODS: This multicenter study had 2 components; the first portion was a prospective description of the implementation and operationalization process, while the second was a retrospective cohort analysis comparing patients who received intravenous piggyback and intravenous push lacosamide. After the transition, the default administration route for adult patients for lacosamide doses of 400 mg or less was intravenous push. While the primary objective was to describe the implementation process, secondary objectives included comparison of time to administration and safety, using a composite and incidence of PR prolongation. RESULTS: Success in implementation and operationalization across a large health system was achieved by following a 6-month timeline. A total of 102 patients were included in the cohort study, with 869 individual administrations analyzed (519 intravenous piggyback and 350 intravenous push). Time from verification to administration was significantly decreased when comparing intravenous piggyback (median, 159 minutes) to intravenous push (median, 88 minutes) administrations (P = 0.008). No significant difference was found in the safety composite or PR prolongation. CONCLUSION: Transitioning intravenous lacosamide administration from piggyback to push administration is feasible and decreases time from verification to administration without increased incidence of adverse effects.

2.
Article in English | MEDLINE | ID: mdl-37771740

ABSTRACT

Objective: To assess the safety and efficacy of a novel beta-lactam allergy assessment algorithm managed by an antimicrobial stewardship program (ASP) team. Design: Retrospective analysis. Setting: One quaternary referral teaching hospital and one tertiary care teaching hospital in a large western Pennsylvania health network. Patients or participants: Patients who received a beta-lactam challenge dose under the beta-lactam allergy assessment algorithm. Interventions: A beta-lactam allergy assessment protocol was designed and implemented by an ASP team. The protocol risk stratified patients' reported allergies to identify patients appropriate for a challenge with a beta-lactam antibiotic. This retrospective analysis assessed the safety and efficacy of this protocol among patients receiving a challenge dose from November 2017 to July 2021. Results: Over a 45-month period, 119 total patients with either penicillin or cephalosporin allergies entered the protocol. Following a challenge dose, 106 (89.1%) patients were treated with a beta-lactam. Eleven patients had adverse reactions to a challenge dose, one of which required escalation of care to the intensive care unit. Of the patients with an unknown or low-risk reported allergy, 7/66 (10.6%) had an observed adverse reaction compared to 3/42 (7.1%) who had an observed reaction with a reported high-risk or anaphylactic allergy. Conclusions: Our implemented protocol was safe and effective, with over 90% of patients tolerating the challenge without incident and many going on to receive indicated beta-lactam therapy. This protocol may serve as a framework for other inpatient ASP teams to implement a low-barrier allergy assessment led by ASP teams.

3.
Open Forum Infect Dis ; 9(9): ofac438, 2022 Sep.
Article in English | MEDLINE | ID: mdl-36092825

ABSTRACT

Background: Limited descriptive data exist regarding the clinical characteristics of hospitalizations due to the severe acute respiratory syndrome coronavirus 2 Omicron variant based on vaccination status. Methods: This was a retrospective cohort study of all patients hospitalized with a diagnosis of coronavirus disease 2019 (COVID-19) between 15 January 2022 and 15 February 2022 across 9 hospitals in a large health network. Data were extracted by manual records review. Results: A total of 351 of 452 (77.7%) unvaccinated, 209 of 331 (63.1%) fully vaccinated, and 107 of 163 (65.6%) boosted patients hospitalized with a COVID-19 diagnosis were determined to be admitted specifically due to COVID-19 (P < .001). Most (85%) boosted patients admitted due to COVID-19 were at least 65 years old and/or had severe immunosuppression, compared to 72.2% of fully vaccinated and 60.7% of unvaccinated patients (P < .001). Significantly more unvaccinated patients (34.2%) required >6 L/minute of supplemental oxygen compared to fully vaccinated (24.4%) and boosted (25.2%) patients (P = .027). The age-adjusted vaccine effectiveness (VE) against hospitalization due to COVID-19 was estimated to be 81.1% and 94.1% for full vaccination and boosted status, respectively, whereas VE against mortality related to COVID-19 was estimated to be 84.7% and 94.8%, respectively. Conclusions: During the Omicron BA.1 sublineage wave, unvaccinated patients hospitalized with a COVID-19 diagnosis were more likely than vaccinated patients to be admitted specifically due to COVID-19. Despite being younger with fewer comorbidities, unvaccinated patients required higher levels of care. Vaccination with a booster provides the greatest protection against hospitalization and death from COVID-19.

4.
Open Forum Infect Dis ; 9(2): ofab589, 2022 Feb.
Article in English | MEDLINE | ID: mdl-35071682

ABSTRACT

BACKGROUND: Preliminary data suggest that the effectiveness of dalbavancin may be similar to current standard-of-care (SoC) treatment options for osteomyelitis with an advantageous dosing schedule. METHODS: This was a retrospective, observational cohort study of adult patients diagnosed with osteomyelitis. Patients were matched 1:2 to dalbavancin (administered as 2 doses separated by 1 week) or SoC treatment for osteomyelitis according to the Charlson Comorbidity Index, site of infection, and causative pathogen. The primary objective was to determine the incidence of treatment failure after a 1-year follow-up period. Secondary objectives included hospital length of stay (LOS), infection-related 1-year readmission rates, and treatment-related adverse events. RESULTS: A total of 132 patients received dalbavancin (n = 42) or SoC (n = 90). Baseline characteristics, including rates of surgical intervention, were similar between the 2 treatment groups. Treatment failure was similar between those who received dalbavancin and SoC (21.4% vs 23.3%; P = .81). Patients who received dalbavancin had a shorter hospital LOS (5.2 days vs 7.2 days; P = .01). There was no difference in the rates of infection-related readmission between the dalbavancin and the SoC group (31% vs 31.1%; P = .99). There were numerically fewer adverse events in the dalbavancin group compared with the SoC group (21.4% vs 36.7%; P = .08). Peripherally inserted central catheter line-related complications were reported in 17.8% of patients in the SoC group. CONCLUSIONS: Dalbavancin administered as a 2-dose regimen is a safe and effective option for the treatment of osteomyelitis.

5.
Clin Infect Dis ; 75(2): 269-277, 2022 08 25.
Article in English | MEDLINE | ID: mdl-34718456

ABSTRACT

BACKGROUND: Bloodstream infections (BSIs) are a leading cause of morbidity and mortality. The Improving Outcomes and Antimicrobial Stewardship study seeks to evaluate the impact of the Accelerate PhenoTest BC Kit (AXDX) on antimicrobial use and clinical outcomes in BSIs. METHODS: This multicenter, quasiexperimental study compared clinical and antimicrobial stewardship metrics, prior to and after implementation of AXDX, to evaluate the impact this technology has on patients with BSIs. Laboratory and clinical data from hospitalized patients with BSIs (excluding contaminants) were compared between 2 arms, 1 that underwent testing on AXDX (post-AXDX) and 1 that underwent alternative organism identification and susceptibility testing (pre-AXDX). The primary outcomes were time to optimal therapy (TTOT) and 30-day mortality. RESULTS: A total of 854 patients with BSIs (435 pre-AXDX, 419 post-AXDX) were included. Median TTOT was 17.2 hours shorter in the post-AXDX arm (23.7 hours) compared with the pre-AXDX arm (40.9 hours; P<.0001). Compared with pre-AXDX, median time to first antimicrobial modification (24.2 vs 13.9 hours; P<.0001) and first antimicrobial deescalation (36.0 vs 27.2 hours; P=.0004) were shorter in the post-AXDX arm. Mortality (8.7% pre-AXDX vs 6.0% post-AXDX), length of stay (7.0 pre-AXDX vs 6.5 days post-AXDX), and adverse drug events were not significantly different between arms. Length of stay was shorter in the post-AXDX arm (5.4 vs 6.4 days; P=.03) among patients with gram-negative bacteremia. CONCLUSIONS: For BSIs, use of AXDX was associated with significant decreases in TTOT, first antimicrobial modification, and time to antimicrobial deescalation.


Subject(s)
Anti-Infective Agents , Antimicrobial Stewardship , Bacteremia , Gram-Negative Bacterial Infections , Anti-Bacterial Agents/therapeutic use , Anti-Infective Agents/therapeutic use , Bacteremia/diagnosis , Bacteremia/drug therapy , Gram-Negative Bacterial Infections/drug therapy , Humans
6.
J Am Pharm Assoc (2003) ; 62(3): 706-710, 2022.
Article in English | MEDLINE | ID: mdl-34920955

ABSTRACT

BACKGROUND: Recent changes to vancomycin guidelines recommend area under the curve concentration (AUC) monitoring in most patients, owing to similar effectiveness and reduced rates of acute kidney injury (AKI). OBJECTIVE: The purpose of this study was to assess the incidence of AKI in patients receiving vancomycin dosed by AUC-based goal troughs and vancomycin dosed by traditional trough goals (15-20 mcg/mL) in the outpatient setting. METHODS: Patients were included if they received vancomycin outpatient for at least 1 week. The primary objective was comparing the incidence of AKI in patients receiving vancomycin as an outpatient with trough goals derived from patient-specific AUC calculations determined as an inpatient with that of patients receiving vancomycin by traditional goal troughs. Secondary objectives included assessing the rate of treatment failure, AUC estimated trough range, and number of regimen changes required. RESULTS: There were 65 patients in the traditional trough dosing group and 53 patients in the AUC trough dosing group. The incidence of AKI was lower in the AUC trough group (5.7% vs. 23.1%; P = 0.01). There were no differences in the incidence of treatment failure. The median AUC estimated trough range was 11.4-17.1 mcg/mL. There were statistically significant less average regimen changes required in the AUC dosing group (1.13 vs. 1.64; P = 0.006). CONCLUSION: There was a statistically significant lower incidence of AKI in patients receiving vancomycin dosed by individualized AUC-based trough ranges compared with that of patients receiving traditional trough dosing. Developing a process for individualized AUC-based trough ranges can facilitate a convenient monitoring method to use the benefits of vancomycin AUC dosing as an outpatient.


Subject(s)
Acute Kidney Injury , Vancomycin , Acute Kidney Injury/chemically induced , Acute Kidney Injury/epidemiology , Anti-Bacterial Agents/therapeutic use , Area Under Curve , Female , Goals , Humans , Male , Microbial Sensitivity Tests , Outpatients , Retrospective Studies , Vancomycin/adverse effects
8.
Infection ; 49(3): 511-519, 2021 Jun.
Article in English | MEDLINE | ID: mdl-33528813

ABSTRACT

PURPOSE: Gram-negative bacteria (GNB) are a leading cause of bloodstream infections (BSI) and management is complicated by antibiotic resistance. The Accelerate Pheno™ system (ACC) can provide rapid organism identification and antimicrobial susceptibility testing (AST). METHODS: A retrospective, pre-intervention/post-intervention study was conducted to compare management of non-critically ill patients with GNB BSI before and after implementation of a bundled initiative. This bundled initiative included dissemination of a clinical decision algorithm, ACC testing on all GNB isolated from blood cultures, real-time communication of results to the Antimicrobial Stewardship Program (ASP), and prospective audit with feedback by the ASP. The pre-intervention period was January 2018 through December 2018, and the post-intervention period was May 2019 through February 2020. RESULTS: Seventy-seven and 129 patients were included in the pre-intervention and post-intervention cohorts, respectively. When compared with the pre-intervention group, the time from Gram stain to AST decreased from 46.1 to 6.9 h (p < 0.001), and the time to definitive therapy (TTDT) improved from 32.6 to 10.5 h (p < 0.001). Implementation led to shorter median total duration of antibiotic therapy (14.2 vs 9.5 days; p < 0.001) and mean hospital length of stay (7.9 vs 5.3 days; p = 0.047) without an increase in 30-day readmissions (22.1% vs 14%; p = 0.13). CONCLUSION: Implementation of an ASP-bundled approach incorporating the ACC aimed at optimizing antibiotic therapy in the management GNB BSI in non-critically ill patients led to reduced TTDT, shorter duration of antibiotic therapy, and shorter hospital length of stay without adversely affecting readmission rates.


Subject(s)
Antimicrobial Stewardship , Bacteremia , Gram-Negative Bacterial Infections , Anti-Bacterial Agents/therapeutic use , Bacteremia/diagnosis , Bacteremia/drug therapy , Gram-Negative Bacteria , Gram-Negative Bacterial Infections/drug therapy , Humans , Microbial Sensitivity Tests , Retrospective Studies
9.
Open Forum Infect Dis ; 5(5): ofy089, 2018 May.
Article in English | MEDLINE | ID: mdl-30568987

ABSTRACT

BACKGROUND: Cefazolin and ceftriaxone are frequently used to treat methicillin-susceptible Staphylococcus aureus (MSSA) bacteremia, especially in the realm of outpatient parenteral antimicrobial therapy. Both antimicrobials have been associated with favorable clinical outcomes for mixed MSSA infections. However, limited published data exist specifically comparing the use of these agents for the treatment of MSSA bacteremia. METHODS: We conducted a retrospective cohort study of Veteran patients with MSSA bacteremia who received ≥14 days of cefazolin or ceftriaxone between 2009 and 2014. Rates of treatment failure were compared between both groups. Treatment failure was defined as therapy extension, incomplete therapy, unplanned oral suppressive therapy, relapse of infection, or hospital admission or surgery within 90 days. RESULTS: Out of 71 patients, 38 received treatment with cefazolin and 33 with ceftriaxone. The overall rate of treatment failure was 40.8%, with significantly more failures among patients receiving ceftriaxone (54.5% versus 28.9%; P = .029). Factors associated with treatment failure included longer duration of parenteral therapy, heart failure, and treatment in an external skilled nursing facility as compared with treatment in the Department of Veterans Affairs attached Community Living Center. CONCLUSIONS: Ceftriaxone had a higher rate of treatment failure than cefazolin for the treatment of MSSA bacteremia in a Veteran population. Potential reasons for this could include the higher protein binding of ceftriaxone, ultimately resulting in lower serum concentrations of free drug, or other unknown factors. Further studies are warranted to confirm these results.

SELECTION OF CITATIONS
SEARCH DETAIL
...