Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 38
Filter
1.
Clin Infect Dis ; 2024 Mar 14.
Article in English | MEDLINE | ID: mdl-38483930

ABSTRACT

BACKGROUND: There are no systematic measures of central line-associated bloodstream infections (CLABSIs) in patients maintaining central venous catheters (CVCs) outside acute care hospitals. To improve understanding of the burden of CLABSIs outside acute care hospitals, we characterized patients with CLABSI present on hospital admission (POA). METHODS: Retrospective cross-sectional analysis of patients with CLABSI-POA in three health systems covering eleven hospitals across Maryland, Washington DC, and Missouri from November 2020 to October 2021. CLABSI-POA was defined using an adaptation of the acute care CLABSI definition. Patient demographics, clinical characteristics, and outcomes were collected via chart review. Cox proportional hazard analysis was used to assess factors associated with all-cause mortality within 30 days. RESULTS: 461 patients were identified as having CLABSI-POA. CVCs were most commonly maintained in home infusion therapy (32.8%) or oncology clinics (31.2%). Enterobacterales were the most common etiologic agent (29.2%). Recurrent CLABSIs occurred in a quarter of patients (25%). Eleven percent of patients died during the hospital admission. Among CLABSI-POA patients, mortality risk increased with age (versus ages <20: ages 20-44 years: HR: 11.21, 95% CI: 1.46-86.22; ages 45-64: HR: 20.88, 95% CI: 2.84-153.58; at least 65 years of age: HR: 22.50, 95% CI: 2.98-169.93), and lack of insurance (HR: 2.46; 95% CI: 1.08-5.59), and decreased with CVC removal (HR: 0.57, 95% CI: 0.39-0.84). CONCLUSION: CLABSI-POA is associated with significant in-hospital mortality. Surveillance is required to understand the burden of CLABSI in the community to identify targets for CLABSI prevention initiatives outside acute care settings.

2.
Article in English | MEDLINE | ID: mdl-38415083

ABSTRACT

Objective: To (1) understand the role of antibiotic-associated adverse events (ABX-AEs) on antibiotic decision-making, (2) understand clinician preferences for ABX-AE feedback, and (3) identify ABX-AEs of greatest clinical concern. Design: Focus groups. Setting: Academic medical center. Participants: Medical and surgical house staff, attending physicians, and advanced practice practitioners. Methods: Focus groups were conducted from May 2022 to December 2022. Participants discussed the role of ABX-AEs in antibiotic decision-making and feedback preferences and evaluated the prespecified categorization of ABX-AEs based on degree of clinical concern. Thematic analysis was conducted using inductive coding. Results: Four focus groups were conducted (n = 15). Six themes were identified. (1) ABX-AE risks during initial prescribing influence the antibiotic prescribed rather than the decision of whether to prescribe. (2) The occurrence of an ABX-AE leads to reassessment of the clinical indication for antibiotic therapy. (3) The impact of an ABX-AE on other management decisions is as important as the direct harm of the ABX-AE. (4) ABX-AEs may be overlooked because of limited feedback regarding the occurrence of ABX-AEs. (5) Clinicians are receptive to feedback regarding ABX-AEs but are concerned about it being punitive. (6) Feedback must be curated to prevent clinicians from being overwhelmed with data. Clinicians generally agreed with the prespecified categorizations of ABX-AEs by degree of clinical concern. Conclusions: The themes identified and assessment of ABX-AEs of greatest clinical concern may inform antibiotic stewardship initiatives that incorporate reporting of ABX-AEs as a strategy to reduce unnecessary antibiotic use.

3.
AIDS Behav ; 28(3): 886-897, 2024 Mar.
Article in English | MEDLINE | ID: mdl-37789236

ABSTRACT

The COVID-19 pandemic has been reported to disrupt the access to care of people who live with HIV (PWH). The impact of the pandemic on the longitudinal HIV care continuum, however, has not been properly evaluated. We performed a mixed-methods study using data from the Mexican System of Distribution, Logistics, and ART Surveillance on PWH that are cared for in the state of Oaxaca. We evaluated the number of HIV diagnoses performed in the state before and during the pandemic with an interrupted time series. We used the longitudinal HIV care continuum framework to describe the stages of HIV care before and during the pandemic. Finally, we performed a qualitative analysis to determine which were the challenges faced by staff and users regarding HIV care during the pandemic. New HIV diagnoses were lower during the first year of the pandemic compared with the year immediately before. Among 2682 PWH with enough information to determine their status of care, 728 started receiving care during the COVID-19 pandemic and 1954 before the pandemic. PWH engaged before the pandemic spent 42825 months (58.2% of follow-up) in optimal HIV control compared with 3061 months (56.1% of follow-up) for those engaged in care during the pandemic. Staff and users reported decreases in the frequency of appointments, prioritisation of unhealthy users, larger disbursements of ART medication, and novel communication strategies with PWH. Despite challenges due to government cutbacks, changes implemented by staff helped maintain HIV care due to higher flexibility in ART delivery and individualised attention.


Subject(s)
COVID-19 , HIV Infections , Humans , COVID-19/epidemiology , Mexico/epidemiology , Pandemics , HIV Infections/drug therapy , HIV Infections/epidemiology , Continuity of Patient Care
4.
PLoS Negl Trop Dis ; 17(12): e0011498, 2023 Dec.
Article in English | MEDLINE | ID: mdl-38157376

ABSTRACT

BACKGROUND: Chagas disease, caused by the parasite Trypanosoma cruzi, is a neglected infectious disease that exerts the highest public health burden in the Americas. There are two anti-parasitic drugs approved for its treatment-benznidazole and nifurtimox-but the absence of biomarkers to early assess treatment efficacy hinders patients´ follow-up. METHODOLOGY/PRINCIPAL FINDINGS: We conducted a longitudinal, observational study among a cohort of 106 chronically T. cruzi-infected patients in Cochabamba (Bolivia) who completed the recommended treatment of benznidazole. Participants were followed-up for five years, in which we collected clinical and serological data, including yearly electrocardiograms and optical density readouts from two ELISAs (total and recombinant antigens). Descriptive and statistical analyses were performed to understand trends in data, as well as the relationship between clinical symptoms and serological evolution after treatment. Our results showed that both ELISAs documented average declines up to year three and slight inclines for the following two years. The recorded clinical parameters indicated that most patients did not have any significant changes to their cardiac or digestive symptoms after treatment, at least in the timeframe under investigation, while a small percentage demonstrated either a regression or progression in symptoms. Only one participant met the "cure criterion" of a negative serological readout for both ELISAs by the final year. CONCLUSIONS/SIGNIFICANCE: The study confirms that follow-up of benznidazole-treated T. cruzi-infected patients should be longer than five years to determine, with current tools, if they are cured. In terms of serological evolution, the single use of a total antigen ELISA might be a more reliable measure and suffice to address infection status, at least in the region of Bolivia where the study was done. Additional work is needed to develop a test-of-cure for an early assessment of drugs´ efficacy with the aim of improving case management protocols.


Subject(s)
Chagas Disease , Nitroimidazoles , Trypanocidal Agents , Trypanosoma cruzi , Humans , Bolivia , Chagas Disease/parasitology , Nitroimidazoles/therapeutic use , Trypanocidal Agents/therapeutic use , Chronic Disease
5.
Open Forum Infect Dis ; 10(6): ofad264, 2023 Jun.
Article in English | MEDLINE | ID: mdl-37383251

ABSTRACT

Background: The burden of vancomycin-associated acute kidney injury (V-AKI) is unclear because it is not systematically monitored. The objective of this study was to develop and validate an electronic algorithm to identify cases of V-AKI and to determine its incidence. Methods: Adults and children admitted to 1 of 5 health system hospitals from January 2018 to December 2019 who received at least 1 dose of intravenous (IV) vancomycin were included. A subset of charts was reviewed using a V-AKI assessment framework to classify cases as unlikely, possible, or probable events. Based on review, an electronic algorithm was developed and then validated using another subset of charts. Percentage agreement and kappa coefficients were calculated. Sensitivity and specificity were determined at various cutoffs, using chart review as the reference standard. For courses ≥48 hours, the incidence of possible or probable V-AKI events was assessed. Results: The algorithm was developed using 494 cases and validated using 200 cases. The percentage agreement between the electronic algorithm and chart review was 92.5% and the weighted kappa was 0.95. The electronic algorithm was 89.7% sensitive and 98.2% specific in detecting possible or probable V-AKI events. For the 11 073 courses of ≥48 hours of vancomycin among 8963 patients, the incidence of possible or probable V-AKI events was 14.0%; the V-AKI incidence rate was 22.8 per 1000 days of IV vancomycin therapy. Conclusions: An electronic algorithm demonstrated substantial agreement with chart review and had excellent sensitivity and specificity in detecting possible or probable V-AKI events. The electronic algorithm may be useful for informing future interventions to reduce V-AKI.

6.
Article in English | MEDLINE | ID: mdl-37113198

ABSTRACT

Objectives: Access to patient information may affect how home-infusion surveillance staff identify central-line-associated bloodstream infections (CLABSIs). We characterized information hazards in home-infusion CLABSI surveillance and identified possible strategies to mitigate information hazards. Design: Qualitative study using semistructured interviews. Setting and participants: The study included 21 clinical staff members involved in CLABSI surveillance at 5 large home-infusion agencies covering 13 states and the District of Columbia. Methods: Interviews were conducted by 1 researcher. Transcripts were coded by 2 researchers; consensus was reached by discussion. Results: Data revealed the following barriers: information overload, information underload, information scatter, information conflict, and erroneous information. Respondents identified 5 strategies to mitigate information chaos: (1) engage information technology in developing reports; (2) develop streamlined processes for acquiring and sharing data among staff; (3) enable staff access to hospital electronic health records; (4) use a single, validated, home-infusion CLABSI surveillance definition; and (5) develop relationships between home-infusion surveillance staff and inpatient healthcare workers. Conclusions: Information chaos occurs in home-infusion CLABSI surveillance and may affect the development of accurate CLABSI rates in home-infusion therapy. Implementing strategies to minimize information chaos will enhance intra- and interteam collaborations in addition to improving patient-related outcomes.

7.
Infect Control Hosp Epidemiol ; 44(8): 1358-1360, 2023 08.
Article in English | MEDLINE | ID: mdl-37114417

ABSTRACT

Exposure investigations are labor intensive and vulnerable to recall bias. We developed an algorithm to identify healthcare personnel (HCP) interactions from the electronic health record (EHR), and we evaluated its accuracy against conventional exposure investigations. The EHR algorithm identified every known transmission and used ranking to produce a manageable contact list.


Subject(s)
Electronic Health Records , Health Personnel , Humans , Attitude of Health Personnel
8.
Infect Control Hosp Epidemiol ; 44(11): 1748-1759, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37078467

ABSTRACT

OBJECTIVE: Central-line-associated bloodstream infection (CLABSI) surveillance in home infusion therapy is necessary to track efforts to reduce infections, but a standardized, validated, and feasible definition is lacking. We tested the validity of a home-infusion CLABSI surveillance definition and the feasibility and acceptability of its implementation. DESIGN: Mixed-methods study including validation of CLABSI cases and semistructured interviews with staff applying these approaches. SETTING: This study was conducted in 5 large home-infusion agencies in a CLABSI prevention collaborative across 14 states and the District of Columbia. PARTICIPANTS: Staff performing home-infusion CLABSI surveillance. METHODS: From May 2021 to May 2022, agencies implemented a home-infusion CLABSI surveillance definition, using 3 approaches to secondary bloodstream infections (BSIs): National Healthcare Safety Program (NHSN) criteria, modified NHSN criteria (only applying the 4 most common NHSN-defined secondary BSIs), and all home-infusion-onset bacteremia (HiOB). Data on all positive blood cultures were sent to an infection preventionist for validation. Surveillance staff underwent semistructured interviews focused on their perceptions of the definition 1 and 3-4 months after implementation. RESULTS: Interrater reliability scores overall ranged from κ = 0.65 for the modified NHSN criteria to κ = 0.68 for the NHSN criteria to κ = 0.72 for the HiOB criteria. For the NHSN criteria, the agency-determined rate was 0.21 per 1,000 central-line (CL) days, and the validator-determined rate was 0.20 per 1,000 CL days. Overall, implementing a standardized definition was thought to be a positive change that would be generalizable and feasible though time-consuming and labor intensive. CONCLUSIONS: The home-infusion CLABSI surveillance definition was valid and feasible to implement.


Subject(s)
Bacteremia , Catheter-Related Infections , Catheterization, Central Venous , Cross Infection , Sepsis , Humans , Cross Infection/epidemiology , Catheter-Related Infections/diagnosis , Catheter-Related Infections/epidemiology , Catheter-Related Infections/prevention & control , Reproducibility of Results , Sepsis/epidemiology , Bacteremia/diagnosis , Bacteremia/epidemiology , Bacteremia/prevention & control , Catheterization, Central Venous/adverse effects
9.
Am J Infect Control ; 51(5): 594-596, 2023 05.
Article in English | MEDLINE | ID: mdl-36642577

ABSTRACT

Infection prevention and surveillance training approaches for home infusion therapy have not been well defined. We interviewed home infusion staff who perform surveillance activities about barriers to and facilitators for central line-associated bloodstream infection (CLABSI) surveillance and identified barriers to training in CLABSI surveillance. Our findings show a lack of formal surveillance training for staff. This gap can be addressed by adapting existing training resources to the home infusion setting.


Subject(s)
Catheter-Related Infections , Catheterization, Central Venous , Cross Infection , Home Infusion Therapy , Humans , Catheter-Related Infections/prevention & control , Cross Infection/prevention & control
10.
Pathogens ; 11(12)2022 Dec 16.
Article in English | MEDLINE | ID: mdl-36558885

ABSTRACT

The elderly are understudied despite their high risk of tuberculosis (TB). We sought to identify factors underlying the lack of an association between TB and type 2 diabetes (T2D) in the elderly, but not adults. We conducted a case-control study in elderly (≥65 years old; ELD) vs. younger adults (young/middle-aged adults (18-44/45-64 years old; YA|MAA) stratified by TB and T2D, using a research study population (n = 1160) and TB surveillance data (n = 8783). In the research study population the adjusted odds ratio (AOR) of TB in T2D was highest in young adults (AOR 6.48) but waned with age becoming non-significant in the elderly. Findings were validated using TB surveillance data. T2D in the elderly (vs. T2D in younger individuals) was characterized by better glucose control (e.g., lower hyperglycemia or HbA1c), lower insulin resistance, more sulphonylureas use, and features of less inflammation (e.g., lower obesity, neutrophils, platelets, anti-inflammatory use). We posit that differences underlying glucose dysregulation and inflammation in elderly vs. younger adults with T2D, contribute to their differential association with TB. Studies in the elderly provide valuable insights into TB-T2D pathogenesis, e.g., here we identified insulin resistance as a novel candidate mechanism by which T2D may increase active TB risk.

11.
Sci Rep ; 12(1): 16769, 2022 10 06.
Article in English | MEDLINE | ID: mdl-36202891

ABSTRACT

A large area of the terrestrial land surface is used for livestock grazing. Trees on grazing lands provide and can enhance multiple ecosystem services such as provisioning, cultural and regulating, that include carbon sequestration. In this study, we assessed the above- and belowground carbon stocks across six different land-uses in livestock-dominated landscapes of Mexico. We measured tree biomass and soil organic carbon (SOC) stocks in fodder banks, live fences, pasturelands with dispersed trees, secondary forests, and primary forests from three different geographical regions and compared them with conventional open pasturelands respectively. We also calculated tree diversity indices for each land-use and their similarity with native primary forests. The aboveground woody biomass stocks differed significantly between land-uses and followed the gradient from less diverse conventional open pasturelands to silvopastoral systems and ecologically complex primary forests. The SOC stocks showed a differential response to the land-use gradient dependent on the study region. Multivariate analyses showed that woody biomass, fine root biomass, and SOC concentrations were positively related, while land-use history and soil bulk density showed an inverse relationship to these variables. Silvopastoral systems and forest remnants stored 27-163% more carbon compared to open pasturelands. Our results demonstrate the importance of promoting appropriate silvopastoral systems and conserving forest remnants within livestock-dominated landscapes as a land-based carbon mitigation strategy. Furthermore, our findings also have important implications to help better manage livestock-dominated landscapes and minimize pressures on natural protected areas and biodiversity in the hotspots of deforestation for grassland expansion.


Subject(s)
Carbon , Ecosystem , Animals , Biomass , Carbon/analysis , Carbon Sequestration , Forests , Livestock , Mexico , Soil , Trees
12.
Jt Comm J Qual Patient Saf ; 48(9): 468-474, 2022 09.
Article in English | MEDLINE | ID: mdl-35850954

ABSTRACT

BACKGROUND: Patients discharged to the home on home-based outpatient parenteral antimicrobial therapy (OPAT) perform their own infusions and catheter care; thus, they require high-quality training to improve safety and the likelihood of treatment success. This article describes the study team's experience piloting an educational toolkit for patients on home-based OPAT. METHODS: An OPAT toolkit was developed to address barriers such as unclear communication channels, rushed instruction, safe bathing with an intravenous (IV) catheter, and lack of standardized instructions. The research team evaluated the toolkit through interviews with home infusion nurses implementing the intervention, surveys of 20 patients who received the intervention, and five observations of the home infusion nurses delivering the intervention to patients and caregivers. RESULTS: Of surveyed patients, 90.0% were comfortable infusing medications at the time of discharge, and 80.0% with bathing with the IV catheter. While all practiced on equipment, 75.0% used the videos and the paper checklists. Almost all (95.0%) were satisfied with their training, and all were satisfied with managing their IV catheters at home. The videos were considered very helpful, particularly as reference. Overall, nurses adjusted training to patient characteristics and modified the toolkit over time. Shorter instruction forms were more helpful than longer instruction forms. CONCLUSION: Developing a toolkit to improve the education of patients on home-based OPAT has the potential to improve the safety of and experience with home-based OPAT.


Subject(s)
Anti-Infective Agents , Outpatients , Ambulatory Care , Anti-Bacterial Agents , Humans , Infusions, Parenteral , Patient Discharge
13.
Proc Natl Acad Sci U S A ; 119(15): e2119959119, 2022 04 12.
Article in English | MEDLINE | ID: mdl-35377782

ABSTRACT

Biodiversity-mediated ecosystem services (ES) support human well-being, but their values are typically estimated individually. Although ES are part of complex socioecological systems, we know surprisingly little about how multiple ES interact ecologically and economically. Interactions could be positive (synergy), negative (trade-offs), or absent (additive effects), with strong implications for management and valuation. Here, we evaluate the interactions of two ES, pollination and pest control, via a factorial field experiment in 30 Costa Rican coffee farms. We found synergistic interactions between these two critical ES to crop production. The combined positive effects of birds and bees on fruit set, fruit weight, and fruit weight uniformity were greater than their individual effects. This represents experimental evidence at realistic farm scales of positive interactions among ES in agricultural systems. These synergies suggest that assessments of individual ES may underestimate the benefits biodiversity provides to agriculture and human well-being. Using our experimental results, we demonstrate that bird pest control and bee pollination services translate directly into monetary benefits to coffee farmers. Excluding both birds and bees resulted in an average yield reduction of 24.7% (equivalent to losing US$1,066.00/ha). These findings highlight that habitat enhancements to support native biodiversity can have multiple benefits for coffee, a valuable crop that supports rural livelihoods worldwide. Accounting for potential interactions among ES is essential to quantifying their combined ecological and economic value.


Subject(s)
Coffee , Crop Production , Pest Control , Pollination , Biodiversity
14.
Am J Infect Control ; 50(5): 555-562, 2022 05.
Article in English | MEDLINE | ID: mdl-35341660

ABSTRACT

BACKGROUND: Barriers for home infusion therapy central line associated bloodstream infection (CLABSI) surveillance have not been elucidated and are needed to identify how to support home infusion CLABSI surveillance. We aimed to (1) perform a goal-directed task analysis of home infusion CLABSI surveillance, and (2) describe barriers to, facilitators for, and suggested strategies for successful home infusion CLABSI surveillance. METHODS: We conducted semi-structured interviews with team members involved in CLABSI surveillance at 5 large home infusion agencies to explore work systems used by members of the agency for home infusion CLABSI surveillance. We analyzed the transcribed interviews qualitatively for themes. RESULTS: Twenty-one interviews revealed 8 steps for performing CLABSI surveillance in home infusion therapy. Major barriers identified included the need for training of the surveillance staff, lack of a standardized definition, inadequate information technology support, struggles communicating with hospitals, inadequate time, and insufficient clinician engagement and leadership support. DISCUSSION: Staff performing home infusion CLABSI surveillance need health system resources, particularly leadership and front-line engagement, access to data, information technology support, training, dedicated time, and reports to perform tasks. CONCLUSIONS: Building home infusion CLABSI surveillance programs will require support from home infusion leadership.


Subject(s)
Catheter-Related Infections , Catheterization, Central Venous , Cross Infection , Home Infusion Therapy , Sepsis , Catheter-Related Infections/epidemiology , Catheter-Related Infections/prevention & control , Humans , Leadership
15.
Ecol Lett ; 25(3): 581-597, 2022 Mar.
Article in English | MEDLINE | ID: mdl-35199922

ABSTRACT

Functional traits offer a rich quantitative framework for developing and testing theories in evolutionary biology, ecology and ecosystem science. However, the potential of functional traits to drive theoretical advances and refine models of global change can only be fully realised when species-level information is complete. Here we present the AVONET dataset containing comprehensive functional trait data for all birds, including six ecological variables, 11 continuous morphological traits, and information on range size and location. Raw morphological measurements are presented from 90,020 individuals of 11,009 extant bird species sampled from 181 countries. These data are also summarised as species averages in three taxonomic formats, allowing integration with a global phylogeny, geographical range maps, IUCN Red List data and the eBird citizen science database. The AVONET dataset provides the most detailed picture of continuous trait variation for any major radiation of organisms, offering a global template for testing hypotheses and exploring the evolutionary origins, structure and functioning of biodiversity.


Subject(s)
Birds , Ecosystem , Animals , Biodiversity , Biological Evolution , Humans , Phylogeny
16.
J Environ Manage ; 310: 114717, 2022 May 15.
Article in English | MEDLINE | ID: mdl-35217445

ABSTRACT

Degradation, fragmentation, and loss of tropical forests has exponentially increased in the last decades leading to unprecedented rates of species extinctions and loss of ecosystems functions and services. Forest restoration is key to recover ecosystems health and achieve the UN Sustainable Development Goals. However, restoring forests at the landscape scale presents many challenges, since it requires balancing conservation goals and economic development. In this study, we used a spatial planning tool (Marxan) to identify priority areas for restoration satisfying multiple objectives across a biological corridor in Costa Rica. Biological corridors are critical conservation instruments promoting forest connectivity while acknowledging human presence. Increasing forest connectivity requires restoration initiatives that will likely conflict with other land uses, some of them of high national economic importance. Our restoration plan sought to maximize the provision of forest-related services (i.e., seed dispersal, tourism and carbon storage) while minimizing the impact on current land uses and thus avoiding potential conflicts. We quantified seed dispersal and tourism services (birdwatching potential) using species distribution models. We used the carbon sequestration model of InVEST to quantify carbon storage potential. We tested different restoration scenarios that differed in whether land opportunity costs of current uses were considered or not when identifying potential restoration areas, or how these costs were estimated. We showed how a landscape-scale forest restoration plan accounting for only forest connectivity and ecosystem service provision capacity can greatly differ from a plan that considers the potential impacts on local livelihoods. Spatial planning tools can assist at designing cost-effective landscape-scale forest restoration plans, identifying priority areas where forest restoration can maximize ecosystem provision and increase forest connectivity. Special care must be paid to the use of adequate estimates of opportunity cost, to avoid potential conflicts between restoration goals and other legitimate land uses.


Subject(s)
Ecosystem , Sustainable Development , Biodiversity , Carbon Sequestration , Conservation of Natural Resources , Costa Rica , Forests , Humans
17.
J Am Geriatr Soc ; 70(3): 659-668, 2022 03.
Article in English | MEDLINE | ID: mdl-35038344

ABSTRACT

BACKGROUND: SARS-CoV-2 circulating variants coupled with waning immunity pose a significant threat to the long-term care (LTC) population. Our objective was to measure salivary IgG antibodies in residents and staff of an LTC facility to (1) evaluate IgG response in saliva post-natural infection and vaccination and (2) assess its feasibility to describe the seroprevalence over time. METHODS: We performed salivary IgG sampling of all residents and staff who agreed to test in a 150-bed skilled nursing facility during three seroprevalence surveys between October 2020 and February 2021. The facility had SARS-CoV-2 outbreaks in May 2020 and November 2020, when 45 of 138 and 37 of 125 residents were infected, respectively; they offered two Federal vaccine clinics in January 2021. We evaluated quantitative IgG in saliva to the Nucleocapsid (N), Spike (S), and Receptor-binding domain (RBD) Antigens of SARS-CoV-2 over time post-infection and post-vaccination. RESULTS: One hundred twenty-four residents and 28 staff underwent saliva serologic testing on one or more survey visits. Over three surveys, the SARS-CoV-2 seroprevalence at the facility was 49%, 64%, and 81%, respectively. IgG to S, RBD, and N Antigens all increased post infection. Post vaccination, the infection naïve group did not have a detectable N IgG level, and N IgG levels for the previously infected did not increase post vaccination (p < 0.001). Fully vaccinated subjects with prior COVID-19 infection had significantly higher RBD and S IgG responses compared with those who were infection-naïve prior to vaccination (p < 0.001 for both). CONCLUSIONS: Positive SARS-COV-2 IgG in saliva was concordant with prior infection (Anti N, S, RBD) and vaccination (Anti S, RBD) and remained above positivity threshold for up to 9 months from infection. Salivary sampling is a non-invasive method of tracking immunity and differentiating between prior infection and vaccination to inform the need for boosters in LTC residents and staff.


Subject(s)
Antibodies, Viral/immunology , COVID-19 Vaccines/immunology , COVID-19/immunology , COVID-19/prevention & control , Immunoglobulin G/immunology , Saliva/immunology , Aged , COVID-19/epidemiology , COVID-19 Vaccines/administration & dosage , Female , Humans , Male , Nursing Homes , SARS-CoV-2 , Seroepidemiologic Studies , United States/epidemiology
18.
Infect Control Hosp Epidemiol ; 43(4): 474-480, 2022 04.
Article in English | MEDLINE | ID: mdl-33823950

ABSTRACT

BACKGROUND: Physical distancing among healthcare workers (HCWs) is an essential strategy in preventing HCW-to-HCWs transmission of severe acute respiratory coronavirus virus 2 (SARS-CoV-2). OBJECTIVE: To understand barriers to physical distancing among HCWs on an inpatient unit and identify strategies for improvement. DESIGN: Qualitative study including observations and semistructured interviews conducted over 3 months. SETTING: A non-COVID-19 adult general medical unit in an academic tertiary-care hospital. PARTICIPANTS: HCWs based on the unit. METHODS: We performed a qualitative study in which we (1) observed HCW activities and proximity to each other on the unit during weekday shifts July-October 2020 and (2) conducted semi-structured interviews of HCWs to understand their experiences with and perspectives of physical distancing in the hospital. Qualitative data were coded based on a human-factors engineering model. RESULTS: We completed 25 hours of observations and 20 HCW interviews. High-risk interactions often occurred during handoffs of care at shift changes and patient rounds, when HCWs gathered regularly in close proximity for at least 15 minutes. Identified barriers included spacing and availability of computers, the need to communicate confidential patient information, and the desire to maintain relationships at work. CONCLUSIONS: Physical distancing can be improved in hospitals by restructuring computer workstations, work rooms, and break rooms; applying visible cognitive aids; adapting shift times; and supporting rounds and meetings with virtual conferencing. Additional strategies to promote staff adherence to physical distancing include rewarding positive behaviors, having peer leaders model physical distancing, and encouraging additional safe avenues for social connection at a safe distance.


Subject(s)
COVID-19 , Pandemics , Adult , COVID-19/prevention & control , Health Personnel , Hospital Units , Humans , Pandemics/prevention & control , Physical Distancing , SARS-CoV-2
19.
Infect Control Hosp Epidemiol ; 43(12): 1790-1795, 2022 12.
Article in English | MEDLINE | ID: mdl-34903308

ABSTRACT

BACKGROUND: Healthcare workers (HCWs) not adhering to physical distancing recommendations is a risk factor for acquisition of severe acute respiratory coronavirus virus 2 (SARS-CoV-2). The study objective was to assess the impact of interventions to improve HCW physical distancing on actual distance between HCWs in a real-life setting. METHODS: HCWs voluntarily wore proximity beacons to measure the number and intensity of physical distancing interactions between each other in a pediatric intensive care unit. We compared interactions before and after implementing a bundle of interventions including changes to the layout of workstations, cognitive aids, and individual feedback from wearable proximity beacons. RESULTS: Overall, we recorded 10,788 interactions within 6 feet (∼2 m) and lasting >5 seconds. The number of HCWs wearing beacons fluctuated daily and increased over the study period. On average, 13 beacons were worn daily (32% of possible staff; range, 2-32 per day). We recorded 3,218 interactions before the interventions and 7,570 interactions after the interventions began. Using regression analysis accounting for the maximum number of potential interactions if all staff had worn beacons on a given day, there was a 1% decline in the number of interactions per possible interactions in the postintervention period (incident rate ratio, 0.99; 95% confidence interval, 0.98-1.00; P = .02) with fewer interactions occurring at nursing stations, in workrooms and during morning rounds. CONCLUSIONS: Using quantitative data from wearable proximity beacons, we found an overall small decline in interactions within 6 feet between HCWs in a busy intensive care unit after a multifaceted bundle of interventions was implemented to improve physical distancing.


Subject(s)
COVID-19 , SARS-CoV-2 , Child , Humans , Physical Distancing , COVID-19/prevention & control , Health Personnel , Intensive Care Units, Pediatric
20.
JAMIA Open ; 4(4): ooab095, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34926997

ABSTRACT

OBJECTIVE: Despite the importance of physical distancing in reducing SARS-CoV-2 transmission, this practice is challenging in healthcare. We piloted use of wearable proximity beacons among healthcare workers (HCWs) in an inpatient unit to highlight considerations for future use of trackable technologies in healthcare settings. MATERIALS AND METHODS: We performed a feasibility pilot study in a non-COVID adult medical unit from September 28 to October 28, 2020. HCWs wore wearable proximity beacons, and interactions defined as <6 feet for ≥5 s were recorded. Validation was performed using direct observations. RESULTS: A total of 6172 close proximity interactions were recorded, and with the removal of 2033 false-positive interactions, 4139 remained. The highest proportion of interactions occurred between 7:00 Am-9:00 Am. Direct observations of HCWs substantiated these findings. DISCUSSION: This pilot study showed that wearable beacons can be used to monitor and quantify HCW interactions in inpatient settings. CONCLUSION: Technology can be used to track HCW physical distancing.

SELECTION OF CITATIONS
SEARCH DETAIL
...