Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 26
Filter
1.
Vaccine ; 42(3): 535-540, 2024 Jan 25.
Article in English | MEDLINE | ID: mdl-38199921

ABSTRACT

MVA-BN is an orthopoxvirus vaccine that provides protection against both smallpox and mpox. In June 2022, Canada launched a publicly-funded vaccination campaign to offer MVA-BN to at-risk populations including men who have sex with men (MSM) and sex workers. The safety of MVA-BN has not been assessed in this context. To address this, the Canadian National Vaccine Safety Network (CANVAS) conducted prospective safety surveillance during public health vaccination campaigns in Toronto, Ontario and in Vancouver, British Columbia. Vaccinated participants received a survey 7 and 30 days after each MVA-BN dose to elicit adverse health events. Unvaccinated individuals from a concurrent vaccine safety project evaluating COVID-19 vaccine safety were used as controls. Vaccinated and unvaccinated participants that reported a medically attended visit on their 7-day survey were interviewed. Vaccinated participants and unvaccinated controls were matched 1:1 based on age group, gender, sex and provincial study site. Overall, 1,173 vaccinated participants completed a 7-day survey, of whom 75 % (n = 878) also completed a 30-day survey. Mild to moderate injection site pain was reported by 60 % of vaccinated participants. Among vaccinated participants 8.4 % were HIV positive and when compared to HIV negative vaccinated individuals, local injection sites were less frequent in those with HIV (48 % vs 61 %, p = 0.021), but health events preventing work/school or requiring medical assessment were more frequent (7.1 % vs 3.1 %, p = 0.040). Health events interfering with work/school, or requiring medical assessment were less common in the vaccinated group than controls (3.3 % vs. 7.1 %, p < 0.010). No participants were hospitalized within 7 or 30 days of vaccination. No cases of severe neurological disease, skin disease, or myocarditis were identified. Our results demonstrate that the MVA-BN vaccine appears safe when used for mpox prevention, with a low frequency of severe adverse events and no hospitalizations observed.


Subject(s)
HIV Infections , Mpox (monkeypox) , Sexual and Gender Minorities , Smallpox Vaccine , Humans , Male , British Columbia , Homosexuality, Male , Immunization , Prospective Studies , Risk Factors , Smallpox Vaccine/adverse effects , Vaccination/adverse effects , Vaccines, Attenuated
2.
Nat Commun ; 15(1): 63, 2024 01 02.
Article in English | MEDLINE | ID: mdl-38167404

ABSTRACT

Avapritinib is the only potent and selective inhibitor approved for the treatment of D842V-mutant gastrointestinal stromal tumors (GIST), the most common primary mutation of the platelet-derived growth factor receptor α (PDGFRA). The approval was based on the NAVIGATOR trial, which revealed overall response rates of more than 90%. Despite this transformational activity, patients eventually progress, mostly due to acquired resistance mutations or following discontinuation due to neuro-cognitive side effects. These patients have no therapeutic alternative and face a dismal prognosis. Notable, little is known about this drug's binding mode and its medicinal chemistry development, which is instrumental for the development of the next generation of drugs. Against this background, we solve the crystal structures of avapritinib in complex with wild-type and mutant PDGFRA and stem cell factor receptor (KIT), which provide evidence and understanding of inhibitor binding and lead to the identification of a sub-pocket (Gα-pocket). We utilize this information to design, synthesize and characterize avapritinib derivatives for the determination of key pharmacophoric features to overcome drug resistance and limit potential blood-brain barrier penetration.


Subject(s)
Antineoplastic Agents , Gastrointestinal Stromal Tumors , Humans , Receptor, Platelet-Derived Growth Factor alpha/genetics , Receptor, Platelet-Derived Growth Factor alpha/metabolism , Gastrointestinal Stromal Tumors/drug therapy , Gastrointestinal Stromal Tumors/genetics , Gastrointestinal Stromal Tumors/pathology , Pyrazoles/therapeutic use , Pyrroles/pharmacology , Pyrroles/therapeutic use , Mutation , Proto-Oncogene Proteins c-kit/genetics , Antineoplastic Agents/pharmacology
3.
Resuscitation ; 195: 110087, 2024 Feb.
Article in English | MEDLINE | ID: mdl-38097108

ABSTRACT

Standardized reporting of data is crucial for out-of-hospital cardiac arrest (OHCA) research. While the implementation of first responder systems dispatching volunteers to OHCA is encouraged, there is currently no uniform reporting standard for describing these systems. A steering committee established a literature search to identify experts in smartphone alerting systems. These international experts were invited to a conference held in Hinterzarten, Germany, with 40 researchers from 13 countries in attendance. Prior to the conference, participants submitted proposals for parameters to be included in the reporting standard. The conference comprised five workshops covering different aspects of smartphone alerting systems. Proposed parameters were discussed, clarified, and consensus was achieved using the Nominal Group Technique. Participants voted in a modified Delphi approach on including each category as a core or supplementary element in the reporting standard. Results were presented, and a writing group developed definitions for all categories and items, which were sent to participants for revision and final voting using LimeSurvey web-based software. The resulting reporting standard consists of 68 core items and 21 supplementary items grouped into five topics (first responder system, first responder network, technology/algorithm/strategies, reporting data, and automated external defibrillators (AED)). This proposed reporting standard generated by an expert opinion group fills the gap in describing first responder systems. Its adoption in future research will facilitate comparison of systems and research outcomes, enhancing the transfer of scientific findings to clinical practice.


Subject(s)
Cardiopulmonary Resuscitation , Emergency Responders , Out-of-Hospital Cardiac Arrest , Humans , Smartphone , Cardiopulmonary Resuscitation/methods , Defibrillators , Out-of-Hospital Cardiac Arrest/therapy
4.
J Hosp Infect ; 106(4): 820-827, 2020 12.
Article in English | MEDLINE | ID: mdl-32916210

ABSTRACT

BACKGROUND: Hospital drains may be an important reservoir for carbapenemase-producing Enterobacterales (CPE). AIM: To determine prevalence of CPE in hospital drains exposed to inpatients with CPE, relatedness of drain and patient CPE, and risk factors for drain contamination. METHODS: Sink and shower drains in patient rooms and communal shower rooms exposed to 310 inpatients with CPE colonization/infection were cultured at 10 hospitals. Using short- and long-read whole-genome sequencing, inpatient and corresponding drain CPE were compared. Risk factors for drain contamination were assessed using multi-level modelling. FINDINGS: Of 1209 exposed patient room and communal shower room drains, 53 (4%) yielded 62 CPE isolates in seven (70%) hospitals. Of 49 CPE isolates in patient room drains, four (8%) were linked to prior room occupants. Linked drain/room occupant pairs included Citrobacter freundii ST18 isolates separated by eight single nucleotide variants (SNVs), related blaKPC-containing IncN3-type plasmids (different species), related blaKPC-3-containing IncN-type plasmids (different species), and related blaOXA-48-containing IncL/M-type plasmids (different species). In one hospital, drain isolates from eight rooms on two units were Enterobacter hormaechei separated by 0-6 SNVs. Shower drains were more likely to be CPE-contaminated than hand hygiene (odds ratio: 3.45; 95% confidence interval: 1.66-7.16) or patient-use (13.0; 4.29-39.1) sink drains. Hand hygiene sink drains were more likely to be CPE-contaminated than patient-use sink drains (3.75; 1.17-12.0). CONCLUSION: Drain contamination was uncommon but widely dispersed. Drain CPE unrelated to patient exposure suggests contamination by undetected colonized patients or retrograde (drain-to-drain) contamination. Drain types had different contamination risks.


Subject(s)
Enterobacter/isolation & purification , Equipment Contamination , Hospitals , Patients' Rooms , Water Supply , Bacterial Proteins , Drug Resistance, Bacterial , Enterobacteriaceae Infections/prevention & control , Humans , Ontario , beta-Lactamases
5.
J Hosp Infect ; 104(4): 513-521, 2020 Apr.
Article in English | MEDLINE | ID: mdl-31954763

ABSTRACT

BACKGROUND: Viral respiratory illnesses are common causes of outbreaks and can be fatal to some patients. AIM: To investigate the association between laboratory-confirmed viral respiratory infections and potential sources of exposure during the previous 7 days. METHODS: In this nested case-control analysis, healthcare personnel from nine Canadian hospitals who developed acute respiratory illnesses during the winters of 2010/11-2013/14 submitted swabs that were tested for viral pathogens. Associated illness diaries and the weekly diaries of non-ill participants provided information on contact with people displaying symptoms of acute respiratory illness in the previous week. Conditional logistic regression assessed the association between cases, who were matched by study week and site with controls with no respiratory symptoms. FINDINGS: There were 814 laboratory-confirmed viral respiratory illnesses. The adjusted odds ratio (aOR) of a viral illness was higher for healthcare personnel reporting exposures to ill household members [7.0, 95% confidence interval (CI) 5.4-9.1], co-workers (3.4, 95% CI 2.4-4.7) or other social contacts (5.1, 95% CI 3.6-7.1). Exposures to patients with respiratory illness were not associated with infection (aOR 0.9, 95% CI 0.7-1.2); however, healthcare personnel with direct patient contact did have higher odds (aOR 1.3, 95% CI 1.1-1.6). The aORs for exposure and for direct patient contact were similar for illnesses caused by influenza. CONCLUSION: Community and co-worker contacts are important sources of viral respiratory illness in healthcare personnel, while exposure to patients with recognized respiratory infections is not associated. The comparatively low risk associated with direct patient contact may reflect transmission related to asymptomatic patients or unrecognized infections.


Subject(s)
Cross Infection/epidemiology , Cross Infection/virology , Respiratory Tract Infections/epidemiology , Respiratory Tract Infections/virology , Virus Diseases/epidemiology , Adult , Aged , Canada/epidemiology , Case-Control Studies , Female , Health Personnel , Hospitals , Humans , Influenza, Human/epidemiology , Male , Middle Aged , Risk Factors , Surveys and Questionnaires , Young Adult
7.
J Thromb Haemost ; 15(10): 2005-2016, 2017 10.
Article in English | MEDLINE | ID: mdl-28782177

ABSTRACT

Essentials Membrane-binding GLA domains of coagulation factors are essential for proper clot formation. Factor X (FX) is specific to phosphatidylserine (PS) lipids through unknown atomic-level interactions. Molecular dynamics simulations were used to develop the first membrane-bound model of FX-GLA. PS binding modes of FX-GLA were described, and potential PS-specific binding sites identified. SUMMARY: Background Factor X (FX) binds to cell membranes in a highly phospholipid-dependent manner and, in complex with tissue factor and factor VIIa (FVIIa), initiates the clotting cascade. Experimental information concerning the membrane-bound structure of FX with atomic resolution has remained elusive because of the fluid nature of cellular membranes. FX is known to bind preferentially to phosphatidylserine (PS). Objectives To develop the first membrane-bound model of the FX-GLA domain to PS at atomic level, and to identify PS-specific binding sites of the FX-GLA domain. Methods Molecular dynamics (MD) simulations were performed to develop an atomic-level model for the FX-GLA domain bound to PS bilayers. We utilized a membrane representation with enhanced lipid mobility, termed the highly mobile membrane mimetic (HMMM), permitting spontaneous membrane binding and insertion by FX-GLA in multiple 100-ns simulations. In 14 independent simulations, FX-GLA bound spontaneously to the membrane. The resulting membrane-bound models were converted from HMMM to conventional membrane and simulated for an additional 100 ns. Results The final membrane-bound FX-GLA model allowed for detailed characterization of the orientation, insertion depth and lipid interactions of the domain, providing insight into the molecular basis of its PS specificity. All binding simulations converged to the same configuration despite differing initial orientations. Conclusions Analysis of interactions between residues in FX-GLA and lipid-charged groups allowed for potential PS-specific binding sites to be identified. This new structural and dynamic information provides an additional step towards a full understanding of the role of atomic-level lipid-protein interactions in regulating the critical and complex clotting cascade.


Subject(s)
Cell Membrane/metabolism , Factor X/metabolism , Phosphatidylserines/metabolism , 1-Carboxyglutamic Acid/metabolism , Animals , Binding Sites , Cattle , Factor X/chemistry , Kinetics , Molecular Docking Simulation , Phosphatidylserines/chemistry , Protein Binding , Protein Interaction Domains and Motifs , Structure-Activity Relationship
8.
J Dairy Sci ; 100(3): 1987-2006, 2017 Mar.
Article in English | MEDLINE | ID: mdl-28109604

ABSTRACT

Over the last decades, a dramatic decrease in reproductive performance has been observed in Holstein cattle and fertility problems have become the most common reason for a cow to leave the herd. The premature removal of animals with high breeding values results in both economic and breeding losses. For efficient future Holstein breeding, the identification of loci associated with low fertility is of major interest and thus constitutes the aim of this study. To reach this aim, a genome-wide combined linkage disequilibrium and linkage analysis (cLDLA) was conducted using data on the following 10 calving and fertility traits in the form of estimated breeding values: days from first service to conception of heifers and cows, nonreturn rate on d 56 of heifers and cows, days from calving to first insemination, days open, paternal and maternal calving ease, paternal and maternal stillbirth. The animal data set contained 2,527 daughter-proven Holstein bulls from Germany that were genotyped with Illumina's BovineSNP50 BeadChip (Illumina Inc., San Diego, CA). For the cLDLA, 41,635 sliding windows of 40 adjacent single nucleotide polymorphisms (SNP) were used. At each window midpoint, a variance component analysis was executed using ASReml. The underlying mixed linear model included random quantitative trait locus (QTL) and polygenic effects. We identified 50 genome-wide significant QTL. The most significant peak was detected for direct calving ease at 59,179,424 bp on chromosome 18 (BTA18). Next, a mixed-linear model association (MLMA) analysis was conducted. A comparison of the cLDLA and MLMA results with special regard to BTA18 showed that the genome-wide most significant SNP from the MLMA was associated with the same trait and located on the same chromosome at 57,589,121 bp (i.e., about 1.5 Mb apart from the cLDLA peak). The results of 5 different cLDLA and 2 MLMA models, which included the fixed effects of either SNP or haplotypes, suggested that the cLDLA method outperformed the MLMA in accuracy and precision. The haplotype-based cLDLA method allowed for a more precise mapping and the definition of ancestral and derived QTL alleles, both of which are essential for the detection of underlying quantitative trait nucleotides.


Subject(s)
Chromosome Mapping , Chromosomes, Mammalian , Animals , Breeding , Cattle , Female , Fertility/genetics , Linkage Disequilibrium , Male , Polymorphism, Single Nucleotide , Quantitative Trait Loci
9.
J Evol Biol ; 30(1): 112-127, 2017 Jan.
Article in English | MEDLINE | ID: mdl-27747987

ABSTRACT

A long-standing debate concerns whether nectar sugar composition evolves as an adaptation to pollinator dietary requirements or whether it is 'phylogenetically constrained'. Here, we use a modelling approach to evaluate the hypothesis that nectar sucrose proportion (NSP) is an adaptation to pollinators. We analyse ~ 2100 species of asterids, spanning several plant families and pollinator groups (PGs), and show that the hypothesis of adaptation cannot be rejected: NSP evolves towards two optimal values, high NSP for specialist-pollinated and low NSP for generalist-pollinated plants. However, the inferred adaptive process is weak, suggesting that adaptation to PG only provides a partial explanation for how nectar evolves. Additional factors are therefore needed to fully explain nectar evolution, and we suggest that future studies might incorporate floral shape and size and the abiotic environment into the analytical framework. Further, we show that NSP and PG evolution are correlated - in a manner dictated by pollinator behaviour. This contrasts with the view that a plant necessarily has to adapt its nectar composition to ensure pollination but rather suggests that pollinators adapt their foraging behaviour or dietary requirements to the nectar sugar composition presented by the plants. Finally, we document unexpectedly sucrose-poor nectar in some specialized nectarivorous bird-pollinated plants from the Old World, which might represent an overlooked form of pollinator deception. Thus, our broad study provides several new insights into how nectar evolves and we conclude by discussing why maintaining the conceptual dichotomy between adaptation and constraint might be unhelpful for advancing this field.


Subject(s)
Flowers , Plant Nectar/chemistry , Sucrose/analysis , Adaptation, Physiological , Animals , Birds , Pollination
10.
Methods Enzymol ; 578: 373-428, 2016.
Article in English | MEDLINE | ID: mdl-27497175

ABSTRACT

Membrane transporters mediate one of the most fundamental processes in biology. They are the main gatekeepers controlling active traffic of materials in a highly selective and regulated manner between different cellular compartments demarcated by biological membranes. At the heart of the mechanism of membrane transporters lie protein conformational changes of diverse forms and magnitudes, which closely mediate critical aspects of the transport process, most importantly the coordinated motions of remotely located gating elements and their tight coupling to chemical processes such as binding, unbinding and translocation of transported substrate and cotransported ions, ATP binding and hydrolysis, and other molecular events fueling uphill transport of the cargo. An increasing number of functional studies have established the active participation of lipids and other components of biological membranes in the function of transporters and other membrane proteins, often acting as major signaling and regulating elements. Understanding the mechanistic details of these molecular processes require methods that offer high spatial and temporal resolutions. Computational modeling and simulations technologies empowered by advanced sampling and free energy calculations have reached a sufficiently mature state to become an indispensable component of mechanistic studies of membrane transporters in their natural environment of the membrane. In this article, we provide an overview of a number of major computational protocols and techniques commonly used in membrane transporter modeling and simulation studies. The article also includes practical hints on effective use of these methods, critical perspectives on their strengths and weak points, and examples of their successful applications to membrane transporters, selected from the research performed in our own laboratory.


Subject(s)
ATP Binding Cassette Transporter, Subfamily B, Member 1/chemistry , Cell Membrane/chemistry , Lipid Bilayers/chemistry , Membrane Transport Proteins/chemistry , Molecular Dynamics Simulation , Binding Sites , Biological Transport , Escherichia coli/chemistry , Escherichia coli/metabolism , Humans , Molecular Docking Simulation , Protein Binding , Protein Conformation , Static Electricity , Substrate Specificity , Thermodynamics
11.
Am J Infect Control ; 44(11): 1346-1349, 2016 11 01.
Article in English | MEDLINE | ID: mdl-27158086

ABSTRACT

BACKGROUND: Both hospital admissions and patient isolation increase during influenza season. Influenza testing methodologies that reduce turnaround time (TAT) could reduce time in isolation. METHODS: We assessed the impact of a new influenza test on TAT and isolation days. TAT and daily mean isolation days were compared at a single hospital over 2 influenza seasons. An automated real-time reverse-transcription polymerase chain reaction assay (rRT-PCR) with random access replaced a conventional rRT-PCR assay for the second influenza season. Automation and random access allowed continuous testing, rather than once daily testing 3-5 d/wk. RESULTS: Confirmed influenza cases (57 vs 68) and total patient days (66,308 vs. 66,366) were similar for the 2012-2013 and 2013-2014 influenza seasons. TAT fell from 35 to 3.6 hours. Daily mean isolation days (32.9 vs 27.7, P < .01) fell, as did days in contact precautions (25.0 vs 19.8, P < .01) and droplet precautions (6.0 vs 3.5, P < .01). Although daily mean droplet precaution days for confirmed influenza rose slightly (0.86 vs 1.1, P = .16), droplet precaution days for suspected influenza fell 85% (2.7 vs 0.41, P < .001). CONCLUSIONS: Influenza testing technology that reduced TAT from days to hours resulted in a 42% reduction in droplet precaution days and reduced overall isolation days during influenza season.


Subject(s)
Diagnostic Tests, Routine/methods , Influenza, Human/diagnosis , Molecular Diagnostic Techniques/methods , Patient Isolation , Real-Time Polymerase Chain Reaction/methods , Humans
12.
J Hosp Infect ; 92(1): 7-13, 2016 Jan.
Article in English | MEDLINE | ID: mdl-26601608

ABSTRACT

Contamination of the healthcare environment with pathogenic organisms contributes to the burden of healthcare-associated infection (HCAI). Antimicrobial surfaces are designed to reduce microbial contamination of healthcare surfaces. We aimed to determine whether antimicrobial surfaces prevent HCAI, transmission of antibiotic-resistant organisms (AROs), or microbial contamination, we conducted a systematic review of the use of antimicrobial surfaces in patient rooms. Outcomes included HCAI, ARO, and quantitative microbial contamination. Relevant databases were searched. Abstract review, full text review, and data abstraction were performed in duplicate. Risk of bias was assessed using the Cochrane Effective Practice and Organization Care (EPOC) Group risk of bias assessment tool and the strength of evidence determined using Grading of Recommendations Assessment, Development and Evaluation (GRADE). Eleven studies assessed the effect of copper (N = 7), silver (N = 1), metal-alloy (N = 1), or organosilane-treated surfaces (N = 2) on microbial contamination. Copper surfaces demonstrated a median (range) reduction of microbial contamination of <1 log10 (<1 to 2 log10). Two studies addressed HCAI/ARO incidence. An RCT of copper surfaces in an ICU demonstrated 58% reduction in HCAI (P = 0.013) and 64% reduction in ARO transmission (P = 0.063) but was considered low-quality evidence due to improper randomization and incomplete blinding. An uncontrolled before-after study evaluating copper-impregnated textiles in a long-term care ward demonstrated 24% reduction in HCAI but was considered very-low-quality evidence based on the study design. Copper surfaces used in clinical settings result in modest reductions in microbial contamination. One study of copper surfaces and one of copper textiles demonstrated reduction in HCAI, but both were at high risk of bias.


Subject(s)
Cross Infection/prevention & control , Disease Transmission, Infectious/prevention & control , Disinfectants/administration & dosage , Disinfection/methods , Environmental Microbiology , Surface Properties , Humans
13.
J Hosp Infect ; 92(2): 161-6, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26679727

ABSTRACT

BACKGROUND: Audit and feedback programmes (AFPs) using fluorescent marking lead to improvements in room cleaning but have not been linked to reduced Clostridium difficile infection (CDI) incidence. AIM: To evaluate the impact of an AFP on hospital-acquired CDI incidence. METHODS: In 2012, a hospital-wide AFP was implemented. Fluorescent marking of high-touch surfaces was used to assess discharge cleaning thoroughness. Weekly audit results were presented to cleaning staff. Interrupted time-series analysis was used to test for changes in the trend and level of hospital-acquired CDI incidence between the pre-intervention (January 2008 to December 2011) and post-intervention (April 2012 to June 2015) periods. FINDINGS: In all, 1002 audits were performed and room cleaning thoroughness improved from 49% to 90%. Hospital-acquired CDI incidence fell from 54 to 42 cases per 100,000 patient-days following the intervention whereas non-hospital-acquired CDI incidence rose from 43 to 52 cases per 100,000 patient-days, although both exhibited a downward trend post intervention. Time-series analysis showed that hospital-acquired CDI incidence was declining at a rate of 0.59 cases per 100,000 patient-days per quarter before the intervention. Following programme implementation, the rate of decline accelerated by an additional 1.35 cases per 100,000 patient-days per quarter (P < 0.05). Hand hygiene compliance increased minimally post intervention. CONCLUSION: Implementation of an AFP using fluorescent marking resulted in improved thoroughness of room cleaning and appeared to result in an enhanced downward trend in CDI incidence, although part of this decline could be due to changes in local CDI epidemiology or improved hand hygiene.


Subject(s)
Clostridioides difficile/isolation & purification , Clostridium Infections/epidemiology , Cross Infection/epidemiology , Diarrhea/epidemiology , Housekeeping, Hospital/methods , Infection Control/methods , Clostridium Infections/microbiology , Clostridium Infections/prevention & control , Cross Infection/microbiology , Cross Infection/prevention & control , Diarrhea/microbiology , Diarrhea/prevention & control , Feedback , Health Services Research , Humans , Incidence , Staining and Labeling/methods
14.
Anaesthesist ; 64(4): 261-70, 2015 Apr.
Article in German | MEDLINE | ID: mdl-25893579

ABSTRACT

BACKGROUND: Approximately 18 million patients are treated in German hospitals annually. On the basis of internationally published data the number of in-hospital cardiac arrests can be estimated as 54,000 per year. A structured treatment of in-hospital resuscitation according to the current scientific evidence is essential. AIM: In-hospital resuscitation shows some special characteristics in comparison to resuscitation in emergency services, which are highlighted in this article. MATERIAL AND METHODS: This article is based on the international guidelines for cardiopulmonary resuscitation (CPR) first published in 1992 by the European Resuscitation Council (ERC) and the American Heart Association (AHA) as well as the amendments (current version 2010). Some current studies are also presented, which could not be taken into consideration for the guidelines from 2010. RESULTS: High quality chest compressions with as few interruptions as possible are of utmost importance. Patients with cardiac rhythms which can be defibrillated should be defibrillated within less than 2 min after the collapse. There is no evidence that equipping hospitals with automated external defibrillators is an advantage for survival after in-hospital cardiac arrest. Endotracheal intubation represents the gold standard of airway management during CPR. During in-hospital resuscitation experienced anesthesiologists are mostly involved; however, the use of supraglottic airway devices may help to minimize interruptions in chest compressions especially before the medical emergency team arrives at the scene. Feedback devices may improve the quality of manual chest compressions; however, most devices overestimate the compression depth if the patient is resuscitated when lying in bed. There is no evidence that mechanical chest compression devices improve the outcome after cardiac arrest. Mild therapeutic hypothermia is still recommended for neuroprotection after successful in-hospital resuscitation. CONCLUSION: The prevention of cardiac arrest is of special importance. Uniform and low threshold criteria for alarming the medical emergency team have to be defined to be able to identify and treat critically ill patients in time before cardiac arrest occurs.


Subject(s)
Cardiopulmonary Resuscitation/standards , Heart Arrest/therapy , Airway Management , Electric Countershock , Guidelines as Topic , Heart Arrest/epidemiology , Heart Arrest/mortality , Hospital Mortality , Humans , Treatment Outcome
15.
Anaesthesist ; 64(3): 190-6, 2015 Mar.
Article in English | MEDLINE | ID: mdl-25757552

ABSTRACT

BACKGROUND: Hypotensive states that require fast stabilisation of blood pressure can occur during anaesthesia. In 1963, the 20:1 mixture of cafedrine/theodrenaline (Akrinor) was introduced in Germany for use in anaesthesia and emergency medicine in the first-line management of hypotensive states. Though on the market for many years, few pharmacodynamic data are available on this combination net beta-mimetic agent. AIM: This study aimed to examine the drug combination in real-life clinical practice and recorded time to 10 % mean arterial blood pressure (MAP) increase and heart rate. Furthermore, potential factors that influence drug effectiveness under anaesthesia were assessed. METHODS: Data were collected within a standardised anaesthesia protocol. A total of 353 consecutive patients (female/male = 149/204) who received cafedrine/theodrenaline after a drop in MAP ≥ 5% were included in the study. The time to 10 % increase in MAP, dosage of cafedrine/theodrenaline, volume loading, blood pressure and heart rate were monitored over time. RESULTS: Patients were a mean (standard deviation) of 64.4 ± 15.1 years old with a baseline MAP of 82 ± 14 mmHg, which dropped to a mean of 63 ± 10 mmHg during anaesthesia without gender differences. Cafedrine/theodrenaline (1.27 ± 1.0 mg/kg; 64 ± 50 µg/kg) significantly increased MAP (p < 0.001) by 11 ± 16 mmHg within 5 min, reaching peak values within 17.4 ± 9.0 min. Heart rate was not affected in a clinically significant manner. Cafedrine/theodrenaline induced a 10% MAP increase after 7.2 ± 4.6 min (women) and after 8.6 ± 6.3 min (men) (p = 0.018). Independent of gender, the dose of cafedrine/theodrenaline required to achieve the observed MAP increase of 14 ± 16 mmHg at 15 min was significantly different in patients with heart failure [1.78 ± 1.67 mg/kg (cafedrine)/89.0 ± 83.5 µg/kg (theodrenaline)] compared with healthy patients [1.16 ± 0.77 mg/kg (cafedrine)/58.0 ± 38.5 µg/kg (theodrenaline)] (p = 0.005). Concomitant medication with beta-blocking agents significantly prolonged the time to 10 % MAP increase [9.0 ± 7.0 vs. 7.3 ± 4.3 min (p = 0.008)]. CONCLUSION: Cafedrine/theodrenaline quickly restores MAP during anaesthesia. Female gender is associated with higher effectiveness, while heart failure and beta-blocker administration lower the anti-hypotonic effect. Prospective studies in defined patient populations are warranted to further characterise the effect of cafedrine/theodrenaline.


Subject(s)
Anesthesia/methods , Cardiovascular Agents/therapeutic use , Hypotension/prevention & control , Intraoperative Care/methods , Theophylline/analogs & derivatives , Adrenergic beta-Antagonists/adverse effects , Adult , Aged , Blood Pressure/drug effects , Drug Combinations , Drug Interactions , Female , Heart Failure/complications , Heart Rate/drug effects , Humans , Kaplan-Meier Estimate , Male , Middle Aged , Retrospective Studies , Sex Characteristics , Theophylline/therapeutic use
16.
J Hosp Infect ; 89(1): 51-60, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25480021

ABSTRACT

Electronic and video monitoring systems (EMS/VMS) may improve hand hygiene by providing feedback, real-time reminders or via the Hawthorne effect. The aim of this systematic review was to assess the efficacy of EMS/VMS in improving hand hygiene or reducing the incidence of healthcare-associated infection (HCAI). Experimental and quasi-experimental studies were included if they measured any hand hygiene outcome and/or HCAI incidence. Of the studies included, seven used system-defined compliance (SDC) (N = 6) or hand hygiene event rate (N = 1) as their outcome. SDC differed for all systems. Most (N = 6) were single ward studies. Two uncontrolled pretest‒post-test studies evaluating EMS that provided voice prompts showed increases in SDC, but risk of bias was high. Two uncontrolled time-series analyses of VMS that provided aggregate feedback demonstrated large, sustained improvement in SDC and were at moderate risk of bias. One non-randomized controlled trial of EMS with aggregate feedback found no difference in hand hygiene frequency but was at high risk of bias. Two studies evaluated EMS providing individual feedback and real-time reminders. A pretest‒post-test study at high risk of bias showed an increase in SDC. An RCT at low risk of bias showed 6.8% higher SDC in the intervention arm partially due to a fall in SDC in the control arm. In conclusion, the overall study quality was poor. The study at lowest risk of bias showed only a small increase in SDC. VMS studies at moderate risk of bias showed rapid and sustained increases in SDC. Data were insufficient to recommend EMS/VMS. Future studies should prioritize testing of VMS using stronger study designs including control arms and validated, system-independent measures of hand hygiene.


Subject(s)
Guideline Adherence , Hand Disinfection/methods , Hand Hygiene , Infection Control/methods , Infectious Disease Transmission, Professional-to-Patient/prevention & control , Health Personnel , Humans , Technology
17.
J Hosp Infect ; 83(4): 276-83, 2013 Apr.
Article in English | MEDLINE | ID: mdl-23415717

ABSTRACT

BACKGROUND: Predictors of hand-hygiene compliance have not been re-evaluated in the alcohol-based hand rinse (ABHR) era. AIM: To re-evaluate predictors of hand-hygiene compliance in the era of ABHR. METHODS: Hand-hygiene compliance was monitored at a Canadian teaching hospital for a period of two years using direct observation. Standardized definitions of compliance were used and potential predictors of compliance were recorded. A generalized linear mixed model was developed to evaluate the impact of predictors of hand-hygiene compliance while correcting for clustering. FINDINGS: We observed 7364 opportunities for hand hygiene among 3487 healthcare workers. Hand-hygiene compliance was 45% and did not vary over time. Predictors of improved compliance on multivariate analysis included the indication for hand hygiene with higher compliance seen after body fluid exposure (odds ratio: 4.7; 95% confidence interval: 3.7-6.1) and after patient contact (3.9; 3.5-4.4) compared with hand hygiene prior to patient contact. Glove use was associated with higher compliance (1.3; 1.1-1.4). A professional designation other than nurse or physician was associated with lower compliance (0.72; 0.61-0.86). The number of hand hygiene opportunities per hour was not associated with lower compliance. Higher ward level use of ABHR (vs use of soap/water) was associated with better compliance (P = 0.035). CONCLUSIONS: In the ABHR era a higher frequency of hand-hygiene opportunities is no longer the primary barrier to achieving optimal hand-hygiene compliance. However, heterogeneous use of ABHR by ward may still provide a target for improvement.


Subject(s)
Alcohols/administration & dosage , Disinfectants/administration & dosage , Guideline Adherence/statistics & numerical data , Hand Disinfection/methods , Attitude of Health Personnel , Canada , Hospitals, Teaching , Humans , Longitudinal Studies
18.
Unfallchirurg ; 116(7): 602-9, 2013 Jul.
Article in German | MEDLINE | ID: mdl-22367522

ABSTRACT

BACKGROUND: The implementation of ATLS® in the daily routine of trauma management in the emergency department is a challenge. This goal cannot be reached by educating ATLS® to a few team members only. In order to enforce the implementation of ATLS® in a level I trauma centre, a generic in-house training was introduced in 2009 with inter-professional integration of all specialists of the trauma team. MATERIALS AND METHODS: The TEAM® course (trauma evaluation and management concept of the American College of Surgeons) was the theoretical basis of the training. This educational program was developed for medical students and multidisciplinary team members. Prior training, a questionnaire for self-assessment was completed by n=84 team members to assess their knowledge about ATLS® principles. The hands-on training time was 90 min. N=10 members of the trauma team worked out three scenarios of multiple injured patients. These were provided as near-reality manikin simulations by a specialist trainer. After the training participants re-evaluated and analysed improvement by the training. Duration of trauma management and the number of missed injuries were analysed one year prior and one year after the training and served as a marker of the process and outcome quality of trauma care. RESULTS: Prior the training, 57% of trainees specified their knowledge related to the ATLS® can be improved. Their expectations were generally satisfied by the training. The mean time of trauma management in the ED could not be reduced one year after the training (36±16 min) compared to one year prior the training (39±18 min), however, the detection of missed injuries (5.6% vs. 3.2%, p<0.05) was significantly diminished after the training. CONCLUSION: Apart form education of ATLS® providers the inauguration of an interdisciplinary and interprofessionel team training may enhance implementation of ATLS- algorithms into daily routine.


Subject(s)
Education, Medical, Continuing/organization & administration , Leadership , Orthopedics/education , Orthopedics/organization & administration , Patient Care Team/organization & administration , Traumatology/education , Traumatology/organization & administration , Germany
19.
Phys Rev Lett ; 108(26): 260501, 2012 Jun 29.
Article in English | MEDLINE | ID: mdl-23004944

ABSTRACT

In this work, we show that very natural, apparently simple problems in quantum measurement theory can be undecidable even if their classical analogues are decidable. Undecidability hence appears as a genuine quantum property here. Formally, an undecidable problem is a decision problem for which one cannot construct a single algorithm that will always provide a correct answer in finite time. The problem we consider is to determine whether sequentially used identical Stern-Gerlach-type measurement devices, giving rise to a tree of possible outcomes, have outcomes that never occur. Finally, we point out implications for measurement-based quantum computing and studies of quantum many-body models and suggest that a plethora of problems may indeed be undecidable.

20.
Minerva Anestesiol ; 78(8): 901-9, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22504855

ABSTRACT

BACKGROUND: We developed a 1.5 days crew resource management (CRM) course on situation awareness (SA) to improve the participants' ability to recognise critical situations in crisis scenarios. Objective of the study was to evaluate the influence of the CRM course on SA and medical performance in crisis scenarios and to compare the results with the effects of a purely clinical simulator training. METHODS: Sixty-one final-year medical students, randomized into three groups, took part in a pre-intervention test scenario of septic shock in a patient simulator setting. Medical performance and SA were assessed using a checklist and the Situation Awareness Global Assessment Tool (SAGAT), respectively. All students received a lecture about the sepsis guidelines. The simulator (SIM) group took part in a 1.5-day simulator training on sepsis resuscitation. The CRM group took part in a course on situation awareness. The control group (CG) did not obtain any training. All students accomplished a post-intervention test scenario comparable to the pre-intervention scenario. RESULTS: The SAGAT score rose from 10.6±2.3 to 11.9±1.7 (preintervention vs. postintervention test, P=0.04) in the SIM group, whereas no significant changes could be shown in the CRM group and the control group, respectively. The clinical performance scores in the post-intervention test did not differ from those in the preintervention test. CONCLUSION: Neither the 1.5 days simulator training nor the 1.5 days CRM course did influence the clinical performance scores. SAGAT scores were higher after the simulator training, but not after the CRM training.


Subject(s)
Education, Medical/methods , Patient Care Team/organization & administration , Resuscitation/methods , Sepsis/therapy , Adult , Analysis of Variance , Clinical Competence , Computer Simulation , Critical Care , Curriculum , Female , Humans , Male , Manikins , Middle Aged , Patient Simulation , Prospective Studies , Students, Medical , Surveys and Questionnaires
SELECTION OF CITATIONS
SEARCH DETAIL
...