Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 95
Filter
1.
BMC Health Serv Res ; 23(1): 1012, 2023 Sep 20.
Article in English | MEDLINE | ID: mdl-37726731

ABSTRACT

BACKGROUND: The critical role that middle managers play in enacting organisational culture change designed to address unprofessional co-worker behaviours has gone largely unexplored. We aimed to explore middle managers' perspectives on i) whether they speak up when they or their team members experience unprofessional behaviours (UBs); ii) how concerns are handled; iii) the outcomes; and iv) the role of a professional accountability culture change program (known as Ethos) in driving change. METHODS: Qualitative, constructivist approach. Five metropolitan hospitals in Australia which had implemented Ethos. Purposive sampling was used to invite middle-level managers from medicine, nursing, and non-clinical support services. Semi-structured interviews conducted remotely. Inductive, reflexive thematic and descriptive thematic analyses undertaken using NVivo. RESULTS: Thirty interviews (approximately 60 min; August 2020 to May 2021): Nursing (n = 12), Support Services (n = 10), and Medical (n = 8) staff, working in public (n = 18) and private (n = 12) hospitals. One-third (n = 10) had a formal role in Ethos. All middle managers (hearers) had experienced the raising of UBs by their team (speakers). Themes representing reasons for ongoing UBs were: staying silent but active; history and hierarchy; and double-edged swords. The Ethos program was valued as a confidential, informal, non-punitive system but required improvements in profile and effectiveness. Participants described four response stages: i) determining if reports were genuine; ii) taking action depending on the speaker's preference, behaviour factors (type, frequency, impact), if the person was known/unknown; iii) exploring for additional information; and iv) addressing either indirectly (e.g., change rosters) or directly (e.g., become a speaker). CONCLUSIONS: Addressing UBs requires an organisational-level approach beyond supporting staff to speak up, to include those hearing and addressing UBs. We propose a new hearer's model that details middle managers' processes after a concern is raised, identifying where action can be taken to minimise avoidant behaviours to improve hospital culture, staff and patient safety.


Subject(s)
Hospitals, Urban , Medicine , Humans , Australia , Social Responsibility , Professional Misconduct
2.
BMC Health Serv Res ; 20(1): 883, 2020 Sep 18.
Article in English | MEDLINE | ID: mdl-32948168

ABSTRACT

BACKGROUND: Internationally, point prevalence surveys are the main source of antibiotic use data in residential aged care (RAC). Our objective was to describe temporal trends in antibiotic use and antibiotics flagged for restricted use, resident characteristics associated with use, and variation in use by RAC home, using electronic health record data. METHODS: We conducted a retrospective cohort study of 9793 unique residents aged ≥65 years in 68 RAC homes between September 2014 and September 2017, using electronic health records. We modelled the primary outcome of days of antibiotic therapy /1000 resident days (DOT/1000 days), and secondary outcomes of number of courses/1000 days and the annual prevalence of antibiotic use. Antibiotic use was examined for all antibiotics and antibiotics on the World Health Organization's (WHO) Watch List (i.e. antibiotics flagged for restricted use). RESULTS: In 2017, there were 85 DOT/1000 days (99% CI: 79, 92), 8.0 courses/1000 days (99% CI: 7.6, 8.5), and 63.4% (99% CI: 61.9, 65.0) of residents received at least one course of antibiotics. There were 7.7 DOT/1000 days (99% CI: 6.69, 8.77) of antibiotics on the WHO Watch List administered in 2017. Antibiotic use increased annually by 4.09 DOT/1000 days (99% CI: 1.18, 6.99) before adjusting for resident factors, and 3.12 DOT/1000 days (99% CI: - 0.05, 6.29) after adjustment. Annual prevalence of antibiotic use decreased from 68.4% (99% CI: 66.9, 69.9) in 2015 to 63.4% (99% CI: 61.9, 65.0) in 2017, suggesting fewer residents were on antibiotics, but using them for longer. Resident factors associated with higher use were increasing age; chronic respiratory disease; a history of urinary tract infections, and skin and soft tissue infections; but dementia was associated with lower use. RAC home level antibiotic use ranged between 44.0 to 169.2 DOT/1000 days in 2016. Adjusting for resident factors marginally reduced this range (42.6 to 155.5 DOT/1000 days). CONCLUSIONS: Antibiotic course length and RAC homes with high use should be a focus of antimicrobial stewardship interventions. Practices in RAC homes with low use could inform interventions and warrant further investigation. This study provides a model for using electronic health records as a data source for antibiotic use surveillance in RAC.


Subject(s)
Anti-Bacterial Agents/therapeutic use , Drug Utilization/statistics & numerical data , Electronic Health Records , Homes for the Aged/statistics & numerical data , Nursing Homes/statistics & numerical data , Aged , Aged, 80 and over , Antimicrobial Stewardship/statistics & numerical data , Australia , Female , Humans , Male , Retrospective Studies , Urinary Tract Infections/drug therapy
3.
Eur J Clin Pharmacol ; 74(1): 15-27, 2018 Jan.
Article in English | MEDLINE | ID: mdl-29058038

ABSTRACT

PURPOSE: Drug-drug interactions (DDIs) are often avoidable and, if undetected, can lead to patient harm. This review aimed to determine the prevalence of potential DDIs (pDDIs), clinically relevant DDIs (DDIs that could lead to measurable patient harm, taking into account the patient's individual clinical profile) and DDIs that resulted in actual patient harm during hospitalisation. METHOD: Four databases were scanned for English papers published from 2000 to 2016. Papers that reported prevalence of DDIs in the outpatient setting, at admission or discharge, involving only specific drugs, or in specific disease populations or age groups were excluded. RESULTS: Twenty-seven papers met the inclusion criteria and were graded for quality using the Critical Appraisal Skills Programme (CASP) cohort study checklist. Ten papers were rated as 'poor', 14 as 'fair' and only three papers as 'good'. Overall, the meta-analysis revealed that 33% of general patients and 67% of intensive care patients experienced a pDDI during their hospital stay. It was not possible to determine the prevalence of clinically relevant DDIs or DDIs that resulted in actual patient harm as data on these categories were limited. Of the very few studies that reported on harm, only a small proportion of DDIs were found to have resulted in actual patient harm. CONCLUSIONS: Standardisation of DDI definitions and research methods are required to allow meaningful prevalence rates to be obtained and compared. Studies that go further than measuring pDDIs are critically needed to determine the impact of DDIs on patient safety.


Subject(s)
Drug Interactions , Drug-Related Side Effects and Adverse Reactions , Inpatients , Medication Errors/statistics & numerical data , Drug-Related Side Effects and Adverse Reactions/epidemiology , Drug-Related Side Effects and Adverse Reactions/etiology , Hospitalization , Humans , Inpatients/statistics & numerical data , Medication Errors/prevention & control , Patient Safety , Prevalence
4.
Int J Med Inform ; 105: 22-30, 2017 09.
Article in English | MEDLINE | ID: mdl-28750908

ABSTRACT

OBJECTIVES: To assess the evidence of the effectiveness of different categories of interruptive medication prescribing alerts to change prescriber behavior and/or improve patient outcomes in hospital computerized provider order entry (CPOE) systems. METHODS: PubMed, Embase, CINAHL and the Cochrane Library were searched for relevant articles published between January 2000 and February 2016. Studies were included if they compared the outcomes of automatic, interruptive medication prescribing alert/s to a control/comparison group to determine alert effectiveness. RESULTS: Twenty-three studies describing 32 alerts classified into 11 alert categories were identified. The most common alert categories studied were drug-condition interaction (n=6), drug-drug interaction alerts (n=6) and corollary order alerts (n=6). All 23 papers investigated the effect of the intervention alert on at least one outcome measure of prescriber behavior. Just over half of the studies (53%, n=17) reported a statistically significant beneficial effect from the intervention alert; 34% (n=11) reported no statistically significant effect, and 6% (n=2) reported a significant detrimental effect. Two studies also evaluated the effect of alerts on patient outcome measures; neither finding that patient outcomes significantly improved following alert implementation (6%, n=2). The greatest volume of evidence relates to three alert categories: drug-condition, drug-drug and corollary order alerts. Of these, drug-condition alerts had the greatest number of studies reporting positive effects (five out of six studies). Only two of six studies of drug-drug interaction and one of six of corollary alerts reported positive benefits. DISCUSSION AND CONCLUSION: The current evidence-base does not show a clear indication that particular categories of alerts are more effective than others. While the majority of alert categories were shown to improve outcomes in some studies, there were also many cases where outcomes did not improve. This lack of evidence hinders decisions about the amount and type of decision support that should be integrated into CPOE systems to increase safety while reducing the risk of alert fatigue. Virtually no studies have sought to investigate the impact on changes to prescriber behavior and outcomes overall when alerts from multiple categories are incorporated within the same system.


Subject(s)
Clinical Alarms , Decision Support Systems, Clinical/standards , Medical Order Entry Systems/standards , Medication Errors/prevention & control , Patient Safety , Physicians/psychology , Drug Interactions , Humans , Reminder Systems
5.
Article in English | MEDLINE | ID: mdl-28352457

ABSTRACT

Many types of organisation are difficult to change, mainly due to structural, cultural and contextual barriers. Change in public hospitals is arguably even more problematic than in other types of hospitals, due to features such as structural dysfunctionalities and bureaucracy stemming from being publicly-run institutions. The main goals of this commentary are to bring into focus and highlight the "3 + 3 Decision Framework" proposed by Edwards and Saltman. This aims to help guide policymakers and managers implementing productive change in public hospitals. However, while change from the top is popular, there are powerful front-line clinicians, especially doctors, who can act to counterbalance top-down efforts. Front-line clinicians have cultural characteristics and power that allows them to influence or reject managerial decisions. Clinicians in various lower-level roles can also influence other clinicians to resist or ignore management requirements. The context is further complicated by multi-stakeholder agendas, differing goals, and accumulated inertia. The special status of clinicians, along with other system features of public hospitals, should be factored into efforts to realise major system improvements and progressive change.


Subject(s)
Hospitals, Public/standards , Organizational Innovation , Systems Analysis , Decision Support Techniques , Hospitals, Public/methods , Hospitals, Public/organization & administration , Humans
6.
BMJ Open ; 6(10): e011811, 2016 10 21.
Article in English | MEDLINE | ID: mdl-27797997

ABSTRACT

INTRODUCTION: Medication errors are the most frequent cause of preventable harm in hospitals. Medication management in paediatric patients is particularly complex and consequently potential for harms are greater than in adults. Electronic medication management (eMM) systems are heralded as a highly effective intervention to reduce adverse drug events (ADEs), yet internationally evidence of their effectiveness in paediatric populations is limited. This study will assess the effectiveness of an eMM system to reduce medication errors, ADEs and length of stay (LOS). The study will also investigate system impact on clinical work processes. METHODS AND ANALYSIS: A stepped-wedge cluster randomised controlled trial (SWCRCT) will measure changes pre-eMM and post-eMM system implementation in prescribing and medication administration error (MAE) rates, potential and actual ADEs, and average LOS. In stage 1, 8 wards within the first paediatric hospital will be randomised to receive the eMM system 1 week apart. In stage 2, the second paediatric hospital will randomise implementation of a modified eMM and outcomes will be assessed. Prescribing errors will be identified through record reviews, and MAEs through direct observation of nurses and record reviews. Actual and potential severity will be assigned. Outcomes will be assessed at the patient-level using mixed models, taking into account correlation of admissions within wards and multiple admissions for the same patient, with adjustment for potential confounders. Interviews and direct observation of clinicians will investigate the effects of the system on workflow. Data from site 1 will be used to develop improvements in the eMM and implemented at site 2, where the SWCRCT design will be repeated (stage 2). ETHICS AND DISSEMINATION: The research has been approved by the Human Research Ethics Committee of the Sydney Children's Hospitals Network and Macquarie University. Results will be reported through academic journals and seminar and conference presentations. TRIAL REGISTRATION NUMBER: Australian New Zealand Clinical Trials Registry (ANZCTR) 370325.


Subject(s)
Drug Monitoring/methods , Drug-Related Side Effects and Adverse Reactions/prevention & control , Electronics, Medical , Hospitals, Pediatric , Length of Stay , Medication Errors/prevention & control , Medication Systems, Hospital , Child , Humans , Pediatrics , Pharmaceutical Preparations , Research Design
7.
Stud Health Technol Inform ; 227: 74-9, 2016.
Article in English | MEDLINE | ID: mdl-27440292

ABSTRACT

Registered nurses providing telenursing triage and advice services record information on the medication related calls they handle. However the quality and consistency of these data were rarely examined. Our aim was to examine medication related calls made to the healthdirect advice service in November 2014, to assess their basic characteristics and how the data entry format influenced information collected and data consistency. Registered nurses selected the patient question type from a range of categories, and entered the medications involved in a free text field. Medication names were manually extracted from the free text fields. We also compared the selected patient question type with the free text description of the call, in order to gauge data consistency. Results showed that nurses provided patients with advice on medication-related queries in a timely matter (the median call duration of 9 minutes). From 1835 calls, we were able to identify and classify 2156 medications into 384 generic names. However, in 204 cases (11.2% of calls) no medication name was entered. A further 308 (15.0%) of the medication names entered were not identifiable. When we compared the selected patient question with the free text description of calls, we found that these were consistent in 63.27% of cases. Telenursing and triage advice services provide a valuable resource to the public with quick and easily accessible advice. To support nurses provide quality services and record accurate information about the queries, appropriate data entry format and design would be beneficial.


Subject(s)
Data Accuracy , Pharmaceutical Preparations/classification , Telenursing/standards , Australia , Humans , Nurses , Triage/statistics & numerical data
8.
Intern Med J ; 46(7): 819-25, 2016 Jul.
Article in English | MEDLINE | ID: mdl-27094756

ABSTRACT

BACKGROUND: Patients admitted to hospital on weekends have a greater risk of mortality compared to patients admitted on weekdays. Junior medical officers (JMO) make up the majority of medical staff on weekends. No previous study has quantified JMO work patterns on weekends. AIM: To describe and quantify JMO work patterns on weekends and compare them with patterns previously observed during the week. METHODS: Observational time and motion study of JMO working weekends using the Work Observation Method by Activity Timing (WOMBAT; Australian Institute of Health Innovation, Macquarie University, Sydney, NSW, Australia) software. Descriptive statistics were used to determine the proportion of total observed time spent in tasks. RESULTS: Weekend JMO predominately spent time in indirect care (32.0%), direct care (23.0%) and professional communication (22.1%). JMO spent 20.9% of time multitasking and were interrupted, on average, every 9 min. Weekend JMO spent significantly more time in direct care compared with weekdays (13.0%; P < 0.001) and nights (14.3%; P < 0.001). Weekend JMO spent significantly less time on breaks (8.5%), with less than 1 h in an 11-h shift, compared with JMO during weekdays (16.4%; P = 0.004) and nights (27.6%; P = <0.001). Weekend JMO were interrupted at a higher rate (6.6/h) than on weekdays (rate ratio (RR) 2.9, 95% confidence intervals (CI) 2.6, 3.3) or nights (RR 5.1, 95% CI 4.2, 6.1). Multitasking on weekends (20.9%) was comparable to weekdays (18.9%; P = 0.19) but significantly higher than nights (6.4%; P = <0.001). CONCLUSION: On weekends, JMO had few breaks, were interrupted frequently and engaged in high levels of multitasking. This pattern of JMO work could be a potential contributing factor to the weekend effect in terms of JMO abilities to respond safely and adequately to care demands.


Subject(s)
Delivery of Health Care/standards , Medical Staff, Hospital/statistics & numerical data , Practice Patterns, Physicians'/statistics & numerical data , Time and Motion Studies , Workload/statistics & numerical data , Adult , Australia , Communication , Female , Humans , Male , Regression Analysis , Young Adult
9.
Int J Nurs Stud ; 56: 9-16, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26775214

ABSTRACT

BACKGROUND: High profile safety failures have demonstrated that recognising early warning signs of clinical and physiological deterioration can prevent or reduce harm resulting from serious adverse events. Early warning scoring systems are now routinely used in many places to detect and escalate deteriorating patients. Timely and accurate vital signs monitoring are critical for ensuring patient safety through providing data for early warning scoring systems, but little is known about current monitoring practices. OBJECTIVE: To establish a profile of nurses' vital signs monitoring practices, related dialogue, and adherence to health service protocol in New South Wales, Australia. METHODS: Direct observations of nurses' working practices were conducted in two wards. The observations focused on times of the day when vital signs were generally measured. Patient interactions were recorded if occurring any time during the observation periods. Participants (n=42) included nursing staff on one chronic disease medical and one acute surgical ward in a large urban teaching hospital in New South Wales. RESULTS: We observed 441 patient interactions. Measurement of vital signs occurred in 52% of interactions. The minimum five vital signs measures required by New South Wales Health policy were taken in only 6-21% of instances of vital signs monitoring. Vital signs were documented immediately on 93% of vitals-taking occasions and documented according to the policy in the patient's chart on 89% of these occasions. Nurse-patient interactions were initiated for the purpose of taking vital signs in 49% of interactions, with nurse-patient discourse observed during 88% of all interactions. Nurse-patient dialogue led to additional care being provided to patients in 12% of interactions. CONCLUSION: The selection of appropriate vital signs measured and responses to these appears to rely on nurses' clinical judgement or time availability rather than on policy-mandated frequency. The prevalence of incomplete sets of vital signs may limit identification of deteriorating patients. The findings from this study present an important baseline profile against which to evaluate the impact of introducing continuous monitoring approaches on current hospital practice.


Subject(s)
Monitoring, Physiologic , Nurse-Patient Relations , Nursing Staff, Hospital , Vital Signs , Humans , New South Wales , Qualitative Research
10.
J R Soc Interface ; 13(114): 20150930, 2016 Jan.
Article in English | MEDLINE | ID: mdl-26763327

ABSTRACT

Leptosporangiate ferns have evolved an ingenious cavitation catapult to disperse their spores. The mechanism relies almost entirely on the annulus, a row of 12-25 cells, which successively: (i) stores energy by evaporation of the cells' content, (ii) triggers the catapult by internal cavitation, and (iii) controls the time scales of energy release to ensure efficient spore ejection. The confluence of these three biomechanical functions within the confines of a single structure suggests a level of sophistication that goes beyond most man-made devices where specific structures or parts rarely serve more than one function. Here, we study in detail the three phases of spore ejection in the sporangia of the fern Polypodium aureum. For each of these phases, we have written the governing equations and measured the key parameters. For the opening of the sporangium, we show that the structural design of the annulus is particularly well suited to inducing bending deformations in response to osmotic volume changes. Moreover, the measured parameters for the osmoelastic design lead to a near-optimal speed of spore ejection (approx. 10 m s(-1)). Our analysis of the trigger mechanism by cavitation points to a critical cavitation pressure of approximately -100 ± 14 bar, a value that matches the most negative pressures recorded in the xylem of plants. Finally, using high-speed imaging, we elucidated the physics leading to the sharp separation of time scales (30 versus 5000 µs) in the closing dynamics. Our results highlight the importance of the precise tuning of the parameters without which the function of the leptosporangium as a catapult would be severely compromised.


Subject(s)
Polypodium/anatomy & histology , Polypodium/physiology , Sporangia/anatomy & histology , Sporangia/physiology , Spores
11.
Int J Biometeorol ; 60(2): 255-67, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26045330

ABSTRACT

Fall armyworm, Spodoptera frugiperda (J.E. Smith), is a highly mobile insect pest of a wide range of host crops. However, this pest of tropical origin cannot survive extended periods of freezing temperature but must migrate northward each spring if it is to re-infest cropping areas in temperate regions. The northward limit of the winter-breeding region for North America extends to southern regions of Texas and Florida, but infestations are regularly reported as far north as Québec and Ontario provinces in Canada by the end of summer. Recent genetic analyses have characterized migratory pathways from these winter-breeding regions, but knowledge is lacking on the atmosphere's role in influencing the timing, distance, and direction of migratory flights. The Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model was used to simulate migratory flight of fall armyworm moths from distinct winter-breeding source areas. Model simulations identified regions of dominant immigration from the Florida and Texas source areas and overlapping immigrant populations in the Alabama-Georgia and Pennsylvania-Mid-Atlantic regions. This simulated migratory pattern corroborates a previous migratory map based on the distribution of fall armyworm haplotype profiles. We found a significant regression between the simulated first week of moth immigration and first week of moth capture (for locations which captured ≥ 10 moths), which on average indicated that the model simulated first immigration 2 weeks before first captures in pheromone traps. The results contribute to knowledge of fall armyworm population ecology on a continental scale and will aid in the prediction and interpretation of inter-annual variability of insect migration patterns including those in response to climatic change and adoption rates of transgenic cultivars.


Subject(s)
Animal Migration , Models, Theoretical , Spodoptera , Animals , Female , Male , Seasons , United States , Zea mays
12.
Appl Clin Inform ; 6(3): 443-53, 2015.
Article in English | MEDLINE | ID: mdl-26448790

ABSTRACT

OBJECTIVES: To assess the impact of introducing a new Picture Archiving and Communication System (PACS) and Radiology Information System (RIS) on: (i) Medical Imaging work processes; and (ii) turnaround times (TATs) for x-ray and CT scan orders initiated in the Emergency Department (ED). METHODS: We employed a mixed method study design comprising: (i) semi-structured interviews with Medical Imaging Department staff; and (ii) retrospectively extracted ED data before (March/April 2010) and after (March/April 2011 and 2012) the introduction of a new PACS/RIS. TATs were calculated as: processing TAT (median time from image ordering to examination) and reporting TAT (median time from examination to final report). RESULTS: Reporting TAT for x-rays decreased significantly after introduction of the new PACS/RIS; from a median of 76 hours to 38 hours per order (p<.0001) for patients discharged from the ED, and from 84 hours to 35 hours (p<.0001) for patients admitted to hospital. Medical Imaging staff reported that the changeover to the new PACS/RIS led to gains in efficiency, particularly regarding the accessibility of images and patient-related information. Nevertheless, assimilation of the new PACS/RIS with existing Departmental work processes was considered inadequate and in some instances unsafe. Issues highlighted related to the synchronization of work tasks (e.g., porter arrangements) and the material set up of the work place (e.g., the number and location of computers). CONCLUSIONS: The introduction of new health IT can be a "double-edged sword" providing improved efficiency but at the same time introducing potential hazards affecting the effectiveness of the Medical Imaging Department.


Subject(s)
Diagnostic Imaging , Medical Informatics/organization & administration , Radiology Information Systems , Workflow , Access to Information , Humans , Research Design , Time Factors
13.
Yearb Med Inform ; 10(1): 47-54, 2015 Aug 13.
Article in English | MEDLINE | ID: mdl-26293851

ABSTRACT

OBJECTIVES: To examine if human factors methods were applied in the design, development, and evaluation of mobile applications developed to facilitate aspects of patient-centered care coordination. METHODS: We searched MEDLINE and EMBASE (2013-2014) for studies describing the design or the evaluation of a mobile health application that aimed to support patients' active involvement in the coordination of their care. RESULTS: 34 papers met the inclusion criteria. Applications ranged from tools that supported self-management of specific conditions (e.g. asthma) to tools that provided coaching or education. Twelve of the 15 papers describing the design or development of an app reported the use of a human factors approach. The most frequently used methods were interviews and surveys, which often included an exploration of participants' current use of information technology. Sixteen papers described the evaluation of a patient application in practice. All of them adopted a human factors approach, typically an examination of the use of app features and/or surveys or interviews which enquired about patients' views of the effects of using the app on their behaviors (e.g. medication adherence), knowledge, and relationships with healthcare providers. No study in our review assessed the impact of mobile applications on health outcomes. CONCLUSION: The potential of mobile health applications to assist patients to more actively engage in the management of their care has resulted in a large number of applications being developed. Our review showed that human factors approaches are nearly always adopted to some extent in the design, development, and evaluation of mobile applications.


Subject(s)
Mobile Applications , Patient-Centered Care/organization & administration , Self Care , Humans , Patient Care Management
14.
Intern Med J ; 45(6): 609-17, 2015 Jun.
Article in English | MEDLINE | ID: mdl-25828553

ABSTRACT

BACKGROUND: Australia has a statutory incident reporting system for radiopharmaceutical maladministrations, but additional research into registry data is required for the purpose of quality improvement in nuclear medicine. AIMS: We (i) used control charts to identify factors contributing to special cause variation (indicating higher than expected rates) in maladministrations and (ii) evaluated the impact of heterogeneous notification criteria and extent of underreporting among jurisdictions and individual facilities, respectively. METHODS: Anonymised summaries of Australian Radiation Incident Register reports permitted calculation of national monthly maladministration notification rates for 2007-2012 and preparation of control charts. Multivariate logistic regression assessed the association of population, insurance and regulatory characteristics with maladministration notifications in each Australian State and Territory. Maladministration notification rates from two facilities with familiarity of notification processes and commitment to radiation protection were compared with those elsewhere. RESULTS: Special cause variation occurred in only 3 months, but contributed to 21% of all incidents (42 of 197 patients), mainly because of 'clusters' of maladministrations (n = 24) arising from errors in bulk radiopharmaceutical dispensing. Maladministration notification rates varied significantly between jurisdictions (0 to 12.2 maladministrations per 100 000 procedures (P < 0.05)) and individual facilities (31.7 vs 5.8 per 100 000; χ(2) = 40; 1 degree of freedom, P < 0.001). CONCLUSIONS: Unexpected increases in maladministration notifications predominantly relate to incident 'clusters' affecting multiple patients. The bulk preparation of radiopharmaceuticals is a vulnerable process and merits additional safeguards. Maladministration notification rates in Australia are heterogeneous. Adopting uniform maladministration notification criteria among States and Territories and methods to overcome underreporting are warranted.


Subject(s)
Medical Errors , Nuclear Medicine/standards , Quality Improvement/standards , Radiopharmaceuticals/adverse effects , Risk Management/standards , Australia/epidemiology , Female , Humans , Male , Medical Errors/legislation & jurisprudence , Medical Errors/prevention & control , Nuclear Medicine/legislation & jurisprudence , Quality Improvement/legislation & jurisprudence , Registries , Risk Management/legislation & jurisprudence
15.
Exp Neurol ; 263: 235-43, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25447937

ABSTRACT

Nitric oxide (NO) is a key signalling molecule in the regulation of cerebral blood flow. This review summarises current evidence regarding the role of NO in the regulation of cerebral blood flow at rest, under physiological conditions, and after brain injury, focusing on subarachnoid haemorrhage, traumatic brain injury, and ischaemic stroke and following cardiac arrest. We also review the role of NO in the response to hypoxic insult in the developing brain. NO depletion in ischaemic brain tissue plays a pivotal role in the development of subsequent morbidity and mortality through microcirculatory disturbance and disordered blood flow regulation. NO derived from endothelial nitric oxide synthase (eNOS) appears to have neuroprotective properties. However NO derived from inducible nitric oxide synthase (iNOS) may have neurotoxic effects. Cerebral NO donor agents, for example sodium nitrite, appear to replicate the effects of eNOS derived NO, and therefore have neuroprotective properties. This is true in both the adult and immature brain. We conclude that these agents should be further investigated as targeted pharmacotherapy to protect against secondary brain injury.


Subject(s)
Brain Injuries/metabolism , Cerebrovascular Circulation/physiology , Nitric Oxide/metabolism , Signal Transduction/physiology , Animals , Brain Injuries/pathology , Brain Injuries/physiopathology , Humans , Nitric Oxide Synthase/metabolism
16.
Intern Med J ; 44(10): 986-90, 2014 Oct.
Article in English | MEDLINE | ID: mdl-24989476

ABSTRACT

BACKGROUND: Previous work has examined the impact of technology on information sharing and communication between doctors and patients in general practice consultations, but very few studies have explored this in hospital settings. AIMS: To assess if, and how, senior clinicians use an iPad to share information (e.g. patient test results) with patients during ward rounds and to explore patients' and doctors' experiences of information sharing events. METHODS: Ten senior doctors were shadowed on ward rounds on general wards during interactions with 525 patients over 77.3 h, seven senior doctors were interviewed and 180 patients completed a short survey. RESULTS: Doctors reported that information sharing with patients is critical to the delivery of high-quality healthcare, but were not seen to use the iPad to share information with patients on ward rounds. Patients did not think the iPad had impacted on their engagement with doctors on rounds. Ward rounds were observed to follow set routines and patient interactions were brief. CONCLUSIONS: Although the iPad potentially creates new opportunities for information sharing and patient engagement, the ward round may not present the most appropriate context for this to be done.


Subject(s)
Computers, Handheld/statistics & numerical data , Electronic Health Records/statistics & numerical data , Information Dissemination , Patient Satisfaction/statistics & numerical data , Physicians , Quality of Health Care/statistics & numerical data , Attitude of Health Personnel , Attitude to Computers , Communication , Health Care Surveys , Humans , Physician-Patient Relations , Practice Patterns, Physicians'/statistics & numerical data , Teaching Rounds
17.
Cogn Affect Behav Neurosci ; 14(2): 443-72, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24920442

ABSTRACT

Recent years have seen a rejuvenation of interest in studies of motivation-cognition interactions arising from many different areas of psychology and neuroscience. The present issue of Cognitive, Affective, & Behavioral Neuroscience provides a sampling of some of the latest research from a number of these different areas. In this introductory article, we provide an overview of the current state of the field, in terms of key research developments and candidate neural mechanisms receiving focused investigation as potential sources of motivation-cognition interaction. However, our primary goal is conceptual: to highlight the distinct perspectives taken by different research areas, in terms of how motivation is defined, the relevant dimensions and dissociations that are emphasized, and the theoretical questions being targeted. Together, these distinctions present both challenges and opportunities for efforts aiming toward a more unified and cross-disciplinary approach. We identify a set of pressing research questions calling for this sort of cross-disciplinary approach, with the explicit goal of encouraging integrative and collaborative investigations directed toward them.


Subject(s)
Cognition/physiology , Motivation/physiology , Animals , Humans , Neuropsychological Tests
18.
Int J Biometeorol ; 58(5): 931-40, 2014 Jul.
Article in English | MEDLINE | ID: mdl-23748420

ABSTRACT

Corn earworm (Lepidoptera: Noctuidae) (CEW) populations infesting one crop production area may rapidly migrate and infest distant crop production areas. Although entomological radars have detected corn earworm moth migrations, the spatial extent of the radar coverage has been limited to a small horizontal view above crop production areas. The Weather Service Radar (version 88D) (WSR-88D) continuously monitors the radar-transmitted energy reflected by, and radial speed of, biota as well as by precipitation over areas that may encompass crop production areas. We analyzed data from the WSR-88D radar (S-band) at Brownsville, Texas, and related these data to aerial concentrations of CEW estimated by a scanning entomological radar (X-band) and wind velocity measurements from rawinsonde and pilot balloon ascents. The WSR-88D radar reflectivity was positively correlated (r2=0.21) with the aerial concentration of corn earworm-size insects measured by a scanning X-band radar. WSR-88D radar constant altitude plan position indicator estimates of wind velocity were positively correlated with wind speed (r2=0.56) and wind direction (r2=0.63) measured by pilot balloons and rawinsondes. The results reveal that WSR-88D radar measurements of insect concentration and displacement speed and direction can be used to estimate the migratory flux of corn earworms and other nocturnal insects, information that could benefit areawide pest management programs. In turn, identification of the effects of spatiotemporal patterns of migratory flights of corn earworm-size insects on WSR-88D radar measurements may lead to the development of algorithms that increase the accuracy of WSR-88D radar measurements of reflectivity and wind velocity for operational meteorology.


Subject(s)
Moths , Radar , Animal Migration , Animals , Mexico , Population Density , Refractometry , Texas , Wind
19.
Appl Clin Inform ; 5(4): 971-87, 2014.
Article in English | MEDLINE | ID: mdl-25589911

ABSTRACT

INTRODUCTION: Electronic medication administration record (eMAR) systems are promoted as a potential intervention to enhance medication safety in residential aged care facilities (RACFs). The purpose of this study was to conduct an in-practice evaluation of an eMAR being piloted in one Australian RACF before its roll out, and to provide recommendations for system improvements. METHODS: A multidisciplinary team conducted direct observations of workflow (n=34 hours) in the RACF site and the community pharmacy. Semi-structured interviews (n=5) with RACF staff and the community pharmacist were conducted to investigate their views of the eMAR system. Data were analysed using a grounded theory approach to identify challenges associated with the design of the eMAR system. RESULTS: The current eMAR system does not offer an end-to-end solution for medication management. Many steps, including prescribing by doctors and communication with the community pharmacist, are still performed manually using paper charts and fax machines. Five major challenges associated with the design of eMAR system were identified: limited interactivity; inadequate flexibility; problems related to information layout and semantics; the lack of relevant decision support; and system maintenance issues. We suggest recommendations to improve the design of the eMAR system and to optimize existing workflows. DISCUSSION: Immediate value can be achieved by improving the system interactivity, reducing inconsistencies in data entry design and offering dedicated organisational support to minimise connectivity issues. Longer-term benefits can be achieved by adding decision support features and establishing system interoperability requirements with stakeholder groups (e.g. community pharmacies) prior to system roll out. In-practice evaluations of technologies like eMAR system have great value in identifying design weaknesses which inhibit optimal system use.


Subject(s)
Clinical Pharmacy Information Systems , Electronic Health Records , Residential Facilities/statistics & numerical data , Decision Support Systems, Clinical , Humans , Semantics
20.
Intern Med J ; 43(12): 1321-6, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23800071

ABSTRACT

BACKGROUND: It is imperative to understand the current work practices of hospital personnel to inform efforts and secure resources towards the improvement of hospital systems. Research examining doctors' work during night-shifts is limited. AIM: To describe and quantify the night-shift work practices of junior doctors. METHODS: An observational time and motion study was conducted. Eight resident doctors in four general wards were observed for 96 h during night shifts (Monday-Friday, 2200-0800). RESULTS: Doctors spent the highest proportion (28%; 95% CI 21-35) of their time performing social/personal tasks (e.g. sleeping, eating) and indirect care (24%; 95% CI 22-25) (e.g. reviewing notes, ordering tests). Work-related discussion comprised 15% (95% CI 13-17), and most took place at the beginning of the night. Medication-related tasks consumed a small proportion of time (4%; 95% CI 3-4) but attracted a higher level of multitasking and interruptions than most other tasks. On average, 2 h of every shift were spent at a computer and 1.3 h with patient notes. Doctors spent 72% of the night-shift alone, multitasked 6.4% of the time and were interrupted, on average, once every 46 min. CONCLUSIONS: This study provides new data about junior doctors' work at night. Relative to doctors during the day, greater proportions of time were devoted to social/personal tasks (including sleep) and indirect care, but a similar proportion to direct care. Multitasking and interruptions were minimal. Computer activities were an integral part of work. Handovers were observed at the beginning but not the completion of the night shift, which may have implications for patient safety.


Subject(s)
Hospitals, Teaching/statistics & numerical data , Medical Staff, Hospital/statistics & numerical data , Time and Motion Studies , Work Schedule Tolerance , Hospitals, Teaching/methods , Humans , Medical Staff, Hospital/psychology , Time Factors , Work Schedule Tolerance/psychology
SELECTION OF CITATIONS
SEARCH DETAIL
...