Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 10 de 10
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
JMIR AI ; 2: e48628, 2023 Sep 13.
Artigo em Inglês | MEDLINE | ID: mdl-38875535

RESUMO

BACKGROUND: Infusion failure may have severe consequences for patients receiving critical, short-half-life infusions. Continued interruptions to infusions can lead to subtherapeutic therapy. OBJECTIVE: This study aims to identify and rank determinants of the longevity of continuous infusions administered through syringe drivers, using nonlinear predictive models. Additionally, this study aims to evaluate key factors influencing infusion longevity and develop and test a model for predicting the likelihood of achieving successful infusion longevity. METHODS: Data were extracted from the event logs of smart pumps containing information on care profiles, medication types and concentrations, occlusion alarm settings, and the final infusion cessation cause. These data were then used to fit 5 nonlinear models and evaluate the best explanatory model. RESULTS: Random forest was the best-fit predictor, with an F1-score of 80.42, compared to 5 other models (mean F1-score 75.06; range 67.48-79.63). When applied to infusion data in an individual syringe driver data set, the predictor model found that the final medication concentration and medication type were of less significance to infusion longevity compared to the rate and care unit. For low-rate infusions, rates ranging from 2 to 2.8 mL/hr performed best for achieving a balance between infusion longevity and fluid load per infusion, with an occlusion versus no-occlusion ratio of 0.553. Rates between 0.8 and 1.2 mL/hr exhibited the poorest performance with a ratio of 1.604. Higher rates, up to 4 mL/hr, performed better in terms of occlusion versus no-occlusion ratios. CONCLUSIONS: This study provides clinicians with insights into the specific types of infusion that warrant more intense observation or proactive management of intravenous access; additionally, it can offer valuable information regarding the average duration of uninterrupted infusions that can be expected in these care areas. Optimizing rate settings to improve infusion longevity for continuous infusions, achieved through compounding to create customized concentrations for individual patients, may be possible in light of the study's outcomes. The study also highlights the potential of machine learning nonlinear models in predicting outcomes and life spans of specific therapies delivered via medical devices.

2.
JMIR Hum Factors ; 9(4): e37905, 2022 Oct 11.
Artigo em Inglês | MEDLINE | ID: mdl-36222805

RESUMO

BACKGROUND: Outpatient pharmacy management aims for improved patient safety, improved quality of service, and cost reduction. The Six Sigma method improves quality by eliminating variability, with the goal of a nearly error-free process. Automation of pharmacy tasks potentially offers greater efficiency and safety. OBJECTIVE: The goal was to measure the impact that integration of automation made to service, safety and efficiency, staff reallocation and reorientation, and workflow in the outpatient pharmacy department. The Six Sigma problem definition to be resolved was as follows: The current system of outpatient dispensing denies quality to patients in terms of waiting time and contact time with pharmacy professionals, incorporates risks to the patient in terms of mislabeling of medications and the incomplete dispensing of prescriptions, and is potentially wasteful in terms of time and resources. METHODS: We described the process of introducing automation to a large outpatient pharmacy department in a university hospital. The Six Sigma approach was used as it focuses on continuous improvement and also produces a road map that integrates tracking and monitoring into its process. A review of activity in the outpatient department focused on non-value-added (NVA) pharmacist tasks, improving the patient experience and patient safety. Metrics to measure the impact of change were established, and a process map analysis with turnaround times (TATs) for each stage of service was created. Discrete events were selected for correction, improvement, or mitigation. From the review, the team selected key outcome metrics, including storage, picking and delivery dispensing rates, patient and prescription load per day, average packs and lines per prescription, and lines held. Our goal was total automation of stock management. We deployed 2 robotic dispensing units to feed 9 dispensing desks. The automated units were integrated with hospital information technology (HIT) that supports appointments, medication records, and prescriptions. RESULTS: Postautomation, the total patient time in the department, including the time interacting with the pharmacist for medication education and counseling, dropped from 17.093 to 11.812 digital minutes, with an appreciable increase in patient-pharmacist time. The percentage of incomplete prescriptions dispensed versus orders decreased from 3.0% to 1.83%. The dispensing error rate dropped from 1.00% to 0.24%. Assessed via a "basket" of medications, wastage cost was reduced by 83.9%. During implementation, it was found that NVA tasks that were replaced by automated processes were responsible for an extensive loss of pharmacist time. The productivity ratio postautomation was 1.26. CONCLUSIONS: The Six Sigma methodology allowed for rapid transformation of the medication management process. The risk priority numbers (RPNs) for the "wrong patient-wrong medication error" reduced by a ratio of 5.25:1 and for "patient leaves unit with inadequate counseling" postautomation by 2.5:1. Automation allowed for ring-fencing of patient-pharmacist time. This time needs to be structured for optimal effectiveness.

3.
JMIR Form Res ; 6(4): e36710, 2022 Apr 26.
Artigo em Inglês | MEDLINE | ID: mdl-35471247

RESUMO

BACKGROUND: There is a paucity of information in the literature on core nursing staff knowledge on the requirements of specific intravenous administration lines for medications regularly given in critical care. There is also a lack of well-researched and appropriate information in the literature for intravenous administration line selection, and the need for filtration, protection from light, and other line-material selection precautions for many critical and noncritical medications used in these settings to maintain their potency and efficacy. OBJECTIVE: We aimed to assess the knowledge gap of clinicians with respect to intravenous administration line set material requirements for critical care medications. METHODS: Data were drawn from a clinician knowledge questionnaire, a region-wide database of administered infusions, and regional data on standard and special intravenous administration line consumption for 1 year (2019-2020) from an enterprise resource planning system log. The clinician knowledge questionnaire was validated with 3 groups (n=35) and then released for a general survey of critical care nurses (n=72) by assessing response dispersal and interrater reliability (Cronbach α=.889). Correct answers were determined by referencing available literature, with consensus between the team's pharmacists. Percentage deviations from correct answers (which had multiple possible selections) were calculated for control and test groups. We reviewed all 3 sources of information to identify the gap between required usage and real usage, and the impact of knowledge deficits on this disparity. RESULTS: Percentage deviations from correct answers were substantial in the control groups and extensive in the test group for all medications tested (percentage deviation range -43% to 93%), with the exception of for total parenteral nutrition. Respondents scored poorly on questions about medications requiring light protection, and there was a difference of 2.75% between actual consumption of lines and expected consumption based on medication type requirement. Confusion over the requirements for low-sorbing lines, light protection of infusions, and the requirement for filtration of specific solutions was evident in all evidence sources. The consumption of low-sorbing lines (125,090/1,454,440, 8.60%) was larger than the regional data of medication usage data would suggest as being appropriate (15,063/592,392, 2.54%). CONCLUSIONS: There is no single source of truth for clinicians on the interactions of critical care intravenous medications and administration line materials, protection from light, and filtration. Nursing staff showed limited knowledge of these requirements. To reduce clinical variability in this area, it is desirable to have succinct easy-to-access information available for clinicians to make decisions on which administration line type to use for each medication. The study's results will be used to formulate solutions for bedside delivery of accurate information on special intravenous line requirements for critical care medications.

4.
JMIR Hum Factors ; 8(4): e29180, 2021 Nov 02.
Artigo em Inglês | MEDLINE | ID: mdl-34456182

RESUMO

BACKGROUND: The forms of automation available to the oncology pharmacy range from compounding robotic solutions through to combination workflow software, which can scale-up to cover the entire workflow from prescribing to administration. A solution that offers entire workflow management for oncology is desirable because (in terms of cytotoxic delivery of a regimen to a patient) the chain that starts with prescription and the assay of the patient's laboratory results and ends with administration has multiple potential safety gaps and choke points. OBJECTIVE: The aim of this study was to show how incremental change to a core compounding workflow software solution has helped an organization meet goals of improved patient safety; increasing the number of oncology treatments; improving documentation; and improving communication between oncologists, pharmacists, and nurses. We also aimed to illustrate how using this technology flow beyond the pharmacy has extended medication safety to the patient's bedside through the deployment of a connected solution for confirming and documenting right patient-right medication transactions. METHODS: A compounding workflow software solution was introduced for both preparation and documentation, with pharmacist verification of the order, gravimetric checks, and step-by-step on-screen instructions displayed in the work area for the technician. The software supported the technician during compounding by proposing the required drug vial size, diluents, and consumables. Out-of-tolerance concentrations were auto-alerted via an integrated gravimetric scale. A patient-medication label was created. Integration was undertaken between a prescribing module and the compounding module to reduce the risk of transcription errors. The deployment of wireless-connected handheld barcode scanners was then made to allow nurses to use the patient-medication label on each compounded product and to scan patient identification bands to ensure right patient-right prescription. RESULTS: Despite an increase in compounding, with a growth of 12% per annum and no increase in pharmacy headcount, we doubled our output to 14,000 medications per annum through the application of the compounding solution. The use of a handheld barcode scanning device for nurses reduced the time for medication administration from ≈6 minutes per item to 41 seconds, with a mean average saving of 5 minutes and 19 seconds per item. When calculated against our throughput of 14,000 items per annum (current production rate via pharmacy), this gives a saving of 3 hours and 24 minutes of nursing time per day, equivalent to 0.425 full-time nurses per annum. CONCLUSIONS: The addition of prescribing, compounding, and administration software solutions to our oncology medication chain has increased detection and decreased the risk of error at each stage of the process. The double-checks that the system has built in by virtue of its own systems and through the flow of control of drugs and dosages from physician to pharmacist to nurse allow it to integrate fully with our human systems of risk management.

5.
JMIR Hum Factors ; 8(3): e28381, 2021 Sep 01.
Artigo em Inglês | MEDLINE | ID: mdl-34304149

RESUMO

BACKGROUND: We describe the introduction, use, and evaluation of an automation and integration pharmacy development program in a private facility in Saudi Arabia. The project was specifically undertaken to increase throughput, reduce medication dispensing error rates, improve patient satisfaction, and free up pharmacists' time to allow for increased face-to-face consultations with patients. OBJECTIVE: We forecasted growth of our outpatient service at 25% per annum over 5- and 10-year horizons and set out to prepare our outpatient pharmacy service to meet this demand. Initial project goals were set as a 50% reduction in the average patient wait time, a 15% increase in patient satisfaction regarding pharmacy wait time and pharmacy services, a 25% increase in pharmacist productivity, and zero dispensing errors. This was expected to be achieved within 10 months of go-live. Realignment of pharmacist activity toward counseling and medication review with patients was a secondary goal, along with the rapid development of a reputation in the served community for patient-centered care. METHODS: Preimplementation data for patient wait time for dispensing of prescribed medications as a specific measure of patient satisfaction was gathered as part of wider ongoing data collection in this field. Pharmacist activity and productivity in terms of patient interaction time were gathered. Reported and discovered dispensing errors per 1000 prescriptions were also aggregated. All preimplementation data was gathered over an 11-month period. RESULTS: From go-live, data were gathered on the above metrics in 1-month increments. At the 10-month point, there had been a 53% reduction in the average wait time, a 20% increase in patient satisfaction regarding pharmacy wait time, with a 22% increase in overall patient satisfaction regarding pharmacy services, and a 33% increase in pharmacist productivity. A zero dispensing error rate was reported. CONCLUSIONS: The robotic pharmacy solution studied was highly effective, but a robust upstream supply chain is vital to ensure stock levels, particularly when automated filling is planned. The automation solution must also be seamlessly and completely integrated into the facility's software systems for appointments, medication records, and prescription generation in order to garner its full benefits. Overall patient satisfaction with pharmacy services is strongly influenced by wait time and follow-up studies are required to identify how to use this positive effect and make optimal use of freed-up pharmacist time. The extra time spent by pharmacists with patients and the opportunity for complete overview of the patient's medication history, which full integration provides, may allow us to address challenging issues such as medication nonadherence. Reduced wait times may also allow for smaller prescription fill volumes, and more frequent outpatient department visits, allowing patients to have increased contact time with pharmacists.

6.
JMIR Hum Factors ; 7(3): e20364, 2020 Aug 11.
Artigo em Inglês | MEDLINE | ID: mdl-32667895

RESUMO

BACKGROUND: There is a paucity of quantitative evidence in the current literature on the incidence of wrong medication and wrong dose administration of intravenous medications by clinicians. The difficulties of obtaining reliable data are related to the fact that at this stage of the medication administration chain, detection of errors is extremely difficult. Smart pump medication library logs and their reporting software record medication and dose selections made by users, as well as cancellations of selections and the time between these actions. Analysis of these data adds quantitative data to the detection of these kinds of errors. OBJECTIVE: We aimed to establish, in a reproducible and reliable study, baseline data to show how metrics in the set-up and programming phase of intravenous medication administration can be produced from medication library near-miss error reports from infusion pumps. METHODS: We performed a 12-month retrospective review of medication library reports from infusion pumps from across a facility to obtain metrics on the set-up phase of intravenous medication administration. Cancelled infusions and resolutions of all infusion alerts by users were analyzed. Decision times of clinicians were calculated from the time-date stamps of the pumps' logs. RESULTS: Incorrect medication selections represented 3.45% (10,017/290,807) of all medication library alerts and 22.40% (10,017/44,721) of all cancelled infusions. Of these cancelled medications, all high-risk medications, oncology medications, and all intravenous medications delivered to pediatric patients and neonates required a two-nurse check according to the local policy. Wrong dose selection was responsible for 2.93% (8533/290,807) of all alarms and 19.08% (8533/44,721) of infusion cancellations. Average error recognition to cancellation and correction times were 27.00 s (SD 22.25) for medication error correction and 26.52 s (SD 24.71) for dose correction. The mean character count of medications corrected from initial lookalike-soundalike selection errors was 13.04, with a heavier distribution toward higher character counts. The position of the word/phrase error was spread among name beginning (6991/10,017, 69.79%), middle (2144/10,017, 21.40%), and end (882/10,017, 8.80%). CONCLUSIONS: The study identified a high number of lookalike-soundalike near miss errors, with cancellation of one medication being rapidly followed by the programming of a second. This phenomenon was largely centered on initial misreadings of the beginning of the medication name, with some incidences of misreading in the middle and end portions of medication nomenclature. The value of an infusion pump showing the entire medication name complete with TALLman lettering on the interface matching that of medication labeling is supported by these findings. The study provides a quantitative appraisal of an area that has been resistant to study and measurement, which is the number of intravenous medication administration errors of wrong medication and wrong dose that occur in clinical settings.

7.
JMIR Hum Factors ; 6(3): e14123, 2019 Aug 12.
Artigo em Inglês | MEDLINE | ID: mdl-31407667

RESUMO

BACKGROUND: Alarm fatigue commonly leads to a reduced response to alarms. Appropriate and timely response to intravenous pump alarms is crucial to infusion continuity. The difficulty of filtering out critical short half-life infusion alarms from nonurgent alarms is a key challenge for risk management for clinicians. Critical care areas provide ample opportunities for intravenous medication error with the frequent administration of high-alert, critical short half-life infusions that require rigorous maintenance for continuity of delivery. Most serious medication errors in critical care occur during the execution of treatment, with performance-level failures outweighing rule-based or knowledge-based mistakes. OBJECTIVE: One objective of this study was to establish baseline data for the types and frequency of alarms that critical care clinicians are exposed to from a variety of infusion devices, including both large volume pumps and syringe drivers. Another objective was to identify the volume of these alarms that specifically relate to critical short half-life infusions and to evaluate user response times to alarms from infusion devices delivering these particular infusions. METHODS: The event logs of 1183 infusion pumps used in critical care environments and in general care areas within the European region were mined for a range of alarm states. The study then focused on a selection of infusion alarms from devices delivering critical short half-life infusions that would warrant rapid attention from clinicians in order to avoid potentially harmful prolonged infusion interruption. The reaction time of clinicians to infusion-interruption states and alarms for the selected critical short half-life infusions was then calculated. RESULTS: Initial analysis showed a mean average of 4.50 alarms per infusion in the general critical care pump population as opposed to the whole hospital rate of 1.39. In the pediatric intensive care unit (PICU) group, the alarms per infusion value was significantly above the mean average for all critical care areas, with 8.61 alarms per infusion. Infusion-interruption of critical short half-life infusions was found to be a significant problem in all areas of the general critical care pump population, with a significant number of downstream (ie, vein and access) occlusion events noted. While the mean and median response times to critical short half-life infusion interruptions were generally within the half-lives of the selected medications, there was a high prevalence of outliers in terms of reaction times for all the critical short half-life infusions studied. CONCLUSIONS: This study gives an indication of what might be expected in critical care environments in terms of the volume of general infusion alarms and critical short half-life infusion alarms, as well as for clinician reaction times to critical short half-life infusion-interruption events. This study also identifies potentially problematic areas of the hospital for alarm fatigue and for particular issues of infusion and infusion-line management. Application of the proposed protocols can help create benchmarks for pump alarm management and clinician reaction times. These protocols can be applied to studies on the impact of alarm fatigue and for the evaluation of protocols, infusion-monitoring strategies, and infusion pump-based medication safety software aimed at reducing alarm fatigue and ensuring the maintenance of critical short half-life infusions. Given the frequency of infusion alarms seen in this study, the risk of alarm fatigue due to the white noise of pump alarms present in critical care, to which clinicians are constantly exposed, is very high. Furthermore, the added difficulties of maintaining critical short half-life infusions, and other infusions in specialist areas, are made clear by the high ratio of downstream occlusion to infusion starts in the neonatal intensive care unit (NICU). The ability to quantitatively track the volume of alarms and clinician reaction times contributes to a greater understanding of the issues of alarm fatigue in intensive care units. This can be applied to clinical audit, can allow for targeted training to reduce nuisance alarms, and can aid in planning for improvement in the key area of maintenance of steady-state plasma levels of critical short half-life infusions. One clear conclusion is that the medication administration rights should be extended to include right maintenance and ensured delivery continuity of critical short half-life infusions.

8.
J Environ Manage ; 181: 770-778, 2016 Oct 01.
Artigo em Inglês | MEDLINE | ID: mdl-27444723

RESUMO

Previous studies have demonstrated both beneficial and detrimental effects on soil properties from biochar incorporation. Several biochars, with different feedstock origins, were evaluated for their effectiveness at improving soil quality of a sandy agricultural soil. A pot trial was used to investigate aggregate stability and microbial activity, pore water trace element mobility and micronutrient concentrations in grain of spring wheat after incorporation of three biochars. The feedstocks for biochar production were selected because they were established UK waste products, namely oversize woody material from green waste composting facilities, and rhododendron and soft wood material from forest clearance operations. Biochars were incorporated into the soil at a rate of 5% v/v. Aggregate stability was improved following addition of oversize biochar whilst microbial activity increased in all treatments. Dissolved organic carbon (DOC) concentrations in soil pore water from biochar-treated soils were raised, whilst micronutrient concentrations in wheat grain grown in the treated soils were significantly reduced. It was concluded that incorporation of biochar to temperate agricultural soils requires caution as it may result in reductions of essential grain micronutrients required for human health, whilst the effect on aggregate stability may be linked to organic carbon functional groups on biochar surfaces and labile carbon released from the char into the soil system.


Assuntos
Carvão Vegetal/química , Carvão Vegetal/farmacocinética , Solo/química , Triticum/química , Resíduos , Agricultura , Disponibilidade Biológica , Carbono/análise , Carbono/química , Rhododendron , Microbiologia do Solo , Oligoelementos/farmacocinética , Triticum/crescimento & desenvolvimento , Reino Unido , Madeira
9.
Am J Disaster Med ; 9(4): 273-85, 2014.
Artigo em Inglês | MEDLINE | ID: mdl-25672330

RESUMO

OBJECTIVE: Delineation of the advantages and problems related to the use of forward-site operating room-, Intensive Care Unit (ICU)-, radiography-, and mass casualty-enabled disaster vehicles for site evacuation, patient stabilization, and triage. SETTING: The vehicles discussed have six ventilated ICU spaces, two ORs, on-site radiography, 21 intermediate acuity spaces with stretchers, and 54 seated minor acuity spaces. Each space has piped oxygen with an independent vehicle-loaded supply. The vehicles are operated by the Dubai Corporate Ambulance Services. Their support hospital is the main trauma center for the Emirate of Dubai and provides the vehicles' surgical, intensivist, anesthesia, and nursing staff. The disaster vehicles have been deployed 264 times in the last 5 years (these figures do not include deployments for drills). INTERVENTIONS: Introducing this new service required extensive initial planning and ongoing analysis of the performance of the disaster vehicles that offer ambulance services and receiving hospitals a large array of possibilities in terms of triage, stabilization of priority I and II patients, and management of priority III patients. PRELIMINARY RESULTS: In both drills and in disasters, the vehicles were valuable in forward triage and stabilization and in the transport of large numbers of priority III patients. This has avoided the depletion of emergency transport available for priority I and II patients. CONCLUSIONS: The successful utilization of disaster vehicles requires seamless cooperation between the hospital staffing the vehicles and the ambulance service deploying them. They are particularly effective during preplanned deployments to high-risk situations. These vehicles also potentially provide self-sufficient refuges for forward teams in hostile environments.


Assuntos
Planejamento em Desastres , Serviços Médicos de Emergência/organização & administração , Incidentes com Feridos em Massa , Unidades Móveis de Saúde , Veículos Automotores , Triagem , Cuidados Críticos , Diagnóstico por Imagem , Humanos , Salas Cirúrgicas , Desenvolvimento de Programas , Avaliação de Programas e Projetos de Saúde , Transporte de Pacientes
10.
Am J Disaster Med ; 6(1): 39-46, 2011.
Artigo em Inglês | MEDLINE | ID: mdl-21466028

RESUMO

OBJECTIVES: Delineation of the problem of physician role during disaster activations both for disaster responders and for general physicians in a Middle East state facility. SETTING: The hospital described has 500 medical-surgical beds, 59 intensive care unit beds, eight operating rooms (ORs), and 60 emergency room (ER) beds. Its ER sees 150,000 presentations per year and between 11 and 26 multitrauma cases per day. Most casualties are the result of industrial accidents (50.5 percent) and road traffic accidents (34 percent). It is the principle trauma center for Dubai, UAE. The hospital is also the designated primary regional responder for medical, chemical, and biological events. Its disaster plan has been activated 10 times in the past 3 years and it is consistently over its bed capacity. INTERVENTIONS: A review of the activity of physicians during disaster activations revealed problems of role identification, conflict, and lack of training. Interventions included training nonacute teams in reverse triaging and responder teams in coordinated emergency care. Both actions were fostered and controlled by a Disaster Control Centre and its Committee. RESULTS: Clear identification of medical leadership in disaster situations, introduction of a process of reverse triage to meet surge based on an ethical framework, and improvement of flow through the ER and OR. CONCLUSIONS: Reverse triage can be made to work in the Middle East despite its lack of primary healthcare infrastructure. Lessons from the restructuring of responder teams may be applicable to the deployment to prehospital environments of hospital teams, and further development of audit tools is required to measure improvement in these areas.


Assuntos
Planejamento em Desastres/organização & administração , Desastres , Serviços Médicos de Emergência/organização & administração , Papel do Médico , Triagem/organização & administração , Serviço Hospitalar de Emergência/organização & administração , Número de Leitos em Hospital , Humanos , Liderança , Salas Cirúrgicas/organização & administração , Emirados Árabes Unidos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...