Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 13 de 13
Filter
1.
Air Med J ; 37(3): 203-205, 2018.
Article in English | MEDLINE | ID: mdl-29735234

ABSTRACT

A 30-year-old woman, gravida 1, para 2, in her second trimester presented to the local emergency department complaining of an atraumatic headache described as the worst headache of her life. While undergoing evaluation, she became unresponsive with signs of herniation, including a blown pupil and bradycardia. Emergent imaging identified an intracerebral hemorrhage requiring immediate surgical decompression. The patient was transferred by helicopter to tertiary care. Upon arrival, the patient was taken directly to the operating room and underwent a decompressive craniotomy. This article reviews the considerations for transporting pregnant patients with intracerebral hemorrhage.


Subject(s)
Air Ambulances , Cerebral Hemorrhage/complications , Encephalocele/complications , Pregnancy Complications/therapy , Adult , Cerebral Hemorrhage/surgery , Cerebral Hemorrhage/therapy , Decompressive Craniectomy , Emergency Service, Hospital , Encephalocele/surgery , Encephalocele/therapy , Female , Headache/complications , Headache/etiology , Humans , Pregnancy , Pregnancy Complications/surgery , Pregnancy Trimester, Second , Tertiary Care Centers
2.
Prehosp Emerg Care ; 14(4): 433-8, 2010.
Article in English | MEDLINE | ID: mdl-20608878

ABSTRACT

INTRODUCTION: Firefighters who become lost, disoriented, or trapped in a burning building may die after running out of air in their self-contained breathing apparatus (SCBA). An emergency escape device has been developed that attaches to the firefighter's mask in place of the SCBA regulator. The device filters out particulate matter and a number of hazardous components of smoke (but does not provide oxygen), providing additional time to escape after the firefighter runs out of SCBA air. OBJECTIVE: To field-test the device under realistic fire conditions to 1) ascertain whether it provides adequate protection from carbon monoxide (CO) and 2) examine firefighters' impressions of the device and its use. METHODS: A wood-frame house was fitted with atmospheric monitors, and levels of CO, oxygen, and hydrogen cyanide were continuously recorded. After informed consent was obtained, firefighters wearing the escape device instead of their usual SCBA regulators entered the burning structure and spent 10 minutes breathing through the device. A breath CO analyzer was used to estimate (+ or - 3 ppm) each subject's carboxyhemoglobin level immediately upon exiting the building, vital signs and pulse oximetry were assessed, and each firefighter was asked for general impressions of the device. RESULTS: Thirteen subjects were enrolled (all male, mean age 42.5 years, mean weight 94 kg). The mean peak CO level at the floor in the rooms where the subjects were located was 546 ppm, and ceiling CO measurements ranged from 679 ppm to the meters' maximum of 1,000 ppm, indicating substantial CO exposure. The firefighters' mean carboxyhemoglobin level was 1.15% (range 0.8%-2.1%) immediately after exit. All pulse oximetry readings were 95% or greater. No subject reported problems or concerns regarding the device, no symptoms suggestive of smoke inhalation or toxicity were reported, and all subjects expressed interest in carrying the device while on duty. CONCLUSION: The emergency escape device provides excellent protection from CO in realistic fire scenarios with substantial exposure to toxic gases, and the firefighters studied had a positive impression of the device and its use.


Subject(s)
Filtration/instrumentation , Fires , Occupational Exposure , Respiratory Protective Devices , Adult , Equipment Design , Humans , Male , Middle Aged , Prospective Studies , Safety Management
3.
Prehosp Emerg Care ; 13(4): 536-40, 2009.
Article in English | MEDLINE | ID: mdl-19731169

ABSTRACT

INTRODUCTION: No existing mass casualty triage system has been scientifically scrutinized or validated. A recent work group sponsored by the Centers for Disease Control and Prevention, using a combination of expert opinion and the extremely limited research data available, created the SALT (sort-assess-lifesaving interventions-treat/transport) triage system to serve as a national model. An airport crash drill was used to pilot test the SALT system. OBJECTIVE: To assess the accuracy and speed with which trained paramedics can triage victims using this new system. METHODS: Investigators created 50 patient scenarios with a wide range of injuries and severities, and two additional uninjured victims were added at the time of the drill. Students wearing moulage and coached on how to portray their injuries served as "victims." Assuming proper application of the SALT system, the patient scenarios were designed such that 16 patients would be triaged as T1/red/immediate, 12 as T2/yellow/delayed, 14 as T3/green/minimal, and 10 as T4/black/dead. Paramedics were trained to proficiency in the SALT system one week prior to the drill using a 90-minute didactic/practical session, and were given "flash cards" showing the triage algorithm to be used if needed during the drill. Observers blinded to the study purpose timed and recorded the triage process for each patient during the drill. Simple descriptive statistics were used to analyze the data. RESULTS: The two paramedics assigned to the role of triage officers applied the SALT algorithm correctly to 41 of the 52 patients (78.8% accuracy). Seven patients intended to be T2 were triaged as T1, and two patients intended to be T3 were triaged as T2, for an overtriage rate of 13.5%. Two patients intended to be T2 were triaged as T3, for an undertriage rate of 3.8%. Triage times were recorded by the observers for 42 of the 52 patients, with a mean of 15 seconds per patient (range 5-57 seconds). CONCLUSIONS: The SALT mass casualty triage system can be applied quickly in the field and appears to be safe, as measured by a low undertriage rate. There was, however, significant overtriage. Further refinement is needed, and effect on patient outcomes needs to be evaluated.


Subject(s)
Mass Casualty Incidents , Triage/organization & administration , Disaster Planning , Efficiency, Organizational , Emergency Medical Services , Emergency Medical Technicians , Humans , Pilot Projects , Task Performance and Analysis
4.
Prehosp Emerg Care ; 12(4): 479-85, 2008.
Article in English | MEDLINE | ID: mdl-18924012

ABSTRACT

OBJECTIVES: Emergency medical dispatch (EMD) protocols are intended to match response resources with patient needs. In a small city that previously sent a first-responder basic life support (BLS) engine company lights-and-siren response to every emergency medical services (EMS) call, regardless of nature or severity, an EMD system was implemented in order to reduce the number of such responses. The study objectives were to determine the effects of the EMD system on first-responder call volume and to assess the safety of the system. METHODS: This was a prospective, before-after trial. Using computer-assisted dispatch (CAD) records, all EMS calls in the 120 days before implementation of the EMD protocol and the 120 days after implementation were identified (excluding a one-month wash-in period). In the "after" phase, patient care reports of a random sample of cases in which an ambulance was dispatched with no first responders was manually reviewed to assess whether there might have been any benefit to first-responder dispatch. Given the lack of accepted clinical criteria for need for first responders, the investigators' clinical judgment was used. Paired t-tests were used to compare groups. RESULTS: There were 9,820 EMS calls in the "before" phase, with 8,278 first-responder engine runs (84.3%), and 9,943 EMS calls in the "after" phase, with 3,804 first-responder engine runs (39.1%). The first-responder companies were dispatched to a median of 5.65 runs/day (range 1.1-12.7) in the "before" phase, and 3.17 runs/day (range 0.6-5.0) in the "after" phase (p = 0.0008 by paired t-test). Review of 1,816 "after" phase ambulance-only patient care reports (PCRs) found ten (0.55%) in which first-responder dispatch might have been beneficial, but review of EMS and emergency department (ED) records found no adverse outcomes in these ten patients. CONCLUSIONS: This study suggests that a formal EMD system can reduce first-responder call volume by roughly one-half. The system appears to be safe for patients, with an undertriage rate of about one-half of one percent.


Subject(s)
Efficiency, Organizational , Emergency Medical Service Communication Systems/organization & administration , Emergency Medical Technicians/statistics & numerical data , Emergency Service, Hospital/statistics & numerical data , Safety Management/methods , Prospective Studies , Triage
5.
Prehosp Emerg Care ; 12(2): 236-40, 2008.
Article in English | MEDLINE | ID: mdl-18379923

ABSTRACT

INTRODUCTION: Existing mass casualty triage systems do not consider the possibility of chemical, biological, or radiologic/nuclear (CBRN) contamination of the injured patients. A system that can triage injured patients who are or may be contaminated by CBRN material, developed through expert opinion, was pilot-tested at an airport disaster drill. The study objective was to determine the system's speed and accuracy. METHODS: For a drill involving a plane crash with release of organophosphate material from the cargo hold, 56 patient scenarios were generated, with some involving signs and symptoms of organophosphate toxicity in addition to physical trauma. Prior to the drill, the investigators examined each scenario to determine the "correct" triage categorization, assuming proper application of the proposed system, and trained the paramedics who were expected to serve as triage officers at the drill. During the drill, the medics used the CBRN triage system to triage the 56 patients, with two observers timing and recording the events of the triage process. The IRB deemed the study exempt from full review. RESULTS: The two triage officers applied the CBRN system correctly to 49 of the 56 patients (87.5% accuracy). One patient intended to be T2 (yellow) was triaged as T1 (red), for an over-triage rate of 1.8%. Five patients intended to be T1 were triaged as T2, and one patient intended to be T2 was triaged as T3 (green), for an under-triage rate of 10.7%. All six under-triage cases were due to failure to recognize or account for signs of organophosphate toxidrome in applying the triage system. For the 27 patients for whom times were recorded, triage was accomplished in a mean of 19 seconds (range 4-37, median 17). CONCLUSIONS: The chemical algorithm of the proposed CBRN-capable mass casualty triage system can be applied rapidly by trained paramedics, but a significant under-triage rate (10.7%) was seen in this pilot test. Further refinement and testing are needed, and effect on outcome must be studied.


Subject(s)
Hazardous Substances/isolation & purification , Mass Casualty Incidents , Triage/standards , Weapons of Mass Destruction , Humans , Organophosphates/isolation & purification , Pilot Projects , Triage/organization & administration
6.
Prehosp Emerg Care ; 11(1): 14-8, 2007.
Article in English | MEDLINE | ID: mdl-17169870

ABSTRACT

OBJECTIVE: Emergency medical dispatch (EMD) protocols should match response resources with patient needs. We tested a protocol sending only a commercial ambulance, without fire department first responders (FR), to all non-cardiac-arrest EMS calls at a physician-staffed HMO facility. Study objectives were to determine how often FR provided patient care at such facilities and whether EMD implementation could conserve FR resources without compromising patient care. METHODS: All EMS dispatches to this facility in the 4 months before implementation of the EMD protocol and 4 months after implementation were identified through dispatch records, and all FR and ambulance patient care reports were reviewed. In the "after" phase, all cases needing ALS transport were reviewed to examine whether there would have been benefit to FR dispatch. RESULTS: Of 242 dispatches in the "before" phase, BLS FR responded to 156 (64%), and ALS FR to 117 (48%). BLS FR provided patient care in 2 cases, and ALS FR in 17. Of 227 dispatches in the "after" phase, BLS FR responded to 10 (4%), and ALS FR to 10 (4%); all but one were protocol violations. BLS FR provided care in one case, and ALS FR in three. Review of the 93 "after" cases requiring ALS transport found none where FR presence would have been beneficial. CONCLUSIONS: First responders rarely provided patient care when responding to EMS calls at a physician-staffed medical facility. Implementation of an EMD protocol can safely reduce the number of FR responses to unscheduled ambulance calls at such a facility.


Subject(s)
Emergency Medical Services , Emergency Medical Technicians/statistics & numerical data , Health Facilities , Professional Role , Advanced Cardiac Life Support , Connecticut , Humans , Retrospective Studies
7.
Prehosp Emerg Care ; 10(2): 272-5, 2006.
Article in English | MEDLINE | ID: mdl-16531388

ABSTRACT

OBJECTIVE: To determine if a single mailing from the local volunteer fire department can increase the number of homes with proper, visible address numbering. Proper numbering is essential in rapidly locating a house during an emergency response. METHODS: The study was conducted at a suburban/rural fire department providing EMS and fire suppression services to a 22 square mile area with residential mailboxes located at the street. During a hazard identification pre-plan tour, each house was examined and assigned a classification: (A) No number visible on the house or mailbox (improper); (B) Number on only one side of the mailbox (improper); (C) Number on both sides or the end of the mailbox, or visible on the house (proper). The homeowners of all residences with improper numbering (A or B) were sent a one-page letter, discussing the need for proper numbering. The tour was repeated six weeks later to determine whether deficiencies had been corrected. It was prospectively determined that a 25% improvement was sought. RESULTS: During the pre-plan tour, 73 houses were classified as type A, 454 as type B, and 1706 as type C. At the re-visit, 135 (26%) of the type A and B homes had been properly numbered. Correction of deficiencies was better at type A homes (37, or 51%) than at type B homes (98, or 22%) (p < 0.001 by Chi-square). CONCLUSION: For houses with improper numbering, a single mailing from the fire department can be effective in encouraging residents to post proper numbers.


Subject(s)
Correspondence as Topic , Efficiency, Organizational , Emergency Medical Services , Housing , Reminder Systems , Connecticut , Information Systems
8.
Prehosp Disaster Med ; 20(5): 282-9, 2005.
Article in English | MEDLINE | ID: mdl-16295164

ABSTRACT

The end of the Cold War vastly altered the worldwide political landscape. With the loss of a main competitor, the United States (US) military has had to adapt its strategic, operational, and tactical doctrines to an ever-increasing variety of non-traditional missions, including humanitarian operations. Complex emergencies (CEs) are defined in this paper from a political and military perspective, various factors that contribute to their development are described, and issues resulting from the employment of US military forces are discussed. A model was developed to illustrate the course of a humanitarian emergency and the potential impact of a military response. The US intervention in Haiti, Northern Iraq, Kosovo, Somalia, Bosnia, and Rwanda serve as examples. A CE develops when there is civil conflict, loss of national governmental authority, a mass population movement, and massive economic failure, each leading to a general decline in food security. The military can alleviate a CE in four ways: (1) provide security for relief efforts; (2) enforce negotiated settlements; (3) provide security for non-combatants; and/or (4) employ logistical capabilities. The model incorporates Norton and Miskel's taxonomy of identifying failing states and helps illustrate the factors that lead to a CE. The model can be used to determine if and when military intervention will have the greatest impact. The model demonstrates that early military intervention and mission assignment within the core competencies of the forces can reverse the course of a CE. Further study will be needed to verify the model.


Subject(s)
Disaster Planning/organization & administration , Emergencies , Military Science/methods , Models, Organizational , Altruism , Humans , Relief Work/organization & administration , Rescue Work/organization & administration , United States , Warfare
10.
Prehosp Emerg Care ; 9(1): 8-13, 2005.
Article in English | MEDLINE | ID: mdl-16036821

ABSTRACT

OBJECTIVES: Carboxyhemoglobin (COHb) levels can be estimated by chemical analysis of exhaled alveolar breath. Such noninvasive measurement could be used on the fireground to screen both firefighters (FFs) and victims. The purpose of this study was to assess the feasibility of using a hand-held carbon monoxide (CO) monitoring device to screen for CO toxicity in FFs under field conditions. METHODS: Informed consent was obtained from all participants. Using a hand-held breath CO detection device, COHb readings were collected at baseline, and then as FFs exited burning buildings after performing interior fire attack and overhaul with self-contained breathing apparatus (SCBA) during live-fire training. Ambient CO levels were occasionally measured in interior areas where the FFs were working to assess the degree of CO exposure. RESULTS: Baseline COHb readings of 64 FFs ranged from 0% to 3% (mean 1%, median 1%). One hundred eighty-four COHb readings were collected during training exercises. The mean and median COHb levels were 1%. The maximum value in a FF wearing SCBA was 3%; values of 14%, 5%, and 4% were measured in instructors who were not properly wearing SCBA. Ambient CO readings during fire attack ranged from 75 to 1,290 ppm, and the ambient CO reading for overhaul ranged from 0 to 130 ppm. When the device was used for interior CO monitoring, washout time limited its utility for COHb monitoring in FFs. CONCLUSIONS: A hand-held CO monitoring device adapted for estimation of COHb levels by exhaled breath analysis can feasibly be deployed on the fireground to assess CO exposure in FFs.


Subject(s)
Air Pollutants, Occupational/analysis , Carbon Monoxide Poisoning/diagnosis , Carboxyhemoglobin/analysis , Fires , Adult , Breath Tests , Carbon Monoxide/analysis , Carbon Monoxide Poisoning/etiology , Chi-Square Distribution , Equipment Design , Equipment Safety , Female , Humans , Male , Middle Aged , Probability , Sampling Studies , Sensitivity and Specificity
11.
Neurobiol Learn Mem ; 81(3): 185-99, 2004 May.
Article in English | MEDLINE | ID: mdl-15082020

ABSTRACT

Glucocorticoid receptor activation within the basolateral amygdala (BLA) during fear conditioning may mediate enhancement in rats chronically exposed to stress levels of corticosterone. Male Sprague-Dawley rats received corticosterone (400 microg/ml) in their drinking water (days 1-21), a manipulation that was previously shown to cause hippocampal CA3 dendritic retraction. Subsequently, rats were adapted to the fear conditioning chamber (day 22), then trained (day 23), and tested for conditioned fear to context and tone (day 25). Training consisted of two tone (20s) and footshock (500 ms, 0.25 mA) pairings. In Experiment 1, muscimol (4.4 nmol/0.5 microl/side), a GABAergic agonist, was microinfused to temporarily inactivate the BLA during training. Rats given chronic corticosterone showed enhanced freezing to context, but not tone, compared to vehicle-supplemented rats. Moreover, BLA inactivation impaired contextual and tone conditioning, regardless of corticosterone treatment. In Experiment 2, RU486 (0, 0.3, and 3.0 ng/0.2 microl/side) was infused on training day to antagonize glucocorticoid receptors in the BLA. Corticosterone treatment enhanced fear conditioning to context and tone when analyzed together, but not separately. Moreover, RU486 (3.0 ng/side) selectively exacerbated freezing to context in chronic corticosterone-exposed rats only, but failed to alter tone conditioning. Serum corticosterone levels were negatively correlated with contextual, not tone, conditioning. Altogether, these suggest that chronic corticosterone influences fear conditioning differently than chronic stress as shown previously. Moreover, chronic exposure to corticosteroids alters BLA functioning in a non-linear fashion and that contextual conditioning is influenced more than tone conditioning by chronic corticosterone and BLA glucocorticoid receptor stimulation.


Subject(s)
Amygdala/physiology , Conditioning, Classical/physiology , Corticosterone/physiology , Fear/physiology , Hippocampus/physiology , Receptors, Glucocorticoid/antagonists & inhibitors , Receptors, Steroid/antagonists & inhibitors , Amygdala/drug effects , Animals , Conditioning, Classical/drug effects , Corticosterone/administration & dosage , Dendrites/drug effects , Dendrites/pathology , Dose-Response Relationship, Drug , Drug Administration Schedule , Fear/drug effects , GABA Agonists/pharmacology , Hippocampus/cytology , Hippocampus/drug effects , Hormone Antagonists/administration & dosage , Male , Mifepristone/administration & dosage , Muscimol/pharmacology , Rats , Rats, Sprague-Dawley
13.
Anal Chem ; 74(13): 3160-7, 2002 Jul 01.
Article in English | MEDLINE | ID: mdl-12141678

ABSTRACT

Mitoxantrone is an anticancer agent for which it is important to know the concentration in blood during therapy. Current methods of analysis are cumbersome, requiring a pretreatment stage. A method based on surface-enhanced resonance Raman scattering (SERRS) has been developed using a flow cell and silver colloid as the SERRS substrate. It is simple, sensitive, fast, and reliable. Both blood plasma and serum can be analyzed directly, but fresh serum is preferred here due to reduced fluorescence in the clinical samples available. Fluorescence is reduced further by the dilution of the serum in the flow cell and by quenching by the silver of surface-adsorbed material. The effectiveness of the latter process is dependent on the contact time between the serum and the silver. The linear range encompasses the range of concentrations detected previously in patient samples using HPLC methods. In a comparative study of a series of samples taken from a patient at different times, there is good agreement between the results obtained by HPLC and SERRS with no significant difference between them at the 95% limit. The limit of detection in serum using the final optimized procedure for SERRS was 4.0 x 10(-11) M (0.02 ng/mL) mitoxantrone. The ease with which the SERRS analysis can be carried out makes it the preferred choice of technique for mitoxantrone analysis.


Subject(s)
Antineoplastic Agents/analysis , Mitoxantrone/analysis , Antineoplastic Agents/blood , Calibration , Chromatography, High Pressure Liquid , Humans , Indicators and Reagents , Mitoxantrone/blood , Surface Plasmon Resonance
SELECTION OF CITATIONS
SEARCH DETAIL
...