Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 22
Filter
1.
Ann Emerg Med ; 68(4): 509-10, 2016 10.
Article in English | MEDLINE | ID: mdl-27666350
2.
J Inherit Metab Dis ; 38(2): 305-14, 2015 Mar.
Article in English | MEDLINE | ID: mdl-24715333

ABSTRACT

BACKGROUND: Enzyme-replacement therapy (ERT) in Pompe disease--an inherited metabolic disorder caused by acid α-glucosidase deficiency and characterized in infants by generalized muscle weakness and cardiomyopathy--can be complicated by immune responses. Infants that do not produce any endogenous acid α-glucosidase, so-called CRIM-negative patients, reportedly develop a strong response. We report the clinical outcome of our Dutch infants in relation to their CRIM status and immune response. METHODS: Eleven patients were genotyped and their CRIM status was determined. Antibody formation and clinical outcome were assessed for a minimum of 4 years. RESULTS: ERT was commenced between 0.1 and 8.3 months of age, and patients were treated from 0.3 to 13.7 years. All patients developed antibodies. Those with a high antibody titer (above 1:31,250) had a poor response. The antibody titers varied substantially between patients and did not strictly correlate with the patients' CRIM status. Patients who started ERT beyond 2 months of age tended to develop higher titers than those who started earlier. All three CRIM-negative patients in our study succumbed by the age of 4 years seemingly unrelated to the height of their antibody titer. CONCLUSION: Antibody formation is a common response to ERT in classic infantile Pompe disease and counteracts the effect of treatment. The counteracting effect seems determined by the antibody:enzyme molecular stoichiometry. The immune response may be minimized by early start of ERT and by immune modulation, as proposed by colleagues. The CRIM-negative status itself seems associated with poor outcome.


Subject(s)
Antibodies/blood , Enzyme Replacement Therapy , Glycogen Storage Disease Type II/drug therapy , alpha-Glucosidases/therapeutic use , Age Factors , Biomarkers/blood , Cells, Cultured , Child, Preschool , Disease Progression , Female , Genetic Predisposition to Disease , Glycogen Storage Disease Type II/diagnosis , Glycogen Storage Disease Type II/enzymology , Glycogen Storage Disease Type II/immunology , Glycogen Storage Disease Type II/mortality , Humans , Infant , Infant, Newborn , Male , Mutation , Netherlands , Phenotype , Recombinant Proteins/immunology , Recombinant Proteins/therapeutic use , Risk Factors , Time Factors , Transfection , Treatment Outcome , alpha-Glucosidases/deficiency , alpha-Glucosidases/genetics , alpha-Glucosidases/immunology
3.
Am J Cardiol ; 113(10): 1599-605, 2014 May 15.
Article in English | MEDLINE | ID: mdl-24792735

ABSTRACT

The Immediate Myocardial Metabolic Enhancement During Initial Assessment and Treatment in Emergency care Trial of very early intravenous glucose-insulin-potassium (GIK) for acute coronary syndromes (ACS) in out-of-hospital emergency medical service (EMS) settings showed 80% reduction in infarct size at 30 days, suggesting potential longer-term benefits. Here we report 1-year outcomes. Prespecified 1-year end points of this randomized, placebo-controlled, double-blind, effectiveness trial included all-cause mortality and composites including cardiac arrest, mortality, or hospitalization for heart failure (HF). Of 871 participants randomized to GIK versus placebo, death occurred within 1 year in 11.6% versus 13.5%, respectively (unadjusted hazard ratio [HR] 0.83, 95% confidence interval [CI] 0.57 to 1.23, p = 0.36). The composite of cardiac arrest or 1-year mortality was 12.8% versus 17.0% (HR 0.71, 95% CI 0.50 to 1.02, p = 0.06). The composite of hospitalization for HF or mortality within 1 year was 17.2% versus 17.2% (HR 0.98, 95% CI 0.70 to 1.37, p = 0.92). The composite of mortality, cardiac arrest, or HF hospitalization within 1 year was 18.1% versus 20.4% (HR 0.85, 95% CI 0.62 to 1.16, p = 0.30). In patients presenting with suspected ST elevation myocardial infarction, HRs for 1-year mortality and the 3 composites were, respectively, 0.65 (95% CI 0.33 to 1.27, p = 0.21), 0.52 (95% CI 0.30 to 0.92, p = 0.03), 0.63 (95% CI 0.35 to 1.16, p = 0.14), and 0.51 (95% CI 0.30 to 0.87, p = 0.01). In patients with suspected acute coronary syndromes, serious end points generally were lower with GIK than placebo, but the differences were not statistically significant. However, in those with ST elevation myocardial infarction, the composites of cardiac arrest or 1-year mortality, and of cardiac arrest, mortality, or HF hospitalization within 1 year, were significantly reduced.


Subject(s)
Acute Coronary Syndrome/drug therapy , After-Hours Care/methods , Outpatients , Acute Coronary Syndrome/diagnosis , Acute Coronary Syndrome/mortality , Adult , Cardioplegic Solutions , Cause of Death/trends , Double-Blind Method , Electrocardiography , Female , Follow-Up Studies , Glucose/administration & dosage , Heart Arrest/mortality , Heart Arrest/prevention & control , Humans , Infusions, Intravenous , Insulin/administration & dosage , Male , Middle Aged , Myocardium/metabolism , Potassium/administration & dosage , Retrospective Studies , Survival Rate/trends , Time Factors , Treatment Outcome , United States/epidemiology
4.
Orphanet J Rare Dis ; 8: 182, 2013 Nov 19.
Article in English | MEDLINE | ID: mdl-24245577

ABSTRACT

BACKGROUND: Pompe disease has a broad clinical spectrum, in which the phenotype is partially explained by the genotype. The aim of this study was to describe phenotypical variation among siblings with non-classic Pompe disease. We hypothesized that siblings and families with the same genotype share more similar phenotypes than the total population of non-classic Pompe patients, and that this might reveal genotype-phenotype correlations. METHODS: We identified all Dutch families in which two or three siblings were diagnosed with Pompe disease and described genotype, acid α-glucosidase activity, age at symptom onset, presenting symptoms, specific clinical features, mobility and ventilator dependency. RESULTS: We identified 22 families comprising two or three siblings. All carried the most common mutation c.-32-13 T > G in combination with another pathogenic mutation. The median age at symptom onset was 33 years (range 1-62 years). Within sibships symptom onset was either in childhood or in adulthood. The median variation in symptom onset between siblings was nine years (range 0-31 years). Presenting symptoms were similar across siblings in 14 out of 22 families. Limb girdle weakness was most frequently reported. In some families ptosis or bulbar weakness were present in all siblings. A large variation in disease severity (based on wheelchair/ventilator dependency) was observed in 11 families. This variation did not always result from a difference in duration of the disease since a third of the less affected siblings had a longer course of the disease. Enzyme activity could not explain this variation either. In most families male patients were more severely affected. Finally, symptom onset varied substantially in twelve families despite the same GAA genotype. CONCLUSION: In most families with non-classic Pompe disease siblings share a similar phenotype regarding symptom onset, presenting symptoms and specific clinical features. However, in some families the course and severity of disease varied substantially. This phenotypical variation was also observed in families with identical GAA genotypes. The commonalities and differences indicate that besides genotype, other factors such as epigenetic and environmental effects influence the clinical presentation and disease course.


Subject(s)
Glycogen Storage Disease Type II/genetics , Adolescent , Adult , Child , Child, Preschool , Female , Genotype , Glycogen Storage Disease Type II/pathology , Glycogen Storage Disease Type II/physiopathology , Humans , Infant , Male , Middle Aged , Phenotype , Siblings , Young Adult , alpha-Glucosidases/genetics
5.
Prehosp Emerg Care ; 17(3): 293-8, 2013.
Article in English | MEDLINE | ID: mdl-23510381

ABSTRACT

INTRODUCTION: Prior data from our institution suggested that our paramedics can accurately interpret ST-segment elevation myocardial infarction (STEMI) on prehospital 12-lead electrocardiograms (ECGs), and that activation of the cardiac catheterization laboratory by paramedics immediately upon diagnosing STEMI at the scene could potentially decrease door-to-balloon (D2B) times. A "field activation" protocol was thus initiated in May 2010. This study examined D2B times and compliance with the national 90-minute D2B performance benchmark in the first 14 months. Hypothesis. We hypothesized that D2B times would be shorter, and 90-minute compliance better, when the catheterization laboratory was activated by emergency medical services (EMS), compared with when either EMS failed to activate the catheterization laboratory or when the STEMI patient arrived by means other than EMS. METHODS: For this prospective, observational study, EMS and hospital data were reviewed for consecutive STEMI patients at a single hospital between May 2010 and July 2011. Patients were categorized as: 1) EMS field activations, 2) patients transported by EMS without EMS catheterization laboratory activation (e.g., ambulance from outside our area, paramedic missed STEMI/protocol violation), or 3) walk-in STEMI patient. Data were manipulated in Excel, means with standard deviations (SDs) and 95% confidence intervals (95% CIs) were determined, and analysis of variance (ANOVA) with Dunnett's correction was used to compare groups. RESULTS: There were 38 EMS field activations, 47 nonactivation EMS STEMI arrivals, and 28 walk-in STEMI patients. The mean (±SD) D2B times were 37 (±17), 87 (±40), and 80 (±23) minutes, respectively. D2B time was better for the EMS field activations than for either nonactivation EMS transports (difference of means 35.3 min, 95% CI 22.3-48.3 min, p < 0.001) or walk-in patients (difference of means 37.0 min, 95% CI 21.8-52.2 min, p < 0.001). Compliance with the 90-minute D2B benchmark was 100%, 72%, and 68%, respectively, and was better for the EMS field activations than for either of the other groups (p < 0.001). CONCLUSIONS: In the system studied, EMS field activation of the catheterization laboratory for patients with STEMI is associated with shorter D2B times and better compliance with 90-minute benchmarks than ED activation for either walk-in STEMI patients or STEMI patients arriving by EMS without field activation. Improvements are needed in compliance with the field activation protocol to maximize these benefits. Key words: emergency medical services; emergency medical technicians; electrocardiography; myocardial infarction; heart catheterization.


Subject(s)
Cardiac Catheterization , Electrocardiography , Emergency Medical Services/standards , Emergency Service, Hospital/standards , Myocardial Infarction/diagnosis , Aged , Angioplasty, Balloon, Coronary , Benchmarking , Efficiency, Organizational , Female , Humans , Laboratories, Hospital , Male , Middle Aged , Prospective Studies , Time Factors , Treatment Outcome , Triage
6.
Mol Genet Metab ; 107(3): 448-55, 2012 Nov.
Article in English | MEDLINE | ID: mdl-23040796

ABSTRACT

Since the introduction of enzyme replacement therapy for Pompe disease, awareness and early diagnosis have gained importance. Because the therapy is most effective when started early and methods for dried bloodspot screening for Pompe disease are currently being explored, neonatal screening is getting increased attention. The objective of this study was to investigate the gains that might be achieved with earlier diagnosis by neonatal screening. For this purpose we analyzed the health and functional status of non-screened patients with Pompe disease at the time of diagnosis. Previously collected clinical data and results of an international patient-reported questionnaire were used. Cross-sectional data of 53 patients with Pompe disease diagnosed between 1999 and 2009 (aged 0-64 years) were analyzed. According to the World Health Organization's International Classification of Functioning, Disability and Health the following domains are described: body function, activity, participation and contextual factors. In all patients with classic infantile Pompe disease cardiac function, hearing, muscle strength and motor development were considerably impaired at the time of clinical diagnosis. The use of oxygen and/or nasogastric tube-feeding was reported in more than 70% of these cases. Most children, adolescents and adults had advanced muscle weakness and impaired respiratory function at the time of their diagnosis, causing varying degrees of handicap. About 12% of them used a walking device and/or respiratory support at the time of diagnosis. The severely impaired health status reported here provides a strong argument for earlier diagnosis and to further explore the potential of neonatal screening for Pompe disease.


Subject(s)
Glycogen Storage Disease Type II/diagnosis , Muscle Weakness/diagnosis , Neonatal Screening/statistics & numerical data , Adolescent , Adult , Child , Child, Preschool , Cross-Sectional Studies , Early Diagnosis , Female , Glycogen Storage Disease Type II/pathology , Health Status , Humans , Infant , Infant, Newborn , Male , Middle Aged , Motor Activity , Muscle Weakness/pathology , Surveys and Questionnaires , Time Factors
7.
Expert Opin Pharmacother ; 13(16): 2281-99, 2012 Nov.
Article in English | MEDLINE | ID: mdl-23009070

ABSTRACT

INTRODUCTION: Lysosomal storage disorders (LSDs) are clinically heterogeneous disorders that result primarily from lysosomal accumulation of macromolecules in various tissues. LSDs are always progressive, and often lead to severe symptoms and premature death. The identification of the underlying genetic and enzymatic defects has prompted the development of various treatment options. AREAS COVERED: To describe the current treatment options for LSDs, the authors provide a focused overview of their pathophysiology. They discuss the current applications and challenges of enzyme-replacement therapy, stem-cell therapy, gene therapy, chaperone therapy and substrate-reduction therapy, as well as future therapeutic prospects. EXPERT OPINION: Over recent decades, considerable progress has been made in the treatment of LSDs and in the outcome of patients. None of the current options are completely curative yet. They are complicated by the difficulty in efficiently targeting all affected tissues (particularly the central nervous system), in reaching sufficiently high enzyme levels in the target tissues, and by their high costs. The pathways leading from the genetic mutation to the clinical symptoms should be further elucidated, as they might prompt the development of new and ultimately curative therapies.


Subject(s)
Lysosomal Storage Diseases/therapy , Animals , Enzyme Replacement Therapy , Genetic Therapy , Hematopoietic Stem Cell Transplantation , Humans , Lysosomal Storage Diseases/physiopathology , Molecular Chaperones/therapeutic use
8.
Clin Chem ; 58(7): 1139-47, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22623745

ABSTRACT

BACKGROUND: Urinary excretion of the tetrasaccharide 6-α-D-glucopyranosyl-maltotriose (Glc4) is increased in various clinical conditions associated with increased turnover or storage of glycogen, making Glc4 a potential biomarker for glycogen storage diseases (GSD). We developed an ultraperformance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) assay to detect Glc4 in urine without interference of the Glc4 isomer maltotetraose (M4). METHODS: Urine samples, diluted in 0.1% ammonium hydroxide containing the internal standard acarbose, were filtered, and the filtrate was analyzed by UPLC-MS/MS. RESULTS: We separated and quantified acarbose, M4, and Glc4 using the ion pairs m/z 644/161, 665/161, and 665/179, respectively. Response of Glc4 was linear up to 1500 µmol/L and the limit of quantification was 2.8 µmol/L. Intra- and interassay CVs were 18.0% and 18.4% (10 µmol/L Glc4), and 10.5% and 16.2% (200 µmol/L Glc4). Glc4 in control individuals (n = 116) decreased with increasing age from a mean value of 8.9 mmol/mol to 1.0 mmol/mol creatinine. M4 was present in 5% of urine samples. Mean Glc4 concentrations per age group in untreated patients with Pompe disease (GSD type II) (n = 66) were significantly higher, ranging from 39.4 to 10.3 mmol/mol creatinine (P < 0.001-0.005). The diagnostic sensitivity of Glc4 for GSD-II was 98.5% and the diagnostic specificity 92%. Urine Glc4 was also increased in GSD-III (8 of 9), GSD-IV (2 of 3) and GSD-IX (6 of 10) patients. CONCLUSIONS: The UPLC-MS/MS assay of Glc4 in urine was discriminative between Glc4 and M4 and confirmed the diagnosis in >98% of GSD-II cases.


Subject(s)
Glycogen Storage Disease/urine , Glycogen/metabolism , Oligosaccharides/urine , Adolescent , Adult , Age Factors , Aged , Child , Child, Preschool , Chromatography, Liquid , Glycogen Storage Disease Type II/urine , Glycogen Storage Disease Type III/urine , Glycogen Storage Disease Type IV/urine , Humans , Infant , Infant, Newborn , Maltose/analogs & derivatives , Maltose/urine , Middle Aged , Reference Values , Spectrometry, Mass, Electrospray Ionization , Tandem Mass Spectrometry , Young Adult
9.
PLoS One ; 7(3): e32692, 2012.
Article in English | MEDLINE | ID: mdl-22403698

ABSTRACT

BACKGROUND: As the United States embraces electronic health records (EHRs), improved emergency medical services (EMS) information systems are also a priority; however, little is known about the experiences of EMS agencies as they adopt and implement electronic patient care report (e-PCR) systems. We sought to characterize motivations for adoption of e-PCR systems, challenges associated with adoption and implementation, and emerging implementation strategies. METHODS: We conducted a qualitative study using semi-structured in-depth interviews with EMS agency leaders. Participants were recruited through a web-based survey of National Association of EMS Physicians (NAEMSP) members, a didactic session at the 2010 NAEMSP Annual Meeting, and snowball sampling. Interviews lasted approximately 30 minutes, were recorded and professionally transcribed. Analysis was conducted by a five-person team, employing the constant comparative method to identify recurrent themes. RESULTS: Twenty-three interviewees represented 20 EMS agencies from the United States and Canada; 14 EMS agencies were currently using e-PCR systems. The primary reason for adoption was the potential for e-PCR systems to support quality assurance efforts. Challenges to e-PCR system adoption included those common to any health information technology project, as well as challenges unique to the prehospital setting, including: fear of increased ambulance run times leading to decreased ambulance availability, difficulty integrating with existing hospital information systems, and unfunded mandates requiring adoption of e-PCR systems. Three recurring strategies emerged to improve e-PCR system adoption and implementation: 1) identify creative funding sources; 2) leverage regional health information organizations; and 3) build internal information technology capacity. CONCLUSION: EMS agencies are highly motivated to adopt e-PCR systems to support quality assurance efforts; however, adoption and implementation of e-PCR systems has been challenging for many. Emerging strategies from EMS agencies and others that have successfully implemented EHRs may be useful in expanding e-PCR system use and facilitating this transition for other EMS agencies.


Subject(s)
Electronic Health Records/statistics & numerical data , Emergency Medical Services/statistics & numerical data , Patient Care/statistics & numerical data , Data Collection , Electronic Health Records/economics , Electronic Health Records/legislation & jurisprudence , Emergency Medical Services/economics , Emergency Medical Services/legislation & jurisprudence , Hospital Information Systems , Hospitalization , Humans , Motivation , Patient Care/economics , Time Factors
10.
JAMA ; 307(18): 1925-33, 2012 May 09.
Article in English | MEDLINE | ID: mdl-22452807

ABSTRACT

CONTEXT: Laboratory studies suggest that in the setting of cardiac ischemia, immediate intravenous glucose-insulin-potassium (GIK) reduces ischemia-related arrhythmias and myocardial injury. Clinical trials have not consistently shown these benefits, possibly due to delayed administration. OBJECTIVE: To test out-of hospital emergency medical service (EMS) administration of GIK in the first hours of suspected acute coronary syndromes (ACS). DESIGN, SETTING, AND PARTICIPANTS: Randomized, placebo-controlled, double-blind effectiveness trial in 13 US cities (36 EMS agencies), from December 2006 through July 31, 2011, in which paramedics, aided by electrocardiograph (ECG)-based decision support, randomized 911 (871 enrolled) patients (mean age, 63.6 years; 71.0% men) with high probability of ACS. INTERVENTION: Intravenous GIK solution (n = 411) or identical-appearing 5% glucose placebo (n = 460) administered by paramedics in the out-of-hospital setting and continued for 12 hours. MAIN OUTCOME MEASURES: The prespecified primary end point was progression of ACS to myocardial infarction (MI) within 24 hours, as assessed by biomarkers and ECG evidence. Prespecified secondary end points included survival at 30 days and a composite of prehospital or in-hospital cardiac arrest or in-hospital mortality, analyzed by intent-to-treat and by presentation with ST-segment elevation. RESULTS: There was no significant difference in the rate of progression to MI among patients who received GIK (n = 200; 48.7%) vs those who received placebo (n = 242; 52.6%) (odds ratio [OR], 0.88; 95% CI, 0.66-1.13; P = .28). Thirty-day mortality was 4.4% with GIK vs 6.1% with placebo (hazard ratio [HR], 0.72; 95% CI, 0.40-1.29; P = .27). The composite of cardiac arrest or in-hospital mortality occurred in 4.4% with GIK vs 8.7% with placebo (OR, 0.48; 95% CI, 0.27-0.85; P = .01). Among patients with ST-segment elevation (163 with GIK and 194 with placebo), progression to MI was 85.3% with GIK vs 88.7% with placebo (OR, 0.74; 95% CI, 0.40-1.38; P = .34); 30-day mortality was 4.9% with GIK vs 7.7% with placebo (HR, 0.63; 95% CI, 0.27-1.49; P = .29). The composite outcome of cardiac arrest or in-hospital mortality was 6.1% with GIK vs 14.4% with placebo (OR, 0.39; 95% CI, 0.18-0.82; P = .01). Serious adverse events occurred in 6.8% (n = 28) with GIK vs 8.9% (n = 41) with placebo (P = .26). CONCLUSIONS: Among patients with suspected ACS, out-of-hospital administration of intravenous GIK, compared with glucose placebo, did not reduce progression to MI. Compared with placebo, GIK administration was not associated with improvement in 30-day survival but was associated with lower rates of the composite outcome of cardiac arrest or in-hospital mortality. TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00091507.


Subject(s)
Acute Coronary Syndrome/drug therapy , Cardioplegic Solutions/therapeutic use , Myocardial Infarction/prevention & control , Acute Coronary Syndrome/mortality , Aged , Allied Health Personnel , Angina, Unstable/complications , Angina, Unstable/drug therapy , Decision Support Techniques , Double-Blind Method , Electrocardiography , Emergency Medical Services , Female , Glucose/therapeutic use , Heart Arrest/prevention & control , Hospital Mortality , Humans , Insulin/therapeutic use , Male , Middle Aged , Myocardial Infarction/etiology , Odds Ratio , Potassium/therapeutic use , Survival Analysis , Treatment Outcome
11.
Conn Med ; 75(5): 261-8, 2011 May.
Article in English | MEDLINE | ID: mdl-21678837

ABSTRACT

OBJECTIVES: Assess the association of helmet use with motorcycle crash mortality and identify characteristics of riders who do not wear helmets in Connecticut crashes. METHODS: Police crash data for Connecticut motorcycle crashes 2001-2007 were analyzed. Bivariate analysis and multivariable logistic regressions were performed including age, gender, seating position, road type, season, time of day, and recklessness. RESULTS: Of the 9,214 crashes with helmet use data available, helmets were worn in 4072 (44.2%). Non-helmeted riders, age > or =18, riding interstate or state roads, in the evening or at night, and who were riding recklessly were associated with higher odds of fatality. Predictors of nonhelmet use included males, passengers, age <18 or 30 to 59, and riding in the summer, eveningor at night, and on U.S., state, and localroads. CONCLUSION: Current crash data affirm that helmets reduce fatal crashes in Connecticut. A set of factors help predict nonhelmeted riders to whom safety training could be targeted.


Subject(s)
Accidents, Traffic/statistics & numerical data , Craniocerebral Trauma/prevention & control , Head Protective Devices/statistics & numerical data , Motorcycles , Accidents, Traffic/legislation & jurisprudence , Accidents, Traffic/mortality , Adolescent , Adult , Age Factors , Connecticut/epidemiology , Craniocerebral Trauma/mortality , Female , Humans , Logistic Models , Male , Middle Aged , Motorcycles/legislation & jurisprudence , Motorcycles/statistics & numerical data , Risk Factors , Risk-Taking , Sex Factors
12.
Prehosp Emerg Care ; 15(2): 149-57, 2011.
Article in English | MEDLINE | ID: mdl-21294627

ABSTRACT

Some of the most intractable challenges in prehospital medicine include response time optimization, inefficiencies at the emergency medical services (EMS)-emergency department (ED) interface, and the ability to correlate field interventions with patient outcomes. Information technology (IT) can address these and other concerns by ensuring that system and patient information is received when and where it is needed, is fully integrated with prior and subsequent patient information, and is securely archived. Some EMS agencies have begun adopting information technologies, such as wireless transmission of 12-lead electrocardiograms, but few agencies have developed a comprehensive plan for management of their prehospital information and integration with other electronic medical records. This perspective article highlights the challenges and limitations of integrating IT elements without a strategic plan, and proposes an open, interoperable, and scalable prehospital information technology (PHIT) architecture. The two core components of this PHIT architecture are 1) routers with broadband network connectivity to share data between ambulance devices and EMS system information services and 2) an electronic patient care report to organize and archive all electronic prehospital data. To successfully implement this comprehensive PHIT architecture, data and technology requirements must be based on best available evidence, and the system must adhere to health data standards as well as privacy and security regulations. Recent federal legislation prioritizing health information technology may position federal agencies to help design and fund PHIT architectures.


Subject(s)
Computer Systems , Emergency Medical Services/organization & administration , Medical Informatics/organization & administration , Medical Records Systems, Computerized/organization & administration , Wireless Technology/organization & administration , Congresses as Topic , Electrocardiography/instrumentation , Humans , Patient Care , Privacy , Time , United States
13.
Prehosp Emerg Care ; 14(4): 433-8, 2010.
Article in English | MEDLINE | ID: mdl-20608878

ABSTRACT

INTRODUCTION: Firefighters who become lost, disoriented, or trapped in a burning building may die after running out of air in their self-contained breathing apparatus (SCBA). An emergency escape device has been developed that attaches to the firefighter's mask in place of the SCBA regulator. The device filters out particulate matter and a number of hazardous components of smoke (but does not provide oxygen), providing additional time to escape after the firefighter runs out of SCBA air. OBJECTIVE: To field-test the device under realistic fire conditions to 1) ascertain whether it provides adequate protection from carbon monoxide (CO) and 2) examine firefighters' impressions of the device and its use. METHODS: A wood-frame house was fitted with atmospheric monitors, and levels of CO, oxygen, and hydrogen cyanide were continuously recorded. After informed consent was obtained, firefighters wearing the escape device instead of their usual SCBA regulators entered the burning structure and spent 10 minutes breathing through the device. A breath CO analyzer was used to estimate (+ or - 3 ppm) each subject's carboxyhemoglobin level immediately upon exiting the building, vital signs and pulse oximetry were assessed, and each firefighter was asked for general impressions of the device. RESULTS: Thirteen subjects were enrolled (all male, mean age 42.5 years, mean weight 94 kg). The mean peak CO level at the floor in the rooms where the subjects were located was 546 ppm, and ceiling CO measurements ranged from 679 ppm to the meters' maximum of 1,000 ppm, indicating substantial CO exposure. The firefighters' mean carboxyhemoglobin level was 1.15% (range 0.8%-2.1%) immediately after exit. All pulse oximetry readings were 95% or greater. No subject reported problems or concerns regarding the device, no symptoms suggestive of smoke inhalation or toxicity were reported, and all subjects expressed interest in carrying the device while on duty. CONCLUSION: The emergency escape device provides excellent protection from CO in realistic fire scenarios with substantial exposure to toxic gases, and the firefighters studied had a positive impression of the device and its use.


Subject(s)
Filtration/instrumentation , Fires , Occupational Exposure , Respiratory Protective Devices , Adult , Equipment Design , Humans , Male , Middle Aged , Prospective Studies , Safety Management
14.
Prehosp Emerg Care ; 14(2): 153-8, 2010.
Article in English | MEDLINE | ID: mdl-20095828

ABSTRACT

BACKGROUND: Prompt reperfusion in ST-segment elevation myocardial infarction (STEMI) saves lives. Although studies have shown that paramedics can reliably interpret STEMI on prehospital 12-lead electrocardiograms (p12ECGs), prehospital activation of the cardiac catheterization laboratory by emergency medical services (EMS) has not yet gained widespread acceptance. OBJECTIVE: To quantify the potential reduction in time to percutaneous coronary intervention (PCI) by early prehospital activation of the cardiac catheterization laboratory in STEMI. METHODS: This prospective, observational study enrolled all patients diagnosed with STEMI by paramedics in a mid-sized regional EMS system. Patients were enrolled if: 1) the paramedic interpreted STEMI on the p12ECG, 2) the Acute Cardiac Ischemia Time-Insensitive Predictive Instrument (ACI-TIPI) score was 75% or greater, and 3) the patient was transported to either of two area PCI centers. Data recorded included the time of initial EMS "STEMI alert" from the scene, time of arrival at the emergency department (ED), and time of actual catheterization laboratory activation by the ED physician, all using synchronized clocks. The primary outcome measure was the time difference between the STEMI alert and the actual activation (i.e., potential time savings). The false-positive rate (patients incorrectly diagnosed with STEMI by paramedics) was also calculated and compared with a locally accepted false-positive rate of 10%. RESULTS: Twelve patients were enrolled prior to early termination of the study. The mean and median potential time reductions were 15 and 11 minutes, respectively (range 7-29 minutes). There was one false STEMI alert (8.3% false-positive rate) for a patient with a right bundle branch block who subsequently had a non-ST-segment elevation myocardial infarction. The study was terminated when our cardiologists adopted a prehospital catheterization laboratory activation protocol based on our initial data. CONCLUSION: Important reductions in time to reperfusion seem possible by activation of the catheterization laboratory by EMS from the scene, with an acceptably low false-positive rate in this small sample. This type of clinical research can inform multidisciplinary policies and bring about meaningful clinical practice changes.


Subject(s)
Cardiac Catheterization , Electrocardiography/instrumentation , Emergency Medical Technicians , Myocardial Infarction/therapy , Humans , Myocardial Infarction/physiopathology , Observation , Prospective Studies , Time Factors , United States
15.
Prehosp Emerg Care ; 13(4): 536-40, 2009.
Article in English | MEDLINE | ID: mdl-19731169

ABSTRACT

INTRODUCTION: No existing mass casualty triage system has been scientifically scrutinized or validated. A recent work group sponsored by the Centers for Disease Control and Prevention, using a combination of expert opinion and the extremely limited research data available, created the SALT (sort-assess-lifesaving interventions-treat/transport) triage system to serve as a national model. An airport crash drill was used to pilot test the SALT system. OBJECTIVE: To assess the accuracy and speed with which trained paramedics can triage victims using this new system. METHODS: Investigators created 50 patient scenarios with a wide range of injuries and severities, and two additional uninjured victims were added at the time of the drill. Students wearing moulage and coached on how to portray their injuries served as "victims." Assuming proper application of the SALT system, the patient scenarios were designed such that 16 patients would be triaged as T1/red/immediate, 12 as T2/yellow/delayed, 14 as T3/green/minimal, and 10 as T4/black/dead. Paramedics were trained to proficiency in the SALT system one week prior to the drill using a 90-minute didactic/practical session, and were given "flash cards" showing the triage algorithm to be used if needed during the drill. Observers blinded to the study purpose timed and recorded the triage process for each patient during the drill. Simple descriptive statistics were used to analyze the data. RESULTS: The two paramedics assigned to the role of triage officers applied the SALT algorithm correctly to 41 of the 52 patients (78.8% accuracy). Seven patients intended to be T2 were triaged as T1, and two patients intended to be T3 were triaged as T2, for an overtriage rate of 13.5%. Two patients intended to be T2 were triaged as T3, for an undertriage rate of 3.8%. Triage times were recorded by the observers for 42 of the 52 patients, with a mean of 15 seconds per patient (range 5-57 seconds). CONCLUSIONS: The SALT mass casualty triage system can be applied quickly in the field and appears to be safe, as measured by a low undertriage rate. There was, however, significant overtriage. Further refinement is needed, and effect on patient outcomes needs to be evaluated.


Subject(s)
Mass Casualty Incidents , Triage/organization & administration , Disaster Planning , Efficiency, Organizational , Emergency Medical Services , Emergency Medical Technicians , Humans , Pilot Projects , Task Performance and Analysis
16.
Prehosp Emerg Care ; 13(1): 85-9, 2009.
Article in English | MEDLINE | ID: mdl-19145531

ABSTRACT

The application of Advanced Cardiac Life Support (ACLS) in severe hypothermic cardiac arrest remains controversial. While the induction of mild hypothermia has been shown to improve outcomes in patients already resuscitated from cardiac arrest, it is unknown whether ACLS protocols are effective during the resuscitation of the severely hypothermic cardiac arrest patient. We describe a case of a 47-year-old man who was successfully resuscitated from a ventricular fibrillation (VF) arrest with a core body temperature of 26.4 degrees C. The patient had been found unresponsive in a bathtub of cold water following an apparent suicide attempt. An incorrect pronouncement of death by the fire department delayed his transport to the hospital by more than four hours. Once in the emergency department (ED), the patient sustained a VF cardiac arrest and was successfully defibrillated using ACLS protocols. He ultimately survived his hospitalization with near-complete neurologic recovery. In this case report, we discuss the application of ACLS to the resuscitation of the hypothermic cardiac arrest patient as well as the issues involved in the prehospital determination of death.


Subject(s)
Advanced Cardiac Life Support/methods , Heart Arrest/therapy , Hypothermia/therapy , Rewarming , Ventricular Fibrillation/therapy , Electric Countershock , Heart Arrest/complications , Humans , Hypothermia/complications , Male , Middle Aged , Positive-Pressure Respiration , Suicide, Attempted
17.
Prehosp Emerg Care ; 12(3): 297-301, 2008.
Article in English | MEDLINE | ID: mdl-18584495

ABSTRACT

INTRODUCTION: Firefighters are taught that heat, oxygen deprivation, and carbon monoxide (CO) are the primary threats to life in residential structure fires, and they are taught to search for victims on the fire floor first, and then floors above. The objective of this study was to gather data regarding oxygen, CO, and heat conditions inside a realistic house fire, to examine the validity of these teachings. METHODS: During six live-burn training evolutions in a two-story wood-frame house, metering for oxygen levels, CO levels, and temperature was conducted. Except where noted, all readings were taken 24 inches off the floor, to simulate the location of a crawling victim or firefighter. Readings were hand-recorded on a convenience basis by firefighters stationed outside the building, near the meters. RESULTS: Of the 35 oxygen levels recorded, the lowest was 18.2%, with only 12 readings below 20%. Three of 16 first-floor readings were below 20%, whereas nine of 19 second-floor readings were below 20% (p=0.07). First- and second-floor readings were comparable (mean 20.3% vs. 19.9%, p=0.11). Except for one reading of 1,870 ppm, all CO readings at the ceiling exceeded the 2,000-ppm limit of the meters. Of the 34 CO levels recorded 24 inches off the floor, 29 (76%) exceeded the permissible exposure limit of 50 ppm, with the highest reading being 1,424 ppm, well above the "immediately dangerous to life and health" level of 1,200 ppm. None of the 20 CO levels recorded on the first floor exceeded the 30-minute exposure limit of 800 ppm, whereas seven of 14 second-floor readings exceeded this limit (p<0.001). While ceiling temperatures frequently exceeded the 1,000 degrees F limit of the meters, none of 16 readings taken 24 inches off the floor exceeded 137 degrees F. First- and second-floor temperatures were comparable (mean 88.5 degrees F vs. 90.1 degrees F, p=0.9). CONCLUSIONS: In residential structure fires, CO poses a greater threat to victims and firefighters than does oxygen deprivation or heat. Emergency medical services personnel should consider CO toxicity in all fire victims. Conditions on the floor above a fire are at least as adverse as those on the fire floor.


Subject(s)
Carbon Monoxide Poisoning/prevention & control , Emergency Medical Services , Fires , Safety Management , Smoke Inhalation Injury/prevention & control , Carbon Monoxide/analysis , Emergency Medical Technicians/education , Hot Temperature/adverse effects , Humans , Hypoxia/prevention & control , Inservice Training , Oxygen/analysis , Residence Characteristics , United States
18.
J Forensic Leg Med ; 15(5): 343-5, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18511013

ABSTRACT

This case report provides an unusual presentation of a gunshot wound (GSW) and stresses the importance of gathering complete clinical, scene and historical information, if possible. Sufficient details regarding an injured patient's mechanism of injury (MOI) should be elicited by the treating physician when hemodynamic status of the patient allows. A careful physical exam is essential as are appropriate laboratory investigations and diagnostic imaging. We present a case report of a single GSW found on physical exam with multiple projectiles found on imaging studies. The history of present illness, scene findings and trial transcripts clarify the patient presentation.


Subject(s)
Forensic Ballistics , Radiography, Thoracic , Wounds, Gunshot/pathology , Adult , Humans , Liver/injuries , Liver/pathology , Male , Thoracic Injuries/pathology
19.
Prehosp Emerg Care ; 12(2): 236-40, 2008.
Article in English | MEDLINE | ID: mdl-18379923

ABSTRACT

INTRODUCTION: Existing mass casualty triage systems do not consider the possibility of chemical, biological, or radiologic/nuclear (CBRN) contamination of the injured patients. A system that can triage injured patients who are or may be contaminated by CBRN material, developed through expert opinion, was pilot-tested at an airport disaster drill. The study objective was to determine the system's speed and accuracy. METHODS: For a drill involving a plane crash with release of organophosphate material from the cargo hold, 56 patient scenarios were generated, with some involving signs and symptoms of organophosphate toxicity in addition to physical trauma. Prior to the drill, the investigators examined each scenario to determine the "correct" triage categorization, assuming proper application of the proposed system, and trained the paramedics who were expected to serve as triage officers at the drill. During the drill, the medics used the CBRN triage system to triage the 56 patients, with two observers timing and recording the events of the triage process. The IRB deemed the study exempt from full review. RESULTS: The two triage officers applied the CBRN system correctly to 49 of the 56 patients (87.5% accuracy). One patient intended to be T2 (yellow) was triaged as T1 (red), for an over-triage rate of 1.8%. Five patients intended to be T1 were triaged as T2, and one patient intended to be T2 was triaged as T3 (green), for an under-triage rate of 10.7%. All six under-triage cases were due to failure to recognize or account for signs of organophosphate toxidrome in applying the triage system. For the 27 patients for whom times were recorded, triage was accomplished in a mean of 19 seconds (range 4-37, median 17). CONCLUSIONS: The chemical algorithm of the proposed CBRN-capable mass casualty triage system can be applied rapidly by trained paramedics, but a significant under-triage rate (10.7%) was seen in this pilot test. Further refinement and testing are needed, and effect on outcome must be studied.


Subject(s)
Hazardous Substances/isolation & purification , Mass Casualty Incidents , Triage/standards , Weapons of Mass Destruction , Humans , Organophosphates/isolation & purification , Pilot Projects , Triage/organization & administration
20.
Prehosp Emerg Care ; 12(2): 225-35, 2008.
Article in English | MEDLINE | ID: mdl-18379922

ABSTRACT

OBJECTIVE: Develop experimental models to study uncompensable heat stress (UCHS) in working firefighters (FFs). METHODS: FFs ingested core temperature (Tc) capsules prior to performing sequential tasks in 40 degrees C and personal protective ensemble (PPE), or 18 degrees C and no PPE. Both trials were conducted in an environmental chamber with FFs using self-contained breathing apparatus (SCBA). RESULTS: FFs exercising in heat and PPE reproduced UCHS conditions. For every FF in both trials for whom the capsules worked, Tc was elevated, and Tc(max) occurred after completion of study protocol. Trials with PPE resulted in a mean maximum temperature of 38.94 degrees C (+/-0.37 degrees C); Tc(max) reached 40.4 degrees C. Without PPE, maximum Tc averaged 37.79 degrees C (+/-0.07 degrees C). Heat storage values ranged from 131 to 1205 kJ, averaging 578 kJ (+/-151.47 kJ) with PPE and 210.83 kJ (+/-21.77 kJ) without PPE. CONCLUSIONS: An experimental model has been developed that simulates the initial phases of an interior fire attack to study the physiology of UCHS in FF. The hot environment and PPE increase maximum Tc and heat storage over that due to the exertion required to perform the tasks and may decrease time to volitional fatigue. This model will permit controlled studies to optimize work-rest cycles, rehab conditions, and physical conditioning of FFs.


Subject(s)
Employment , Fires , Heat Stress Disorders/physiopathology , Adolescent , Adult , Body Mass Index , Female , Heat Stress Disorders/etiology , Humans , Male , Middle Aged , Monitoring, Ambulatory/methods , Occupational Exposure , Task Performance and Analysis
SELECTION OF CITATIONS
SEARCH DETAIL
...