Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
1.
Appl Clin Inform ; 4(4): 569-82, 2013.
Article in English | MEDLINE | ID: mdl-24454583

ABSTRACT

BACKGROUND: In determining whether clinical decision support (CDS) should be interruptive or non-interruptive, CDS designers need more guidance to balance the potential for interruptive CDS to overburden clinicians and the potential for non-interruptive CDS to be overlooked by clinicians. OBJECTIVE: (1)To compare performance achieved by clinicians using interruptive CDS versus using similar, non-interruptive CDS. (2)To compare performance achieved using non-interruptive CDS among clinicians exposed to interruptive CDS versus clinicians not exposed to interruptive CDS. METHODS: We studied 42 emergency medicine physicians working in a large hospital where an interruptive CDS to help identify patients requiring contact isolation was replaced by a similar, but non-interruptive CDS. The first primary outcome was the change in sensitivity in identifying these patients associated with the conversion from an interruptive to a non-interruptive CDS. The second primary outcome was the difference in sensitivities yielded by the non-interruptive CDS when used by providers who had and who had not been exposed to the interruptive CDS. The reference standard was an epidemiologist-designed, structured, objective assessment. RESULTS: In identifying patients needing contact isolation, the interruptive CDS-physician dyad had sensitivity of 24% (95% CI: 17%-32%), versus sensitivity of 14% (95% CI: 9%-21%) for the non-interruptive CDS-physician dyad (p = 0.04). Users of the non-interruptive CDS with prior exposure to the interruptive CDS were more sensitive than those without exposure (14% [95% CI: 9%-21%] versus 7% [95% CI: 3%-13%], p = 0.05). LIMITATIONS: As with all observational studies, we cannot confirm that our analysis controlled for every important difference between time periods and physician groups. CONCLUSIONS: Interruptive CDS affected clinicians more than non-interruptive CDS. Designers of CDS might explicitly weigh the benefits of interruptive CDS versus its associated increased clinician burden. Further research should study longer term effects of clinician exposure to interruptive CDS, including whether it may improve clinician performance when using a similar, subsequent non-interruptive CDS.


Subject(s)
Decision Support Systems, Clinical , Patient Isolation , Humans , Physicians , Retrospective Studies
2.
Prehosp Emerg Care ; 5(1): 23-8, 2001.
Article in English | MEDLINE | ID: mdl-11194065

ABSTRACT

OBJECTIVE: To characterize the reasons for and effects of diversions of advanced life support (ALS) ambulances in a large urban area with a high concentration of receiving hospitals. METHODS: A retrospective study was performed in a large urban region during a consecutive three-month period. Diversion was defined as the ALS transport of a patient to an emergency department (ED) other than the designated primary receiving facility. Case-matched concurrent cohorts of patients who were and were not diverted were studied to establish emergency medical services (EMS) time intervals, including total prehospital interval (TPI), on-scene interval (OSI), and patient transfer interval (PTI); age; gender; Glasgow Coma Score (GCS); ALS interventions; and insurance status. The reasons for diversion and the chief complaints of diverted patients were also studied. RESULTS: During the study period, 2,534 ALS runs occurred, of which 147 (5.8%) were diverted. Twenty-four (16.3%) diversions had incomplete run times, leaving 123 (83.7%) for analysis. The most common chief complaints of diverted patients were shortness of breath (SOB), chest pain (CP), and altered mental status (AMS). The most common reason for diversion was special consideration (SC), defined as a diversion requested by a patient, family member, law enforcement officer, or private medical doctor. Diverted ambulances had significant increases in TPI, 36.4 [95% confidence interval (95% CI) 35.1-37.7] vs. 33.4 [95% CI 32.13-34.7], and PTI, 10.3 [95% CI 9.4-11.2] vs. 7.9 [95% CI 7.2-8.6], compared with nondiverted ambulances. Further analysis demonstrated that SC diversions accounted for all of the increases in TPI (p<0.001) and PTI (p<0.001) when compared with other types of diversions and nondiverted transports. CONCLUSION: "Special consideration" was the most common reason for diversion in this study. Special consideration diversions increased TPI and PTI, causing delays in arrival to the ED and decreased ALS ambulance availability.


Subject(s)
Ambulances/statistics & numerical data , Emergency Medical Services/organization & administration , Patient Transfer/statistics & numerical data , Urban Health Services/statistics & numerical data , Utilization Review , Adult , Advanced Cardiac Life Support , Aged , Catchment Area, Health , Emergencies/classification , Emergency Medical Services/statistics & numerical data , Female , Humans , Los Angeles/epidemiology , Male , Middle Aged , Retrospective Studies , Time and Motion Studies
5.
Oecologia ; 71(1): 63-68, 1986 Dec.
Article in English | MEDLINE | ID: mdl-28312085

ABSTRACT

Laurel Sumac (Rhus laurina) is a dominant member of the coastal chaparral community of southern California that survives periodic burning by wildfires by resprouting from a lignotuber (root crown). We investigated the physiological basis for resprouting by comparing shoot elongation, leaf nitrogen content, tissue water status, leaf conductance to water vapor diffusion, and photosynthetic rates of post-fire R. laurina to those of adjacent unburned shrubs. Resprouts had higher rates of shoot elongation, leaf conductance, and photosynthesis than mature, unburned shrubs. Leaf nitrogen contents were elevated in burned shrubs even though their leaves developed interveinal chlorosis. A comparison of soil water potential to predawn water potential indicated that roots of R. laurina remain active below 2 m during the first summer drought after wildfire. Our results support the hypothesis that lignotubers not only contain dormant buds that develop into aerial shoots after wildfire but they also supply nutrient resources that enhance shoot elongation. Because R. laurina is relatively sensitive to drought, yet very successful in its rapid recovery after fire, maintaining an active root system after shoot removal may be the primary function of the massive lignotuber formed by this species.

SELECTION OF CITATIONS
SEARCH DETAIL
...