ABSTRACT
OBJECTIVE: Simulation stands to serve an important role in modern competency-based programs of assessment in postgraduate medical education. Our objective was to compare the performance of individual emergency medicine (EM) residents in a simulation-based resuscitation objective structured clinical examination (OSCE) using the Queen's Simulation Assessment Tool (QSAT), with portfolio assessment of clinical encounters using a modified in-training evaluation report (ITER) to understand in greater detail the inferences that may be drawn from a simulation-based OSCE assessment. METHODS: A prospective observational study was employed to explore the use of a multicenter simulation-based OSCE for evaluation of resuscitation competence. EM residents from five Canadian academic sites participated in the OSCE. Video-recorded performances were scored by blinded raters using the scenario-specific QSATs with domain-specific anchored scores (primary assessment, diagnostic actions, therapeutic actions, communication) and a global assessment score (GAS). Residents' portfolios were evaluated using a modified ITER subdivided by CanMEDS roles (medical expert, communicator, collaborator, leader, health advocate, scholar, and professional) and a GAS. Correlational and regression analyses were performed comparing components of each of the assessment methods. RESULTS: Portfolio review and ITER scoring was performed for 79 residents participating in the simulation-based OSCE. There was a significant positive correlation between total OSCE and ITER scores (r = 0.341). The strongest correlations were found between ITER medical expert score and each of the OSCE GAS (r = 0.420), communication (r = 0.443), and therapeutic action (r = 0.484) domains. ITER medical expert was a significant predictor of OSCE total (p = 0.002). OSCE therapeutic action was a significant predictor of ITER total (p = 0.02). CONCLUSIONS: Simulation-based resuscitation OSCEs and portfolio assessment captured by ITERs appear to measure differing aspects of competence, with weak to moderate correlation between those measures of conceptually similar constructs. In a program of competency-based assessment of EM residents, a simulation-based OSCE using the QSAT shows promise as a tool for assessing medical expert and communicator roles.
ABSTRACT
The Department of Emergency Medicine at Queen's University developed, implemented, and evaluated an interprofessional simulation-based competition called the Simulation Olympics with the purpose of encouraging health care providers to practice resuscitation skills and foster strong team-based attitudes. Eleven teams (N â=â 45) participated in the competition. Teams completed three standardized resuscitation scenarios in a high-fidelity simulation laboratory with teams composed of nurses, respiratory therapists, and undergraduate and postgraduate medical trainees. Trained standardized actors and a dedicated technician were used for all scenarios. Judges evaluated team performance using standardized assessment tools. All participants (100%) completed an anonymous two-page questionnaire prior to the competition assessing baseline characteristics and evaluating participant attitudes, motivation, and barriers to participation. The majority of participants (71%) completed an evaluation form following the event focusing on highlights, barriers to participation, and desired future directions. Evaluations were uniformly positive in short-answer feedback and attitudinal scoring measures. To our knowledge, the Simulation Olympics competition is the first of its kind in Canada to be offered at an academic teaching hospital.