Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
1.
Eur J Radiol ; 147: 110109, 2022 Feb.
Article in English | MEDLINE | ID: mdl-34968900

ABSTRACT

OBJECTIVES: Systematic program evaluation of the Queen's University diagnostic radiology residency program following transition to a competency-based medical education (CBME) curriculum. METHODS: Rapid Evaluation methodology and the Core Components Framework were utilized to measure CBME implementation. A combination of interviews and focus groups were held with program leaders (n = 6), faculty (n = 10), both CBME stream and traditional stream residents (n = 6), and program staff (n = 2). Interviews and focus groups were transcribed and analyzed abductively. Study team met with program leaders to review common themes and plan potential adaptations. RESULTS: Strengths of CBME implementation included more frequent and timely feedback as well as the role of the Academic Advisor. However, frontline faculty felt insufficiently supported with regards to the theory and practical implementation of the new curriculum and found assessment tools unintuitive. The circumstances surrounding the curricular implementation also resulted in some negative sentiment. Additional faculty and resident education workshops were identified as areas for improvement as well as changes to assessment tools for increased clarity. Residents overall viewed the changes favorably, with traditional stream residents indicating that they also had a desire for increased feedback. CONCLUSIONS: Rapid Evaluation is an effective method for program assessment following curricular change in diagnostic radiology. A departmental champion driving enthusiasm for change from within may be valuable. Adequate resident and faculty education is key to maximize change and smooth the transition. Advances in knowledge: This study provides insights for other radiology training programs transitioning to a CBME framework and provides a structure for programmatic assessment.


Subject(s)
Internship and Residency , Radiology , Canada , Clinical Competence , Competency-Based Education , Curriculum , Humans , Radiology/education
3.
CJEM ; 20(1): 125-131, 2018 01.
Article in English | MEDLINE | ID: mdl-28443532

ABSTRACT

BACKGROUND: Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. OBJECTIVE: We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. METHODS: Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. RESULTS: Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.


Subject(s)
Biomedical Research , Education, Medical/economics , Emergency Medicine/education , Fellowships and Scholarships/standards , Guidelines as Topic/standards , Periodicals as Topic , Emergency Medicine/economics , Humans
4.
CJEM ; 20(1): 132-141, 2018 01.
Article in English | MEDLINE | ID: mdl-28511730

ABSTRACT

OBJECTIVES: Simulation-based education (SBE) is an important training strategy in emergency medicine (EM) postgraduate programs. This study sought to characterize the use of simulation in FRCPC-EM residency programs across Canada. METHODS: A national survey was administered to residents and knowledgeable program representatives (PRs) at all Canadian FRCPC-EM programs. Survey question themes included simulation program characteristics, the frequency of resident participation, the location and administration of SBE, institutional barriers, interprofessional involvement, content, assessment strategies, and attitudes about SBE. RESULTS: Resident and PR response rates were 63% (203/321) and 100% (16/16), respectively. Residents reported a median of 20 (range 0-150) hours of annual simulation training, with 52% of residents indicating that the time dedicated to simulation training met their needs. PRs reported the frequency of SBE sessions ranging from weekly to every 6 months, with 15 (94%) programs having an established simulation curriculum. Two (13%) of the programs used simulation for resident assessment, although 15 (94%) of PRs indicated that they would be comfortable with simulation-based assessment. The most common PR-identified barriers to administering simulation were a lack of protected faculty time (75%) and a lack of faculty experience with simulation (56%). Interprofessional involvement in simulation was strongly valued by both residents and PRs. CONCLUSIONS: SBE is frequently used by Canadian FRCPC-EM residency programs. However, there exists considerable variability in the structure, frequency, and timing of simulation-based activities. As programs transition to competency-based medical education, national organizations and collaborations should consider the variability in how SBE is administered.


Subject(s)
Education, Medical, Graduate/methods , Emergency Medicine/education , Internship and Residency/methods , Program Evaluation , Simulation Training/methods , Surveys and Questionnaires , Canada , Humans
5.
CJEM ; 20(2): 284-292, 2018 03.
Article in English | MEDLINE | ID: mdl-28521849

ABSTRACT

OBJECTIVE: Education scholarship can be conducted using a variety of methods, from quantitative experiments to qualitative studies. Qualitative methods are less commonly used in emergency medicine (EM) education research but are well-suited to explore complex educational problems and generate hypotheses. We aimed to review the literature to provide resources to guide educators who wish to conduct qualitative research in EM education. METHODS: We conducted a scoping review to outline: 1) a list of journals that regularly publish qualitative educational papers; 2) an aggregate set of quality markers for qualitative educational research and scholarship; and 3) a list of quality checklists for qualitative educational research and scholarship. RESULTS: We found nine journals that have published more than one qualitative educational research paper in EM. From the literature, we identified 39 quality markers that were grouped into 10 themes: Initial Grounding Work (preparation, background); Goals, Problem Statement, or Question; Methods (general considerations); Sampling Techniques; Data Collection Techniques; Data Interpretation and Theory Generation; Measures to Optimize Rigour and Trustworthiness; Relevance to the Field; Evidence of Reflective Practice; Dissemination and Reporting. Lastly, five quality checklists were found for guiding educators in reporting their qualitative work. CONCLUSION: Many problems that EM educators face are well-suited to exploration using qualitative methods. The results of our scoping review provide publication venues, quality indicators, and checklists that may be useful to EM educators embarking on qualitative projects.


Subject(s)
Education, Medical/organization & administration , Emergency Medicine/education , Fellowships and Scholarships/standards , Guidelines as Topic , Qualitative Research , Humans
6.
J Grad Med Educ ; 9(4): 503-508, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28824766

ABSTRACT

BACKGROUND: Postgraduate medical education programs would benefit from a robust process for training and assessment of competence in resuscitation early in residency. OBJECTIVE: To describe and evaluate the Nightmares Course, a novel, competency-based, transitional curriculum and assessment program in resuscitation medicine at Queen's University in Kingston, Ontario, Canada. METHODS: First-year residents participated in the longitudinal Nightmares Course at Queen's University during the 2015-2016 academic year. An expert working group developed the entrustable professional activity and curricular design for the course. Formative feedback was provided following each simulation-based session, and we employed a summative objective structured clinical examination (OSCE) utilizing a modified Queen's Simulation Assessment Tool. A generalizability study and resident surveys were performed to evaluate the course and assessment process. RESULTS: A total of 40 residents participated in the course, and 23 (58%) participated in the OSCE. Eight of 23 (35%) did not meet the predetermined competency threshold and required remediation. The OSCE demonstrated an acceptable phi coefficient of 0.73. The approximate costs were $240 per Nightmares session, $10,560 for the entire 44-session curriculum, and $3,900 for the summative OSCE. CONCLUSIONS: The Nightmares Course demonstrated feasibility and acceptability, and is applicable to a broad array of postgraduate medical education programs. The entrustment-based assessment detected several residents not meeting a minimum competency threshold, and directed them to additional training.


Subject(s)
Clinical Competence , Curriculum , Educational Measurement/methods , Internship and Residency , Resuscitation/education , Simulation Training/methods , Dreams , Educational Measurement/standards , Humans , Longitudinal Studies , Ontario , Simulation Training/standards
7.
CJEM ; 19(S1): S9-S15, 2017 May.
Article in English | MEDLINE | ID: mdl-28508740

ABSTRACT

A key skill for successful clinician educators is the effective dissemination of scholarly innovations and research. Although there are many ways to disseminate scholarship, the most accepted and rewarded form of educational scholarship is publication in peer-reviewed journals. This paper provides direction for emergency medicine (EM) educators interested in publishing their scholarship via traditional peer-reviewed avenues. It builds upon four literature reviews that aggregated recommendations for writing and publishing high-quality quantitative and qualitative research, innovations, and reviews. Based on the findings from these literature reviews, the recommendations were prioritized for importance and relevance to novice clinician educators by a broad community of medical educators. The top items from the expert vetting process were presented to the 2016 Canadian Association of Emergency Physicians (CAEP) Academic Symposium Consensus Conference on Education Scholarship. This community of EM educators identified the highest yield recommendations for junior medical education scholars. This manuscript elaborates upon the top recommendations identified through this consensus-building process.


Subject(s)
Education, Medical/economics , Emergency Medicine/education , Fellowships and Scholarships/standards , Guidelines as Topic , Societies, Medical/organization & administration , Congresses as Topic , Humans , Peer Review
8.
AEM Educ Train ; 1(4): 293-300, 2017 Oct.
Article in English | MEDLINE | ID: mdl-30051047

ABSTRACT

OBJECTIVE: Simulation stands to serve an important role in modern competency-based programs of assessment in postgraduate medical education. Our objective was to compare the performance of individual emergency medicine (EM) residents in a simulation-based resuscitation objective structured clinical examination (OSCE) using the Queen's Simulation Assessment Tool (QSAT), with portfolio assessment of clinical encounters using a modified in-training evaluation report (ITER) to understand in greater detail the inferences that may be drawn from a simulation-based OSCE assessment. METHODS: A prospective observational study was employed to explore the use of a multicenter simulation-based OSCE for evaluation of resuscitation competence. EM residents from five Canadian academic sites participated in the OSCE. Video-recorded performances were scored by blinded raters using the scenario-specific QSATs with domain-specific anchored scores (primary assessment, diagnostic actions, therapeutic actions, communication) and a global assessment score (GAS). Residents' portfolios were evaluated using a modified ITER subdivided by CanMEDS roles (medical expert, communicator, collaborator, leader, health advocate, scholar, and professional) and a GAS. Correlational and regression analyses were performed comparing components of each of the assessment methods. RESULTS: Portfolio review and ITER scoring was performed for 79 residents participating in the simulation-based OSCE. There was a significant positive correlation between total OSCE and ITER scores (r = 0.341). The strongest correlations were found between ITER medical expert score and each of the OSCE GAS (r = 0.420), communication (r = 0.443), and therapeutic action (r = 0.484) domains. ITER medical expert was a significant predictor of OSCE total (p = 0.002). OSCE therapeutic action was a significant predictor of ITER total (p = 0.02). CONCLUSIONS: Simulation-based resuscitation OSCEs and portfolio assessment captured by ITERs appear to measure differing aspects of competence, with weak to moderate correlation between those measures of conceptually similar constructs. In a program of competency-based assessment of EM residents, a simulation-based OSCE using the QSAT shows promise as a tool for assessing medical expert and communicator roles.

10.
Simul Healthc ; 10(2): 98-105, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25710317

ABSTRACT

INTRODUCTION: The use of high-fidelity simulation is emerging as an effective approach to competency-based assessment in medical education. We aimed to develop and validate a modifiable anchored global assessment scoring tool for simulation-based Objective Structured Clinical Examinations (OSCEs) of resuscitation competence in postgraduate emergency medicine (EM) trainees. METHODS: The Queen's Simulation Assessment Tool was developed using a modified Delphi technique with a panel of EM physicians. Ten standardized resuscitation OSCE scenarios were administered to EM trainees, and their video-recorded performances were scored by 3 independent and blinded EM attending physicians using the Queen's Simulation Assessment Tool. Correlational analyses and analysis of variance were applied to measure the discriminatory capabilities and interrater reliability of each scenario. A fully crossed generalizability study was conducted for each examination. RESULTS: Emergency medicine postgraduate trainees at Queen's University (n = 19-25 per station) participated in the study over 3 years. Interrater reliability showed acceptable levels of agreement for each scenario (mean Spearman ρ = 0.75 [0.63-0.87]; mean interclass correlation coefficient, 0.69 [0.58-0.87]). Discriminatory validity was strong, with senior residents outperforming junior residents in all but 1 of the 10 scenarios. Generalizability studies found the trainee and trainee by scenario interactions as the largest contributors to variance, with G coefficients ranging from 0.67 to 0.84. Resident trainees reported comfort being assessed in the simulation environment (3.8/5) and found the simulation-based examination valuable to their learning (4.6/5). CONCLUSIONS: This study describes the development and validation of a novel modifiable anchored global assessment scoring tool for simulation-based OSCE assessment of resuscitation competence in postgraduate EM trainees.


Subject(s)
Education, Medical/methods , Educational Measurement/methods , Emergency Medicine/education , Resuscitation/education , Simulation Training/methods , Clinical Competence , Educational Measurement/standards , Humans , Internship and Residency , Observer Variation , Reproducibility of Results , Simulation Training/standards
11.
CJEM ; 14(3): 139-46, 2012 May.
Article in English | MEDLINE | ID: mdl-22575294

ABSTRACT

OBJECTIVE: We sought to develop and validate a three-station simulation-based Objective Structured Clinical Examination (OSCE) tool to assess emergency medicine resident competency in resuscitation scenarios. METHODS: An expert panel of emergency physicians developed three scenarios for use with high-fidelity mannequins. For each scenario, a corresponding assessment tool was developed with an essential actions (EA) checklist and a global assessment score (GAS). The scenarios were (1) unstable ventricular tachycardia, (2) respiratory failure, and (3) ST elevation myocardial infarction. Emergency medicine residents were videotaped completing the OSCE, and three clinician experts independently evaluated the videotapes using the assessment tool. RESULTS: Twenty-one residents completed the OSCE (nine residents in the College of Family Physicians of Canada-Emergency Medicine [CCFP-EM] program, six junior residents in the Fellow of the Royal College of Physicians of Canada-Emergency Medicine [FRCP-EM] program, six senior residents in the FRCP-EM). Interrater reliability for the EA scores was good but varied between scenarios (Spearman rho = [1] 0.68, [2] 0.81, [3] 0.41). Interrater reliability for the GAS was also good, with less variability (rho = [1] 0.64, [2] 0.56, [3] 0.62). When comparing GAS scores, senior FRCP residents outperformed CCFP-EM residents in all scenarios and junior residents in two of three scenarios (p < 0.001 to 0.01). Based on EA scores, senior FRCP residents outperformed CCFP-EM residents, but junior residents outperformed senior FRCP residents in scenario 1 and CCFP-EM residents in all scenarios (p = 0.006 to 0.04). CONCLUSION: This study outlines the creation of a high-fidelity simulation assessment tool for trainees in emergency medicine. A single-point GAS demonstrated stronger relational validity and more consistent reliability in comparison with an EA checklist. This preliminary work will provide a foundation for ongoing future development of simulation-based assessment tools.


Subject(s)
Educational Measurement/methods , Emergency Medicine/education , Internship and Residency , Manikins , Resuscitation/education , Canada , Cross-Sectional Studies , Educational Measurement/standards , Humans , Intubation, Intratracheal , Myocardial Infarction/therapy , Observer Variation , Reference Standards , Reproducibility of Results , Respiratory Insufficiency/therapy , Tachycardia, Ventricular/therapy , Videotape Recording
SELECTION OF CITATIONS
SEARCH DETAIL
...