Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 25
Filter
1.
Perspect Med Educ ; 13(1): 201-223, 2024.
Article in English | MEDLINE | ID: mdl-38525203

ABSTRACT

Postgraduate medical education is an essential societal enterprise that prepares highly skilled physicians for the health workforce. In recent years, PGME systems have been criticized worldwide for problems with variable graduate abilities, concerns about patient safety, and issues with teaching and assessment methods. In response, competency based medical education approaches, with an emphasis on graduate outcomes, have been proposed as the direction for 21st century health profession education. However, there are few published models of large-scale implementation of these approaches. We describe the rationale and design for a national, time-variable competency-based multi-specialty system for postgraduate medical education called Competence by Design. Fourteen innovations were bundled to create this new system, using the Van Melle Core Components of competency based medical education as the basis for the transformation. The successful execution of this transformational training system shows competency based medical education can be implemented at scale. The lessons learned in the early implementation of Competence by Design can inform competency based medical education innovation efforts across professions worldwide.


Subject(s)
Education, Medical , Medicine , Humans , Competency-Based Education/methods , Education, Medical/methods , Clinical Competence , Publications
2.
Perspect Med Educ ; 13(1): 95-107, 2024.
Article in English | MEDLINE | ID: mdl-38343556

ABSTRACT

Program evaluation is an essential, but often neglected, activity in any transformational educational change. Competence by Design was a large-scale change initiative to implement a competency-based time-variable educational system in Canadian postgraduate medical education. A program evaluation strategy was an integral part of the build and implementation plan for CBD from the beginning, providing insights into implementation progress, challenges, unexpected outcomes, and impact. The Competence by Design program evaluation strategy was built upon a logic model and three pillars of evaluation: readiness to implement, fidelity and integrity of implementation, and outcomes of implementation. The program evaluation strategy harvested from both internally driven studies and those performed by partners and invested others. A dashboard for the program evaluation strategy was created to transparently display a real-time view of Competence by Design implementation and facilitate continuous adaptation and improvement. The findings of the program evaluation for Competence by Design drove changes to all aspects of the Competence by Design implementation, aided engagement of partners, supported change management, and deepened our understanding of the journey required for transformational educational change in a complex national postgraduate medical education system. The program evaluation strategy for Competence by Design provides a framework for program evaluation for any large-scale change in health professions education.


Subject(s)
Competency-Based Education , Education, Medical , Humans , Canada , Program Evaluation , Curriculum
4.
Can Med Educ J ; 14(1): 4-12, 2023 03.
Article in English | MEDLINE | ID: mdl-36998506

ABSTRACT

Background: The CanMEDS physician competency framework will be updated in 2025. The revision occurs during a time of disruption and transformation to society, healthcare, and medical education caused by the COVID-19 pandemic and growing acknowledgement of the impacts of colonialism, systemic discrimination, climate change, and emerging technologies on healthcare and training. To inform this revision, we sought to identify emerging concepts in the literature related to physician competencies. Methods: Emerging concepts were defined as ideas discussed in the literature related to the roles and competencies of physicians that are absent or underrepresented in the 2015 CanMEDS framework. We conducted a literature scan, title and abstract review, and thematic analysis to identify emerging concepts. Metadata for all articles published in five medical education journals between October 1, 2018 and October 1, 2021 were extracted. Fifteen authors performed a title and abstract review to identify and label underrepresented concepts. Two authors thematically analyzed the results to identify emerging concepts. A member check was conducted. Results: 1017 of 4973 (20.5%) of the included articles discussed an emerging concept. The thematic analysis identified ten themes: Equity, Diversity, Inclusion, and Social Justice; Anti-racism; Physician Humanism; Data-Informed Medicine; Complex Adaptive Systems; Clinical Learning Environment; Virtual Care; Clinical Reasoning; Adaptive Expertise; and Planetary Health. All themes were endorsed by the authorship team as emerging concepts. Conclusion: This literature scan identified ten emerging concepts to inform the 2025 revision of the CanMEDS physician competency framework. Open publication of this work will promote greater transparency in the revision process and support an ongoing dialogue on physician competence. Writing groups have been recruited to elaborate on each of the emerging concepts and how they could be further incorporated into CanMEDS 2025.


Contexte: Le référentiel de compétences CanMEDS pour les médecins sera mis à jour en 2025. Cette révision arrive à un moment où la société, les soins de santé et l'enseignement médical sont bouleversés et en pleine mutation à cause de la pandémie de la COVID-19. On est aussi à l'heure où l'on reconnaît de plus en plus les effets du colonialisme, de la discrimination systémique, des changements climatiques et des nouvelles technologies sur les soins de santé et la formation des médecins. Pour effectuer cette révision, nous avons avons extrait de la littérature scientifique les concepts émergents se rapportant aux compétences des médecins. Méthodes: Les concepts émergents ont été définis comme des idées ayant trait aux rôles et aux compétences des médecins qui sont débattues dans la littérature, mais qui sont absentes ou sous-représentées dans le cadre CanMEDS 2015. Nous avons réalisé une recherche documentaire, un examen des titres et des résumés, et une analyse thématique pour repérer les concepts émergents. Les métadonnées de tous les articles publiés dans cinq revues d'éducation médicale entre le 1er octobre 2018 et le 1er octobre 2021 ont été extraites. Quinze auteurs ont effectué un examen des titres et des résumés pour relever et étiqueter les concepts sous-représentés. Deux auteurs ont procédé à une analyse thématique des résultats pour dégager les concepts émergents. Une vérification a été faite par les membres de l'équipe. Résultats: Parmi les 4973 articles dépouillés, 1017 (20,5 %) abordaient un concept émergent. Les dix thèmes suivants sont ressortis de l'analyse thématique: l'équité, la diversité, l'inclusion et la justice sociale; l'antiracisme; l'humanité du médecin; la médecine fondée sur les données; les systèmes adaptatifs complexes; l'environnement de l'apprentissage clinique; les soins virtuels; le raisonnement clinique; l'expertise adaptative; et la santé planétaire. L'ensemble de ces thèmes ont été approuvés comme concepts émergents par l'équipe de rédaction. Conclusion: Cet examen de la littérature a permis de relever dix concepts émergents qui peuvent servir à éclairer la révision du référentiel de compétences CanMEDS pour les médecins qui aura lieu en 2025. La publication en libre accès de ce travail favorisera la transparence du processus de révision et le dialogue continu sur les compétences des médecins. Des groupes de rédaction ont été recrutés pour développer chacun des concepts émergents et pour examiner la façon dont ils pourraient être intégrés dans la version du référentiel CanMEDS de 2025.


Subject(s)
COVID-19 , Education, Medical , Physicians , Humans , Pandemics , Clinical Competence , Education, Medical/methods
5.
Med Teach ; 45(8): 802-815, 2023 08.
Article in English | MEDLINE | ID: mdl-36668992

ABSTRACT

BACKGROUND: Competency-based medical education (CBME) received increased attention in the early 2000s by educators, clinicians, and policy makers as a way to address concerns about physician preparedness and patient safety in a rapidly changing healthcare environment. Opinions and perspectives around this shift in medical education vary and, to date, a systematic search and synthesis of the literature has yet to be undertaken. The aim of this scoping review is to present a comprehensive map of the literary conversations surrounding CBME. METHODS: Twelve different databases were searched from database inception up until 29 April 2020. Literary conversations were extracted into the following categories: perceived advantages, perceived disadvantages, challenges/uncertainties/skepticism, and recommendations related to CBME. RESULTS: Of the 5757 identified records, 387 were included in this review. Through thematic analysis, eight themes were identified in the literary conversations about CBME: credibility, application, community influence, learner impact, assessment, educational developments, organizational structures, and societal impacts of CBME. Content analysis supported the development of a heat map that provides a visual illustration of the frequency of these literary conversations over time. CONCLUSIONS: This review serves two purposes for the medical education research community. First, this review acts as a comprehensive historical record of the shifting perceptions of CBME as the construct was introduced and adopted by many groups in the medical education global community over time. Second, this review consolidates the many literary conversations about CBME that followed the initial proposal for this approach. These findings can facilitate understanding of CBME for multiple audiences both within and outside of the medical education research community.


Subject(s)
Education, Medical , Physicians , Humans , Competency-Based Education , Curriculum , Attitude
6.
Med Teach ; 44(8): 886-892, 2022 08.
Article in English | MEDLINE | ID: mdl-36083123

ABSTRACT

PURPOSE: Organizational readiness is critical for successful implementation of an innovation. We evaluated program readiness to implement Competence by Design (CBD), a model of Competency-Based Medical Education (CBME), among Canadian postgraduate training programs. METHODS: A survey of program directors was distributed 1 month prior to CBD implementation in 2019. Questions were informed by the R = MC2 framework of organizational readiness and addressed: program motivation, general capacity for change, and innovation-specific capacity. An overall readiness score was calculated. An ANOVA was conducted to compare overall readiness between disciplines. RESULTS: Survey response rate was 42% (n = 79). The mean overall readiness score was 74% (30-98%). There was no difference in scores between disciplines. The majority of respondents agreed that successful implementation of CBD was a priority (74%), and that their leadership (94%) and faculty and residents (87%) were supportive of change. Fewer perceived that CBD was a move in the right direction (58%) and that implementation was a manageable change (53%). Curriculum mapping, competence committees and programmatic assessment activities were completed by >90% of programs, while <50% had engaged off-service disciplines. CONCLUSION: Our study highlights important areas where programs excelled in their preparation for CBD, as well as common challenges that serve as targets for future intervention to improve program readiness for CBD implementation.


Subject(s)
Competency-Based Education , Education, Medical , Canada , Curriculum , Humans , Leadership
7.
Med Teach ; 44(7): 781-789, 2022 07.
Article in English | MEDLINE | ID: mdl-35199617

ABSTRACT

PURPOSE: This study evaluated the fidelity of competence committee (CC) implementation in Canadian postgraduate specialist training programs during the transition to competency-based medical education (CBME). METHODS: A national survey of CC chairs was distributed to all CBME training programs in November 2019. Survey questions were derived from guiding documents published by the Royal College of Physicians and Surgeons of Canada reflecting intended processes and design. RESULTS: Response rate was 39% (113/293) with representation from all eligible disciplines. Committee size ranged from 3 to 20 members, 42% of programs included external members, and 20% included a resident representative. Most programs (72%) reported that a primary review and synthesis of resident assessment data occurs prior to the meeting, with some data reviewed collectively during meetings. When determining entrustable professional activity (EPA) achievement, most programs followed the national specialty guidelines closely with some exceptions (53%). Documented concerns about professionalism, EPA narrative comments, and EPA entrustment scores were most highly weighted when determining resident progress decisions. CONCLUSIONS: Heterogeneity in CC implementation likely reflects local adaptations, but may also explain some of the variable challenges faced by programs during the transition to CBME. Our results offer educational leaders important fidelity data that can help inform the larger evaluation and transformation of CBME.


Subject(s)
Internship and Residency , Physicians , Canada , Clinical Competence , Competency-Based Education , Humans , Specialization
8.
Med Teach ; 43(7): 751-757, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34410891

ABSTRACT

The ongoing adoption of competency-based medical education (CBME) across health professions training draws focus to learner-centred educational design and the importance of fostering a growth mindset in learners, teachers, and educational programs. An emerging body of literature addresses the instructional practices and features of learning environments that foster the skills and strategies necessary for trainees to be partners in their own learning and progression to competence and to develop skills for lifelong learning. Aligned with this emerging area is an interest in Dweck's self theory and the concept of the growth mindset. The growth mindset is an implicit belief held by an individual that intelligence and abilities are changeable, rather than fixed and immutable. In this paper, we present an overview of the growth mindset and how it aligns with the goals of CBME. We describe the challenges associated with shifting away from the fixed mindset of most traditional medical education assumptions and practices and discuss potential solutions and strategies at the individual, relational, and systems levels. Finally, we present future directions for research to better understand the growth mindset in the context of CBME.


Subject(s)
Competency-Based Education , Education, Medical , Health Occupations , Humans , Learning
9.
Med Teach ; 43(7): 794-800, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34121596

ABSTRACT

There is an urgent need to capture the outcomes of the ongoing global implementation of competency-based medical education (CBME). However, the measurement of downstream outcomes following educational innovations, such as CBME is fraught with challenges stemming from the complexities of medical training, the breadth and variability of inputs, and the difficulties attributing outcomes to specific educational elements. In this article, we present a logic model for CBME to conceptualize an impact pathway relating to CBME and facilitate outcomes evaluation. We further identify six strategies to mitigate the challenges of outcomes measurement: (1) clearly identify the outcome of interest, (2) distinguish between outputs and outcomes, (3) carefully consider attribution versus contribution, (4) connect outcomes to the fidelity and integrity of implementation, (5) pay attention to unanticipated outcomes, and (6) embrace methodological pluralism. Embracing these challenges, we argue that careful and thoughtful evaluation strategies will move us forward in answering the all-important question: Are the desired outcomes of CBME being achieved?


Subject(s)
Competency-Based Education , Education, Medical , Humans
10.
Med Teach ; 43(7): 788-793, 2021 Jul.
Article in English | MEDLINE | ID: mdl-34038673

ABSTRACT

As the global transformation of postgraduate medical training continues, there are persistent calls for program evaluation efforts to understand the impact and outcomes of competency-based medical education (CBME) implementation. The measurement of a complex educational intervention such as CBME is challenging because of the multifaceted nature of activities and outcomes. What is needed, therefore, is an organizational taxonomy to both conceptualize and categorize multiple outcomes. In this manuscript we propose a taxonomy that builds on preceding works to organize CBME outcomes across three domains: focus (educational, clinical), level (micro, meso, macro), and timeline (training, transition to practice, practice). We also provide examples of how to conceptualize outcomes of educational interventions across medical specialties using this taxonomy. By proposing a shared language for outcomes of CBME, we hope that this taxonomy will help organize ongoing evaluation work and catalyze those seeking to engage in the evaluation effort to help understand the impact and outcomes of CBME.


Subject(s)
Curriculum , Education, Medical , Competency-Based Education , Humans , Language , Program Evaluation
11.
Acad Med ; 96(9): 1332-1336, 2021 09 01.
Article in English | MEDLINE | ID: mdl-33769339

ABSTRACT

PURPOSE: Competency-based assessment, using entrustable professional activities (EPAs), is rapidly being implemented worldwide without sufficient agreement on the essential elements of EPA-based assessment. The rapidity of implementation has left little time to understand what works in what circumstances and why or why not. The result is the attempted execution of a complex service intervention without a shared mental model for features needed to remain true to implementing an EPA assessment framework as intended. The purpose of this study was to identify the essential core components necessary to maintain integrity in the implementation of this intended intervention. METHOD: A formal consensus-building technique, the Delphi process, was used to identify core components for implementing an EPA-based assessment framework. Twelve EPA experts from the United States, Canada, and the Netherlands participated in this process in February and March 2020. In each Delphi round, participants rated possible core components on a scale from 1 to 6, with 1 reflecting the worst fit and 6 the best fit for EPA-based assessment implementation. Predetermined automatic inclusion and exclusion criteria for candidate core components were set at ≥ 80% of participants assigning a value of 5 or 6 and ≥ 80% assigning a value of 1 or 2, respectively. RESULTS: After 3 rounds, participants prioritized 10 of 19 candidate core components for inclusion: performance prediction, shared local mental model, workplace assessment, high-stakes entrustment decisions, outcomes based, value of the collective, informed clinical competency committee members, construct alignment, qualitative data, and entrustment decision consequences. The study closed after 3 rounds on the basis of the rankings and comments. CONCLUSIONS: Using the core components identified in this study advances efforts to implement an EPA assessment framework intervention as intended, which mitigates the likelihood of making an incorrect judgment that the intervention demonstrates negative results.


Subject(s)
Clinical Competence/standards , Competency-Based Education/standards , Educational Measurement/standards , Implementation Science , Outcome and Process Assessment, Health Care/standards , Canada , Consensus , Delphi Technique , Humans , Netherlands , United States
12.
J Eval Clin Pract ; 26(4): 1087-1095, 2020 Aug.
Article in English | MEDLINE | ID: mdl-31820556

ABSTRACT

RATIONALE, AIMS, AND OBJECTIVES: Programmatic assessment has been identified as a system-oriented approach to achieving the multiple purposes for assessment within Competency-Based Medical Education (CBME, i.e., formative, summative, and program improvement). While there are well-established principles for designing and evaluating programs of assessment, few studies illustrate and critically interpret, what a system of programmatic assessment looks like in practice. This study aims to use systems thinking and the 'two communities' metaphor to interpret a model of programmatic assessment and to identify challenges and opportunities with operationalization. METHOD: An interpretive case study was used to investigate how programmatic assessment is being operationalized within one competency-based residency program at a Canadian university. Qualitative data were collected from residents, faculty, and program leadership via semi-structured group and individual interviews conducted at nine months post-CBME implementation. Data were analyzed using a combination of data-based inductive analysis and theory-derived deductive analysis. RESULTS: In this model, Academic Advisors had a central role in brokering assessment data between communities responsible for producing and using residents' performance information for decision making (i.e., formative, summative/evaluative, and program improvement). As system intermediaries, Academic Advisors were in a privileged position to see how the parts of the assessment system contributed to the functioning of the whole and could identify which system components were not functioning as intended. Challenges were identified with the documentation of residents' performance information (i.e., system inputs); use of low-stakes formative assessments to inform high-stakes evaluative judgments about the achievement of competence standards; and gaps in feedback mechanisms for closing learning loops. CONCLUSIONS: The findings of this research suggest that program stakeholders can benefit from a systems perspective regarding how their assessment practices contribute to the efficacy of the system as a whole. Academic Advisors are well positioned to support educational development efforts focused on overcoming challenges with operationalizing programmatic assessment.


Subject(s)
Competency-Based Education , Internship and Residency , Canada , Clinical Competence , Feedback , Humans , Learning
13.
Acad Med ; 95(5): 786-793, 2020 05.
Article in English | MEDLINE | ID: mdl-31625995

ABSTRACT

PURPOSE: Despite the broad endorsement of competency-based medical education (CBME), myriad difficulties have arisen in program implementation. The authors sought to evaluate the fidelity of implementation and identify early outcomes of CBME implementation using Rapid Evaluation to facilitate transformative change. METHOD: Case-study methodology was used to explore the lived experience of implementing CBME in the emergency medicine postgraduate program at Queen's University, Canada, using iterative cycles of Rapid Evaluation in 2017-2018. After the intended implementation was explicitly described, stakeholder focus groups and interviews were conducted at 3 and 9 months post-implementation to evaluate the fidelity of implementation and early outcomes. Analyses were abductive, using the CBME core components framework and data-driven approaches to understand stakeholders' experiences. RESULTS: In comparing planned with enacted implementation, important themes emerged with resultant opportunities for adaption. For example, lack of a shared mental model resulted in frontline difficulty with assessment and feedback and a concern that the granularity of competency-focused assessment may result in "missing the forest for the trees," prompting the return of global assessment. Resident engagement in personal learning plans was not uniformly adopted, and learning experiences tailored to residents' needs were slow to follow. CONCLUSIONS: Rapid Evaluation provided critical insights into the successes and challenges of operationalizing CBME. Implementing the practical components of CBME was perceived as a sprint, while realizing the principles of CBME and changing culture in postgraduate training was a marathon requiring sustained effort in the form of frequent evaluation and continuous faculty and resident development.


Subject(s)
Competency-Based Education/standards , Program Development/standards , Program Evaluation/methods , Time Factors , Canada , Competency-Based Education/statistics & numerical data , Focus Groups/methods , Humans , Interviews as Topic/methods , Program Development/statistics & numerical data , Program Evaluation/standards , Program Evaluation/statistics & numerical data , Qualitative Research
14.
Acad Med ; 94(7): 1002-1009, 2019 07.
Article in English | MEDLINE | ID: mdl-30973365

ABSTRACT

PURPOSE: The rapid adoption of competency-based medical education (CBME) provides an unprecedented opportunity to study implementation. Examining "fidelity of implementation"-that is, whether CBME is being implemented as intended-is hampered, however, by the lack of a common framework. This article details the development of such a framework. METHOD: A two-step method was used. First, a perspective indicating how CBME is intended to bring about change was described. Accordingly, core components were identified. Drawing from the literature, the core components were organized into a draft framework. Using a modified Delphi approach, the second step examined consensus amongst an international group of experts in CBME. RESULTS: Two different viewpoints describing how a CBME program can bring about change were found: production and reform. Because the reform model was most consistent with the characterization of CBME as a transformative innovation, this perspective was used to create a draft framework. Following the Delphi process, five core components of CBME curricula were identified: outcome competencies, sequenced progression, tailored learning experiences, competency-focused instruction, and programmatic assessment. With some modification in wording, consensus emerged amongst the panel of international experts. CONCLUSIONS: Typically, implementation evaluation relies on the creation of a specific checklist of practices. Given the ongoing evolution and complexity of CBME, this work, however, focused on identifying core components. Consistent with recent developments in program evaluation, where implementation is described as a developmental trajectory toward fidelity, identifying core components is presented as a fundamental first step toward gaining a more sophisticated understanding of implementation.


Subject(s)
Competency-Based Education/standards , Education, Medical/standards , Program Evaluation/methods , Competency-Based Education/methods , Education, Medical/methods , Humans
15.
Med Teach ; 41(7): 811-818, 2019 07.
Article in English | MEDLINE | ID: mdl-30955390

ABSTRACT

Purpose: Adopting CBME is challenging in medicine. It mandates a change in processes and approach, ultimately a change in institutional culture with stakeholders ideally embracing and valuing the new processes. Adopting the transformational change model, this study describes the shift in assessment culture by Academic Advisors (AAs) and preceptors over three years of CBME implementation in one Department of Family Medicine. Methods: A qualitative grounded theory method was used for this two-part study. Interviews were conducted with 12 AAs in 2013 and nine AAs in 2016 using similar interview questions. Data were analyzed through a constant comparative method. Results: Three overarching themes emerged from the data: (1) specific identified shifts in assessment culture, (2) factors supporting the shifts in culture, and (3) outcomes related to the culture shift. Conclusions: In both parts of the study, participants noted that assessment took more time and effort. In Part 2, however, the effort was mitigated by a sense of value for all stakeholders. With support from the mandate of regulatory bodies, local leadership, department, faculty development and an electronic platform, a cultural transformation occurred in assessment that enhanced learning and teaching, use of embedded standards for performance decisions, and tracking and documentation performance.


Subject(s)
Competency-Based Education/organization & administration , Education, Medical/organization & administration , Educational Measurement/methods , Competency-Based Education/standards , Education, Medical/standards , Educational Measurement/standards , Faculty, Medical/organization & administration , Grounded Theory , Humans , Leadership , Organizational Culture
16.
Teach Learn Med ; 31(3): 307-318, 2019.
Article in English | MEDLINE | ID: mdl-30554529

ABSTRACT

Problem: Medical educators recognize that professionalism is difficult to teach to students in lecture-based or faculty-led settings. An underused but potentially valuable alternative is to enroll near-peers to teach professionalism. Intervention: We describe a novel near-peer curriculum on professionalism developed at Queen's University School of Medicine. Senior medical students considered role models by their classmates were nominated to facilitate small-group seminars with junior students on topics in professionalism. Each session was preceded by brief pre-readings or prompts and engaged students in semistructured, open-ended discussion. Three 2-hour sessions have occurred annually. Context: The near-peer sessions are a required component (6 hours; 20%) of the 1st-year professionalism course at Queen's University (30 hours), which otherwise includes faculty-led seminars, lectures, and online modules. Senior facilitators are selected through a peer nomination process during their 3rd year of medical school. This format was chosen to create a highly regarded position to which students could aspire by demonstrating positive professionalism. Outcome: We performed a qualitative descriptive evaluation of the near-peer curriculum. Fifty-six medical students participated in 11 focus group interviews, which were coded and analyzed for themes inductively and deductively. Quantitative reviews of student feedback forms and a third-party thematic analysis were performed to triangulate results. Medical students preferred the near-peer-led discussion-based curriculum to faculty-led seminars and didactic or online formats. Junior students could describe specific examples of how the curriculum had influenced their behavior in academic, clinical, and personal settings. They cited senior near-peer facilitators as the strongest aspect of the curriculum for their social and cognitive congruence. Senior students who had facilitated sessions regarded the peer teaching experience as formative to their own understanding of professionalism. Lessons Learned: Formal medical curricula on professionalism should emphasize near-peer-led small-group discussion as it fosters a nuanced understanding of professionalism for both early level students and senior students acting as teachers.


Subject(s)
Education, Medical, Undergraduate/methods , Group Processes , Peer Group , Professionalism , Students, Medical/psychology , Female , Humans , Male , Ontario
17.
MedEdPublish (2016) ; 7: 85, 2018.
Article in English | MEDLINE | ID: mdl-38089223

ABSTRACT

This article was migrated. The article was marked as recommended. Health professions education is undergoing a major paradigm shift to competency-based medical education. When shifts in thinking are profound and result in transformations of existing paradigms, there is often an accompanying criticism. While competency-based medical education is an evidence guided change in approach to curriculum and assessment, it is not immune to critique and concerns. Some criticisms are valid, and must be addressed as competency-based medical education is implemented; other concerns raised about competency-based medical education highlight the importance of clarity in language and purpose when discussing new paradigms. In this commentary, we aim to offer a balanced view of competency-based medical education by presenting an overview of the origins and conceptual assumptions of competency-based medical education and acknowledging valid criticisms of the approach.

18.
Acad Med ; 92(8): 1151-1159, 2017 08.
Article in English | MEDLINE | ID: mdl-28746138

ABSTRACT

PURPOSE: To examine the effectiveness of co-learning, wherein faculty and trainees learn together, as a novel approach for building quality improvement (QI) faculty capacity. METHOD: From July 2012 through September 2015, the authors conducted 30 semistructured interviews with 23 faculty participants from the Co-Learning QI Curriculum of the Department of Medicine, Faculty of Medicine, University of Toronto, and collected descriptive data on faculty participation and resident evaluations of teaching effectiveness. Interviewees were from 13 subspecialty residency programs at their institution. RESULTS: Of the 56 faculty participants, the Co-Learning QI Curriculum trained 29 faculty mentors, 14 of whom taught formally. Faculty leads with an academic QI role, many of whom had prior QI training, reinforced their QI knowledge while also developing QI mentorship and teaching skills. Co-learning elements that contributed to QI teaching skills development included seeing first how the QI content is taught, learning through project mentorship, building experience longitudinally over time, a graded transition toward independent teaching, and a supportive program lead. Faculty with limited QI experience reported improved QI knowledge, skills, and project facilitation but were ambivalent about assuming a teacher role. Unplanned outcomes for both groups included QI teaching outside of the curriculum, applying QI principles to other work, networking, and strengthening one's QI professional role. CONCLUSIONS: The Co-Learning QI Curriculum was effective in improving faculty QI knowledge and skills and increased faculty capacity to teach and mentor QI. Findings suggest that a combination of curriculum and contextual factors were critical to realizing the curriculum's full potential.


Subject(s)
Education, Medical, Graduate/organization & administration , Faculty, Medical/education , Health Personnel/education , Internal Medicine/education , Internship and Residency/organization & administration , Quality Improvement , Staff Development/organization & administration , Adult , Curriculum , Female , Humans , Longitudinal Studies , Male , Middle Aged , Ontario , Organizational Innovation
19.
Acad Med ; 92(6): 752-758, 2017 06.
Article in English | MEDLINE | ID: mdl-28557934

ABSTRACT

Competency-based medical education (CBME) aims to bring about the sequential acquisition of competencies required for practice. Although it is being adopted in centers of medical education around the globe, there is little evidence concerning whether, in comparison with traditional methods, CBME produces physicians who are better prepared for the practice environment and contributes to improved patient outcomes. Consequently, the authors, an international group of collaborators, wrote this article to provide guidance regarding the evaluation of CBME programs.CBME is a complex service intervention consisting of multiple activities that contribute to the achievement of a variety of outcomes over time. For this reason, it is difficult to apply traditional methods of program evaluation, which require conditions of control and predictability, to CBME. To address this challenge, the authors describe an approach that makes explicit the multiple potential linkages between program activities and outcomes. Referred to as contribution analysis (CA), this theory-based approach to program evaluation provides a systematic way to make credible causal claims under conditions of complexity. Although CA has yet to be applied to medical education, the authors describe how a six-step model and a postulated theory of change could be used to examine the link between CBME, physicians' preparation for practice, and patient care outcomes.The authors argue that adopting the methods of CA, particularly the rigor in thinking required to link program activities, outcomes, and theory, will serve to strengthen understanding of the impact of CBME over time.


Subject(s)
Clinical Competence/standards , Competency-Based Education/standards , Curriculum/standards , Education, Medical, Undergraduate/standards , Thinking , Adult , Female , Humans , Male , Young Adult
20.
Acad Med ; 91(2): 191-8, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26630606

ABSTRACT

The decision to trust a medical trainee with the critical responsibility to care for a patient is fundamental to clinical training. When carefully and deliberately made, such decisions can serve as significant stimuli for learning and also shape the assessment of trainees. Holding back entrustment decisions too much may hamper the trainee's development toward unsupervised practice. When carelessly made, however, they jeopardize patient safety. Entrustment decision-making processes, therefore, deserve careful analysis.Members (including the authors) of the International Competency-Based Medical Education Collaborative conducted a content analysis of the entrustment decision-making process in health care training during a two-day summit in September 2013 and subsequently reviewed the pertinent literature to arrive at a description of the critical features of this process, which informs this article.The authors discuss theoretical backgrounds and terminology of trust and entrustment in the clinical workplace. The competency-based movement and the introduction of entrustable professional activities force educators to rethink the grounds for assessment in the workplace. Anticipating a decision to grant autonomy at a designated level of supervision appears to align better with health care practice than do most current assessment practices. The authors distinguish different modes of trust and entrustment decisions and elaborate five categories, each with related factors, that determine when decisions to trust trainees are made: the trainee, supervisor, situation, task, and the relationship between trainee and supervisor. The authors' aim in this article is to lay a theoretical foundation for a new approach to workplace training and assessment.


Subject(s)
Clinical Competence , Competency-Based Education/methods , Decision Making , Education, Medical, Graduate/methods , Internship and Residency/methods , Interprofessional Relations , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...