Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
BMC Med Educ ; 22(1): 771, 2022 Nov 09.
Artigo em Inglês | MEDLINE | ID: mdl-36352441

RESUMO

INTRODUCTION: One of the challenges in medical education is effectively assessing basic science knowledge retention. National Board of Medical Examiners (NBME) clerkship subject exam performance is reflective of the basic science knowledge accrued during preclinical education. The aim of this study was to determine if students' retention of basic science knowledge during the clerkship years can be analyzed using a cognitive diagnostic assessment (CDA) of the NBME subject exam data. METHODS: We acquired a customized NBME item analysis report of our institution's pediatric clerkship subject exams for the period of 2017-2020 and developed a question-by-content Q-matrix by identifying skills necessary to master content. As a pilot study, students' content mastery in 12 major basic science content areas was analyzed using a CDA model called DINA (deterministic input, noisy "and" gate). RESULTS: The results allowed us to identify strong and weak basic science content areas for students in the pediatric clerkship. For example: "Reproductive systems" and "Skin and subcutaneous tissue" showed a student mastery of 83.8 ± 2.2% and 60.7 ± 3.2%, respectively. CONCLUSIONS: Our pilot study demonstrates how this new technique can be applicable in quantitatively measuring students' basic science knowledge retention during any clerkship. Combined data from all the clerkships will allow comparisons of specific content areas and identification of individual variations between different clerkships. In addition, the same technique can be used to analyze internal assessments thereby creating an opportunity for the longitudinal tracking of student performances. Detailed analyses like this can guide specific curricular changes and drive continuous quality improvement in the undergraduate medical school curriculum.


Assuntos
Estágio Clínico , Educação de Graduação em Medicina , Estudantes de Medicina , Humanos , Criança , Avaliação Educacional , Médicos Legistas , Projetos Piloto , Currículo , Competência Clínica , Educação de Graduação em Medicina/métodos
2.
MedEdPublish (2016) ; 12: 39, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-37064860

RESUMO

Coronavirus disease 2019 (COVID-19) drastically disrupted daily life and abruptly forced curricular modifications in undergraduate medical education. Despite level of preparedness, medical schools moved instruction online for students to access remotely. Similarly, accreditation visits by the Liaison Committee for Medical Education (LCME) were also moved from in-person to virtual formats during these chaotic times. Little guidance was available to transition to the new process. Medical schools that were scheduled for a virtual survey visit were required to pivot without tried experience on how to prepare for and conduct these high stakes online visits. New processes needed to be developed to successfully navigate a virtual accreditation visit. To date, there has been nothing in the literature from those who have participated on LCME teams nor from medical schools who have undergone a virtual survey visit. This article accounts for one medical school's experience from its 2021 LCME virtual visit and makes recommendations to consider when planning for such a significant event. The future of virtual visits is taken into account as this method has its benefits including elimination of travel and the associated time and cost. Yet, the perspectives from others who have participated in a virtual LCME accreditation visit should be studied. While the LCME will return to in-person visits in 2022-23, it is important for medical schools to share their experiences and lessons learned from their virtual accreditation visit should the need arise to reinstate virtual visits in the future.

3.
MedEdPublish (2016) ; 10: 37, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-38486513

RESUMO

This article was migrated. The article was marked as recommended. Objective: Review of assessment questions to ensure quality is critical to properly assess student performance. The purpose of this study was to identify processes used by medical schools to review questions used in internal assessments. Methods: The authors recruited professionals involved with the writing and/or review of questions for their medical school's internal assessments to participate in this study. The survey was administered electronically via an anonymous link, and participation was solicited through the DR-ED listserv, an electronic discussion group for medical educators. Responses were collected over a two-week period, and one reminder was sent to increase the response rate. The instrument was comprised of one demographic question, two closed-ended questions, and two open-ended questions. Results: Thirty-nine respondents completed the survey in which 22 provided the name of their institution/medical school. Of those who self-identified, no two respondents appeared to be from the same institution, and participants represented institutions from across the United States with two from other countries. The majority (n=32, 82%) of respondents indicated they had a process to review student assessment questions. Most participants reported that faculty and course/block directors had responsibility for review of assessment questions, while some indicated they had a committee or group of faculty who was responsible for review. Most focused equally on content/accuracy, formatting, and grammar as reported. Over 81% (n=22) of respondents indicated they used NBME resources to guide review, and less than 19% (n=5) utilized internally developed writing guides. Conclusions: Results of this study identified that medical schools are using a wide range of item review strategies and use a variety of tools to guide their review. These results will give insight to other medical schools who do not have processes in place to review assessment questions or who are looking to expand upon current procedures.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...