Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Clin Teach ; 13(5): 352-6, 2016 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-26450346

RESUMO

BACKGROUND: Single-best answer (SBA) questions are widely used for assessment in medical schools; however, often clinical staff have neither the time nor the incentive to develop high-quality material for revision purposes. A student-led approach to producing formative SBA questions offers a potential solution. METHODS: Cardiff University School of Medicine students created a bank of SBA questions through a previously described staged approach, involving student question-writing, peer-review and targeted senior clinician input. We arranged questions into discrete tests and posted these online. Student volunteer performance on these tests from the 2012/13 cohort of final-year medical students was recorded and compared with the performance of these students in medical school finals (knowledge and objective structured clinical examinations, OSCEs). In addition, we compared the performance of students that participated in question-writing groups with the performance of the rest of the cohort on the summative SBA assessment. Often clinical staff have neither the time nor the incentive to develop high-quality material for revision purposes RESULTS: Performance in the end-of-year summative clinical knowledge SBA paper correlated strongly with performance in the formative student-written SBA test (r = ~0.60, p <0.01). There was no significant correlation between summative OSCE scores and formative student-written SBA test scores. Students who wrote and reviewed questions scored higher than average in the end-of-year summative clinical knowledge SBA paper. CONCLUSION: Student-written SBAs predict performance in end-of-year SBA examinations, and therefore can provide a potentially valuable revision resource. There is potential for student-written questions to be incorporated into summative examinations.


Assuntos
Educação Médica/normas , Avaliação Educacional/métodos , Estudantes de Medicina , Humanos
2.
Adv Health Sci Educ Theory Pract ; 21(3): 571-85, 2016 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-26597452

RESUMO

In UK medical schools, five-option single-best answer (SBA) questions are the most widely accepted format of summative knowledge assessment. However, writing SBA questions with four effective incorrect options is difficult and time consuming, and consequently, many SBAs contain a high frequency of implausible distractors. Previous research has suggested that fewer than five-options could hence be used for assessment, without deterioration in quality. Despite an existing body of empirical research in this area however, evidence from undergraduate medical education is sparse. The study investigated the frequency of non-functioning distractors in a sample of 480 summative SBA questions at Cardiff University. Distractor functionality was analysed, and then various question models were tested to investigate the impact of reducing the number of distractors per question on examination difficulty, reliability, discrimination and pass rates. A survey questionnaire was additionally administered to 108 students (33 % response rate) to gain insight into their perceptions of these models. The simulation of various exam models revealed that, for four and three-option SBA models, pass rates, reliability, and mean item discrimination remained relatively constant. The average percentage mark however consistently increased by 1-3 % with the four and three-option models, respectively. The questionnaire survey revealed that the student body had mixed views towards the proposed format change. This study is one of the first to comprehensively investigate distractor performance in SBA examinations in undergraduate medical education. It provides evidence to suggest that using three-option SBA questions would maximise efficiency whilst maintaining, or possibly improving, psychometric quality, through allowing a greater number of questions per exam paper.


Assuntos
Avaliação Educacional/métodos , Educação Médica/métodos , Educação Médica/normas , Humanos , Reprodutibilidade dos Testes , Faculdades de Medicina , Inquéritos e Questionários
3.
Teach Learn Med ; 27(2): 182-8, 2015.
Artigo em Inglês | MEDLINE | ID: mdl-25893940

RESUMO

PROBLEM: Multiple-choice questions (MCQs) are the main method of assessing medical student knowledge. As a result there is a high demand from medical students for formative MCQs. However, teaching staff rarely have the time or incentive to develop high-quality formative questions, focusing instead on material for high-stakes assessments. INTERVENTION: We have developed a novel student-led approach involving an interactive online question database, created by medical students for medical students. We adopted a staged approach to create an online bank of formative MCQ questions. First, students write MCQs following a standardized format. Questions are then peer-reviewed by other students, discussing relevant clinical topics, guidelines, and journals to improve question quality. The questions are then scrutinized by specialist doctors and academics. Next, questions are piloted online. Finally, question performance is evaluated statistically. This 5-stage student-led process produced a bank of more than 200 MCQs in three months. CONTEXT: This intervention was carried out by two final-year medical student leads at Cardiff University School of Medicine, UK. Final-year students were recruited to write and peer-review questions, and senior content specialists were recruited from the department. After piloting and evaluation of the questions, the question bank was made available as a learning resource to all medical students at Cardiff University. OUTCOME: Objective analysis of the created MCQs (discrimination indices and distractor analysis) indicated that the random sample of questions piloted were of high quality. When the questions were made available as online tests to approximately 600 students, usage data revealed that 2,800 tests were taken over a 3-month period, indicating that the resource was popular. In addition, subjective feedback from students question writers/reviewers was gathered via free text feedback forms and was invariably positive. We plan to continue the question generation process in Cardiff and would encourage other medical schools to adopt this approach. LESSONS LEARNED: Our 5-stage approach can generate a large volume of high-quality MCQs, addressing the demand from students for formative MCQ questions, with minimal teaching staff input. The project's benefits go beyond the creation of the resource, as involving students in the writing, review, and presentation of questions itself is useful pedagogically.


Assuntos
Educação de Graduação em Medicina/métodos , Avaliação Educacional/métodos , Estudantes de Medicina , Inquéritos e Questionários/normas , Adulto , Feminino , Humanos , Internet , Aprendizagem , Masculino , Grupo Associado
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...