Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Med Teach ; 32(6): 480-5, 2010.
Article in English | MEDLINE | ID: mdl-20515377

ABSTRACT

This collaborative project between the National Board of Medical Examiners and four schools in the UK is investigating the feasibility and utility of a cross-school progress testing program drawing on test material recently retired from the United States Medical Licensing Examination (USMLE) Step 2 Clinical Knowledge (CK) examination. This article describes the design of the progress test; the process used to build, translate (localize), review, and finalize test forms; the approach taken to (web-based) test administration; and the procedure used to calculate and report scores. Results to date have demonstrated that it is feasible to use test items written for the US licensing examination as a base for developing progress test forms for use in the UK. Some content areas can be localized more readily than others, and care is clearly needed in review and revision of test materials to ensure that it is clinically appropriate and suitably phrased for use in the UK. Involvement of content experts in review and vetting of the test material is essential, and it is clearly desirable to supplement expert review with the use of quality control procedures based on the item statistics as a final check on the appropriateness of individual test items.


Subject(s)
Educational Measurement/standards , International Cooperation , Schools, Medical , Humans , Internet , Licensure, Medical , United Kingdom , United States
2.
Med Teach ; 32(6): 516-20, 2010.
Article in English | MEDLINE | ID: mdl-20515385

ABSTRACT

This collaborative project between the National Board of Medical Examiners (NBME) and Case Western Reserve University (CWRU) School of Medicine explored the design and use of cumulative achievement tests in basic science education. In cumulative achievement testing, integrative end-of-unit tests are deliberately constructed to systematically retest topics covered in previous units as well as material from the just-completed unit. CWRU faculty developed and administered a series of six web-based cumulative achievement tests using retired United States Medical Licensing Examination (USMLE) step 1 test material and tools provided by NBME's Customized Assessment Services, and trends in student performance were examined as the new CWRU basic science curriculum unfolded. This article provides the background information about test design and administration, as well as samples of score reporting information for students and faculty. While firm conclusions about the effectiveness of cumulative achievement testing are not warranted after a pilot test at a single school, preliminary results suggest that cumulative achievement testing may be an effective complement to progress testing, with the former used to encourage retention of already-covered material and the latter used to assess growth toward the knowledge and skills expected of a graduating student.


Subject(s)
Cooperative Behavior , Education, Medical, Undergraduate , Educational Measurement , Clinical Competence/standards , Humans , Licensure, Medical , Program Development , United States
3.
Adv Health Sci Educ Theory Pract ; 11(1): 61-8, 2006 Feb.
Article in English | MEDLINE | ID: mdl-16583285

ABSTRACT

PURPOSE: In conjunction with curricular changes, a process to develop integrated examinations was implemented. Pre-established guidelines were provided favoring vignettes, clinically relevant material, and application of knowledge rather than simple recall. Questions were read aloud in a committee including all course directors, and a reviewer with National Board of Medical Examiners (NBME) item writing and review experience. This study examines the effectiveness of this process to improve the quality of in-house examinations. METHODS: Five hundred and twenty items were randomly selected from two academic years for initial comparison; 270 from 2000 to 2001, and 250 from 2001 to 2002. The first set of items represented the style, content and format when courses and tests were departmentally/discipline based, assembled by course directors, and administered separately. The latter group represented similar characteristics when courses and tests were organ-system-based, committee-reviewed and administered in an integrated examination. Items were randomized, blinded for year of origin, and rated by three NBME staff members with extensive item review experience. A five-point rating scale was used: one indicated a technically flawed item assessing recall of an isolated fact; five indicated a technically unflawed item assessing application of knowledge. To assess continued improvement, a follow-up set of 250 items from the 2002 to 2003 academic year was submitted to the same three reviewers who were not informed of the purpose or origin of this set of test items. RESULTS: The mean rating for items from 2000 to 2001 was 2.51 +/- 1.27; analogous values for 2001-2002 were 3.16 +/- 1.33, (t = 5.83; p < 0.0001), and in 2002-2003; 3.59 +/- 1.15 (t = 10.11; p<0.0001). CONCLUSION: Pre-established guidelines and an interdisciplinary review process resulted in improved item quality for in-house examinations.


Subject(s)
Education, Medical/standards , Educational Measurement/standards , Internship and Residency/standards , Management Quality Circles , Professional Staff Committees , Education, Medical/organization & administration , Florida , Guidelines as Topic , Humans , Internship and Residency/organization & administration , Peer Review , Program Evaluation , Psychiatry/education , Specialty Boards
SELECTION OF CITATIONS
SEARCH DETAIL
...