Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
1.
Teach Learn Med ; 26(4): 373-8, 2014.
Article in English | MEDLINE | ID: mdl-25318033

ABSTRACT

BACKGROUND: The Comprehensive Clinical Science Self-Assessment (CCSSA) is a web-administered multiple-choice examination that includes content that is typically covered during the core clinical clerkships in medical school. Because the content of CCSSA items resembles the content of the items on Step 2 Clinical Knowledge (CK), CCSSA is intended to be a tool for students to help assess whether they are prepared for Step 2 CK and to become familiar with its content, format, and pacing. PURPOSES: This study examined the relationship between performance on the National Board of Medical Examiners® CCSSA and performance on the United States Medical Licensing Examination® Step 2 CK for U.S./Canadian (USMGs) and international medical school students/graduates (IMGs). METHODS: The study included 9,789 participants who took CCSSA prior to their first Step 2 CK attempt. Linear and logistic regression analyses investigated the relationship between CCSSA performance and performance on Step 2 CK for both USMGs and IMGs. RESULTS: CCSSA scores explained 58% of the variation in first Step 2 CK scores for USMGs and 60% of the variation for IMGs; the relationship was somewhat different for the two groups as indicated by statistically different intercepts and slopes for the regression lines based on each group. Logistic regression results showed that examinees in both groups with low scores on CCSSA were at a higher risk of failing their first Step 2 CK attempt. CONCLUSIONS: RESULTS suggest that CCSSA can provide students with a valuable practice tool and a realistic self-assessment of their readiness to take Step 2 CK.


Subject(s)
Clinical Competence , Education, Medical, Undergraduate/standards , Educational Measurement/methods , Self-Assessment , Clinical Clerkship , Female , Humans , Internet , Male , United States , Young Adult
2.
Acad Med ; 85(10 Suppl): S98-101, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20881715

ABSTRACT

BACKGROUND: This study examined the relationship between performance on the National Board of Medical Examiners Comprehensive Basic Science Self-Assessment (CBSSA) and performance on United States Medical Licensing Examination Step 1. METHOD: The study included 12,224 U.S. and Canadian medical school students who took CBSSA prior to their first Step 1 attempt. Linear and logistic regression analyses investigated the relationship between CBSSA performance and performance on Step 1, and how that relationship was related to interval between exams. RESULTS: CBSSA scores explained 67% of the variation in first Step 1 scores as the sole predictor variable and 69% of the variation when time between CBSSA attempt and first Step 1 attempt was also included as a predictor. Logistic regression results showed that examinees with low scores on CBSSA were at higher risk of failing their first Step 1 attempt. CONCLUSIONS: Results suggest that CBSSA can provide students with a realistic self-assessment of their readiness to take Step 1.


Subject(s)
Educational Measurement/methods , Educational Status , Licensure, Medical , Science/education , Self-Evaluation Programs , Canada , Clinical Competence , Education, Medical, Undergraduate , Humans , Regression Analysis , United States
3.
Med Teach ; 32(6): 503-8, 2010.
Article in English | MEDLINE | ID: mdl-20515382

ABSTRACT

BACKGROUND: Though progress tests have been used for several decades in various medical education settings, a few studies have offered analytic frameworks that could be used by practitioners to model growth of knowledge as a function of curricular and other variables of interest. AIM: To explore the use of one form of progress testing in clinical education by modeling growth of knowledge in various disciplines as well as by assessing the impact of recent training (core rotation order) on performance using hierarchical linear modeling (HLM) and analysis of variance (ANOVA) frameworks. METHODS: This study included performances across four test administrations occurring between July 2006 and July 2007 for 130 students from a US medical school who graduated in 2008. Measures-nested-in-examinees HLM growth curve analyses were run to estimate clinical science knowledge growth over time and repeated measures ANOVAs were run to assess the effect of recent training on performance. RESULTS: Core rotation order was related to growth rates for total and pediatrics scores only. Additionally, scores were higher in a given discipline if training had occurred immediately prior to the test administration. CONCLUSIONS: This study provides a useful progress testing framework for assessing medical students' growth of knowledge across their clinical science education and the related impact of training.


Subject(s)
Clinical Medicine/education , Educational Measurement/methods , Schools, Medical , Clinical Clerkship , Pilot Projects , United States
4.
Am J Obstet Gynecol ; 193(5): 1773-9, 2005 Nov.
Article in English | MEDLINE | ID: mdl-16260232

ABSTRACT

OBJECTIVE: The objective of this study was to investigate whether the essential elements of the Association of Professors of Gynecology and Obstetrics (APGO) Medical Student Educational Objectives were adequately represented on the National Board of Medical Examiners (NBME) obstetrics and gynecology subject examination, and that the topics questioned on that examination were covered by the APGO objectives. STUDY DESIGN: The Undergraduate Medical Education Committee of APGO and the NBME staff separately reviewed the same 2 NBME obstetrics and gynecology subject examinations. The questions were mapped to the 15 essential elements of the APGO educational objectives and comparisons were made to check how well they matched. RESULTS: All the essential elements of the educational objectives were covered by the NBME subject examination. Of the questions on the examination, 99% were deemed appropriate for medical students with 70% of the questions mapping to "Priority 1" objectives. CONCLUSION: The NBME examination provides an appropriate assessment of mastery of what a medical student should learn, as represented by the APGO Medical Student Educational Objectives.


Subject(s)
Education, Medical, Undergraduate/standards , Gynecology/education , Obstetrics/education , Educational Measurement , Faculty , Societies, Medical , Specialty Boards , United States
5.
Acad Med ; 79(10 Suppl): S55-7, 2004 Oct.
Article in English | MEDLINE | ID: mdl-15383390

ABSTRACT

PROBLEM STATEMENT AND BACKGROUND: This study examined the extent to which performance on the NBME(R) Comprehensive Basic Science Self-Assessment (CBSSA) and NBME Comprehensive Clinical Science Self-Assessment (CCSSA) can be used to project performance on USMLE Step 1 and Step 2 examinations, respectively. METHOD: Subjects were 1,156 U.S./Canadian medical students who took either (1) the CBSSA and Step 1, or (2) the CCSSA and Step 2, between April 2003 and January 2004. Regression analyses examined the relationship between each self-assessment and corresponding USMLE Step as a function of test administration conditions. RESULTS: The CBSSA explained 62% of the variation in Step 1 scores, while the CCSSA explained 56% of Step 2 score variation. In both samples, Standard-Paced conditions produced better estimates of future Step performance than Self-Paced ones. CONCLUSIONS: Results indicate that self-assessment examinations provide an accurate basis for predicting performance on the associated Step with some variation in predictive accuracy across test administration conditions.


Subject(s)
Clinical Competence , Educational Measurement/methods , Licensure, Medical , Self-Evaluation Programs , Students, Medical , Canada , Cohort Studies , Education, Medical, Undergraduate , Feedback , Forecasting , Humans , Internet , Science/education , Time Factors , United States
6.
Anesth Analg ; 95(6): 1476-82, table of contents, 2002 Dec.
Article in English | MEDLINE | ID: mdl-12456404

ABSTRACT

UNLABELLED: A key element in developing a process to determine knowledge and ability in applying perioperative echocardiography has included an examination. We report on the development of a certifying examination in perioperative echocardiography. In addition, we tested the hypothesis that examination performance is related to clinical experience in echocardiography. Since 1995, more than 1200 participants have taken the examination, and more than 70% have passed. Overall examination performance was related positively to longer than 3 mo of training (or equivalent) in echocardiography and performance and interpretation of at least six examinations a week. We concluded that the certifying examination in perioperative echocardiography is a valid tool to help determine individual knowledge in perioperative echocardiography application. IMPLICATIONS: This report describes the process involved in developing the certifying transesophageal echocardiography examination and identifies correlates with examination performance.


Subject(s)
Anesthesiology/education , Certification , Echocardiography, Transesophageal , Humans , Knowledge
SELECTION OF CITATIONS
SEARCH DETAIL
...