Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 11 de 11
Filter
1.
Acad Med ; 87(10): 1443-9, 2012 Oct.
Article in English | MEDLINE | ID: mdl-22914524

ABSTRACT

PURPOSE: To investigate whether a spaced-education (SE) game can be an effective means of teaching core content to medical students and a reliable and valid method of assessing their knowledge. METHOD: This nine-month trial (2008-2009) enrolled students from three U.S. medical schools. The SE game consisted of 100 validated multiple-choice questions-explanations in preclinical/clinical domains. Students were e-mailed two questions daily. Adaptive game mechanics re-sent questions in three or six weeks if answered, respectively, incorrectly or correctly. Questions expired if not answered on time (appointment dynamic). Students retired questions by answering each correctly twice consecutively (progression dynamic). Posting of relative performance fostered competition. Main outcome measures were baseline and completion scores. RESULTS: Seven-hundred thirty-one students enrolled. Median baseline score was 53% (interquartile range [IQR] 16) and varied significantly by year (P<.001, dmax=2.08), school (P<.001, dmax=0.75), and gender (P<.001, d=0.38). Median completion score was 93% (IQR 12) and varied significantly by year (P=.001, dmax=1.12), school (P<.001, dmax=0.34), and age (P=.019, dmax=0.43). Scores did not differ significantly between years 3 and 4. Seventy percent of enrollees (513/731) requested to participate in future SE games. CONCLUSIONS: An SE game is an effective and well-accepted means of teaching core content and a reliable and valid method to assess student knowledge. SE games may be valuable tools to identify and remediate students who could benefit from additional educational support.


Subject(s)
Computer-Assisted Instruction/methods , Education, Medical, Undergraduate/methods , Educational Measurement/methods , Electronic Mail , Surveys and Questionnaires , Adult , Competitive Behavior , Female , Humans , Male , Models, Educational , Multivariate Analysis , Prospective Studies , Reproducibility of Results , United States
2.
Ann Surg ; 256(1): 33-8, 2012 Jul.
Article in English | MEDLINE | ID: mdl-22664558

ABSTRACT

OBJECTIVE: To assess the efficacy of a "spaced-education" game as a method of continuing medical education (CME) among physicians across the globe. BACKGROUND: The efficacy of educational games for the CME has yet to be established. We created a novel online educational game by incorporating game mechanics into "spaced education" (SE), an evidence-based method of online CME. METHODS: This 34-week randomized trial enrolled practicing urologists across the globe. The SE game consisted of 40 validated multiple-choice questions and explanations on urology clinical guidelines. Enrollees were randomized to 2 cohorts: cohort A physicians were sent 2 questions via an automated e-mail system every 2 days, and cohort B physicians were sent 4 questions every 4 days. Adaptive game mechanics re-sent the questions in 12 or 24 days if answered incorrectly and correctly, respectively. Questions expired if not answered on time (appointment dynamic). Physicians retired questions by answering each correctly twice-in-a-row (progression dynamic). Competition was fostered by posting relative performance among physicians. Main outcome measures were baseline scores (percentage of questions answered correctly upon initial presentation) and completion scores (percentage of questions retired). RESULTS: A total of 1470 physicians from 63 countries enrolled. Median baseline score was 48% (interquartile range [IQR] 17) and, in multivariate analyses, was found to vary significantly by region (Cohen dmax = 0.31, P = 0.001) and age (dmax = 0.41, P < 0.001). Median completion score was 98% (IQR 25) and varied significantly by age (dmax = 0.21, P < 0.001) and American Board of Urology certification (d = 0.10, P = 0.033) but not by region (multivariate analyses). Question clustering reduced physicians' performance (d = 0.43, P < 0.001). Seventy-six percent of enrollees (1111/1470) requested to participate in future SE games. CONCLUSIONS: An online SE game can substantially improve guidelines knowledge and is a well-accepted method of global CME delivery.


Subject(s)
Education, Medical, Continuing/methods , Teaching/methods , Urology/education , Adult , Educational Measurement/methods , Electronic Mail , Humans , Middle Aged , Multivariate Analysis , Online Systems , Retention, Psychology
3.
J Am Coll Surg ; 214(3): 367-73, 2012 Mar.
Article in English | MEDLINE | ID: mdl-22225647

ABSTRACT

BACKGROUND: While games are frequently used in resident education, there is little evidence supporting their efficacy. We investigated whether a spaced-education (SE) game can be both a reliable and valid method of assessing residents' knowledge and an effective means of teaching core content. STUDY DESIGN: The SE game consisted of 100 validated multiple-choice questions and explanations on core urology content. Residents were sent 2 questions each day via email. Adaptive game mechanics re-sent the questions in 2 or 6 weeks if answered incorrectly and correctly, respectively. Questions expired if not answered on time (appointment dynamic). Residents retired questions by answering each correctly twice in a row (progression dynamic). Competition was fostered by posting relative performance among residents. Main outcomes measures were baseline scores (percentage of questions answered correctly on initial presentation) and completion scores (percentage of questions retired). RESULTS: Nine hundred thirty-one US and Canadian residents enrolled in the 45-week trial. Cronbach alpha reliability for the SE baseline scores was 0.87. Baseline scores (median 62%, interquartile range [IQR] 17%) correlated with scores on the 2008 American Urological Association in-service examination (ISE08), 2009 American Board of Urology qualifying examination (QE09), and ISE09 (r = 0.76, 0.46, and 0.64, respectively; all p < 0.001). Baseline scores varied by sex, country, medical degree, and year of training (all p ≤ 0.001). Completion scores (median 100%, IQR 2%) correlated with ISE08 and ISE09 scores (r = 0.35, p < 0.001 for both). Seventy-two percent of enrollees (667 of 931) requested to participate in future SE games. CONCLUSIONS: An SE game is a reliable and valid means to assess residents' knowledge and is a well-accepted method by which residents can master core content.


Subject(s)
Educational Measurement/methods , Internship and Residency , Online Systems , Adult , Canada , Female , Humans , Male , Prospective Studies , United States , Urology/education , Video Games
4.
J Urol ; 186(2): 634-7, 2011 Aug.
Article in English | MEDLINE | ID: mdl-21683391

ABSTRACT

PURPOSE: The American Urological Association In-Service Examination and the American Board of Urology Qualifying Examination are written multiple choice tests that cover all domains in urology. We investigated whether In-Service Examination performance could identify chief residents who scored in the lowest quartile on the Qualifying Examination. MATERIALS AND METHODS: All urology chief residents in the United States and Canada in 2008 and 2009 were eligible to participate in this study. In-Service Examination 2008 and Qualifying Examination 2009 performance data were obtained from the American Urological Association and American Board of Urology, respectively. Data were analyzed with the Pearson correlation and descriptive statistics. RESULTS: Of the 257 American and Canadian chief residents who completed the Qualifying Examination 2009, 194 (75%) enrolled in this study and were included in analysis. Overall In-Service Examination 2008 scores correlated significantly with Qualifying Examination 2009 scores (r=0.55, p<0.001), accounting for 30% of score variance. Substantial variability in In-Service Examination-Qualifying Examination rankings was notable among individual residents. An In-Service Examination 2008 cutoff percentile rank of 40% identified chief residents in the lowest quartile on the Qualifying Examination 2009 with 71% sensitivity, 77% specificity, and a likelihood ratio of 3.1 and 0.4 (positive and negative likelihood ratios, respectively). CONCLUSIONS: The substantial variability of In-Service Examination-Qualifying Examination performance among individual chief residents limits In-Service Examination predictive utility. A single In-Service Examination score should not be used to make a high stakes judgment about an individual resident. In-Service Examination scores should be used as 1 part of an overall evaluation program to prospectively identify residents who could benefit from additional educational support.


Subject(s)
Clinical Competence , Education, Medical, Graduate , Internship and Residency , Urology/education , Adult , Canada , Educational Measurement , Female , Forecasting , Humans , Male , United States
5.
Acad Med ; 86(3): 300-6, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21248600

ABSTRACT

PURPOSE: U.S. medical students will soon complete only one licensure examination sequence, given near the end of medical school. Thus, schools are challenged to identify poorly performing students before this high-stakes test and help them retain knowledge across the duration of medical school. The authors investigated whether online spaced education progress-testing (SEPT) could achieve both aims. METHOD: Participants were 2,648 students from four U.S. medical schools; 120 multiple-choice questions and explanations in preclinical and clinical domains were developed and validated. For 34 weeks, students randomized to longitudinal progress-testing alone (LPTA) received four new questions (with answers/ explanations) each week. Students randomized to SEPT received the identical four questions each week, plus two-week and six-week cycled reviews of the questions/explanations. During weeks 31-34, the initial 40 questions were re-sent to students to assess longer-term retention. RESULTS: Of the 1,067 students enrolled, the 120-question progress-test was completed by 446 (84%) and 392 (74%) of the LPTA and SEPT students, respectively. Cronbach alpha reliability was 0.87. Scores were 39.9%, 51.9%, 58.7%, and 58.8% for students in years 1-4, respectively. Performance correlated with Step 1 and Step 2 Clinical Knowledge scores (r = 0.52 and 0.57, respectively; P < .001) and prospectively identified students scoring below the mean on Step 1 with 75% sensitivity, 77% specificity, and 41% positive predictive value. Cycled reviews generated a 170% increase in learning retention relative to baseline (P < .001, effect size 0.95). CONCLUSIONS: SEPT can identify poorly performing students and improve their longer-term knowledge retention.


Subject(s)
Clinical Competence , Computer-Assisted Instruction , Education, Distance , Education, Medical/organization & administration , Needs Assessment , Adult , Female , Humans , Longitudinal Studies , Male , Predictive Value of Tests , Reproducibility of Results , Retention, Psychology , United States , Young Adult
6.
J Am Coll Surg ; 211(3): 331-337.e1, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20800189

ABSTRACT

BACKGROUND: Retention of learning from surgical training is often limited, especially if the knowledge and skills are used infrequently. Using histopathology diagnostic skills as an experimental system, we compared knowledge transfer and retention between bolus Web-based teaching (WBT) modules and online spaced education, a novel email-based method of online education founded on the spacing effect. STUDY DESIGN: All US urology residents were eligible to participate. Enrollees were randomized to 1 of 2 cohorts. Cohort 1 residents received 3 cycles/repetitions of spaced education on prostate-testis histopathology (weeks 1 to 16) and 3 WBT modules on bladder-kidney (weeks 14 to 16). Cohort 2 residents received 3 cycles of spaced education on bladder-kidney (weeks 1 to 16) and 3 WBT modules on prostate-testis (weeks 14 to 16). Each daily spaced education email presented a clinical scenario with histopathology image and asked for a diagnosis. Participants received immediate feedback after submitting their answers. Each cycle/repetition was 4 weeks long and consisted of 20 questions with unique images. WBT used the identical content and delivery system, with questions aggregated into three 20-question modules. Long-term retention of all 4 topics was assessed during weeks 18 to 45. RESULTS: Seven-hundred and twenty-four urology residents enrolled. Spaced education and WBT were completed by 77% and 66% of residents, respectively. Spaced education and WBT generated mean long-term score increases of 15.2% (SD 15.3%) and 3.4% (SD 16.3%), respectively (p < 0.01). Spaced education increased long-term learning efficiency 4-fold. CONCLUSIONS: Online spaced education generates transfer of histopathology diagnostic skills and substantially improves their long-term retention. Additional research is needed to determine how spaced education can optimize learning, transfer, and retention of surgical skills.


Subject(s)
Clinical Competence , Computer-Assisted Instruction , Diagnosis , Electronic Mail , Internet , Internship and Residency , Retention, Psychology , Urology/education , Adult , Female , Humans , Internship and Residency/methods , Internship and Residency/organization & administration , Internship and Residency/trends , Male , Pathology/education , Time Factors , United States
7.
J Urol ; 177(4): 1481-7, 2007 Apr.
Article in English | MEDLINE | ID: mdl-17382760

ABSTRACT

PURPOSE: We investigated whether an online educational program based on spacing effect principles could significantly improve the acquisition and retention of medical knowledge. MATERIALS AND METHODS: In this randomized, controlled trial involving urology residents in the United States and Canada participants randomized to cohort 1 (bolus education) were e-mailed a validated set of 96 study questions on 4 urology topic areas in June 2005. Residents in cohort 2 (spaced education) were sent daily educational e-mails during 27 weeks (June to December 2005), each of which contained 1 or 2 study questions presented in a repeating, spaced pattern. In November 2005 participants completed the Urology In-Service Examination. Participants were also randomized to 1 of 5 outcome cohorts, which completed a 32-item online test at staggered time points (1 to 14 weeks) after completion of the spaced education program. RESULTS: Of 537 participants 400 (74%) completed the online staggered tests and 515 (96%) completed the In-Service Examination. Residents in the spaced education cohort demonstrated significantly greater online test scores than those in the bolus cohort (ANOVA p <0.001). One-way ANOVA with trend analysis revealed that online test scores for the spaced education cohort remained stable with no significant differences with time, while test scores in the bolus cohort demonstrated a significant linear decrease (p = 0.007). The specific learning gains attributable to Spaced Education were robust when controlling for use of the study materials but they did not generalize to higher scores on the In-Service Examination. CONCLUSIONS: Online spaced education improves the acquisition and retention of clinical knowledge.


Subject(s)
Internet , Internship and Residency/methods , Urology/education , Canada , Humans , United States
8.
Acad Med ; 81(3): 224-30, 2006 Mar.
Article in English | MEDLINE | ID: mdl-16501262

ABSTRACT

PURPOSE: To investigate the impact of an adjuvant Web-based teaching program on medical students' learning during clinical rotations. METHOD: From April 2003 to May 2004, 351 students completing clinical rotations in surgery-urology at four U.S. medical schools were invited to volunteer for the study. Web-based teaching cases were developed covering four core urologic topics. Students were block randomized to receive Web-based teaching on two of the four topics. Before and after a designated duration at each institution (ranging one to three weeks), students completed a validated 28-item Web-based test (Cronbach's alpha = .76) covering all four topics. The test was also administered to a subset of students at one school at the conclusion of their third-year to measure long-term learning. RESULTS: Eighty-one percent of all eligible students (286/351) volunteered to participate in the study, 73% of whom (210/286) completed the Web-based program. Compared to controls, Web-based teaching significantly increased test scores in the four topics at each medical school (p < .001, mixed analysis of variance), corresponding to a Cohen's d effect size of 1.52 (95% confidence interval [CI], 1.23-1.80). Learning efficiency was increased three-fold by Web-based teaching (Cohen's d effect size 1.16; 95% CI 1.13-1.19). Students who were tested a median of 4.8 months later demonstrated significantly higher scores for Web-based teaching compared to non-Web-based teaching (p = .007, paired t-test). Limited learning was noted in the absence of Web-based teaching. CONCLUSIONS: This randomized controlled trial provides Class I evidence that Web-based teaching as an adjunct to clinical experiences can significantly and durably improve medical students' learning.


Subject(s)
Education, Medical/methods , Internet , Learning , Female , General Surgery/education , Humans , Male , Schools, Medical , Students, Medical , Urology/education
9.
J Urol ; 172(1): 278-81, 2004 Jul.
Article in English | MEDLINE | ID: mdl-15201794

ABSTRACT

PURPOSE: After the development and implementation of a novel urology curriculum for medical students we evaluated urological learning by medical students using a validated measure of learning in the 4 clinical areas of benign prostatic hyperplasia, erectile dysfunction, prostate cancer and prostate specific antigen screening. MATERIALS AND METHODS: Third year medical students completed an online validated pre-test and post-test immediately before and after the mandatory 1-week clinical rotation in urology. Online pre-surveys and post-surveys were also administered. Overall student participation was 90% (37 of 41) with 63% of students (26 of 41) completing all 4 tests and surveys. RESULTS: Student overall test scores improved significantly upon completion of the 1-week clinical rotation in urology (p <0.001). A trend toward increased learning by male students was identified (p = 0.07). Significant variation in exposure to outpatient clinics and in the performance of physical examination skills was observed among the different teaching sites. CONCLUSIONS: This study demonstrates significant learning by medical students during their 1-week clinical rotation in urology. Further data are needed to confirm the trend toward increased learning by males and elucidate its etiology. Scheduling changes have been implemented to address the inconsistencies across clinical sites.


Subject(s)
Clinical Clerkship/standards , Curriculum , Program Development , Urology/education , Adult , Boston , Clinical Competence , Competency-Based Education , Educational Measurement , Female , Humans , Male , Models, Educational , Program Evaluation
10.
J Urol ; 172(1): 282-5, 2004 Jul.
Article in English | MEDLINE | ID: mdl-15201795

ABSTRACT

PURPOSE: To date published efforts to assess and improve medical student learning in urology have been limited due to the lack of an assessment tool with which to measure student learning. We report the development of a validated measure of medical student learning in urology. MATERIALS AND METHODS: Four core topics in clinical urology were selected as the focus of the test development, namely prostate cancer, screening with prostate specific antigen, benign prostatic hyperplasia and erectile dysfunction. Detailed curricula and multiple choice questions were created for each topic. Content validity of the curriculum and 28 item examination was established by a panel of 2 urologists and 2 medical physicians. Instrument reliability was determined by administering the test on line to third-year surgery students. Test construct validity was established through its administration to 19 urology residents and attending physicians. RESULTS: Reliability of the 28-item test instrument was measured by Cronbach's alpha at 0.76 and its 1-week test-retest reliability was 0.72. All urology experts performed well on the test. Mean urological expert scores were significantly higher than mean student post-test scores (24.9 +/- 2.1 vs 17.8 +/- 3.8, 2-tailed t test p <0.001). Urological experts with greater urological training had higher scores than those with less residency training. CONCLUSIONS: This study documents the development of a validated measure of medical student learning in urology. This validated instrument has the potential to improve educational quality control at medical schools and facilitate the development of effective, evidence based teaching methods.


Subject(s)
Clinical Clerkship/standards , Clinical Competence , Educational Measurement , Urology/education , Adult , Female , Humans , Male , Models, Educational , Reproducibility of Results
11.
Article in English | MEDLINE | ID: mdl-15141127

ABSTRACT

PROBLEM STATEMENT AND BACKGROUND: While the psychometric properties of Objective Structured Clinical Examinations (OSCEs) have been studied, their latent structures have not been well characterized. This study examines a factor analytic model of a comprehensive OSCE and addresses implications for measurement of clinical performance. METHODS: An exploratory maximum likelihood factor analysis with a Promax rotation was used to derive latent structures for the OSCE. RESULTS: A model with two correlated factors fit the data well. The first factor was related to Physical Examination and History-Taking was labeled as information gathering, while the second factor was related to Differential Diagnosis/Clinical Reasoning, and Patient Interaction and was labeled as reasoning and information dissemination. Case Management did not contribute to either factor. The factors accounted for a total 61.6% of the variance in the skills variables. CONCLUSIONS: Recognizing the psychometric components of OSCEs may support and enhance the use of OSCEs for measuring clinical competency of medical students.


Subject(s)
Clinical Competence/standards , Education, Medical, Undergraduate/standards , Educational Measurement/methods , Psychometrics , Factor Analysis, Statistical , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...