Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Med Teach ; 36(1): 68-72, 2014 Jan.
Article in English | MEDLINE | ID: mdl-24195470

ABSTRACT

BACKGROUND: Educators need efficient and effective means to track students' clinical experiences to monitor their progress toward competency goals. AIM: To validate an electronic scoring system that rates medical students' clinical notes for relevance to priority topics of the medical school curriculum. METHOD: The Vanderbilt School of Medicine Core Clinical Curriculum enumerates 25 core clinical problems (CCP) that graduating medical students must understand. Medical students upload clinical notes pertinent to each CCP to a web-based dashboard, but criteria for determining relevance of a note and consistent uploading practices by students are lacking. The Vanderbilt Learning Portfolio (VLP) system automates both tasks by rating relevance for each CCP and uploading the note to the student's electronic dashboard. We validated this electronic scoring system by comparing the relevance of 265 clinical notes written by third year medical students to each of the 25 core patient problems as scored by VLP verses an expert panel of raters. RESULTS: We established the threshold score which yielded 75% positive prediction of relevance for 16 of the 25 clinical problems to expert opinion. DISCUSSION: Automated scoring of student's clinical notes provides a novel, efficient and standardized means of tracking student's progress toward institutional competency goals.


Subject(s)
Clinical Clerkship/standards , Clinical Competence/standards , Educational Measurement/standards , Electronic Health Records/standards , Clinical Clerkship/methods , Clinical Clerkship/organization & administration , Education, Medical, Undergraduate/organization & administration , Education, Medical, Undergraduate/standards , Educational Measurement/methods , Electronic Health Records/organization & administration , Humans , Natural Language Processing , Students, Medical , Tennessee
2.
AMIA Annu Symp Proc ; 2010: 157-61, 2010 Nov 13.
Article in English | MEDLINE | ID: mdl-21346960

ABSTRACT

Accurate assessment and evaluation of medical curricula has long been a goal of medical educators. Current methods rely on manually-entered keywords and trainee-recorded logs of case exposure. In this study, we used natural language processing to compare the clinical content coverage in a four-year medical curriculum to the electronic medical record notes written by clinical trainees. The content coverage was compared for each of 25 agreed-upon core clinical problems (CCPs) and seven categories of infectious diseases. Most CCPs were covered in both corpora. Lecture curricula more frequently represented rare curricula, and several areas of low content coverage were identified, primarily related to outpatient complaints. Such methods may prove useful for future curriculum evaluations and revisions.


Subject(s)
Curriculum , Natural Language Processing , Education, Medical, Undergraduate , Electronic Health Records , Humans
3.
J Gen Intern Med ; 23(7): 979-84, 2008 Jul.
Article in English | MEDLINE | ID: mdl-18612728

ABSTRACT

OBJECTIVE: To determine whether the integration of an automated electronic clinical portfolio into clinical clerkships can improve the quality of feedback given to students on their patient write-ups and the quality of students' write-ups. DESIGN: The authors conducted a single-blinded, randomized controlled study of an electronic clinical portfolio that automatically collects all students' clinical notes and notifies their teachers (attending and resident physicians) via e-mail. Third-year medical students were randomized to use the electronic portfolio or traditional paper means. Teachers in the portfolio group provided feedback directly on the student's write-up using a web-based application. Teachers in the control group provided feedback directly on the student's write-up by writing in the margins of the paper. Outcomes were teacher and student assessment of the frequency and quality of feedback on write-ups, expert assessment of the quality of student write-ups at the end of the clerkship, and participant assessment of the value of the electronic portfolio system. RESULTS: Teachers reported giving more frequent and detailed feedback using the portfolio system (p = 0.01). Seventy percent of students who used the portfolio system, versus 39% of students in the control group (p = 0.001), reported receiving feedback on more than half of their write-ups. Write-ups of portfolio students were rated of similar quality to write-ups of control students. Teachers and students agreed that the system was a valuable teaching tool and easy to use. CONCLUSIONS: An electronic clinical portfolio that automatically collects students' clinical notes is associated with improved teacher feedback on write-ups and similar quality of write-ups.


Subject(s)
Clinical Clerkship , Internal Medicine/education , Internet , Medical History Taking , Adult , Educational Measurement , Feedback, Psychological , Female , Humans , Male , Students, Medical
SELECTION OF CITATIONS
SEARCH DETAIL
...