Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
1.
Korean Journal of Medical Education ; : 35-47, 2016.
Article in English | WPRIM | ID: wpr-76112

ABSTRACT

PURPOSE: The purpose of this study is to investigate the reliability and validity of new clinical performance examination (CPX) for assessing clinical reasoning skills and evaluating clinical reasoning ability of the students. METHODS: Third-year medical school students (n=313) in Busan-Gyeongnam consortium in 2014 were included in the study. One of 12 stations was developed to assess clinical reasoning abilities. The scenario and checklists of the station were revised by six experts. Chief complaint of the case was rhinorrhea, accompanied by fever, headache, and vomiting. Checklists focused on identifying of the main problem and systematic approach to the problem. Students interviewed the patient and recorded subjective and objective findings, assessments, plans (SOAP) note for 15 minutes. Two professors assessed students simultaneously. We performed statistical analysis on their scores and survey. RESULTS: The Cronbach α of subject station was 0.878 and Cohen κ coefficient between graders was 0.785. Students agreed on CPX as an adequate tool to evaluate students' performance, but some graders argued that the CPX failed to secure its validity due to their lack of understanding the case. One hundred eight students (34.5%) identified essential problem early and only 58 (18.5%) performed systematic history taking and physical examination. One hundred seventy-three of them (55.3%) communicated correct diagnosis with the patient. Most of them had trouble in writing SOAP notes. CONCLUSION: To gain reliability and validity, interrater agreement should be secured. Students' clinical reasoning skills were not enough. Students need to be trained on problem identification, reasoning skills and accurate record-keeping.


Subject(s)
Humans , Checklist , Clinical Competence , Communication , Comprehension , Education, Medical, Undergraduate , Educational Measurement/standards , Medical History Taking , Medical Records , Observer Variation , Physical Examination , Physician-Patient Relations , Problem-Based Learning , Reproducibility of Results , Republic of Korea , Schools, Medical , Students, Medical , Surveys and Questionnaires , Thinking , Universities
2.
Korean Journal of Medical Education ; : 237-241, 2016.
Article in English | WPRIM | ID: wpr-32281

ABSTRACT

PURPOSE: The purpose of this study was evaluation of the current status of medical students' documentation of patient medical records METHODS: We checked the completeness, appropriateness, and accuracy of 95 Subjective-Objective-Assessment-Plan (SOAP) notes documented by third-year medical students who participated in clinical skill tests on December 1, 2014. Students were required to complete the SOAP note within 15 minutes of an standard patient (SP)-encounter with a SP complaining rhinorrhea and warring about meningitis. RESULTS: Of the 95 SOAP notes reviewed, 36.8% were not signed. Only 27.4% documented the patient's symptoms under the Objective component, although all students completed the Subjective notes appropriately. A possible diagnosis was assessed by 94.7% students. Plans were described in 94.7% of the SOAP notes. Over half the students planned workups (56.7%) for diagnosis and treatment (52.6%). Accurate documentation of the symptoms, physical findings, diagnoses, and plans were provided in 78.9%, 9.5%, 62.1%, and 38.0% notes, respectively. CONCLUSION: Our results showed that third-year medical students' SOAP notes were not complete, appropriate, or accurate. The most significant problems with completeness were the omission of students' signatures, and inappropriate documentation of the physical examinations conducted. An education and assessment program for complete and accurate medical recording has to be developed.


Subject(s)
Humans , Clinical Competence , Diagnosis , Education , Medical Records , Meningitis , Physical Examination , Pilot Projects , Soaps , Students, Medical
3.
Korean Journal of Medical Education ; : 327-336, 2013.
Article in Korean | WPRIM | ID: wpr-95756

ABSTRACT

PURPOSE: The purpose of this study is to judge the quality of clinical skills assessment in Busan-Gyeongnam Consortium. METHODS: Fourth grade medical school students (n=350 in 2012 and n=419 in 2013) in the Busan-Gyeongnam Consortium were included in the study. The examination was consisted of 6 clinical performance examination (CPX) and 6 objective structured clinical examination (OSCE) stations. The students were divided into groups to take the exam in 4 sites during 3 days. The overall reliability was estimated by Cronbach alpha coefficient across the stations and the case reliability was by alpha across checklist items. Analysis of variance and between-group variation were used to evaluate the variation of examinee performance across different days and sites. RESULTS: The mean total CPX/OSCE score was 67.0 points. The overall alpha across-stations was 0.66 in 2012 and 0.61 in 2013. The alpha across-items within a station was 0.54 to 0.86 in CPX, 0.51 to 0.92 in OSCE. There was no significant increase in scores between the different days. The mean scores over sites were different in 30 out of 48 stations but between-group variances were under 30%, except 2 cases. CONCLUSION: The overall reliability was below 0.70 and standardization of exam sites was unclear. To improve the quality of exam, case development, item design, training of standardized patients and assessors, and standardization of sites are necessary. Above of all, we need to develop the well-organized matrix to measure the quality of the exam.


Subject(s)
Humans , Checklist , Clinical Competence , Outcome and Process Assessment, Health Care , Psychometrics , Schools, Medical
SELECTION OF CITATIONS
SEARCH DETAIL