Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Cureus ; 11(5): e4610, 2019 May 07.
Article in English | MEDLINE | ID: mdl-31312537

ABSTRACT

Background Challenges in bedside teaching may be overcome by the use of high-fidelity simulators for teaching the cardiac physical exam. The purpose of this study is to compare the ability of first-year medical students (MS1) to perform a cardiac physical exam and make the correct diagnosis after instruction using standardized patients (SPs) as compared to a cardiac simulator (Harvey, Laerdal Medical Corporation, NY, US). Methods Thirty-two MS1 were randomized to a teaching module on either SPs or Harvey. Their performance and ability to make the correct diagnosis were evaluated during a posttest objective structured clinical examination (OSCE) on real patients. Results No difference in the mean OSCE score was observed (SP: M=62.2% vs. Harvey: M=57.2%, p=0.32). The SP group obtained a higher frequency of correct diagnoses (M=61.5% SP vs. M=21.0% Harvey, p=0.03). Student feedback revealed that Harvey offered superior clinical findings; however, 34.4% of students requested a combination of teaching modalities as opposed to either method alone. Conclusions Performance in examination skills did not differ between the SP and Harvey groups but the SP group demonstrated an improved ability to arrive at a unifying diagnosis. A combined teaching program may be ideal for transferability to patients.

2.
J Grad Med Educ ; 2(1): 111-7, 2010 Mar.
Article in English | MEDLINE | ID: mdl-21975896

ABSTRACT

BACKGROUND: The Accreditation Council for Graduate Medical Education requires fellows in many specialties to demonstrate attainment of 6 core competencies, yet relatively few validated assessment tools currently exist. We present our initial experience with the design and implementation of a standardized patient (SP) exercise during gastroenterology fellowship that facilitates appraisal of all core clinical competencies. METHODS: Fellows evaluated an SP trained to portray an individual referred for evaluation of abnormal liver tests. The encounters were independently graded by the SP and a faculty preceptor for patient care, professionalism, and interpersonal and communication skills using quantitative checklist tools. Trainees' consultation notes were scored using predefined key elements (medical knowledge) and subjected to a coding audit (systems-based practice). Practice-based learning and improvement was addressed via verbal feedback from the SP and self-assessment of the videotaped encounter. RESULTS: Six trainees completed the exercise. Second-year fellows received significantly higher scores in medical knowledge (55.0 ± 4.2 [standard deviation], P  =  .05) and patient care skills (19.5 ± 0.7, P  =  .04) by a faculty evaluator as compared with first-year trainees (46.2 ± 2.3 and 14.7 ± 1.5, respectively). Scores correlated by Spearman rank (0.82, P  =  .03) with the results of the Gastroenterology Training Examination. Ratings of the fellows by the SP did not differ by level of training, nor did they correlate with faculty scores. Fellows viewed the exercise favorably, with most indicating they would alter their practice based on the experience. CONCLUSIONS: An SP exercise is an efficient and effective tool for assessing core clinical competencies during fellowship training.

3.
BMC Med Educ ; 6: 22, 2006 Apr 25.
Article in English | MEDLINE | ID: mdl-16638135

ABSTRACT

BACKGROUND: Mock oral board exams, fashioned after the live patient hour of the American Board of Psychiatry and Neurology exam, are commonly part of resident assessment during residency training. Exams using real patients selected from clinics or hospitals are not standardized and do not allow comparisons of resident performance across the residency program. We sought to create a standardized patient mock oral board exam that would allow comparison of residents' clinical performance. METHODS: Three cases were created and then used for this mock oral boards exercise utilizing trained standardized patients. Residents from the University of Cincinnati and Indiana University participated in the exam. Residents were scored by attending physician examiners who directly observed the encounter with the standardized patient. The standardized patient also assessed each resident. A post-test survey was administered to ascertain participant's satisfaction with the examination process. RESULTS: Resident scores were grouped within one standard deviation of the mean, with the exception of one resident who was also subjectively felt to "fail" the exam. In exams with two faculty "evaluators", scores were highly correlated. The survey showed satisfaction with the examination process in general. CONCLUSION: Standardized patients can be used for mock oral boards in the live patient format. Our initial experience with this examination process was positive. Further testing is needed to determine if this examination format is more reliable and valid than traditional methods of assessing resident competency.


Subject(s)
Biological Psychiatry/education , Educational Measurement/methods , Internship and Residency/standards , Neurology/education , Patient Simulation , Specialty Boards , Attitude of Health Personnel , Clinical Competence , Humans , Indiana , Nervous System Diseases/diagnosis , Nervous System Diseases/therapy , Ohio , Physician-Patient Relations , Pilot Projects , Program Evaluation , Truth Disclosure
4.
Acad Med ; 77(3): 267, 2002 Mar.
Article in English | MEDLINE | ID: mdl-11891169

ABSTRACT

The authors assessed-feedback agreement scores designed to measure the effectiveness of feedback between faculty and students in clinical medicine.


Subject(s)
Clinical Clerkship , Faculty, Medical , Feedback , Internal Medicine/education , Hospitals, University , Humans , Medical History Taking , Physical Examination
SELECTION OF CITATIONS
SEARCH DETAIL
...