Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Am J Surg ; 214(1): 152-157, 2017 Jul.
Article in English | MEDLINE | ID: mdl-28501285

ABSTRACT

BACKGROUND: We describe initial success in designing and implementing an objective evaluation for opening and closing a simulated abdomen. METHODS: (1) An assessment for laparotomy was created using peer-reviewed literature, texts, and the input of academic surgeons nationally; (2) the assessment was evaluated for construct validity, comparing the videotaped performance of laparotomy by surgical experts and novices on a viscoelastic model; and (3) the basics of open laparotomy training (BOLT) curriculum was piloted with junior residents to evaluate efficacy at improving performance. RESULTS: Experts performed better than novices opening (.94 vs .51; P < .001), closing (.85 vs .16; P < .001), and overall performance (.88 vs .27; P < .001). Novices caused bowel injury more frequently (5 vs 1; P < .05) and took longer to open the abdomen (6:06 vs 3:43; P = .01). After completing the BOLT curriculum, novices improved for opening (1.00 vs .50; P = .014), closing (.80 vs .10; P = .014), and overall score (.87 vs .23; P = .014). CONCLUSIONS: We demonstrate construct validity of an evaluation tool for simulated laparotomy, and pilot efforts with the BOLT curriculum have shown promise.


Subject(s)
Clinical Competence , Curriculum , Educational Measurement , Laparotomy/education , Simulation Training , Abdomen/surgery , Computer Simulation , Delphi Technique , Humans , Internship and Residency , Pilot Projects , United States
2.
J Surg Educ ; 71(6): e11-5, 2014.
Article in English | MEDLINE | ID: mdl-25155640

ABSTRACT

OBJECTIVES: The American Board of Surgery Certifying Examination (ABSCE) is an oral examination designed to evaluate a resident׳s ability to apply their cognitive knowledge to manage a broad range of clinical problems. In this study, we analyze our 5-year experience with a Philadelphia-wide mock oral examination (PMOE). SETTING: The PMOE is organized by the Metropolitan Philadelphia Chapter of the American College of Surgeons and offered annually to all postgraduate year 4/5 residents from the 8 participating Philadelphia general surgery programs. Each examinee is scheduled for 3 consecutive 30-minute examinations given by 2 examiners per room. Overall performance is graded for each interaction using the ABSCE scoring method. Participants are given their "pass/fail" status, and they receive written examiner feedback. DESIGN: From 2008 to 2013, deidentified examinee scores from both the PMOE and the ABSCE were reviewed; overall pass/fail status was compared using the chi-square statistic for significance. Examinee feedback from 2009 to 2013 was reviewed by 3 independent raters and characterized as commenting upon cognitive knowledge, clinical management, or communication skills. This categorical data were then correlated with pass/fail status and examined using unpaired t tests for significance. RESULTS: From 2009 to 2013, 189 residents participated in the PMOE with an overall pass rate of 53%, compared with the ABSCE pass rate of 76% for 113 examinees from the Philadelphia area from 2008 to 2013 (χ(2) = 18.8, p < 0.01). A total of 2273 comments were reviewed and categorized from 2009 to 2013. Examinees who failed the PMOE received significantly more feedback pertaining to cognitive knowledge than examinees who passed the examination (p = 0.04). CONCLUSION: The PMOE provides residents an opportunity to receive feedback on their performance on a representation of the ABSCE that may be more rigorous than the actual certifying examination. Deficits in cognitive knowledge are a significant determinant of performance on a city-wide mock oral examination.


Subject(s)
Certification , General Surgery/education , Internship and Residency , Communication , Feedback , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...