Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Med Educ Online ; 24(1): 1591257, 2019 Dec.
Article in English | MEDLINE | ID: mdl-30935299

ABSTRACT

BACKGROUND: Clinical reasoning is an essential skill to be learned during medical education. A developmental framework for the assessment and measurement of this skill has not yet been described in the literature. OBJECTIVE: The authors describe the creation and pilot implementation of a rubric designed to assess the development of clinical reasoning skills in pre-clinical medical education. DESIGN: The multi-disciplinary course team used Backwards Design to develop course goals, objectives, and assessment for a new Clinical Reasoning Course. The team focused on behaviors that students were expected to demonstrate, identifying each as a 'desired result' element and aligning these with three levels of performance: emerging, acquiring, and mastering. RESULTS: The first draft of the rubric was reviewed and piloted by faculty using sample student entries; this provided feedback on ease of use and appropriateness. After the first semester, the course team evaluated whether the rubric distinguished between different levels of student performance in each competency. A systematic approach based on descriptive analysis of mid- and end of semester assessments of student performance revealed that from mid- to end-of-semester, over half the students received higher competency scores at semester end. CONCLUSION: The assessment rubric allowed students in the early stages of clinical reasoning development to understand their trajectory and provided faculty a framework from which to give meaningful feedback. The multi-disciplinary background of the course team supported a systematic and robust course and assessment design process. The authors strongly encourage other colleges to support the use of collaborative and multi-disciplinary course teams.


Subject(s)
Clinical Decision-Making/methods , Education, Medical/methods , Educational Measurement/methods , Problem Solving , Clinical Competence , Curriculum , Humans , Learning
2.
Adv Med Educ Pract ; 9: 307-315, 2018.
Article in English | MEDLINE | ID: mdl-29765259

ABSTRACT

OBJECTIVE: Non-medical knowledge-based sub-competencies (multitasking, professionalism, accountability, patient-centered communication, and team management) are challenging for a supervising emergency medicine (EM) physician to evaluate in real-time on shift while also managing a busy emergency department (ED). This study examines residents' perceptions of having a medical education specialist shadow and evaluate their nonmedical knowledge skills. METHODS: Medical education specialists shadowed postgraduate year 1 and postgraduate year 2 EM residents during an ED shift once per academic year. In an attempt to increase meaningful feedback to the residents, these specialists evaluated resident performance in selected non-medical knowledge-based Accreditation Council of Graduate Medical Education (ACGME) sub-competencies and provided residents with direct, real-time feedback, followed by a written evaluation sent via email. Evaluations provided specific references to examples of behaviors observed during the shift and connected these back to ACGME competencies and milestones. RESULTS: Twelve residents participated in this shadow experience (six post graduate year 1 and six postgraduate year 2). Two residents emailed the medical education specialists ahead of the scheduled shadow shift requesting specific feedback. When queried, five residents voluntarily requested their feedback to be included in their formal biannual review. Residents received milestone scores and narrative feedback on the non-medical knowledge-based ACGME sub-competencies and indicated the shadow experience and subsequent feedback were valuable. CONCLUSION: Medical education specialists who observe residents over the course of an entire shift and evaluate non-medical knowledge-based skills are perceived by EM residents to provide meaningful feedback and add valuable information for the biannual review process.

3.
Intern Emerg Med ; 11(6): 843-52, 2016 Sep.
Article in English | MEDLINE | ID: mdl-26892405

ABSTRACT

The skill of delivering bad news is difficult to teach and evaluate. Residents may practice in simulated settings; however, this may not translate to confidence or competence during real experiences. We investigated the acceptability and feasibility of social workers as evaluators of residents' delivery of bad news during patient encounters, and assessed the attitudes of both groups regarding this process. From August 2013 to June 2014, emergency medicine residents completed self-assessments after delivering bad news. Social workers completed evaluations after observing these conversations. The Assessment tools were designed by modifying the global Breaking Bad News Assessment Scale. Residents and social workers completed post-study surveys. 37 evaluations were received, 20 completed by social workers and 17 resident self-evaluations. Social workers reported discussing plans with residents prior to conversations 90 % of the time (18/20, 95 % CI 64.5, 97.8). Social workers who had previously observed the resident delivering bad news reported that the resident was more skilled on subsequent encounters 90 % of the time (95 % CI 42.2, 99). Both social workers and residents felt that prior training or experience was important. First-year residents valued advice from social workers less than advice from attending physicians, whereas more experienced residents perceived advice from social workers to be equivalent with that of attending physicians (40 versus 2.9 %, p = 0.002). Social worker assessment of residents' abilities to deliver bad news is feasible and acceptable to both groups. This formalized self-assessment and evaluation process highlights the importance of social workers' involvement in delivery of bad news, and the teaching of this skill. This method may also be used as direct-observation for resident milestone assessment.


Subject(s)
Communication , Emergency Medicine/methods , Internship and Residency , Physician-Patient Relations , Social Workers/psychology , Adult , Aged , Female , Humans , Interprofessional Relations , Male , Middle Aged , Self-Assessment , Surveys and Questionnaires , Workforce
4.
Adv Med Educ Pract ; 5: 275-9, 2014.
Article in English | MEDLINE | ID: mdl-25187750

ABSTRACT

BACKGROUND: The transition from medical student to first-year intern can be challenging. The stress of increased responsibilities, the gap between performance expectations and varying levels of clinical skills, and the need to adapt to a new institutional space and culture can make this transition overwhelming. Orientation programs intend to help new residents prepare for their new training environment. OBJECTIVE: To ease our interns' transition, we piloted a novel clinical primer course. We believe this course will provide an introduction to basic clinical knowledge and procedures, without affecting time allotted for mandatory orientation activities, and will help the interns feel better prepared for their clinical duties. METHODS: First-year Emergency Medicine residents were invited to participate in this primer course, called the Introductory Clinician Development Series (or "intern boot camp"), providing optional lecture and procedural skills instruction prior to their participation in the mandatory orientation curriculum and assumption of clinical responsibilities. Participating residents completed postcourse surveys asking for feedback on the experience. RESULTS: Survey responses indicated that the intern boot camp helped first-year residents feel more prepared for their clinical shifts in the Emergency Department. CONCLUSION: An optional clinical introductory series can allow for maintenance of mandatory orientation activities and clinical shifts while easing the transition from medical student to clinician.

5.
J Grad Med Educ ; 6(2): 335-7, 2014 Jun.
Article in English | MEDLINE | ID: mdl-24949143

ABSTRACT

BACKGROUND: The Residency Review Committee for Emergency Medicine mandates conference participation, but tracking attendance is difficult and fraught with errors. Feedback on didactic sessions, if not collected in real time, is challenging to obtain. OBJECTIVE: We assessed whether an audience response system (ARS) would (1) encourage residents to arrive on time for lectures, and (2) increase anonymous real-time audience feedback. METHODS: The ARS (Poll Everywhere) provided date/time-stamped responses to polls from residents, including a question to verify attendance and questions to gather immediate, anonymous postconference evaluations. Fisher exact test was used to calculate proportions. RESULTS: The proportion of residents who completed evaluations prior to the institution of the ARS was 8.75, and it was 59.42 after (P < .001). The proportion of faculty who completed evaluations prior to using the ARS was 6.12, and it was 85.71 after (P < .001). The proportion of residents who reported they had attended the conference session was 55 for the 3 weeks prior to initiating the ARS, decreasing to 46.67 for the 3 weeks during which the ARS was used to take attendance (P  =  .46). The proportion of faculty who reported attending the conference was 5.56 for the 3 weeks prior to ARS initiation, decreasing to 4.44 for the 3 weeks while using the ARS (P  =  .81). CONCLUSIONS: Audience response systems are an effective way to verify attendance and tardiness, eliminating the subjective effect of attendance takers' leniency and increasing completion of evaluations for didactic sessions.

SELECTION OF CITATIONS
SEARCH DETAIL
...