Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
J Healthc Leadersh ; 11: 75-80, 2019.
Article in English | MEDLINE | ID: mdl-31354375

ABSTRACT

Purpose: This study examined whether change in physician engagement affected outpatient or resident physician satisfaction using common US measures. Methods: Surveys were administered by Advisory Board Survey Solutions for staff physician engagement, Press Ganey for Clinician and Group Consumer Assessment of Healthcare Providers and Systems (CGCAHPS) for outpatient satisfaction, and Accreditation Council for Graduate Medical Education (ACGME) for the ACGME Resident/Fellow Survey. Survey sample sizes were 685, 697, and 763 for physician engagement and 621, 625, and 618 for resident satisfaction in 2014-2016, respectively; only respondents were available for CGCAHPS (24,302, 34,328, and 43,100 for 2014-2016, respectively). Two groups were analyzed across 3 years: (1) percentage of "engaged" staff physicians versus percentage of outpatient top box scores for physician communication, and (2) percentage of "engaged" staff physicians versus percentage of residents "positive" on program evaluation. For resident evaluation of faculty, the number of programs that met/exceeded ACGME national compliance scores were compared. Univariate chi-squared tests compared data between 2014, 2015, and 2016. Results: For 2014-2016, "engaged" physicians increased from 34% (169/497) to 44% (227/515) to 48% (260/542) (P<0.001) whereas CGCAHPS top box scores for physician communication remained unchanged at 90.9% (22,091/24,302), 90.8% (31,088/34,328), and 90.9% (39,178/43,100) (P=0.869). For the second group, "engaged" physicians increased from 33% (204/617) to 46% (318/692) to 50% (351/701) (P<0.001) and residents "positive" on program evaluation increased from 86% (534/618) in 2014 to 89% (556/624) in 2015 and 89% (550/615) in 2016 (P=0.174). The number of specialties that met/exceeded national compliance for all five faculty evaluation items grew from 44% (11/25) in 2014 to 68% (17/25) in 2015 and 64% (16/25) in 2016 (P=0.182). Conclusion: For our medical group, improvement in physician engagement across time did not coincide with meaningful change in the outpatient experience with physician communication or resident satisfaction with program and faculty.

2.
J Grad Med Educ ; 3(4): 524-8, 2011 Dec.
Article in English | MEDLINE | ID: mdl-23205202

ABSTRACT

OBJECTIVE: We describe a collaboration between the graduate medical education office and the Henry Ford Health System's Office of Clinical Quality and Safety to create an institution-wide communication skills curriculum pertinent to the institution's safety and patient- and family-centered care initiatives. METHODS: A multidisciplinary committee provided oversight for the curriculum design and used sentinel event and other quality and safety data to identify specific target areas. The curriculum consisted of 3 courses: "Informed Consent," "Sharing Bad News," and "Disclosure of Unanticipated Events." Each course included 3 components: a multimedia online module; small group discussions led by the program director that focused on the use of communication scripts; and 2 objective structured clinical examinations (OSCEs) requiring residents to demonstrate use of the communication scripts. All first-year residents (N  =  145) and faculty (N  =  30) from 20 residency programs participated in this initiative. Evaluation of the residents consisted of a self-assessment; the standardized patients' assessment of the residents' performance; and faculty assessment of resident performance with verbal feedback. RESULTS: Survey data showed that residents found the courses valuable, with residents identifying communication scripts they would use in clinical settings. Focus groups with faculty highlighted that the resident debriefing sessions provided them with insight into a resident's communication skills early in their training. CONCLUSION: Our institutional curriculum was developed in a collaborative manner, and used an evidence-based approach to teach communication skills relevant to institutional safety and quality initiatives. Other institutions 5 wish to adopt our strategy of departmental collaboration and alignment of resident education with institutional initiatives.

3.
J Grad Med Educ ; 3(4): 550-3, 2011 Dec.
Article in English | MEDLINE | ID: mdl-23205207

ABSTRACT

BACKGROUND: Multiple factors affect residency education, including duty-hour restrictions and documentation requirements for regulatory compliance. We designed a work sampling study to determine the proportion of time residents spend in structured education, direct patient care, indirect patient care that must be completed by a physician, indirect patient care that 5 be delegated to other health care workers, and personal activities while on an inpatient general practice unit. METHODS: The 3-month study in 2009 involved 14 categorical internal medicine residents who volunteered to use personal digital assistants to self-report their location and primary tasks while on an inpatient general practice unit. RESULTS: Residents reported spending most of their time at workstations (43%) and less time in patient rooms (20%). By task, residents spent 39% of time on indirect patient care that must be completed by a physician, 31% on structured education, 17% on direct patient care, 9% on indirect patient care that 5 be delegated to other health care workers, and 4% on personal activities. From these data we estimated that residents spend 34 minutes per patient per day completing indirect patient care tasks compared with 15 minutes per patient per day in direct patient care. CONCLUSIONS: This single-institution time study objectively quantified a current state of how and where internal medicine residents spend their time while on a general practice unit, showing that residents overall spend less time on direct patient care compared with other activities.

4.
Jt Comm J Qual Patient Saf ; 36(5): 203-8, 2010 May.
Article in English | MEDLINE | ID: mdl-20480752

ABSTRACT

BACKGROUND: The Accreditation Council for Graduate Medical Education (ACGME)'s Outcome Project requires training programs to use external measures such as quality of care indicators to assess their effectiveness. A practical, quality improvement (QI) process was implemented at Henry Ford Hospital to enhance the training program's educational effectiveness and clinical outcomes. METHODS: A QI process consisting ofa modified Plan-Do-Study-Act (PDSA) cycle was applied to residency and fellowship curricula in a medical intensive care unit (MICU). The PDSA activities focused on improving clinical outcomes but also outlined educational goals for residents and fellows, defined teaching methods, and determined assessment methods for the ACGME curricula. The improvement process linked clinical outcomes to specific competency-based educational objectives. Residents and fellows received instruction on QI and applied the new curricula to their clinical training in the MICU. RESULTS: Two of seven MICU clinical outcomes demonstrated initial performance below national benchmarks: iatrogenic pneumothorax rate and sepsis-specific mortality. During the QI process, clinical outcomes in both areas improved. Training program directors used the MICU clinical outcomes as indicators of their programs' educational effectiveness. They also assessed individual trainee performance in QI initiatives through direct observation and review of their written summaries of these projects. CONCLUSIONS: Training programs can use hospital-tracked clinical outcomes to analyze their educational strengths and weaknesses and accordingly to enhance their educational curricula. Linking competency-based learning objectives for trainees to the clinical outcomes of their patients can improve physician education and patient care.


Subject(s)
Internship and Residency/standards , Outcome Assessment, Health Care , Quality Assurance, Health Care/methods , Humans , Iatrogenic Disease , Intensive Care Units , Internship and Residency/organization & administration , Michigan/epidemiology , Pneumothorax/epidemiology , Sepsis/mortality
5.
J Grad Med Educ ; 2(2): 165-9, 2010 Jun.
Article in English | MEDLINE | ID: mdl-21975614

ABSTRACT

BACKGROUND: This study examined the psychometric properties of the Kalamazoo Essential Elements Communication Checklist (Adapted) (KEECC-A), which addresses 7 key elements of physician communication identified in the Kalamazoo Consensus Statement, in a sample of 135 residents in multiple specialties at a large urban medical center in 2008-2009. The KEECC-A was used by residents, standardized patients, and faculty as the assessment tool in a broader institutional curriculum initiative. METHODS: Three separate KEECC-A scores (self-ratings, faculty ratings, and standardized patient ratings) were calculated for each resident to assess the internal consistency and factor structure of the checklist. In addition, we analyzed KEECC-A ratings by gender and US versus international medical graduates, and collected American Board of Internal Medicine Patient Satisfaction Questionnaire (PSQ) scores for a subsample of internal medicine residents (n  =  28) to examine the relationship between this measure and the KEECC-A ratings to provide evidence of convergent validity. RESULTS: The KEECC-A ratings generated by faculty, standardized patients, and the residents themselves demonstrated a high degree of internal consistency. Factor analyses of the 3 different sets of KEECC-A ratings produced a consistent single-factor structure. We could not examine the relationship between KEECC-A and the PSQ because of substantial range restriction in PSQ scores. No differences were seen in the communication scores of men versus women. Faculty rated US graduates significantly higher than international medical graduates. CONCLUSION: Our study provides evidence for the reliability and validity of the KEECC-A as a measure of physician communication skills. The KEECC-A appears to be a psychometrically sound, user-friendly communication tool, linked to an expert consensus statement, that can be quickly and accurately completed by multiple raters across diverse specialties.

SELECTION OF CITATIONS
SEARCH DETAIL
...