Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
2.
Med Educ ; 38(9): 958-68, 2004 Sep.
Article in English | MEDLINE | ID: mdl-15327677

ABSTRACT

CONTEXT: Standardised assessments of practising doctors are receiving growing support, but theoretical and logistical issues pose serious obstacles. OBJECTIVES: To obtain reference performance levels from experienced doctors on computer-based case simulation (CCS) and standardised patient-based (SP) methods, and to evaluate the utility of these methods in diagnostic assessment. SETTING AND PARTICIPANTS: The study was carried out at a military tertiary care facility and involved 54 residents and credentialed staff from the emergency medicine, general surgery and internal medicine departments. MAIN OUTCOME MEASURES: Doctors completed 8 CCS and 8 SP cases targeted at doctors entering the profession. Standardised patient performances were compared to archived Year 4 medical student data. RESULTS: While staff doctors and residents performed well on both CCS and SP cases, a wide range of scores was exhibited on all cases. There were no significant differences between the scores of participants from differing specialties or of varying experience. Among participants who completed both CCS and SP testing (n = 44), a moderate positive correlation between CCS and SP checklist scores was observed. There was a negative correlation between doctor experience and SP checklist scores. Whereas the time students spent with SPs varied little with clinical task, doctors appeared to spend more time on communication/counselling cases than on cases involving acute/chronic medical problems. CONCLUSION: Computer-based case simulations and standardised patient-based assessments may be useful as part of a multimodal programme to evaluate practising doctors. Additional study is needed on SP standard setting and scoring methods. Establishing empirical likelihoods for a range of performances on assessments of this character should receive priority.


Subject(s)
Clinical Competence/standards , Physicians/standards , Adult , Computer-Assisted Instruction/methods , Education, Medical, Continuing/methods , Educational Measurement/standards , Employee Performance Appraisal/standards , Female , Humans , Male , Patient Simulation
4.
Ann Emerg Med ; 43(6): 756-69, 2004 Jun.
Article in English | MEDLINE | ID: mdl-15159710

ABSTRACT

In response to public pressure for greater accountability from the medical profession, a transformation is occurring in the approach to medical education and assessment of physician competency. Over the past 5 years, the Accreditation Council for Graduate Medical Education (ACGME) has implemented the Outcomes and General Competencies projects to better ensure that physicians are appropriately trained in the knowledge and skills of their specialties. Concurrently, the American Board of Medical Specialties, including the American Board of Emergency Medicine (ABEM), has embraced the competency concept. The core competencies have been integral in ABEM's development of Emergency Medicine Continuous Certification and the development of the Model of Clinical Practice of Emergency Medicine (Model). ABEM has used the Model as a significant part of its blueprint for the written and oral certification examinations in emergency medicine and is fully supportive of the effort to more fully define and integrate the ACGME core competencies into training emergency medicine specialists. To incorporate these competencies into our specialty, an Emergency Medicine Competency Taskforce (Taskforce) was formed by the Residency Review Committee-Emergency Medicine to determine how these general competencies fit in the Model. This article represents a consensus of the Taskforce with the input of multiple organizations in emergency medicine. It provides a framework for organizations such as the Council of Emergency Medicine Residency Directors (CORD) and the Society for Academic Emergency Medicine to develop a curriculum in emergency medicine and program requirement revisions by the Residency Review Committee-Emergency Medicine. In this report, we describe the approach taken by the Taskforce to integrate the ACGME core competencies into the Model. Ultimately, as competency-based assessment is implemented in emergency medicine training, program directors, governing bodies such as the ACGME, and individual patients can be assured that physicians are competent in emergency medicine.


Subject(s)
Accreditation , Clinical Competence/standards , Education, Medical, Graduate/standards , Emergency Medicine/education , Internship and Residency/standards , Curriculum , Humans , Models, Educational , Patient Care , Problem-Based Learning , United States
5.
Med Educ ; 36(10): 942-8, 2002 Oct.
Article in English | MEDLINE | ID: mdl-12390462

ABSTRACT

INTRODUCTION: Increasing attention is being directed towards finding ways of assessing how well doctors perform in clinical practice. Current approaches rely on strategies directed at individuals only, but, in real life, doctors' work is characterised by multiple complex professional interactions. These interactions involve different kinds of teams and are embedded within the overall context and systems of care. In addition to individual factors, therefore, we propose that the performance of doctors in health care teams and systems will also impact on the overall quality of patient care. Assessing these dimensions, however, poses a number of challenges. STRATEGIES: Taking a profile of a National Health Service, UK surgeon as an example, the team structures to which he or she may relate are illustrated. These include formal teams such as those found in the operating theatre, and those formed through various professional and collegial partnerships. The authors then propose a model for assessing doctors' performances in teams and systems, which incorporates the educational principles of continuous feedback to enhance future performance. DISCUSSION: To implement the proposed model, a wide range of professional, educational and regulatory bodies must collaborate. This raises a number of important implications for the future roles and relationships of these bodies, which are discussed. A strong and constructive partnership will be essential if the full potential of a more inclusive and representative assessment approach is to be realised.


Subject(s)
Clinical Competence/standards , Education, Medical/standards , Patient Care Team/standards , Physicians, Family/standards , Humans , Interprofessional Relations , Quality of Health Care/standards , United Kingdom
SELECTION OF CITATIONS
SEARCH DETAIL
...