Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Acad Emerg Med ; 22(7): 838-44, 2015 Jul.
Article in English | MEDLINE | ID: mdl-26112031

ABSTRACT

OBJECTIVES: The Accreditation Council for Graduate Medical Education (ACGME) Milestones describe behavioral markers for the progressive acquisition of competencies during residency. As a key component of the Next Accreditation System, all residents are evaluated for the acquisition of specialty-specific Milestones. The objective was to determine the validity and reliability of the emergency medicine (EM) Milestones. METHODS: The ACGME and the American Board of Emergency Medicine performed this single-event observational study. The data included the initial EM Milestones performance ratings of all categorical EM residents submitted to the ACGME from October 31, 2013, to January 6, 2014. Mean performance ratings were determined for all 23 subcompetencies for every year of residency training. The internal consistency (reliability) of the Milestones was determined using a standardized Cronbach's alpha coefficient. Exploratory factor analysis was conducted to determine how the subcompetencies were interrelated. RESULTS: EM Milestone performance ratings were obtained on 100% of EM residents (n = 5,805) from 162 residency programs. The mean performance ratings of the aggregate and individual subcompetency scores showed discrimination between residency years, and the factor structure further supported the validity of the EM Milestones. The reliability was α = 0.96 within each year of training. CONCLUSIONS: The EM Milestones demonstrated validity and reliability as an assessment instrument for competency acquisition. EM residents can be assured that this evaluation process has demonstrated validity and reliability; faculty can be confident that the Milestones are psychometrically sound; and stakeholders can know that the Milestones are a nationally standardized, objective measure of specialty-specific competency acquisition.


Subject(s)
Accreditation/standards , Clinical Competence/standards , Educational Measurement/methods , Emergency Medicine/education , Internship and Residency/standards , Humans , Reproducibility of Results
2.
Acad Emerg Med ; 21(5): 532-7, 2014 May.
Article in English | MEDLINE | ID: mdl-24842504

ABSTRACT

OBJECTIVES: The American Board of Emergency Medicine (ABEM) Maintenance of Certification (MOC) program is a four-step process that includes the Continuous Certification (ConCert) examination. The ConCert examination is a validated, summative examination that assesses medical knowledge and clinical reasoning. ABEM began administering the ConCert examination in 1989. The ConCert examination must be passed at least every 10 years to maintain certification. This study was undertaken to determine longitudinal physician performance on the ConCert examination. METHODS: In this longitudinal review, ConCert examination performance was compared among residency-trained emergency physicians (EPs) over multiple examination cycles. Longitudinal analysis was performed using a growth curve model for unbalanced data to determine the growth trajectories of EP performance over time to see if medical knowledge changed. Using initial certification qualifying examination scores, the longitudinal analysis corrected for intrinsic variances in physician ability. RESULTS: There were 15,085 first-time testing episodes from 1989 to 2012 involving three examination cycles. The mean adjusted examination scores for all physicians taking the ConCert examination for a first cycle was 85.9 (95% confidence interval [CI] = 85.8 to 85.9), the second cycle mean score was 86.2 (95% CI = 86.0 to 86.3), and the third cycle was 85.4 (95% CI = 85.0 to 85.8). Using the first examination cycle as a reference score, the growth curve model analysis resulted in a coefficient of +0.3 for the second cycle (p < 0.001) and -0.5 for the third cycle (p = 0.02). Initial qualifying (written) examination scores were significant predictors for ConCert examination scores. CONCLUSIONS: Over time, EP performance on the ConCert examination was maintained. These results suggest that EPs maintain medical knowledge over the course of their careers as measured by a validated, summative medical knowledge assessment.


Subject(s)
Certification/standards , Clinical Competence/standards , Emergency Medicine/standards , Adult , Certification/statistics & numerical data , Confidence Intervals , Emergency Medicine/statistics & numerical data , Humans , Longitudinal Studies , Middle Aged , United States
3.
J Emerg Med ; 45(6): 935-41, 2013 Dec.
Article in English | MEDLINE | ID: mdl-23937810

ABSTRACT

BACKGROUND: The Lifelong Learning and Self-assessment (LLSA) component of the American Board of Emergency Medicine (ABEM) Maintenance of Certification (MOC) program is a self-assessment exercise for physicians. Beginning in 2011, an optional continuing medical education (CME) activity was added. OBJECTIVES: As a part of a CME activity option for the LLSA, a survey was used to determine the relevancy of the LLSA readings and the degree to which medical knowledge garnered by the LLSA activity would modify clinical care. METHODS: Survey results from the 2011 LLSA CME activity were reviewed. This survey was composed of seven items, including questions about the relevancy of the readings and the impact on the physician's clinical practice. The questions used a 5-point Likert scale and data underwent descriptive analyses. RESULTS: There were 2841 physicians who took the LLSA test during the study period, of whom 1354 (47.7%) opted to participate in the 2011 LLSA CME activity. All participants completed surveys. The LLSA readings were reported to be relevant to the overall clinical practice of Emergency Medicine (69.6% strongly relevant, 28.1% some relevance, and 2.3% little or no relevance), and provided information that would likely help them change their clinical practices (high likelihood 38.8%, some likelihood 53.0%, little or no change 8.2%). CONCLUSIONS: The LLSA component of the ABEM MOC program is relevant to the clinical practice of Emergency Medicine. Through this program, physicians gain new knowledge about the practice of Emergency Medicine, some of which is reported to change physicians' clinical practices.


Subject(s)
Attitude of Health Personnel , Education, Medical, Continuing/standards , Emergency Medicine/education , Adult , Certification/standards , Clinical Competence , Female , Humans , Male , Middle Aged , Prospective Studies , Surveys and Questionnaires
4.
Acad Emerg Med ; 20(7): 730-5, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23859587

ABSTRACT

OBJECTIVES: The Accreditation Council for Graduate Medical Education (ACGME) and the American Board of Medical Specialties sought to define milestones for skill and knowledge acquisition during residency training. Milestones are significant objective observable events. The milestones are listed within a structure that is derived from the ACGME general competencies. Major groups of milestones are called "subcompetencies." The original 24 subcompetencies containing 255 milestones for emergency medicine (EM) were developed through a multiorganizational group representing most EM stakeholder groups. To assure that the milestones reflected EM resident progress throughout training, the EM Milestones Working Group (EM MWG) sought to validate the individual milestones. METHODS: A computer-based survey was sent to all EM residency programs. The survey period began on April 30, 2012, and concluded on May 15, 2012. Respondents were asked to assign each milestone to a specific level of skill or knowledge acquisition. These levels ranged from a beginning resident to an accomplished clinician. There were two different forms that divided the milestones into two groups of 12 subcompetencies each. Surveys were randomly assigned to programs. RESULTS: There were five respondents (the program director and four key faculty) requested from each of the 159 residences. There were responses from 96 programs (60.4%). Of the 795 survey recipients, 28 were excluded due to prior exposure to the EM milestones. Of the remaining 767 potential respondents, 281 completed the survey (36.6%) within a 16-day period. Based on the survey results, the working group adjusted the milestones in the following ways: one entire subcompetency (teaching) was eliminated, six new milestones were created, 34 milestones were eliminated, 26 milestones were reassigned to a lower level score, and 20 were reassigned to a higher level. Nineteen milestones were edited to provide greater clarity. The final result was 227 discrete milestones among 23 subcompetencies. CONCLUSIONS: The EM milestones were validated through a milestone assignment process using a computer-based survey completed by program directors and key faculty. Milestones were revised in accordance with the results to better align assignment within each performance level.


Subject(s)
Accreditation/standards , Clinical Competence , Emergency Medicine/education , Internet , Cross-Sectional Studies , Education, Medical, Graduate/standards , Female , Humans , Internship and Residency/standards , Male , Quality of Health Care/standards , United States
SELECTION OF CITATIONS
SEARCH DETAIL
...