Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
Pract Radiat Oncol ; 3(1): 74-78, 2013.
Article in English | MEDLINE | ID: mdl-24674266

ABSTRACT

PURPOSE: Oral examinations are used in certifying examinations by many medical specialty boards. They represent daily clinical practice situations more realistically than do written tests or computer-based tests. However, there are repeated concerns in the literature regarding objectivity, fairness, and extraneous factors from interpersonal interactions, item bias, reliability, and validity. In this study, the reliability of oral examination on the radiation oncology certifying examination, which was administered in May of 2010, was analyzed. METHODS AND MATERIALS: One hundred fifty-two candidates rotated though 8 examination stations. Stations consisted of a hotel room equipped with a computer and software that exhibited images appropriate to the content areas. Each candidate had a 25-30 minute face-to-face encounter with an oral examiner who was a content expert in one of the following areas: gastrointestinal, gynecology, genitourinary, lymphoma/leukemia/transplant/myeloma, head/neck/skin, breast, central nervous system/pediatrics, or lung/sarcoma. This type of design is typically referred to as a repeated measures design or a subject by treatment design, although the oral examination was a routine event without any experimental manipulation. RESULTS: The reliability coefficient was obtained by applying Feldt and Charter's simple computational alternative to analysis of variance formulas that yielded KR-20, or Cronbach's coefficient alpha of 0.81. CONCLUSIONS: An experimental design to develop a blueprint in order to improve the consistency of evaluation is suggested.

2.
J Am Coll Radiol ; 9(2): 121-8, 2012 Feb.
Article in English | MEDLINE | ID: mdl-22305698

ABSTRACT

The ABR performs practice analysis every 3 years, according to its strategic plan, in an effort to strengthen the content validity of its qualifying and certifying examinations as well as its maintenance of certification examinations. A nationwide survey of diagnostic radiologists was conducted in July 2010 for the purpose of determining the critically important and frequently performed activities in 12 clinical categories. The survey instrument was distributed electronically to 17,721 members of the ACR, with a unique identification code for each individual. A 5-point scale was established for both frequency and importance variables. The frequency scale ranged from 1 to 5 as follows: 1 = not applicable, 2 = occasionally, 3 = monthly, 4 = weekly, and 5 = daily. The scale for importance also ranged from 1 to 5: 1 = not applicable, 2 = not important, 3 = somewhat important, 4 = important, and 5 = essential. A total of 2,909 diagnostic radiologists (19.32%) participated. Of these, 2,233 (76.76%) indicated that they spent ≥50% of their time in clinical practice. Because of its brevity of the list of the activities, results for the gastrointestinal category are presented in this article. The list of activities weighted according to importance and frequency is presented in this article and, as illustrated, could become the foundation for developing a more detailed blueprint for the gastrointestinal category certifying examinations in diagnostic radiology. Findings on demographic information are also presented.


Subject(s)
Certification , Practice Patterns, Physicians'/statistics & numerical data , Professional Practice/statistics & numerical data , Radiology/education , Radiology/standards , Specialty Boards , Workload/statistics & numerical data , Data Collection , Educational Measurement , United States
3.
J Am Coll Radiol ; 8(3): 199-202, 2011 Mar.
Article in English | MEDLINE | ID: mdl-21371671

ABSTRACT

This report was prepared by those who are closely involved in the radiation oncology initial qualification examinations. The primary purpose of this article is to disseminate information concerning test preparation, test administration, scoring, and reporting processes of the ABR. The authors hope that the information contained in the article will be helpful to radiology residents, program directors, and other interested parties.


Subject(s)
Education, Medical, Graduate/standards , Educational Measurement/standards , Radiology/education , Radiology/standards , Certification , Clinical Competence , Curriculum , Humans , Internship and Residency , Specialty Boards , United States , Writing
4.
AJR Am J Roentgenol ; 195(1): 10-2, 2010 Jul.
Article in English | MEDLINE | ID: mdl-20566793

ABSTRACT

The purpose of this article is to inform radiology residents, program directors, and other interested parties of the processes involved in developing, administering, and scoring the American Board of Radiology (ABR) diagnostic radiology qualifying (written) examinations. The residents, once certified, will have a lifelong professional relationship with the ABR. It is important for the ABR to be transparent about the processes it uses to ensure that its examinations are fair, valid, and reliable so that residents and their program directors have accurate information about these high-stakes examinations.


Subject(s)
Certification , Education, Medical, Graduate/standards , Educational Measurement/standards , Radiology/education , Radiology/standards , Clinical Competence , Curriculum , Humans , Internship and Residency , Specialty Boards , United States , Writing
SELECTION OF CITATIONS
SEARCH DETAIL
...