Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
J Dent Educ ; 81(8): 978-985, 2017 Aug.
Article in English | MEDLINE | ID: mdl-28765442

ABSTRACT

Critical thinking skills are essential for the successful dentist, yet few explicit skillsets in critical thinking have been developed and published in peer-reviewed literature. The aims of this article are to 1) offer an assessable critical thinking teaching model with the expert's thought process as the outcome, learning guide, and assessment instrument and 2) offer three critical thinking skillsets following this model: for geriatric risk assessment, technology decision making, and situation analysis/reflections. For the objective component, the student demonstrates delivery of each step in the thought process. For the subjective component, the student is judged to have grasped the principles as applied to the patient or case. This article describes the framework and the results of pilot tests in which students in one year at this school used the model in the three areas, earning scores of 90% or above on the assessments. The model was thus judged to be successful for students to demonstrate critical thinking skillsets in the course settings. Students consistently delivered each step of the thought process and were nearly as consistent in grasping the principles behind each step. As more critical thinking skillsets are implemented, a reinforcing network develops.


Subject(s)
Education, Dental/methods , Educational Measurement , Learning , Models, Educational , Students, Dental/psychology , Thinking , Aged , Clinical Decision-Making , Geriatric Assessment , Humans , Risk Assessment
2.
J Dent Educ ; 78(3): 359-67, 2014 Mar.
Article in English | MEDLINE | ID: mdl-24609338

ABSTRACT

Introducing critical thinking and evidence-based dentistry (EBD) content into an established dental curriculum can be a difficult and challenging process. Over the past three years, the University of Iowa College of Dentistry has developed and implemented a progressive four-year integrated critical thinking and EBD curriculum. The objective of this article is to describe the development and implementation process to make it available as a model for other dental schools contemplating introduction of critical thinking and EBD into their curricula. The newly designed curriculum built upon an existing problem-based learning foundation, which introduces critical thinking and the scientific literature in the D1 year, in order to expose students to the rationale and resources for practicing EBD in the D2 and D3 years and provide opportunities to practice critical thinking and apply the EBD five-step process in the D2, D3, and D4 years. All curricular content is online, and D3 and D4 EBD activities are integrated within existing clinical responsibilities. The curricular content, student resources, and student activities are described.


Subject(s)
Curriculum , Education, Dental , Evidence-Based Dentistry/education , Learning , Teaching/methods , Thinking , Clinical Competence , Computer-Assisted Instruction , Educational Measurement/methods , Feedback , Humans , Iowa , Models, Educational , Online Systems , Problem-Based Learning , Program Development , Program Evaluation , Teaching Materials
3.
J Dent Educ ; 75(2): 160-8, 2011 Feb.
Article in English | MEDLINE | ID: mdl-21293038

ABSTRACT

A concise overview of an institution's aspirations for its students becomes increasingly elusive because dental education has evolving emphases on priorities like critical thinking and adapting to new technology. The purpose of this article is to offer a learner-oriented matrix that gives a focus for discussion and an overview of an institution's educational outcomes. On one axis of the matrix, common educational outcomes are listed: knowledge, technical skills, critical thinking, ethical and professional values, patient and practice management, and social responsibility awareness. On the other axis, methodologies are listed: definition, cultivation strategies, measures (summative/formative, objective/subjective), institutional coordination, and competency determination. By completing the matrix, an overview of the process by which students reach these outcomes emerges. Each institution would likely complete the matrix differently and, ideally, with active discussion. While the matrix can first be used to establish "Where are we now?" for an institution, it can also be a starting point for more extensive matrices and further discussion. Vertical and horizontal analyses of the matrix provide a unique lens for viewing the institution's learning environment.


Subject(s)
Education, Dental , Learning , Models, Educational , Students, Dental , Clinical Competence , Curriculum , Dental Care , Dentist-Patient Relations , Education, Dental/standards , Educational Measurement , Educational Technology , Ethics, Dental , Evidence-Based Dentistry/education , Faculty, Dental , Humans , Practice Management, Dental , Schools, Dental/standards , Social Responsibility , Social Values , Teaching/methods , Thinking
4.
Teach Learn Med ; 22(4): 241-5, 2010 Oct.
Article in English | MEDLINE | ID: mdl-20936568

ABSTRACT

BACKGROUND: The medical education research literature consistently recommends a structured format for the medical school preadmission interview. There is, however, little direct evidence to support this recommendation. PURPOSE: To shed further light on this issue, the present study examines the respective reliability contributions from the structured and unstructured interview components at the University of Iowa. METHODS: We conducted three univariate G studies on ratings from 3,043 interviews and one multivariate G study using responses from 168 applicants who interviewed twice. RESULTS: Examining interrater reliability and test-retest types of reliability, the unstructured format proved more reliable in both instances. Yet, combining measures from the two interview formats yielded a more reliable score than using either alone. CONCLUSIONS: At least from a reliability perspective, the popular advice regarding interview structure may need to be reconsidered. Issues related to validity, fairness, and reliability should be carefully weighed when designing the interview process.


Subject(s)
Education, Medical/standards , Interviews as Topic/methods , School Admission Criteria , Educational Measurement , Educational Status , Humans , Models, Statistical , Multivariate Analysis , Reproducibility of Results , United States
5.
Eval Health Prof ; 33(3): 365-85, 2010 Sep.
Article in English | MEDLINE | ID: mdl-20801977

ABSTRACT

For medical schools, the increasing presence of women makes it especially important that potential sources of gender bias be identified and removed from student evaluation methods. Our study looked for patterns of gender bias in adjective data used to inform our Medical Student Performance Evaluations (MSPEs). Multigroup Confirmatory Factor Analysis (CFA) was used to model the latent structure of the adjectives attributed to students (n = 657) and to test for systematic scoring errors by gender. Gender bias was evident in two areas: (a) women were more likely than comparable men to be described as ''compassionate,'' ''sensitive,'' and ''enthusiastic'' and (b) men were more likely than comparable women to be seen as ''quick learners.'' The gender gap in ''quick learner'' attribution grows with increasing student proficiency; men's rate of increase is over twice that of women's. Technical and nontechnical approaches for ameliorating the impact of gender bias on student recommendations are suggested.


Subject(s)
Educational Measurement/methods , Gender Identity , Prejudice , Students, Medical/statistics & numerical data , Confidence Intervals , Data Interpretation, Statistical , Educational Status , Female , Humans , Likelihood Functions , Male , Multivariate Analysis , Sex Factors , Task Performance and Analysis
6.
Adv Health Sci Educ Theory Pract ; 11(2): 145-53, 2006 May.
Article in English | MEDLINE | ID: mdl-16729242

ABSTRACT

BACKGROUND: Grading standards vary widely across undergraduate institutions. If, during the medical school admissions process, GPA is considered without reference to the institution attended, it will disadvantage applicants from undergraduate institutions employing rigorous grading standards. METHOD: A regression-based GPA institutional equating method using historical MCAT and GPA information is described. Classes selected from eight applicant pools demonstrate the impact of the GPA adjustment. The validity of the adjustment is examined by comparing adjusted and unadjusted GPAs' correlation with USMLE and medical college grades. RESULTS: The adjusted GPA demonstrated significantly improved congruence with MCAT estimates of applicant preparedness. The adjustment changed selection decisions for 21% of those admitted. The adjusted GPA enhanced prediction of USMLE and medical school grades only for students from institutions which required large adjustments. CONCLUSION: Unlike other indices, the adjustment described uses the same metric as GPA and is based only on an institution's history of preparing medical school applicants. The institutional adjustment is consequential in selection, significantly enhances congruence with a standardized measure of academic preparedness and may enhance the validity of the GPA.


Subject(s)
Education, Medical, Undergraduate , Educational Measurement/standards , School Admission Criteria , Humans , United States
7.
Teach Learn Med ; 18(1): 4-8, 2006.
Article in English | MEDLINE | ID: mdl-16354132

ABSTRACT

BACKGROUND: Researchers generally recommend a structured format for the medical school preadmission interview (MSPI). However, the relative benefits of various elements of structure remain unexamined. PURPOSE: In this study, we compared the performance of a highly structured interview format with a semistructured format. Specifically, we examined how the reliability of interview ratings is likely to change when using the same versus different questions for each applicant being interviewed. METHOD: Variance components from a generalizability (G) study of a structured interview are used in decision studies to compare the relative efficiency of using the same versus different questions for each applicant. RESULTS: Using different questions for each interviewee is practically as reliable as using the same questions for all applicants (G = .55 vs. .57, respectively). CONCLUSIONS: Because there are a number of drawbacks to using the same questions for all applicants (i.e., security and validity) and little advantage in terms of increased reliability, the semistructured question format should be considered when conducting the MSPI. A suggested method of implementing a semistructured interview is by presenting each applicant a set of questions randomly drawn from a pool of interview questions.


Subject(s)
School Admission Criteria , Schools, Medical , Students, Medical , Educational Measurement , Humans , Interviews as Topic , Reproducibility of Results , Surveys and Questionnaires
8.
Article in English | MEDLINE | ID: mdl-15141132

ABSTRACT

PURPOSE: Determining the valid and fair use of the interview for medical school admissions is contingent upon a demonstration of the reproducibility of interview scores. This study seeks to establish the generalizability of interview scores, first assessing the existing research evidence, and then analyzing data from a non-experimental independent replications research design. METHODS: Multivariate and univariate generalizability analyses are conducted using data from a structured interview obtained from a population of medical school applicants over two years. RESULTS: The existing literature does not provide sufficient evidence regarding interview reliability. In this study, interview scores derived from a standardized interview were found to display low to moderate levels of reliability. Interview scores do not appear to possess the level of precision found with other measures commonly used to facilitate admissions decisions. DISCUSSION/CONCLUSION: Given the results obtained, the fairness of using the interview as a highly influential component of the admission process is called into question. Methods for using interview data in a psychometrically defensible fashion are discussed. Specifically, attention to decision reliability provides guidance on how interview scores can best be integrated into the admissions process.


Subject(s)
Education, Medical, Undergraduate , Interviews as Topic/standards , School Admission Criteria , Educational Measurement , Humans , Multivariate Analysis , Reproducibility of Results
9.
Teach Learn Med ; 15(2): 116-22, 2003.
Article in English | MEDLINE | ID: mdl-12708069

ABSTRACT

PURPOSE: The underrepresentation of certain minorities within medical education and the medical profession continues to be a problem. A review of the relevant research literature suggests current strategies are inadequate to address this important problem. Psychometric issues important in differentiating the unique concerns of medical education must be defined. SUMMARY: A new model that may attain diversity goals and meet standards related to validity and legality is presented. Admissions data describing applicants for one year at a large midwestern medical college are analyzed. The impact of selection techniques on both majority and underrepresented minority applicants is presented. CONCLUSION: The results of the analyses support further research designed to meet the quantitative objectives implied by diversity goals and suggests initiatives aimed at enhancing minority representation within medical education.


Subject(s)
Cultural Diversity , Education, Medical , Minority Groups , School Admission Criteria , Schools, Medical/organization & administration , Humans , Psychometrics , United States
10.
Teach Learn Med ; 14(1): 29-33, 2002.
Article in English | MEDLINE | ID: mdl-11865746

ABSTRACT

BACKGROUND: In an earlier study it was demonstrated that constrained optimization could be used to accurately translate admission goals. Constrained optimization differs from weighting equations in that it does not assign a rank ordering. Although constrained optimization is conceptually superior, some procedures within the admissions process require a rank ordering of applicants. PURPOSE: The purpose of this study is to describes and evaluate the use of two discriminant analysis procedures to obtain a rank order list by generating weights that model constrained optimization procedures. This study also evaluates the usefulness of an additional method that does not require a rank ordering. METHODS: Premium Solver selected a class from the 1998/99 applicant pool. A discriminant analysis was used to generate a discriminant function for modeling the dichotomous group classification selection variable. These weights were then applied and the discriminant function values calculated. The success of the procedure was evaluated by examining rank orders and the magnitude of the correlation and R-square statistic. RESULTS: Discriminant analysis accounted for 70% of the decision variance generated by the constrained optimization procedure. Using real data allowed an estimate of the number of students impacted by inconsistent outcomes. CONCLUSION: Discriminant analysis could be used to manage an alternate list, however it will be based on somewhat different criteria than the initial selection procedure. Each method evaluated has advantages and disadvantages and the selection of one method over another depends on what outcomes are most valued by the college.


Subject(s)
Discriminant Analysis , School Admission Criteria , Schools, Medical/organization & administration , Iowa
SELECTION OF CITATIONS
SEARCH DETAIL
...