Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Evol Med Public Health ; 11(1): 353-362, 2023.
Article in English | MEDLINE | ID: mdl-37881688

ABSTRACT

Background and objectives: Universities throughout the USA increasingly offer undergraduate courses in evolutionary medicine (EvMed), which creates a need for pedagogical resources. Several resources offer course content (e.g. textbooks) and a previous study identified EvMed core principles to help instructors set learning goals. However, assessment tools are not yet available. In this study, we address this need by developing an assessment that measures students' ability to apply EvMed core principles to various health-related scenarios. Methodology: The EvMed Assessment (EMA) consists of questions containing a short description of a health-related scenario followed by several likely/unlikely items. We evaluated the assessment's validity and reliability using a variety of qualitative (expert reviews and student interviews) and quantitative (Cronbach's α and classical test theory) methods. We iteratively revised the assessment through several rounds of validation. We then administered the assessment to undergraduates in EvMed and Evolution courses at multiple institutions. Results: We used results from the pilot to create the EMA final draft. After conducting quantitative validation, we deleted items that failed to meet performance criteria and revised items that exhibited borderline performance. The final version of the EMA consists of six core questions containing 25 items, and five supplemental questions containing 20 items. Conclusions and implications: The EMA is a pedagogical tool supported by a wide range of validation evidence. Instructors can use it as a pre/post measure of student learning in an EvMed course to inform curriculum revision, or as a test bank to draw upon when developing in-class assessments, quizzes or exams.

2.
CBE Life Sci Educ ; 22(4): ar41, 2023 12.
Article in English | MEDLINE | ID: mdl-37751506

ABSTRACT

Researchers who study student acceptance of evolution rely on surveys that are designed to measure evolution acceptance. It is important for these surveys to measure evolution acceptance accurately and in isolation from other constructs, so that researchers can accurately determine what leads to low acceptance. The Inventory of Student Evolution Acceptance (I-SEA) and the Generalized Acceptance of EvolutioN Evaluation (GAENE) are two surveys that were developed to improve upon the limitations of earlier surveys. Yet neither survey has been extensively tested for response process validity, which can assess the extent to which students use constructs other than their acceptance of evolution to answer survey items. In this study, we examined the response-process validity of the I-SEA and GAENE by conducting cognitive interviews with 60 undergraduate students. Interviews revealed that both surveys retain certain response-process issues. The I-SEA conflated knowledge about and acceptance of evolution for a subset of students. The GAENE measured evolution acceptance inconsistently because students interpreted "evolution" in different ways; it also measured willingness to advocate for evolution in addition to acceptance. Researchers can use these findings to better inform their survey choice when designing future studies, and to further improve the measurement of evolution acceptance.


Subject(s)
Knowledge , Students , Humans , Surveys and Questionnaires
3.
CBE Life Sci Educ ; 21(1): ar10, 2022 03.
Article in English | MEDLINE | ID: mdl-35044845

ABSTRACT

Hundreds of articles have explored the extent to which individuals accept evolution, and the Measure of Acceptance of the Theory of Evolution (MATE) is the most often used survey. However, research indicates the MATE has limitations, and it has not been updated since its creation more than 20 years ago. In this study, we revised the MATE using information from cognitive interviews with 62 students that revealed response process errors with the original instrument. We found that students answered items on the MATE based on constructs other than their acceptance of evolution, which led to answer choices that did not fully align with their actual acceptance. Students answered items based on their understanding of evolution and the nature of science and different definitions of evolution. We revised items on the MATE, conducted 29 cognitive interviews on the revised version, and administered it to 2881 students in 22 classes. We provide response process validity evidence for the new measure through cognitive interviews with students, structural validity through a Rasch dimensionality analysis, and concurrent validity evidence through correlations with other measures of evolution acceptance. Researchers can now measure student evolution acceptance using this new version of the survey, which we have called the MATE 2.0.


Subject(s)
Students , Humans , Psychometrics , Reproducibility of Results , Surveys and Questionnaires
SELECTION OF CITATIONS
SEARCH DETAIL
...