Your browser doesn't support javascript.
Effects of mode and medium in reading comprehension tests on cognitive load
Computers & Education ; 192, 2023.
Article in English | Web of Science | ID: covidwho-2121113
ABSTRACT
Monitoring students' competences is vital in order to inform teachers, administrators, and policymakers about students' learning progress and achievement heterogeneity. For educational researchers, it is a matter of accountability to use state of the art designs to make assessments as reliable, valid and efficient as possible. Digital media have recently been described as a "third space " for learning, located between learning at home and at school (McDougall & Potter, 2019;Potter & McDougall, 2017). This view has gained new relevance in light of the COVID-19 pandemic, which forced schools across the globe into distance learning and thus further increased reliance on computers for schooling (Helm et al., 2021). Large-scale assessments such as PIRLS and PISA have started to incorporate computers into their test administration (Hu beta mann et al., 2017;Yamamoto, Shin, & Khorramdel, 2019). Administering tests on the computer comes with the option to use testing modes that adapt to each examinee's ability (Davey, 2011;Frey et al., 2017). Reading comprehension assessments in elementary school are of particular interest, because reading comprehension affects students' educa-tional futures and their ability to participate in society and life in general (OECD, 2019;Wigfield et al., 2016). As such, it is one of the most important skills that children are taught in elementary school. There, children move from learning individual letters to decoding the meaning of words, on to understanding the content of a sentence, paragraph, or text (Becker, McElvany, & Kortenbruck, 2010). Between grades two and three, children tend to begin reading fluently (see Chall, 1983). Still, reading ability in fourth grade students is considerably heterogeneous, with only 34% of US students participating in the National Assessment of Educational Progress (NAEP) reaching reading proficiency (as defined by the NAEP), and 35% not reaching the basic reading level (National Center for Education Statistics, 2019). Computer-based tests (CBTs) and computer-adaptive tests (CATs) can potentially improve the assessment of reading comprehension over paper-and-pencil tests (PPTs). CBTs have higher technical demands, but offer more control over the test situation (i.e., exposure control or time limits) and provide additional quality control data (e.g., rapid guessing detection) and process data (e.g., log files) compared to PPTs. In addition, adaptive item selection can make a test design much more efficient (Davey, 2011). However, for beginning readers, the effects of reading on screen versus reading on paper and the potential effects of adaptive item selection that adjusts to individual ability are not well-researched. Existing research has focused on test score equivalence between different formats of test administration regarding test scores. Research into the test experience has been limited. Inspired by basic research on instruc-tional design and cognitive psychology, it has been suggested that taking a CBT, CAT, or PPT can lead to different test experiences (Colwell, 2013;Ortner, Weisskopf, & Koch, 2014). An important aspect of the test experience is an examinee's cognitive load and its relation to working memory. Reading comprehension is closely related to working memory (Daneman & Carpenter, 1980), as comprehension relies on remembering previous words in a sentence or paragraph. This holds true for elementary school children (Seigneuric, Ehrlich, Oakhill, & Yuill, 2000). Cognitive load is the strain that carrying out a task, such as learning or reading, puts on a person's working memory (Sweller, van Merrie & BULL;nboer, & Paas, 2019). Cognitive load theory (CLT) is often applied in instructional
Keywords

Full text: Available Collection: Databases of international organizations Database: Web of Science Type of study: Experimental Studies Language: English Journal: Computers & Education Year: 2023 Document Type: Article

Similar

MEDLINE

...
LILACS

LIS


Full text: Available Collection: Databases of international organizations Database: Web of Science Type of study: Experimental Studies Language: English Journal: Computers & Education Year: 2023 Document Type: Article