Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Int J STEM Educ ; 9(1): 16, 2022.
Article in English | MEDLINE | ID: mdl-35194544

ABSTRACT

BACKGROUND: In recent years, research on online learning platforms has exploded in quantity. More and more researchers are using these platforms to conduct A/B tests on the impact of different designs, and multiple scientific communities have emerged around studying the big data becoming available from these platforms. However, it is not yet fully understood how each type of research influences future scientific discourse within the broader field. To address this gap, this paper presents the first scientometric study on how researchers build on the contributions of these two types of online learning platform research (particularly in STEM education). We selected a pair of papers (one using A/B testing, the other conducting learning analytics (LA), on platform data of an online STEM education platform), published in the same year, by the same research group, at the same conference. We then analyzed each of the papers that cited these two papers, coding from the paper text (with inter-rater reliability checks) the reason for each citation made. RESULTS: After statistically comparing the frequency of each category of citation between papers, we found that the A/B test paper was self-cited more and that citing papers built on its work directly more frequently, whereas the LA paper was more often cited without discussion. CONCLUSIONS: Hence, the A/B test paper appeared to have had a larger impact on future work than the learning analytics (LA) paper, even though the LA paper had a higher count of total citations with a lower degree of self-citation. This paper also established a novel method for understanding how different types of research make different contributions in learning analytics, and the broader online learning research space of STEM education.

2.
Psychon Bull Rev ; 18(4): 798-804, 2011 Aug.
Article in English | MEDLINE | ID: mdl-21557027

ABSTRACT

Words have been shown to influence many cognitive tasks, including category learning. Most demonstrations of these effects have focused on instances in which words facilitate performance. One possibility is that words augment representations, predicting an across the-board benefit of words during category learning. We propose that words shift attention to dimensions that have been historically predictive in similar contexts. Under this account, there should be cases in which words are detrimental to performance. The results from two experiments show that words impair learning of object categories under some conditions. Experiment 1 shows that words hurt performance when learning to categorize by texture. Experiment 2 shows that words also hurt when learning to categorize by brightness, leading to selectively attending to shape when both shape and hue could be used to correctly categorize stimuli. We suggest that both the positive and negative effects of words have developmental origins in the history of word usage while learning categories. [corrected]


Subject(s)
Association Learning , Language , Concept Formation , Humans , Photic Stimulation , Psycholinguistics
SELECTION OF CITATIONS
SEARCH DETAIL
...