Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Dev Sci ; 26(2): e13290, 2023 03.
Article in English | MEDLINE | ID: mdl-35617054

ABSTRACT

Most research on early language learning focuses on the objects that infants see and the words they hear in their daily lives, although growing evidence suggests that motor development is also closely tied to language development. To study the real-time behaviors required for learning new words during free-flowing toy play, we measured infants' visual attention and manual actions on to-be-learned toys. Parents and 12-to-26-month-old infants wore wireless head-mounted eye trackers, allowing them to move freely around a home-like lab environment. After the play session, infants were tested on their knowledge of object-label mappings. We found that how often parents named objects during play did not predict learning, but instead, it was infants' attention during and around a labeling utterance that predicted whether an object-label mapping was learned. More specifically, we found that infant visual attention alone did not predict word learning. Instead, coordinated, multimodal attention-when infants' hands and eyes were attending to the same object-predicted word learning. Our results implicate a causal pathway through which infants' bodily actions play a critical role in early word learning.


Subject(s)
Language Development , Learning , Infant , Humans , Child, Preschool , Verbal Learning , Parents , Eye
2.
Infancy ; 27(6): 1154-1178, 2022 11.
Article in English | MEDLINE | ID: mdl-36073790

ABSTRACT

Parental responsiveness to infant behaviors is a strong predictor of infants' language and cognitive outcomes. The mechanisms underlying this effect, however, are relatively unknown. We examined the effects of parent speech on infants' visual attention, manual actions, hand-eye coordination, and dyadic joint attention during parent-infant free play. We report on two studies that used head-mounted eye trackers in increasingly naturalistic laboratory environments. In Study 1, 12-to-24-month-old infants and their parents played on the floor of a seminaturalistic environment with 24 toys. In Study 2, a different sample of dyads played in a home-like laboratory with 10 toys and no restrictions on their movement. In both studies, we present evidence that responsive parent speech extends the duration of infants' multimodal attention. This social "boost" of parent speech impacts multiple behaviors that have been linked to later outcomes-visual attention, manual actions, hand-eye coordination, and joint attention. Further, the amount that parents talked during the interaction was negatively related to the effects of parent speech on infant attention. Together, these results provide evidence of a trade-off between quantity of speech and its effects, suggesting multiple pathways through which parents impact infants' multimodal attention to shape the moment-by-moment dynamics of an interaction.


Subject(s)
Parents , Speech , Humans , Infant , Child, Preschool , Parents/psychology , Infant Behavior , Play and Playthings , Language
3.
J Vis Exp ; (140)2018 10 05.
Article in English | MEDLINE | ID: mdl-30346402

ABSTRACT

Infants and toddlers view the world, at a basic sensory level, in a fundamentally different way from their parents. This is largely due to biological constraints: infants possess different body proportions than their parents and the ability to control their own head movements is less developed. Such constraints limit the visual input available. This protocol aims to provide guiding principles for researchers using head-mounted cameras to understand the changing visual input experienced by the developing infant. Successful use of this protocol will allow researchers to design and execute studies of the developing child's visual environment set in the home or laboratory. From this method, researchers can compile an aggregate view of all the possible items in a child's field of view. This method does not directly measure exactly what the child is looking at. By combining this approach with machine learning, computer vision algorithms, and hand-coding, researchers can produce a high-density dataset to illustrate the changing visual ecology of the developing infant.


Subject(s)
Child Development , Video Recording/instrumentation , Video Recording/methods , Vision, Ocular/physiology , Child, Preschool , Female , Hand/physiology , Humans , Infant , Male , Visual Perception/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...