Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 55
Filter
Add more filters










Publication year range
1.
Behav Brain Sci ; 47: e138, 2024 Jun 27.
Article in English | MEDLINE | ID: mdl-38934458

ABSTRACT

Elizabeth Spelke's What Babies Know is a scholarly presentation of core knowledge theory and a masterful compendium of empirical evidence that supports it. Unfortunately, Spelke's principal theoretical assumption is that core knowledge is simply the innate product of cognitive evolution. As such, her theory fails to explicate the developmental mechanisms underlying the emergence of the cognitive systems on which that knowledge depends.


Subject(s)
Child Development , Cognition , Humans , Infant , Child Development/physiology , Cognition/physiology , Knowledge
2.
Dev Psychol ; 60(1): 135-143, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37917490

ABSTRACT

We presented 28 Spanish monolingual and 28 Catalan-Spanish close-language bilingual 5-year-old children with a video of a talker speaking in the children's native language and a nonnative language and examined the temporal dynamics of their selective attention to the talker's eyes and mouth. When the talker spoke in the children's native language, monolinguals attended equally to the eyes and mouth throughout the trial, whereas close-language bilinguals first attended more to the mouth and then distributed attention equally between the eyes and mouth. In contrast, when the talker spoke in a nonnative language (English), both monolinguals and bilinguals initially attended more to the mouth and then gradually shifted to a pattern of equal attention to the eyes and mouth. These results indicate that specific early linguistic experience has differential effects on young children's deployment of selective attention to areas of a talker's face during the initial part of an audiovisual utterance. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Multilingualism , Speech Perception , Humans , Child, Preschool , Language , Language Development , Linguistics
4.
J Exp Child Psychol ; 232: 105676, 2023 08.
Article in English | MEDLINE | ID: mdl-37018972

ABSTRACT

The timing of the developmental emergence of holistic face processing and its sensitivity to experience in early childhood are somewhat controversial topics. To investigate holistic face perception in early childhood, we used an online testing platform and administered a two-alternative forced-choice task to 4-, 5-, and 6-year-old children. The children saw pairs of composite faces and needed to decide whether the faces were the same or different. To determine whether experience with masked faces may have negatively affected holistic processing, we also administered a parental questionnaire to assess the children's exposure to masked faces during the COVID-19 pandemic. We found that all three age groups performed holistic face processing when the faces were upright (Experiment 1) but not when the faces were inverted (Experiment 2), that response accuracy increased with age, and that response accuracy was not related to degree of exposure to masked faces. These results indicate that holistic face processing is relatively robust in early childhood and that short-term exposure to partially visible faces does not negatively affect young children's holistic face perception.


Subject(s)
COVID-19 , Child Development , Facial Recognition , Pandemics , Facial Recognition/physiology , COVID-19/epidemiology , Humans , Male , Female , Child, Preschool , Child Development/physiology , Child , Surveys and Questionnaires , Parents , Masks
5.
Cognition ; 228: 105226, 2022 11.
Article in English | MEDLINE | ID: mdl-35882100

ABSTRACT

Extraction of meaningful information from multiple talkers relies on perceptual segregation. The temporal synchrony statistics inherent in everyday audiovisual (AV) speech offer a powerful basis for perceptual segregation. We investigated the developmental emergence of synchrony-based perceptual segregation of multiple talkers in 3-7-year-old children. Children either saw four identical or four different faces articulating temporally jittered versions of the same utterance and heard the audible version of the same utterance either synchronized with one of the talkers or desynchronized with all of them. Eye tracking revealed that selective attention to the temporally synchronized talking face increased while attention to the desynchronized faces decreased with age and that attention to the talkers' mouth primarily drove responsiveness. These findings demonstrate that the temporal synchrony statistics inherent in fluent AV speech assume an increasingly greater role in perceptual segregation of the multisensory clutter created by multiple talking faces in early childhood.


Subject(s)
Speech Perception , Child , Child, Preschool , Eye-Tracking Technology , Face , Humans , Mouth , Visual Perception
6.
Infancy ; 27(5): 963-971, 2022 09.
Article in English | MEDLINE | ID: mdl-35833310

ABSTRACT

Infants start tracking auditory-only non-adjacent dependencies (NAD) between 15 and 18 months of age. Given that audiovisual speech, normally available in a talker's mouth, is perceptually more salient than auditory speech and that it facilitates speech processing and language acquisition, we investigated whether 15-month-old infants' NAD learning is modulated by attention to a talker's mouth. Infants performed an audiovisual NAD learning task while we recorded their selective attention to the eyes, mouth, and face of an actress while she spoke an artificial language that followed an AXB structure (tis-X-bun; nal-X-gor) during familiarization. At test, the actress spoke the same language (grammatical trials; tis-X-bun; nal-X-gor) or a novel one that violated the AXB structure (ungrammatical trials; tis-X-gor; nal-X-bun). Overall, total duration of looking did not differ during the familiar and novel test trials but the time-course of selective attention to the talker's face and mouth revealed that the novel trials maintained infants' attention to the face more than did the familiar trials. Crucially, attention to the mouth increased during the novel test trials while it did not change during the familiar test trials. These results indicate that the multisensory redundancy of audiovisual speech facilitates infants' discrimination of non-adjacent dependencies.


Subject(s)
Language Development , Speech Perception , Female , Humans , Infant , Language , Speech
7.
Mind Brain Educ ; 16(1): 62-74, 2022 Feb.
Article in English | MEDLINE | ID: mdl-35273650

ABSTRACT

Looking to the mouth of a talker early in life predicts expressive communication. We hypothesized that looking at a talker's mouth may signal that infants are ready for increased supported joint engagement and that it subsequently facilitates prelinguistic vocal development and translates to broader gains in expressive communication. We tested this hypothesis in 50 infants aged 6-18 months with heightened and general population-level likelihood of autism diagnosis (Sibs-autism and Sibs-NA; respectively). We measured infants' gaze to a speaker's face using an eye tracking task, supported joint engagement during parent-child free play sessions, vocal complexity during a communication sample, and broader expressive communication. Looking at the mouth was indirectly associated with expressive communication via increased higher-order supported joint engagement and vocal complexity. This indirect effect did not vary according to sibling status. This study provides preliminary insights into the mechanisms by which looking at the mouth may influence expressive communication development.

8.
Cognition ; 214: 104743, 2021 09.
Article in English | MEDLINE | ID: mdl-33940250

ABSTRACT

Social interactions often involve a cluttered multisensory scene consisting of multiple talking faces. We investigated whether audiovisual temporal synchrony can facilitate perceptual segregation of talking faces. Participants either saw four identical or four different talking faces producing temporally jittered versions of the same visible speech utterance and heard the audible version of the same speech utterance. The audible utterance was either synchronized with the visible utterance produced by one of the talking faces or not synchronized with any of them. Eye tracking indicated that participants exhibited a marked preference for the synchronized talking face, that they gazed more at the mouth than the eyes overall, that they gazed more at the eyes of an audiovisually synchronized than a desynchronized talking face, and that they gazed more at the mouth when all talking faces were audiovisually desynchronized. These findings demonstrate that audiovisual temporal synchrony plays a major role in perceptual segregation of multisensory clutter and that adults rely on differential scanning strategies of a talker's eyes and mouth to discover sources of multisensory coherence.


Subject(s)
Speech Perception , Visual Perception , Adult , Eye , Face , Humans , Mouth
9.
Infancy ; 25(2): 151-164, 2020 03.
Article in English | MEDLINE | ID: mdl-32749059

ABSTRACT

Little is known about the effects of olfaction on visual processing during infancy. We investigated whether and how an infant's own mother's body odor or another mother's body odor affects 4-month-old infants' looking at their mother's face when it is paired with a stranger's face. In Experiment 1, infants were exposed to their mother's body odor or to a control odor, while in Experiment 2, infants were exposed to a stranger mother's body odor while their visual preferences were recorded. Results revealed that infants looked more at the stranger's female face in presence of the control odor but that they looked more at their mother's face in the context of any mother's body odors. This effect was due to a reduction of looking at the stranger's face. These findings suggest that infants react similarly to the body odor of any mother and add to the growing body of evidence indicating that olfactory stimulation represents a pervasive aspect of infant multisensory perception.


Subject(s)
Face , Mother-Child Relations , Odorants , Analysis of Variance , Female , Humans , Infant , Male , Mothers , Photic Stimulation
10.
Infant Behav Dev ; 54: 80-84, 2019 02.
Article in English | MEDLINE | ID: mdl-30634137

ABSTRACT

We investigated whether attention to a talker's eyes in 12 month-old infants is related to their communication and social abilities. We measured infant attention to a talker's eyes and mouth with a Tobii eye-tracker and examined the correlation between attention to the talker's eyes and scores on the Adaptive Behavior Questionnaire from the Bayley Scales of Infant and Toddler Development (BSID-III). Results indicated a positive relationship between eye gaze and scores on the Social and Communication subscales of the BSID-III.


Subject(s)
Attention/physiology , Communication , Facial Expression , Fixation, Ocular/physiology , Social Skills , Speech Perception/physiology , Female , Humans , Infant , Male , Photic Stimulation/methods
11.
Dev Sci ; 22(3): e12755, 2019 05.
Article in English | MEDLINE | ID: mdl-30251757

ABSTRACT

Previous findings indicate that bilingual Catalan/Spanish-learning infants attend more to the highly salient audiovisual redundancy cues normally available in a talker's mouth than do monolingual infants. Presumably, greater attention to such cues renders the challenge of learning two languages easier. Spanish and Catalan are, however, rhythmically and phonologically close languages. This raises the possibility that bilinguals only rely on redundant audiovisual cues when their languages are close. To test this possibility, we exposed 15-month-old and 4- to 6-year-old close-language bilinguals (Spanish/Catalan) and distant-language bilinguals (Spanish/"other") to videos of a talker uttering Spanish or Catalan (native) and English (non-native) monologues and recorded eye-gaze to the talker's eyes and mouth. At both ages, the close-language bilinguals attended more to the talker's mouth than the distant-language bilinguals. This indicates that language proximity modulates selective attention to a talker's mouth during early childhood and suggests that reliance on the greater salience of audiovisual speech cues depends on the difficulty of the speech-processing task.


Subject(s)
Attention/physiology , Cues , Fixation, Ocular/physiology , Mouth/physiology , Multilingualism , Child , Child, Preschool , Eye , Female , Humans , Infant , Language , Language Development , Male , Speech Perception
13.
Dev Cogn Neurosci ; 34: 75-81, 2018 11.
Article in English | MEDLINE | ID: mdl-30099263

ABSTRACT

Classic views of multisensory processing suggest that cortical sensory regions are specialized. More recent views argue that cortical sensory regions are inherently multisensory. To date, there are no published neuroimaging data that directly test these claims in infancy. Here we used fNIRS to show that temporal and occipital cortex are functionally coupled in 3.5-5-month-old infants (N = 65), and that the extent of this coupling during a synchronous, but not an asynchronous, audiovisual event predicted whether occipital cortex would subsequently respond to sound-only information. These data suggest that multisensory experience may shape cortical dynamics to adapt to the ubiquity of synchronous multisensory information in the environment, and invoke the possibility that adaptation to the environment can also reflect broadening of the computational range of sensory systems.


Subject(s)
Auditory Perception/physiology , Brain Mapping/methods , Spectroscopy, Near-Infrared/methods , Visual Perception/physiology , Female , Humans , Infant , Male
14.
J Exp Child Psychol ; 172: 189-200, 2018 08.
Article in English | MEDLINE | ID: mdl-29627481

ABSTRACT

Previous studies have found that when monolingual infants are exposed to a talking face speaking in a native language, 8- and 10-month-olds attend more to the talker's mouth, whereas 12-month-olds no longer do so. It has been hypothesized that the attentional focus on the talker's mouth at 8 and 10 months of age reflects reliance on the highly salient audiovisual (AV) speech cues for the acquisition of basic speech forms and that the subsequent decline of attention to the mouth by 12 months of age reflects the emergence of basic native speech expertise. Here, we investigated whether infants may redeploy their attention to the mouth once they fully enter the word-learning phase. To test this possibility, we recorded eye gaze in monolingual English-learning 14- and 18-month-olds while they saw and heard a talker producing an English or Spanish utterance in either an infant-directed (ID) or adult-directed (AD) manner. Results indicated that the 14-month-olds attended more to the talker's mouth than to the eyes when exposed to the ID utterance and that the 18-month-olds attended more to the talker's mouth when exposed to the ID and the AD utterance. These results show that infants redeploy their attention to a talker's mouth when they enter the word acquisition phase and suggest that infants rely on the greater perceptual salience of redundant AV speech cues to acquire their lexicon.


Subject(s)
Attention/physiology , Cues , Fixation, Ocular/physiology , Speech Perception/physiology , Face , Female , Humans , Infant , Male , Mouth , Verbal Learning/physiology
15.
Dev Psychobiol ; 60(3): 243-255, 2018 04.
Article in English | MEDLINE | ID: mdl-29457647

ABSTRACT

Recursive, hierarchically organized serial patterns provide the underlying structure in many cognitive and motor domains including speech, language, music, social interaction, and motor action. We investigated whether learning of hierarchical patterns emerges in infancy by habituating 204 infants to different hierarchical serial patterns and then testing for discrimination and generalization of such patterns. Results indicated that 8- to 10-month-old and 12- to 14-month-old infants exhibited sensitivity to the difference between hierarchical and non-hierarchical structure but that 4- to 6-month-old infants did not. These findings demonstrate that the ability to perceive, learn, and generalize recursive, hierarchical, pattern rules emerges in infancy and add to growing evidence that general-purpose pattern learning mechanisms emerge during the first year of life.


Subject(s)
Child Development/physiology , Generalization, Psychological/physiology , Pattern Recognition, Visual/physiology , Recognition, Psychology/physiology , Female , Humans , Infant , Male
16.
Dev Sci ; 21(4): e12604, 2018 Jul.
Article in English | MEDLINE | ID: mdl-28944541

ABSTRACT

We tested 4-6- and 10-12-month-old infants to investigate whether the often-reported decline in infant sensitivity to other-race faces may reflect responsiveness to static or dynamic/silent faces rather than a general process of perceptual narrowing. Across three experiments, we tested discrimination of either dynamic own-race or other-race faces which were either accompanied by a speech syllable, no sound, or a non-speech sound. Results indicated that 4-6- and 10-12-month-old infants discriminated own-race as well as other-race faces accompanied by a speech syllable, that only the 10-12-month-olds discriminated silent own-race faces, and that 4-6-month-old infants discriminated own-race and other-race faces accompanied by a non-speech sound but that 10-12-month-old infants only discriminated own-race faces accompanied by a non-speech sound. Overall, the results suggest that the ORE reported to date reflects infant responsiveness to static or dynamic/silent faces rather than a general process of perceptual narrowing.


Subject(s)
Face , Facial Recognition , Racial Groups , Recognition, Psychology/physiology , Speech , Child Development , Female , Humans , Infant , Male , Self Concept
17.
PLoS One ; 12(1): e0169325, 2017.
Article in English | MEDLINE | ID: mdl-28060872

ABSTRACT

Early multisensory perceptual experiences shape the abilities of infants to perform socially-relevant visual categorization, such as the extraction of gender, age, and emotion from faces. Here, we investigated whether multisensory perception of gender is influenced by infant-directed (IDS) or adult-directed (ADS) speech. Six-, 9-, and 12-month-old infants saw side-by-side silent video-clips of talking faces (a male and a female) and heard either a soundtrack of a female or a male voice telling a story in IDS or ADS. Infants participated in only one condition, either IDS or ADS. Consistent with earlier work, infants displayed advantages in matching female relative to male faces and voices. Moreover, the new finding that emerged in the current study was that extraction of gender from face and voice was stronger at 6 months with ADS than with IDS, whereas at 9 and 12 months, matching did not differ for IDS versus ADS. The results indicate that the ability to perceive gender in audiovisual speech is influenced by speech manner. Our data suggest that infants may extract multisensory gender information developmentally earlier when looking at adults engaged in conversation with other adults (i.e., ADS) than when adults are directly talking to them (i.e., IDS). Overall, our findings imply that the circumstances of social interaction may shape early multisensory abilities to perceive gender.


Subject(s)
Auditory Perception , Speech , Visual Perception , Voice , Acoustic Stimulation , Adult , Child Development , Female , Hearing , Humans , Infant , Male , Photic Stimulation , Speech Perception
18.
Dev Sci ; 20(3)2017 05.
Article in English | MEDLINE | ID: mdl-26743437

ABSTRACT

Previous studies have found that infants shift their attention from the eyes to the mouth of a talker when they enter the canonical babbling phase after 6 months of age. Here, we investigated whether this increased attentional focus on the mouth is mediated by audio-visual synchrony and linguistic experience. To do so, we tracked eye gaze in 4-, 6-, 8-, 10-, and 12-month-old infants while they were exposed either to desynchronized native or desynchronized non-native audiovisual fluent speech. Results indicated that, regardless of language, desynchronization disrupted the usual pattern of relative attention to the eyes and mouth found in response to synchronized speech at 10 months but not at any other age. These findings show that audio-visual synchrony mediates selective attention to a talker's mouth just prior to the emergence of initial language expertise and that it declines in importance once infants become native-language experts.


Subject(s)
Attention/physiology , Mouth , Speech , Eye , Humans , Infant , Language , Speech Perception , Visual Perception
19.
Trends Neurosci ; 39(8): 567-579, 2016 08.
Article in English | MEDLINE | ID: mdl-27282408

ABSTRACT

Multisensory processes are fundamental in scaffolding perception, cognition, learning, and behavior. How and when stimuli from different sensory modalities are integrated rather than treated as separate entities is poorly understood. We review how the relative reliance on stimulus characteristics versus learned associations dynamically shapes multisensory processes. We illustrate the dynamism in multisensory function across two timescales: one long term that operates across the lifespan and one short term that operates during the learning of new multisensory relations. In addition, we highlight the importance of task contingencies. We conclude that these highly dynamic multisensory processes, based on the relative weighting of stimulus characteristics and learned associations, provide both stability and flexibility to brain functions over a wide range of temporal scales.


Subject(s)
Aging/physiology , Learning/physiology , Perception/physiology , Animals , Humans
20.
Cognition ; 147: 100-5, 2016 Feb.
Article in English | MEDLINE | ID: mdl-26649759

ABSTRACT

We investigated whether the audiovisual speech cues available in a talker's mouth elicit greater attention when adults have to process speech in an unfamiliar language vs. a familiar language. Participants performed a speech-encoding task while watching and listening to videos of a talker in a familiar language (English) or an unfamiliar language (Spanish or Icelandic). Attention to the mouth increased in monolingual subjects in response to an unfamiliar language condition but did not in bilingual subjects when the task required speech processing. In the absence of an explicit speech-processing task, subjects attended equally to the eyes and mouth in response to both familiar and unfamiliar languages. Overall, these results demonstrate that language familiarity modulates selective attention to the redundant audiovisual speech cues in a talker's mouth in adults. When our findings are considered together with similar findings from infants, they suggest that this attentional strategy emerges very early in life.


Subject(s)
Attention/physiology , Language , Recognition, Psychology/physiology , Speech Perception/physiology , Cues , Eye , Eye Movements/physiology , Humans , Mouth , Multilingualism , Speech/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...