Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
Behav Res Methods ; 55(8): 3984-4001, 2023 12.
Article in English | MEDLINE | ID: mdl-36538168

ABSTRACT

One of the most precise methods to establish psychometric functions and estimate threshold and slope parameters is the constant stimuli procedure. The large distribution of predetermined stimulus values presented to observers enables the psychometric functions to be fully developed, but makes this procedure time-consuming. Adaptive procedures enable reliable threshold estimation while reducing the number of trials by concentrating stimulus presentations around observers' supposed threshold. Here, the stimulus value for the next trial depends on observer's responses to the previous trials. One recent improvement of these procedures is to also estimate the slope (related to discrimination sensitivity). The Bayesian QUEST+ procedure (Watson Journal of Vision, 17(3), 10, 2017), a generalization and extension of the QUEST procedure, includes this refinement. Surprisingly, this procedure is barely used. Our goal was to empirically assess its precision to evaluate size, orientation, or temporal perception, in three yes/no discrimination tasks that increase in demands. In 72 adult participants in total, we compared points of subjective equivalence (PSEs) or simultaneity (PSSs) as well as discrimination sensitivity obtained with the QUEST+, constant stimuli, and simple up-down staircase procedures. While PSEs did not differ between procedures, sensitivity estimates obtained with the 64-trials QUEST+ procedure were overestimated (i.e., just-noticeable differences, or JNDs, were underestimated). Overall, agreement between procedures was good, and was at its best for the easiest tasks. This study empirically confirmed that the QUEST+ procedure can be considered as a method of choice to accelerate PSE estimation, while keeping in mind that sensitivity estimation should be handled with caution.


Subject(s)
Visual Perception , Adult , Humans , Psychophysics/methods , Sensory Thresholds/physiology , Bayes Theorem , Psychometrics/methods , Visual Perception/physiology
2.
Infancy ; 26(4): 647-659, 2021 07.
Article in English | MEDLINE | ID: mdl-33988894

ABSTRACT

During their first year, infants attune to the faces and language(s) that are frequent in their environment. The present study investigates the impact of language familiarity on how French-learning 9- and 12-month-olds recognize own-race faces. In Experiment 1, infants were familiarized with the talking face of a Caucasian bilingual German-French speaker reciting a nursery rhyme in French (native condition) or in German (non-native condition). In the test phase, infants' face recognition was tested by presenting a picture of the speaker's face they were familiarized with, side by side with a novel face. At 9 and 12 months, neither infants in the native condition nor the ones in the non-native condition clearly recognized the speaker's face. In Experiment 2, we familiarized infants with the still picture of the speaker's face, along with the auditory speech stream. This time, both 9- and 12-month-olds recognized the face of the speaker they had been familiarized with, but only if she spoke in their native language. This study shows that at least from 9 months of age, language modulates the way faces are recognized.


Subject(s)
Child Development , Facial Recognition , Language , Recognition, Psychology , Female , Humans , Infant , Male
3.
Iperception ; 10(1): 2041669519830414, 2019.
Article in English | MEDLINE | ID: mdl-30834097

ABSTRACT

The face own-age bias effect refers to the better ability to recognize the face from one's own age compared with other age groups. Here we examined whether an own-age advantage occurs for faces sex categorization. We examined 7- and 9-year-olds' and adults' ability to correctly categorize the sex of 7- and 9-year-olds and adult faces without external cues, such as hair. Results indicated that all ages easily classify the sex of adult faces. They succeeded in classifying the sex of child faces, but their performance was poorer than for adult faces. In adults, processing time increased, and a response bias (male response) was elicited for child faces. In children, response times remained constant, and no bias was observed. Experience with specific category of faces seems to offer some advantage in speed of processing. Overall, sex categorization is more challenging for child than for adult faces due to their reduced sexual dimorphic facial characteristics.

4.
J Exp Child Psychol ; 172: 189-200, 2018 08.
Article in English | MEDLINE | ID: mdl-29627481

ABSTRACT

Previous studies have found that when monolingual infants are exposed to a talking face speaking in a native language, 8- and 10-month-olds attend more to the talker's mouth, whereas 12-month-olds no longer do so. It has been hypothesized that the attentional focus on the talker's mouth at 8 and 10 months of age reflects reliance on the highly salient audiovisual (AV) speech cues for the acquisition of basic speech forms and that the subsequent decline of attention to the mouth by 12 months of age reflects the emergence of basic native speech expertise. Here, we investigated whether infants may redeploy their attention to the mouth once they fully enter the word-learning phase. To test this possibility, we recorded eye gaze in monolingual English-learning 14- and 18-month-olds while they saw and heard a talker producing an English or Spanish utterance in either an infant-directed (ID) or adult-directed (AD) manner. Results indicated that the 14-month-olds attended more to the talker's mouth than to the eyes when exposed to the ID utterance and that the 18-month-olds attended more to the talker's mouth when exposed to the ID and the AD utterance. These results show that infants redeploy their attention to a talker's mouth when they enter the word acquisition phase and suggest that infants rely on the greater perceptual salience of redundant AV speech cues to acquire their lexicon.


Subject(s)
Attention/physiology , Cues , Fixation, Ocular/physiology , Speech Perception/physiology , Face , Female , Humans , Infant , Male , Mouth , Verbal Learning/physiology
5.
PLoS One ; 12(1): e0169325, 2017.
Article in English | MEDLINE | ID: mdl-28060872

ABSTRACT

Early multisensory perceptual experiences shape the abilities of infants to perform socially-relevant visual categorization, such as the extraction of gender, age, and emotion from faces. Here, we investigated whether multisensory perception of gender is influenced by infant-directed (IDS) or adult-directed (ADS) speech. Six-, 9-, and 12-month-old infants saw side-by-side silent video-clips of talking faces (a male and a female) and heard either a soundtrack of a female or a male voice telling a story in IDS or ADS. Infants participated in only one condition, either IDS or ADS. Consistent with earlier work, infants displayed advantages in matching female relative to male faces and voices. Moreover, the new finding that emerged in the current study was that extraction of gender from face and voice was stronger at 6 months with ADS than with IDS, whereas at 9 and 12 months, matching did not differ for IDS versus ADS. The results indicate that the ability to perceive gender in audiovisual speech is influenced by speech manner. Our data suggest that infants may extract multisensory gender information developmentally earlier when looking at adults engaged in conversation with other adults (i.e., ADS) than when adults are directly talking to them (i.e., IDS). Overall, our findings imply that the circumstances of social interaction may shape early multisensory abilities to perceive gender.


Subject(s)
Auditory Perception , Speech , Visual Perception , Voice , Acoustic Stimulation , Adult , Child Development , Female , Hearing , Humans , Infant , Male , Photic Stimulation , Speech Perception
6.
Dev Sci ; 20(3)2017 05.
Article in English | MEDLINE | ID: mdl-26743437

ABSTRACT

Previous studies have found that infants shift their attention from the eyes to the mouth of a talker when they enter the canonical babbling phase after 6 months of age. Here, we investigated whether this increased attentional focus on the mouth is mediated by audio-visual synchrony and linguistic experience. To do so, we tracked eye gaze in 4-, 6-, 8-, 10-, and 12-month-old infants while they were exposed either to desynchronized native or desynchronized non-native audiovisual fluent speech. Results indicated that, regardless of language, desynchronization disrupted the usual pattern of relative attention to the eyes and mouth found in response to synchronized speech at 10 months but not at any other age. These findings show that audio-visual synchrony mediates selective attention to a talker's mouth just prior to the emergence of initial language expertise and that it declines in importance once infants become native-language experts.


Subject(s)
Attention/physiology , Mouth , Speech , Eye , Humans , Infant , Language , Speech Perception , Visual Perception
7.
Dev Psychobiol ; 57(5): 637-42, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25952509

ABSTRACT

Perceptual narrowing has been observed in human infants for monkey faces: 6-month-olds can discriminate between them, whereas older infants from 9 months of age display difficulty discriminating between them. The difficulty infants from 9 months have processing monkey faces has not been clearly identified. It could be due to the structural characteristics of monkey faces, particularly the key facial features that differ from human faces. The current study aimed to investigate whether the information conveyed by the eyes is of importance. We examined whether the presence of Caucasian human eyes in monkey faces allows recognition to be maintained in 6-month-olds and facilitates recognition in 9- and 12-month-olds. Our results revealed that the presence of human eyes in monkey faces maintains recognition for those faces at 6 months of age and partially facilitates recognition of those faces at 9 months of age, but not at 12 months of age. The findings are interpreted in the context of perceptual narrowing and suggest that the attenuation of processing of other-species faces is not reversed by the presence of human eyes.


Subject(s)
Facial Recognition , Psychology, Child , Age Factors , Animals , Child Development , Discrimination, Psychological , Eye/anatomy & histology , Female , Haplorhini , Humans , Infant , Male , Photic Stimulation
8.
Infant Behav Dev ; 37(4): 676-81, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25243612

ABSTRACT

At 3-4 months of age, infants respond to gender information in human faces. Specifically, young infants display a visual preference toward female over male faces. In three experiments, using a visual preference task, we investigated the role of hairline information in this bias. In Experiment 1, we presented male and female composite faces with similar hairstyles to 4-month-olds and observed a preference for female faces. In Experiment 2, the faces were presented, but in this instance, without hairline cues, and the preference was eliminated. In Experiment 3, using the same cropping to eliminate hairline cues, but with feminized female faces and masculinized male faces, infants' preference toward female faces was still not in evidence. The findings show that hairline information is important in young infants' preferential orientation toward female faces.


Subject(s)
Face , Hair , Infant Behavior/psychology , Adult , Cues , Female , Gender Identity , Humans , Infant , Male , Observer Variation , Photic Stimulation , Reproducibility of Results , Sex Characteristics
9.
Infant Behav Dev ; 37(4): 644-51, 2014 Nov.
Article in English | MEDLINE | ID: mdl-25238663

ABSTRACT

The present study examined whether infant-directed (ID) speech facilitates intersensory matching of audio-visual fluent speech in 12-month-old infants. German-learning infants' audio-visual matching ability of German and French fluent speech was assessed by using a variant of the intermodal matching procedure, with auditory and visual speech information presented sequentially. In Experiment 1, the sentences were spoken in an adult-directed (AD) manner. Results showed that 12-month-old infants did not exhibit a matching performance for the native, nor for the non-native language. However, Experiment 2 revealed that when ID speech stimuli were used, infants did perceive the relation between auditory and visual speech attributes, but only in response to their native language. Thus, the findings suggest that ID speech might have an influence on the intersensory perception of fluent speech and shed further light on multisensory perceptual narrowing.


Subject(s)
Speech Perception/physiology , Acoustic Stimulation , Adult , Auditory Perception/physiology , Female , Humans , Infant , Language , Male , Photic Stimulation , Visual Perception/physiology
10.
PLoS One ; 9(2): e89275, 2014.
Article in English | MEDLINE | ID: mdl-24586651

ABSTRACT

The present study examined when and how the ability to cross-modally match audio-visual fluent speech develops in 4.5-, 6- and 12-month-old German-learning infants. In Experiment 1, 4.5- and 6-month-old infants' audio-visual matching ability of native (German) and non-native (French) fluent speech was assessed by presenting auditory and visual speech information sequentially, that is, in the absence of temporal synchrony cues. The results showed that 4.5-month-old infants were capable of matching native as well as non-native audio and visual speech stimuli, whereas 6-month-olds perceived the audio-visual correspondence of native language stimuli only. This suggests that intersensory matching narrows for fluent speech between 4.5 and 6 months of age. In Experiment 2, auditory and visual speech information was presented simultaneously, therefore, providing temporal synchrony cues. Here, 6-month-olds were found to match native as well as non-native speech indicating facilitation of temporal synchrony cues on the intersensory perception of non-native fluent speech. Intriguingly, despite the fact that audio and visual stimuli cohered temporally, 12-month-olds matched the non-native language only. Results were discussed with regard to multisensory perceptual narrowing during the first year of life.


Subject(s)
Association Learning , Auditory Perception/physiology , Discrimination Learning , Language , Speech/physiology , Visual Perception/physiology , Acoustic Stimulation , Child Development , Female , France , Germany , Humans , Infant , Language Development , Male
SELECTION OF CITATIONS
SEARCH DETAIL
...