Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 15 de 15
Filter
Add more filters










Publication year range
1.
Dev Cogn Neurosci ; 54: 101094, 2022 04.
Article in English | MEDLINE | ID: mdl-35248819

ABSTRACT

Time-resolved multivariate pattern analysis (MVPA), a popular technique for analyzing magneto- and electro-encephalography (M/EEG) neuroimaging data, quantifies the extent and time-course by which neural representations support the discrimination of relevant stimuli dimensions. As EEG is widely used for infant neuroimaging, time-resolved MVPA of infant EEG data is a particularly promising tool for infant cognitive neuroscience. MVPA has recently been applied to common infant imaging methods such as EEG and fNIRS. In this tutorial, we provide and describe code to implement time-resolved, within-subject MVPA with infant EEG data. An example implementation of time-resolved MVPA based on linear SVM classification is described, with accompanying code in Matlab and Python. Results from a test dataset indicated that in both infants and adults this method reliably produced above-chance accuracy for classifying stimuli images. Extensions of the classification analysis are presented including both geometric- and accuracy-based representational similarity analysis, implemented in Python. Common choices of implementation are presented and discussed. As the amount of artifact-free EEG data contributed by each participant is lower in studies of infants than in studies of children and adults, we also explore and discuss the impact of varying participant-level inclusion thresholds on resulting MVPA findings in these datasets.


Subject(s)
Brain , Cognitive Neuroscience , Adult , Child , Electroencephalography/methods , Humans , Infant , Multivariate Analysis , Neuroimaging/methods
2.
Neuropsychologia ; 156: 107838, 2021 06 18.
Article in English | MEDLINE | ID: mdl-33775702

ABSTRACT

Adults exhibit relative behavioral difficulties in processing inanimate, artificial faces compared to real human faces, with implications for using artificial faces in research and designing artificial social agents. However, the developmental trajectory of inanimate face perception is unknown. To address this gap, we used electroencephalography to investigate inanimate faces processing in cross-sectional groups of 5-10-year-old children and adults. A face inversion manipulation was used to test whether face animacy processing relies on expert face processing strategies. Groups of 5-7-year-olds (N = 18), 8-10-year-olds (N = 18), and adults (N = 16) watched pictures of real or doll faces presented in an upright or inverted orientation. Analyses of event-related potentials revealed larger N170 amplitudes in response to doll faces, irrespective of age group or face orientation. Thus, the N170 is sensitive to face animacy by 5-7 years of age, but such sensitivity may not reflect high-level, expert face processing. Multivariate pattern analyses of the EEG signal additionally assessed whether animacy information could be reliably extracted during face processing. Face orientation, but not face animacy, could be reliably decoded from occipitotemporal channels in children and adults. Face animacy could be decoded from whole scalp channels in adults, but not children. Together, these results suggest that 5-10-year-old children exhibit some sensitivity to face animacy over occipitotemporal regions that is comparable to adults.


Subject(s)
Electroencephalography , Evoked Potentials , Adult , Child , Child, Preschool , Cross-Sectional Studies , Humans , Orientation , Pattern Recognition, Visual , Photic Stimulation
3.
Dev Cogn Neurosci ; 47: 100882, 2021 02.
Article in English | MEDLINE | ID: mdl-33246304

ABSTRACT

The processing of facial emotion is an important social skill that develops throughout infancy and early childhood. Here we investigate the neural underpinnings of the ability to process facial emotion across changes in facial identity in cross-sectional groups of 5- and 7-month-old infants. We simultaneously measured neural metabolic, behavioral, and autonomic responses to happy, fearful, and angry faces of different female models using functional near-infrared spectroscopy (fNIRS), eye-tracking, and heart rate measures. We observed significant neural activation to these facial emotions in a distributed set of frontal and temporal brain regions, and longer looking to the mouth region of angry faces compared to happy and fearful faces. No differences in looking behavior or neural activations were observed between 5- and 7-month-olds, although several exploratory, age-independent associations between neural activations and looking behavior were noted. Overall, these findings suggest more developmental stability than previously thought in responses to emotional facial expressions of varying identities between 5- and 7-months of age.


Subject(s)
Fear , Happiness , Child, Preschool , Cross-Sectional Studies , Facial Expression , Female , Humans , Infant
4.
Dev Cogn Neurosci ; 45: 100860, 2020 10.
Article in English | MEDLINE | ID: mdl-32932205

ABSTRACT

Tools from computational neuroscience have facilitated the investigation of the neural correlates of mental representations. However, access to the representational content of neural activations early in life has remained limited. We asked whether patterns of neural activity elicited by complex visual stimuli (animals, human body) could be decoded from EEG data gathered from 12-15-month-old infants and adult controls. We assessed pairwise classification accuracy at each time-point after stimulus onset, for individual infants and adults. Classification accuracies rose above chance in both groups, within 500 ms. In contrast to adults, neural representations in infants were not linearly separable across visual domains. Representations were similar within, but not across, age groups. These findings suggest a developmental reorganization of visual representations between the second year of life and adulthood and provide a promising proof-of-concept for the feasibility of decoding EEG data within-subject to assess how the infant brain dynamically represents visual objects.


Subject(s)
Brain/physiology , Pattern Recognition, Visual/physiology , Adolescent , Adult , Female , Humans , Infant , Male , Young Adult
5.
Front Integr Neurosci ; 14: 14, 2020.
Article in English | MEDLINE | ID: mdl-32327979

ABSTRACT

Individuals with Tuberous Sclerosis Complex (TSC) have atypical white matter integrity and neural connectivity in the brain, including language pathways. To explore functional activity associated with auditory and language processing in individuals with TSC, we used electroencephalography (EEG) to examine basic auditory correlates of detection (P1, N2, N4) and discrimination (mismatch negativity, MMN) of speech and non-speech stimuli for children with TSC and age- and sex-matched typically developing (TD) children. Children with TSC (TSC group) and without TSC (typically developing, TD group) participated in an auditory MMN paradigm containing two blocks of vowels (/a/and/u/) and two blocks of tones (800 Hz and 400 Hz). Continuous EEG data were collected. Multivariate pattern analysis (MVPA) was used to explore functional specificity of neural auditory processing. Speech-specific P1, N2, and N4 waveform components of the auditory evoked potential (AEP) were compared, and the mismatch response was calculated for both speech and tones. MVPA showed that the TD group, but not the TSC group, demonstrated above-chance pairwise decoding between speech and tones. The AEP component analysis suggested that while the TD group had an increased P1 amplitude in response to vowels compared to tones, the TSC group did not show this enhanced response to vowels. Additionally, the TD group had a greater N2 amplitude in response to vowels, but not tones, compared to the TSC group. The TSC group also demonstrated a longer N4 latency to vowels compared to tones, which was not seen in the TD group. No group differences were observed in the MMN response. In this study we identified features of the auditory response to speech sounds, but not acoustically matched tones, which differentiate children with TSC from TD children.

6.
Dev Psychopathol ; 32(3): 961-974, 2020 08.
Article in English | MEDLINE | ID: mdl-31345275

ABSTRACT

Individual differences in social-emotional functioning emerge early and have long-term implications for developmental adaptation and competency. Research is needed that specifies multiple early risk factors and outcomes simultaneously to demonstrate specificity. Using multigroup longitudinal path analysis in a sample of typically developing children (N = 541), we examined child temperament dimensions (surgency, negative affectivity, and regulation/effortful control) and maternal anxiety in infancy and age 2 as predictors of child externalizing, internalizing, dysregulation, and competence behaviors at age 3. Four primary patterns emerged. First, there was stability in temperament dimensions and maternal anxiety from infancy to age 3. Second, negative affectivity was implicated in internalizing problems and surgency in externalizing problems. Third, effortful control at age 2 was a potent mediator of maternal anxiety in infancy on age 3 outcomes. Fourth, there was suggestive evidence for transactional effects between maternal anxiety and child effortful control. Most pathways operated similarly for boys and girls, with some differences, particularly for surgency. These findings expand our understanding of the roles of specific temperamental domains and postnatal maternal anxiety in a range of social-emotional outcomes in the preschool period, and have implications for efforts to enhance the development of young children's social-emotional functioning and reduce risk for later psychopathology.


Subject(s)
Problem Behavior , Temperament , Anxiety , Child , Child, Preschool , Emotions , Female , Humans , Male , Social Adjustment
7.
Annu Int Conf IEEE Eng Med Biol Soc ; 2018: 295-298, 2018 Jul.
Article in English | MEDLINE | ID: mdl-30440396

ABSTRACT

This study presents the implementation of a within-subject neural decoder, based on Support Vector Machines, and its application for the classification of distributed patterns of hemodynamic activation, measured with Functional Near Infrared Spectroscopy (fNIRS) on children, in response to meaningful and meaningless auditory stimuli. Classification accuracy nominally exceeds chance level for the majority of the participants, but fails to reach statistical significance. Future work should investigate whether individual differences in classification accuracy may relate to other characteristics of the children, such as their cognitive, speech or reading abilities.


Subject(s)
Spectroscopy, Near-Infrared , Child , Hemodynamics , Humans , Multivariate Analysis , Support Vector Machine
8.
Dev Psychol ; 54(12): 2240-2247, 2018 Dec.
Article in English | MEDLINE | ID: mdl-30335429

ABSTRACT

Early facial emotion recognition is hypothesized to be critical to later social functioning. However, relatively little is known about the typical intensity thresholds for recognizing facial emotions in preschoolers, between 2 and 4 years of age. This study employed a behavioral sorting task to examine the recognition of happy, fearful, and angry expressions of varying intensity in a large sample of 3-year-old children (N = 208). Thresholds were similar for all expressions; accuracy, however, was significantly lower for fear. Fear and anger expressions above threshold were significantly more confused with one another than with other expressions. In contrast, neutral faces were significantly more often interpreted as happy than as angry or fearful. These results provide a comparison point for future studies of early facial emotion recognition in typical and atypical populations of children in this age group. (PsycINFO Database Record (c) 2018 APA, all rights reserved).


Subject(s)
Child Development/physiology , Emotions/physiology , Facial Expression , Facial Recognition/physiology , Recognition, Psychology/physiology , Child, Preschool , Female , Humans , Male
9.
Sci Rep ; 8(1): 13277, 2018 09 05.
Article in English | MEDLINE | ID: mdl-30185919

ABSTRACT

Face perception abilities in humans exhibit a marked expertise in distinguishing individual human faces at the expense of individual faces from other species (the other-species effect). In particular, one behavioural effect of such specialization is that human adults search for and find categories of non-human faces faster and more accurately than a specific non-human face, and vice versa for human faces. However, a recent visual search study showed that neural responses (event-related potentials, ERPs) were identical when finding either a non-human or human face. We used time-resolved multivariate pattern analysis of the EEG data from that study to investigate the dynamics of neural representations during a visual search for own-species (human) or other-species (non-human ape) faces, with greater sensitivity than traditional ERP analyses. The location of each target (i.e., right or left) could be decoded from the EEG, with similar accuracy for human and non-human faces. However, the neural patterns associated with searching for an exemplar versus a category target differed for human faces compared to non-human faces: Exemplar representations could be more reliably distinguished from category representations for human than non-human faces. These findings suggest that the other-species effect modulates the nature of representations, but preserves the attentional selection of target items based on these representations.


Subject(s)
Evoked Potentials, Visual/physiology , Facial Recognition/physiology , Pattern Recognition, Visual/physiology , Adult , Animals , Attention/physiology , Brain Mapping , Electroencephalography/methods , Evoked Potentials/physiology , Face , Female , Hominidae , Humans , Male , Photic Stimulation , Recognition, Psychology/physiology , Visual Perception/physiology
10.
Neurophotonics ; 5(1): 011003, 2018 Jan.
Article in English | MEDLINE | ID: mdl-28840167

ABSTRACT

This study uses representational similarity-based neural decoding to test whether semantic information elicited by words and pictures is encoded in functional near-infrared spectroscopy (fNIRS) data. In experiment 1, subjects passively viewed eight audiovisual word and picture stimuli for 15 min. Blood oxygen levels were measured using the Hitachi ETG-4000 fNIRS system with a posterior array over the occipital lobe and a left lateral array over the temporal lobe. Each participant's response patterns were abstracted to representational similarity space and compared to the group average (excluding that subject, i.e., leave-one-out cross-validation) and to a distributional model of semantic representation. Mean accuracy for both decoding tasks significantly exceeded chance. In experiment 2, we compared three group-level models by averaging the similarity structures from sets of eight participants in each group. In these models, the posterior array was accurately decoded by the semantic model, while the lateral array was accurately decoded in the between-groups comparison. Our findings indicate that semantic representations are encoded in the fNIRS data, preserved across subjects, and decodable by an extrinsic representational model. These results are the first attempt to link the functional response pattern measured by fNIRS to higher-level representations of how words are related to each other.

11.
Proc Biol Sci ; 284(1862)2017 Sep 13.
Article in English | MEDLINE | ID: mdl-28878060

ABSTRACT

Human adults show an attentional bias towards fearful faces, an adaptive behaviour that relies on amygdala function. This attentional bias emerges in infancy between 5 and 7 months, but the underlying developmental mechanism is unknown. To examine possible precursors, we investigated whether 3.5-, 6- and 12-month-old infants show facilitated detection of fearful faces in noise, compared to happy faces. Happy or fearful faces, mixed with noise, were presented to infants (N = 192), paired with pure noise. We applied multivariate pattern analyses to several measures of infant looking behaviour to derive a criterion-free, continuous measure of face detection evidence in each trial. Analyses of the resulting psychometric curves supported the hypothesis of a detection advantage for fearful faces compared to happy faces, from 3.5 months of age and across all age groups. Overall, our data show a readiness to detect fearful faces (compared to happy faces) in younger infants that developmentally precedes the previously documented attentional bias to fearful faces in older infants and adults.


Subject(s)
Attention , Facial Expression , Fear , Happiness , Face , Humans , Infant
12.
PLoS One ; 10(6): e0129812, 2015.
Article in English | MEDLINE | ID: mdl-26068460

ABSTRACT

Young infants are typically thought to prefer looking at smiling expressions. Although some accounts suggest that the preference is automatic and universal, we hypothesized that it is not rigid and may be influenced by other face dimensions, most notably the face's gender. Infants are sensitive to the gender of faces; for example, 3-month-olds raised by female caregivers typically prefer female over male faces. We presented neutral versus smiling pairs of faces from the same female or male individuals to 3.5-month-old infants (n = 25), controlling for low-level cues. Infants looked longer to the smiling face when faces were female but longer to the neutral face when faces were male, i.e., there was an effect of face gender on the looking preference for smiling. The results indicate that a preference for smiling in 3.5-month-olds is limited to female faces, possibly reflective of differential experience with male and female faces.


Subject(s)
Discrimination, Psychological , Face , Infant Behavior/psychology , Smiling , Visual Perception/physiology , Female , Humans , Infant , Male , Sex Factors
13.
Dev Psychobiol ; 57(5): 637-42, 2015 Jul.
Article in English | MEDLINE | ID: mdl-25952509

ABSTRACT

Perceptual narrowing has been observed in human infants for monkey faces: 6-month-olds can discriminate between them, whereas older infants from 9 months of age display difficulty discriminating between them. The difficulty infants from 9 months have processing monkey faces has not been clearly identified. It could be due to the structural characteristics of monkey faces, particularly the key facial features that differ from human faces. The current study aimed to investigate whether the information conveyed by the eyes is of importance. We examined whether the presence of Caucasian human eyes in monkey faces allows recognition to be maintained in 6-month-olds and facilitates recognition in 9- and 12-month-olds. Our results revealed that the presence of human eyes in monkey faces maintains recognition for those faces at 6 months of age and partially facilitates recognition of those faces at 9 months of age, but not at 12 months of age. The findings are interpreted in the context of perceptual narrowing and suggest that the attenuation of processing of other-species faces is not reversed by the presence of human eyes.


Subject(s)
Facial Recognition , Psychology, Child , Age Factors , Animals , Child Development , Discrimination, Psychological , Eye/anatomy & histology , Female , Haplorhini , Humans , Infant , Male , Photic Stimulation
14.
Front Psychol ; 6: 346, 2015.
Article in English | MEDLINE | ID: mdl-25859238

ABSTRACT

Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward "male" responding in children as young as 5-6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1-2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling.

15.
Conscious Cogn ; 33: 1-15, 2015 May.
Article in English | MEDLINE | ID: mdl-25497406

ABSTRACT

Humans readily introspect upon their thoughts and their behavior, but how reliable are these subjective reports? In the present study, we explored the consistencies of and differences between the observer's subjective report and actual behavior within a single trial. On each trial of a serial search task, we recorded eye movements and the participants' beliefs of where their eyes moved. The comparison of reported versus real eye movements revealed that subjects successfully reported a subset of their eye movements. Limits in subjective reports stemmed from both the number and the type of eye movements. Furthermore, subjects sometimes reported eye movements they actually never made. A detailed examination of these reports suggests that they could reflect covert shifts of attention during overt serial search. Our data provide quantitative and qualitative measures of observers' subjective reports and reveal experimental effects of visual search that would otherwise be inaccessible.


Subject(s)
Fixation, Ocular , Adult , Attention , Eye Movement Measurements , Eye Movements , Female , Humans , Male , Pattern Recognition, Visual , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...