Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 21
Filter
Add more filters










Publication year range
1.
Sensors (Basel) ; 23(18)2023 Sep 16.
Article in English | MEDLINE | ID: mdl-37765987

ABSTRACT

There have been sustained efforts toward using naturalistic methods in developmental science to measure infant behaviors in the real world from an egocentric perspective because statistical regularities in the environment can shape and be shaped by the developing infant. However, there is no user-friendly and unobtrusive technology to densely and reliably sample life in the wild. To address this gap, we present the design, implementation and validation of the EgoActive platform, which addresses limitations of existing wearable technologies for developmental research. EgoActive records the active infants' egocentric perspective of the world via a miniature wireless head-mounted camera concurrently with their physiological responses to this input via a lightweight, wireless ECG/acceleration sensor. We also provide software tools to facilitate data analyses. Our validation studies showed that the cameras and body sensors performed well. Families also reported that the platform was comfortable, easy to use and operate, and did not interfere with daily activities. The synchronized multimodal data from the EgoActive platform can help tease apart complex processes that are important for child development to further our understanding of areas ranging from executive function to emotion processing and social learning.


Subject(s)
Wearable Electronic Devices , Infant , Child , Humans , Software , Technology , Autonomic Nervous System
2.
Infancy ; 28(4): 820-835, 2023.
Article in English | MEDLINE | ID: mdl-36917082

ABSTRACT

Human body postures provide perceptual cues that can be used to discriminate and recognize emotions. It was previously found that 7-months-olds' fixation patterns discriminated fear from other emotion body expressions but it is not clear whether they also process the emotional content of those expressions. The emotional content of visual stimuli can increase arousal level resulting in pupil dilations. To provide evidence that infants also process the emotional content of expressions, we analyzed variations in pupil in response to emotion stimuli. Forty-eight 7-months-old infants viewed adult body postures expressing anger, fear, happiness and neutral expressions, while their pupil size was measured. There was a significant emotion effect between 1040 and 1640 ms after image onset, when fear elicited larger pupil dilations than neutral expressions. A similar trend was found for anger expressions. Our results suggest that infants have increased arousal to negative-valence body expressions. Thus, in combination with previous fixation results, the pupil data show that infants as young as 7-months can perceptually discriminate static body expressions and process the emotional content of those expressions. The results extend information about infant processing of emotion expressions conveyed through other means (e.g., faces).


Subject(s)
Emotions , Pupil , Adult , Humans , Infant , Pupil/physiology , Emotions/physiology , Fear , Anger , Arousal/physiology
3.
J Exp Child Psychol ; 224: 105497, 2022 12.
Article in English | MEDLINE | ID: mdl-35850023

ABSTRACT

Body movements provide a rich source of emotional information during social interactions. Although the ability to perceive biological motion cues related to those movements begins to develop during infancy, processing those cues to identify emotions likely continues to develop into childhood. Previous studies used posed or exaggerated body movements, which might not reflect the kind of body expressions children experience. The current study used an event-related potential (ERP) priming paradigm to investigate the development of emotion recognition from more naturalistic body movements. Point-light displays (PLDs) of male adult bodies expressing happy or angry emotional movements while narrating a story were used as prime stimuli, whereas audio recordings of the words "happy" and "angry" spoken with an emotionally neutral prosody were used as targets. We recorded the ERPs time-locked to the onset of the auditory target from 3- and 6-year-old children, and we compared amplitude and latency of the N300 and N400 responses between the two age groups in the different prime-target conditions. There was an overall effect of prime for the N300 amplitude, with more negative-going responses for happy PLDs compared with angry PLDs. There was also an interaction between prime and target for the N300 latency, suggesting that all children were sensitive to the emotional congruency between body movements and words. For the N400 component, there was only an interaction among age, prime, and target for latency, suggesting an age-dependent modulation of this component when prime and target did not match in emotional information. Overall, our results suggest that the emergence of more complex emotion processing of body expressions occurs around 6 years of age, but it is not fully developed at this point in ontogeny.


Subject(s)
Electroencephalography , Evoked Potentials , Adult , Anger , Child , Child, Preschool , Cues , Emotions/physiology , Evoked Potentials/physiology , Facial Expression , Female , Humans , Male
5.
Sci Rep ; 11(1): 14744, 2021 07 20.
Article in English | MEDLINE | ID: mdl-34285305

ABSTRACT

Previous research has demonstrated that the tendency to form first impressions from facial appearance emerges early in development. We examined whether social referencing is one route through which these consistent first impressions are acquired. In Study 1, we show that 5- to 7-year-old children are more likely to choose a target face previously associated with positive non-verbal signals as more trustworthy than a face previously associated with negative non-verbal signals. In Study 2, we show that children generalise this learning to novel faces who resemble those who have previously been the recipients of positive non-verbal behaviour. Taken together, these data show one means through which individuals within a community could acquire consistent, and potentially inaccurate, first impressions of others faces. In doing so, they highlight a route through which cultural transmission of first impressions can occur.


Subject(s)
Learning , Child , Facial Recognition , Female , Humans , Male , Photic Stimulation , Social Perception
6.
Biol Psychol ; 160: 108047, 2021 03.
Article in English | MEDLINE | ID: mdl-33596461

ABSTRACT

Recent findings indicate that 7-months-old infants perceive and represent the sounds inherent to moving human bodies. However, it is not known whether infants integrate auditory and visual information in representations of specific human actions. To address this issue, we used ERPs to investigate infants' neural sensitivity to the correspondence between sounds and images of human actions. In a cross-modal priming paradigm, 7-months-olds were presented with the sounds generated by two types of human body movement, walking and handclapping, after watching the kinematics of those actions in either a congruent or incongruent manner. ERPs recorded from frontal, central and parietal electrodes in response to action sounds indicate that 7-months-old infants perceptually link the visual and auditory cues of human actions. However, at this age these percepts do not seem to be integrated in cognitive multimodal representations of human actions.


Subject(s)
Cues , Evoked Potentials , Acoustic Stimulation , Auditory Perception , Humans , Infant , Movement , Photic Stimulation
7.
Infant Behav Dev ; 60: 101473, 2020 08.
Article in English | MEDLINE | ID: mdl-32739668

ABSTRACT

The human body is an important source of information to infer a person's emotional state. Research with adult observers indicate that the posture of the torso, arms and hands provide important perceptual cues for recognising anger, fear and happy expressions. Much less is known about whether infants process body regions differently for different body expressions. To address this issue, we used eye tracking to investigate whether infants' visual exploration patterns differed when viewing body expressions. Forty-eight 7-months-old infants were randomly presented with static images of adult female bodies expressing anger, fear and happiness, as well as an emotionally-neutral posture. Facial cues to emotional state were removed by masking the faces. We measured the proportion of looking time, proportion and number of fixations, and duration of fixations on the head, upper body and lower body regions for the different expressions. We showed that infants explored the upper body more than the lower body. Importantly, infants at this age fixated differently on different body regions depending on the expression of the body posture. In particular, infants spent a larger proportion of their looking times and had longer fixation durations on the upper body for fear relative to the other expressions. These results extend and replicate the information about infant processing of emotional expressions displayed by human bodies, and they support the hypothesis that infants' visual exploration of human bodies is driven by the upper body.


Subject(s)
Emotions/physiology , Exploratory Behavior/physiology , Eye Movements/physiology , Eye-Tracking Technology , Gestures , Posture/physiology , Adult , Facial Expression , Female , Humans , Infant , Male , Photic Stimulation/methods , Random Allocation , Recognition, Psychology/physiology
8.
Cortex ; 117: 323-335, 2019 08.
Article in English | MEDLINE | ID: mdl-31200126

ABSTRACT

In human adults the auditory representation of others' actions is capable to activate specific areas of the motor and premotor cortices. Here, we examined the early origins of the neural processing of action sounds to investigate whether and how infants rely on auditory information to understand their close social environment. Sensorimotor activity, as indexed by µ rhythm suppression, was measured using electroencephalography in 14-month-old infants who listened to hand- and foot-produced action sounds (i.e., footsteps and clapping) and to mechanical sounds (i.e., blender). Footstep sounds elicited activation at midline electrodes over the foot area (Cz), and not in correspondence of lateralized clusters over the hand areas (C3 and C4). Greater activation in response to clapping compared to blender and footstep sounds was recorded at electrodes in the left central cluster, over the hand sensorimotor cortex (i.e., C3), but extended to some extent over the midline electrode cluster. Furthermore, our results underscore the role of natural locomotor experience in shaping sensorimotor activation, since infants who gained more walking experience exhibited stronger sensorimotor activation for footstep sounds over left central electrodes. Taken together, current results provide the first evidence that action sounds produced by another person are capable to elicit sensorimotor activation during infancy.


Subject(s)
Auditory Perception/physiology , Brain Waves/physiology , Sensorimotor Cortex/physiology , Electroencephalography , Female , Humans , Infant , Learning/physiology , Male , Sound
9.
PLoS One ; 13(12): e0208524, 2018.
Article in English | MEDLINE | ID: mdl-30521593

ABSTRACT

The current research explored toddlers' gaze fixation during a scene showing a person expressing sadness after a ball is stolen from her. The relation between the duration of gaze fixation on different parts of the person's sad face (e.g., eyes, mouth) and theory of mind skills was examined. Eye tracking data indicated that before the actor experienced the negative event, toddlers divided their fixation equally between the actor's happy face and other distracting objects, but looked longer at the face after the ball was stolen and she expressed sadness. The strongest predictor of increased focus on the sad face versus other elements of the scene was toddlers' ability to predict others' emotional reactions when outcomes fulfilled (happiness) or failed to fulfill (sadness) desires, whereas toddlers' visual perspective-taking skills predicted their more specific focusing on the actor's eyes and, for boys only, mouth. Furthermore, gender differences emerged in toddlers' fixation on parts of the scene. Taken together, these findings suggest that top-down processes are involved in the scanning of emotional facial expressions in toddlers.


Subject(s)
Attention , Facial Expression , Visual Perception , Child, Preschool , Emotions , Face , Female , Fixation, Ocular , Happiness , Humans , Male , Photic Stimulation , Sadness , Sex Factors
10.
Sci Rep ; 8(1): 17152, 2018 11 21.
Article in English | MEDLINE | ID: mdl-30464309

ABSTRACT

Infants are sensitive to and converge emotionally with peers' distress. It is unclear whether these responses extend to positive affect and whether observing peer emotions motivates infants' behaviors. This study investigates 8-month-olds' asymmetric frontal EEG during peers' cry and laughter, and its relation to approach and withdrawal behaviors. Participants observed videos of infant crying or laughing during two separate sessions. Frontal EEG alpha power was recorded during the first, while infants' behaviors and emotional expressions were recorded during the second session. Facial and vocal expressions of affect suggest that infants converge emotionally with their peers' distress, and, to a certain extent, with their happiness. At group level, the crying peer elicited right lateralized frontal activity. However, those infants with reduced right and increased left frontal activity in this situation, were more likely to approach their peer. Overall, 8-month-olds did not show asymmetric frontal activity in response to peer laughter. But, those infants who tended to look longer at their happy peer were more likely to respond with left lateralized frontal activity. The link between variations in left frontal activity and simple approach behaviors indicates the presence of a motivational dimension to infants' responses to distressed peers.


Subject(s)
Emotions , Frontal Lobe/physiology , Infant Behavior , Interpersonal Relations , Electroencephalography , Facial Expression , Female , Humans , Infant , Male , Video Recording
11.
JMIR Ment Health ; 5(3): e10153, 2018 Aug 08.
Article in English | MEDLINE | ID: mdl-30089610

ABSTRACT

BACKGROUND: Research in psychology has shown that the way a person walks reflects that person's current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. OBJECTIVE: The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual's emotional state. We present our findings of a user study with 50 participants. METHODS: The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual "movie clips" and audio "music clips"). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual's emotion. RESULTS: Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P<.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness. CONCLUSIONS: Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.

12.
Biol Psychol ; 135: 117-127, 2018 05.
Article in English | MEDLINE | ID: mdl-29596956

ABSTRACT

Infants' ability to process others' emotional expressions is fundamental for their social development. While infants' processing of emotions expressed by faces and speech has been more extensively investigated, less is known about how infants process non-verbal vocalizations of emotions. Here, we recorded frontal N100, P200, and LPC event-related potentials (ERPs) from 8-month-old infants listening to sounds of other infants crying, laughing, and coughing. Infants' temperament was measured via parental report. Results showed that processing of emotional information from non-verbal vocalizations was associated with more negative N100 and greater LPC amplitudes for peer's crying sounds relative to positive and neutral sounds. Temperament was further related to the N100, P200, and LPC difference scores between conditions. One important finding was that infants with improved ability to regulate arousal exhibited increased sustained processing of peers' cry sounds compared to both laughter and cough sounds. These results emphasize the relevance of considering the temperamental characteristics in understanding the development of infant emotion information processing, as well as for formulating comprehensive theoretical models of typical and atypical social development.


Subject(s)
Arousal/physiology , Crying/psychology , Evoked Potentials/physiology , Individuality , Laughter/psychology , Auditory Perception , Emotions/physiology , Female , Humans , Infant , Interpersonal Relations , Male , Temperament/physiology
13.
Sci Rep ; 7(1): 17500, 2017 12 13.
Article in English | MEDLINE | ID: mdl-29235500

ABSTRACT

Viewing facial expressions often evokes facial responses in the observer. These spontaneous facial reactions (SFRs) are believed to play an important role for social interactions. However, their developmental trajectory and the underlying neurocognitive mechanisms are still little understood. In the current study, 4- and 7-month old infants were presented with facial expressions of happiness, anger, and fear. Electromyography (EMG) was used to measure activation in muscles relevant for forming these expressions: zygomaticus major (smiling), corrugator supercilii (frowning), and frontalis (forehead raising). The results indicated no selective activation of the facial muscles for the expressions in 4-month-old infants. For 7-month-old infants, evidence for selective facial reactions was found especially for happy (leading to increased zygomaticus major activation) and fearful faces (leading to increased frontalis activation), while angry faces did not show a clear differential response. These results suggest that emotional SFRs may be the result of complex neurocognitive mechanisms which lead to partial mimicry but are also likely to be influenced by evaluative processes. Such mechanisms seem to undergo important developments at least until the second half of the first year of life.


Subject(s)
Emotions , Facial Expression , Facial Muscles/physiology , Social Behavior , Electromyography , Emotions/physiology , Facial Muscles/growth & development , Facial Recognition/physiology , Female , Humans , Infant , Male , Psychological Tests , Psychology, Child
14.
Neuropsychologia ; 91: 99-108, 2016 Oct.
Article in English | MEDLINE | ID: mdl-27480921

ABSTRACT

Perceiving emotion from multiple modalities enhances the perceptual sensitivity of an individual. This allows more accurate judgments of others' emotional states, which is crucial to appropriate social interactions. It is known that body expressions effectively convey emotional messages, although fewer studies have examined how this information is combined with the auditory cues. The present study used event-related potentials (ERP) to investigate the interaction between emotional body expressions and vocalizations. We also examined emotional congruency between auditory and visual information to determine how preceding visual context influences later auditory processing. Consistent with prior findings, a reduced N1 amplitude was observed in the audiovisual condition compared to an auditory-only condition. While this component was not sensitive to the modality congruency, the P2 was sensitive to the emotionally incompatible audiovisual pairs. Further, the direction of these congruency effects was different in terms of facilitation or suppression based on the preceding contexts. Overall, the results indicate a functionally dissociated mechanism underlying two stages of emotional processing whereby N1 is involved in cross-modal processing, whereas P2 is related to assessing a unifying perceptual content. These data also indicate that emotion integration can be affected by the specific emotion that is presented.


Subject(s)
Brain/physiology , Emotions , Motion Perception/physiology , Social Perception , Speech Perception/physiology , Electroencephalography , Evoked Potentials , Female , Gestures , Humans , Male , Neuropsychological Tests , Young Adult
15.
Curr Biol ; 26(14): R663-4, 2016 07 25.
Article in English | MEDLINE | ID: mdl-27458908

ABSTRACT

Emotional facial expressions are thought to have evolved because they play a crucial role in species' survival. From infancy, humans develop dedicated neural circuits [1] to exhibit and recognize a variety of facial expressions [2]. But there is increasing evidence that culture specifies when and how certain emotions can be expressed - social norms - and that the mature perceptual mechanisms used to transmit and decode the visual information from emotional signals differ between Western and Eastern adults [3-5]. Specifically, the mouth is more informative for transmitting emotional signals in Westerners and the eye region for Easterners [4], generating culture-specific fixation biases towards these features [5]. During development, it is recognized that cultural differences can be observed at the level of emotional reactivity and regulation [6], and to the culturally dominant modes of attention [7]. Nonetheless, to our knowledge no study has explored whether culture shapes the processing of facial emotional signals early in development. The data we report here show that, by 7 months, infants from both cultures visually discriminate facial expressions of emotion by relying on culturally distinct fixation strategies, resembling those used by the adults from the environment in which they develop [5].


Subject(s)
Culture , Emotions , Visual Perception , Attention , Facial Expression , Humans , Infant , Japan , United Kingdom
16.
J Exp Child Psychol ; 144: 1-14, 2016 Apr.
Article in English | MEDLINE | ID: mdl-26687335

ABSTRACT

Rapid facial reactions (RFRs) to observed emotional expressions are proposed to be involved in a wide array of socioemotional skills, from empathy to social communication. Two of the most persuasive theoretical accounts propose RFRs to rely either on motor resonance mechanisms or on more complex mechanisms involving affective processes. Previous studies demonstrated that presentation of facial and bodily expressions can generate rapid changes in adult and school-age children's muscle activity. However, to date there is little to no evidence to suggest the existence of emotional RFRs from infancy to preschool age. To investigate whether RFRs are driven by motor mimicry or could also be a result of emotional appraisal processes, we recorded facial electromyographic (EMG) activation from the zygomaticus major and frontalis medialis muscles to presentation of static facial and bodily expressions of emotions (i.e., happiness, anger, fear, and neutral) in 3-year-old children. Results showed no specific EMG activation in response to bodily emotion expressions. However, observing others' happy faces led to increased activation of the zygomaticus major and decreased activation of the frontalis medialis, whereas observing others' angry faces elicited the opposite pattern of activation. This study suggests that RFRs are the result of complex mechanisms in which both affective processes and motor resonance may play an important role.


Subject(s)
Electromyography/methods , Emotions/physiology , Facial Expression , Facial Muscles/physiology , Imitative Behavior/physiology , Nonverbal Communication/physiology , Posture/physiology , Child, Preschool , Female , Humans , Male
17.
Dev Cogn Neurosci ; 12: 134-44, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25732377

ABSTRACT

Recent evidence suggests that human adults perceive human action sounds as a distinct category from human vocalizations, environmental, and mechanical sounds, activating different neural networks (Engel et al., 2009; Lewis et al., 2011). Yet, little is known about the development of such specialization. Using event-related potentials (ERP), this study investigated neural correlates of 7-month-olds' processing of human action (HA) sounds in comparison to human vocalizations (HV), environmental (ENV), and mechanical (MEC) sounds. Relative to the other categories, HA sounds led to increased positive amplitudes between 470 and 570ms post-stimulus onset at left anterior temporal locations, while HV led to increased negative amplitudes at the more posterior temporal locations in both hemispheres. Collectively, human produced sounds (HA+HV) led to significantly different response profiles compared to non-living sound sources (ENV+MEC) at parietal and frontal locations in both hemispheres. Overall, by 7 months of age human action sounds are being differentially processed in the brain, consistent with a dichotomy for processing living versus non-living things. This provides novel evidence regarding the typical categorical processing of socially relevant sounds.


Subject(s)
Acoustic Stimulation , Auditory Perception/physiology , Brain/physiology , Evoked Potentials, Auditory , Brain Mapping , Electroencephalography , Female , Frontal Lobe/physiology , Humans , Infant , Male , Parietal Lobe/physiology , Temporal Lobe/physiology
18.
J Exp Child Psychol ; 129: 55-67, 2015 Jan.
Article in English | MEDLINE | ID: mdl-25247632

ABSTRACT

From a very young age, infants perceive others' actions as goal directed. Yet, the processes underlying this competence are still debated. In this study, we investigated whether (a) 4- and 6-month-old infants and adults discriminate the biomechanical properties of the human hand within an action context, (b) the manipulation of the biomechanics of hand movements has an impact on the ability to anticipate the goal of an action, and (c) the emergence of motor experience with grasping is related to infants' ability to discriminate the biomechanics of hand movements and to anticipate the action goal. The 6-month-olds discriminated between biomechanically possible and impossible grasps, and in some (but not all) instances they made more anticipatory gaze shifts toward the goal of the possible action. Both the 4- and 6-month-olds' processing of biomechanical properties of the hand were significantly related to their ability to anticipate the goal of a grasping action. Importantly, those 4-month-olds with higher precision grasping skills manifested faster anticipatory gazes toward the goal of the action. These findings suggest that multiple sources of information from an action scene are interdependent and that both perceptual information and motor experience with an action are relevant for on-line prediction of the final goal of the action.


Subject(s)
Motion Perception , Movement , Age Factors , Biomechanical Phenomena , Female , Goals , Humans , Infant , Male , Motion , Psychology, Child , Young Adult
19.
Soc Cogn Affect Neurosci ; 8(4): 432-7, 2013 Apr.
Article in English | MEDLINE | ID: mdl-22317745

ABSTRACT

The ability to infer other people's mental states such as desires, emotions, intentions and beliefs is essential for successful social interactions, and it is usually referred to as theory of mind (ToM). In particular, the ability to detect and understand that people have beliefs about reality that may be false is considered an important hallmark of ToM. This experiment reports on the results of 18 participants who viewed photographic sequences of an actress performing actions as a consequence of true and false beliefs. Consistent with prior work, results from the passive viewing of stimuli depicting true belief indicated an increased response over frontal, central and parietal regions when compared with the amplitude for the false belief condition. These results show that (i) frontal activity is required for processing false belief tasks and (ii) parietal effects reported in previous studies to reflect specific cognitive process of monitoring others' beliefs can be elicited in the absence of an explicit instruction for mentalizing.


Subject(s)
Concept Formation/physiology , Culture , Mental Processes/physiology , Theory of Mind/physiology , Adolescent , Adult , Brain/physiology , Cognition/physiology , Emotions/physiology , Female , Humans , Male , Reality Testing , Young Adult
20.
PLoS One ; 6(11): e27132, 2011.
Article in English | MEDLINE | ID: mdl-22110605

ABSTRACT

It has been suggested that infants resonate emotionally to others' positive and negative affect displays, and that these responses become stronger towards emotions with negative valence around the age of 12-months. In this study we measured 6- and 12-month-old infants' changes in pupil diameter when presented with the image and sound of peers experiencing happiness, distress and an emotionally neutral state. For all participants the perception of another's distress triggered larger pupil diameters. Perceiving other's happiness also induced larger pupil diameters but for shorter time intervals. Importantly, we also found evidence for an asymmetry in autonomous arousal towards positive versus negative emotional displays. Larger pupil sizes for another's distress compared to another's happiness were recorded shortly after stimulus onset for the older infants, and in a later time window for the 6-month-olds. These findings suggest that arousal responses for negative as well as for positive emotions are present in the second half of the first postnatal year. Importantly, an asymmetry with stronger responses for negative emotions seems to be already present at this age.


Subject(s)
Emotions/physiology , Pupil , Female , Humans , Infant , Male , Photic Stimulation , Time Factors , Visual Perception/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...