Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 60
Filter
1.
PLoS One ; 19(1): e0290765, 2024.
Article in English | MEDLINE | ID: mdl-38194416

ABSTRACT

A close relationship between emotional contagion and spontaneous facial mimicry has been theoretically proposed and is supported by empirical data. Facial expressions are essential in terms of both emotional and motor synchrony. Previous studies have demonstrated that trait emotional empathy enhanced spontaneous facial mimicry, but the relationship between autistic traits and spontaneous mimicry remained controversial. Moreover, previous studies presented faces that were static or videotaped, which may lack the "liveliness" of real-life social interactions. We addressed this limitation by using an image relay system to present live performances and pre-recorded videos of smiling or frowning dynamic facial expressions to 94 healthy female participants. We assessed their subjective experiential valence and arousal ratings to infer the amplitude of emotional contagion. We measured the electromyographic activities of the zygomaticus major and corrugator supercilii muscles to estimate spontaneous facial mimicry. Individual differences measures included trait emotional empathy (empathic concern) and the autism-spectrum quotient. We did not find that live performances enhanced the modulatory effect of trait differences on emotional contagion or spontaneous facial mimicry. However, we found that a high trait empathic concern was associated with stronger emotional contagion and corrugator mimicry. We found no two-way interaction between the autism spectrum quotient and emotional condition, suggesting that autistic traits did not modulate emotional contagion or spontaneous facial mimicry. Our findings imply that previous findings regarding the relationship between emotional empathy and emotional contagion/spontaneous facial mimicry using videos and photos could be generalized to real-life interactions.


Subject(s)
Autistic Disorder , Empathy , Female , Humans , Social Interaction , Emotions , Smiling
2.
Emotion ; 24(1): 15-26, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37227829

ABSTRACT

Coherence between subjective experience and bodily responses in emotion is assumed to have a positive influence on well-being, which might be particularly valuable in late adulthood. Previous studies of young adults' continuous subjective, behavioral, and physiological responses to emotional films reported emotional mind-body coherence. In contrast, research regarding emotional coherence in older adults has been scarce. In this study, we examined emotional coherence in older adults between continuous valence ratings and behavioral responses (facial electromyography [EMG] of the corrugator supercilii and zygomatic major muscles), as well as between continuous arousal ratings and physiological measures (electrodermal activity [EDA] and fingertip temperature), in response to four emotion-eliciting film clips (anger, sadness, contentment, and amusement) film clips and an emotionally neutral clip. Intraindividual cross-correlation analyses revealed that the coherence between valence ratings and corrugator EMG activity for the anger-eliciting film was weaker in older adults than in young adults, who completed an identical experiment. Age differences also emerged in the coherence of arousal ratings with EDA and fingertip temperature measures, respectively, while participants watched the anger-eliciting and contentment-eliciting films; while negative correlations were found for older adults, positive correlations were found for young adults. These results indicate that emotional mind-body coherence somewhat differs quantitatively and qualitatively between older and young adults. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Subject(s)
Anger , Emotions , Young Adult , Humans , Aged , Adult , Emotions/physiology , Facial Muscles/physiology , Face , Sadness , Electromyography
3.
Front Psychol ; 14: 1284739, 2023.
Article in English | MEDLINE | ID: mdl-38078263

ABSTRACT

Speedy detection of faces with emotional value plays a fundamental role in social interactions. A few previous studies using a visual search paradigm have reported that individuals with high autistic traits (ATs), who are characterized by deficits in social interactions, demonstrated decreased detection performance for emotional facial expressions. However, whether ATs modulate the rapid detection of faces with emotional value remains inconclusive because emotional facial expressions involve salient visual features (i.e., a U-shaped mouth in a happy expression) that can facilitate visual attention. In order to disentangle the effects of visual factors from the rapid detection of emotional faces, we examined the rapid detection of neutral faces associated with emotional value among young adults with varying degrees of ATs in a visual search task. In the experiment, participants performed a learning task wherein neutral faces were paired with monetary reward, monetary punishment, or no monetary outcome, such that the neutral faces acquired positive, negative, or no emotional value, respectively. During the subsequent visual search task, previously learned neutral faces were presented as discrepant faces among newly presented neutral distractor faces, and the participants were asked to detect the discrepant faces. The results demonstrated a significant negative association between the degrees of ATs and an advantage in detecting punishment-associated neutral faces. This indicates the decreased detection of faces with negative value in individuals with higher ATs, which may contribute to their difficulty in making prompt responses in social situations.

4.
Biol Sex Differ ; 14(1): 84, 2023 11 14.
Article in English | MEDLINE | ID: mdl-37964327

ABSTRACT

BACKGROUND: Rapid detection of faces with emotional meaning is essential for understanding the emotions of others, possibly promoting successful interpersonal relationships. Although few studies have examined sex differences in the ability to detect emotional faces, it remains unclear whether faces with emotional meaning capture the attention of females and males differently, because emotional faces have visual saliency that modulates visual attention. To overcome this issue, we tested the rapid detection of the neutral faces associated with and without learned emotional value, which are all regarded as free from visual saliency. We examined sex differences in the rapid detection of the neutral female and male faces associated with emotional value. METHODS: First, young adult female and male participants completed an associative learning task in which neutral faces were associated with either monetary rewards, monetary punishments, or no monetary outcomes, such that the neutral faces acquired positive, negative, and no emotional value, respectively. Then, they engaged in a visual search task in which previously learned neutral faces were presented as discrepant faces among newly presented neutral distractor faces. During the visual search task, the participants were required to rapidly identify discrepant faces. RESULTS: Female and male participants exhibited comparable learning abilities. The visual search results demonstrated that female participants achieved rapid detection of neutral faces associated with emotional value irrespective of the sex of the faces presented, whereas male participants showed this ability only for male faces. CONCLUSIONS: Our results demonstrated that sex differences in the ability to rapidly detect neutral faces with emotional value were modulated by the sex of those faces. The results suggest greater sensitivity to faces with emotional significance in females, which might enrich interpersonal communication, regardless of sex.


Speedy detection of faces with emotional meaning plays a fundamental role in social interactions. However, it is unclear whether females and males differ in their ability to rapidly detect neutral faces associated with newly acquired emotional meaning/value. This study examined the sex differences in the rapid detection of neutral female and male faces associated with emotional value subsequent to associative learning. During learning, neutral faces were paired with monetary reward or punishment, such that they acquired positive or negative emotional value, respectively. In a subsequent visual search task, previously learned neutral faces were presented as discrepant faces among newly presented neutral faces, and the participants had to rapidly identify the discrepant faces. The results showed that, among female participants, neutral faces associated with reward and punishment were detected more rapidly than neutral faces not associated with monetary outcomes, irrespective of the sex of the face stimuli. By contrast, male participants only showed the rapid detection of neutral male faces. The results suggest enhanced sensitivity to faces with emotional meaning among females, which is consistent with the notion of greater sensitivity to emotional/social information in females.


Subject(s)
Emotions , Sex Characteristics , Young Adult , Humans , Male , Female , Learning , Interpersonal Relations
5.
Sci Rep ; 13(1): 20727, 2023 11 25.
Article in English | MEDLINE | ID: mdl-38007578

ABSTRACT

The conscious perception of emotional facial expressions plays an indispensable role in social interaction. However, previous psychological studies have reported inconsistent findings regarding whether conscious awareness is greater for emotional expressions than for neutral expressions. Furthermore, whether this phenomenon is attributable to emotional or visual factors remains unknown. To investigate these issues, we conducted five psychological experiments to test the conscious perception of emotional and neutral facial expressions using the match-to-sample paradigm. Facial stimuli were momentarily presented in the peripheral visual fields while participants read simultaneously presented letters in the central visual fields. The participants selected a perceived face from nine samples. The results of all experiments demonstrated that emotional expressions were more accurately identified than neutral expressions. Furthermore, Experiment 4 showed that angry expressions were identified more accurately than anti-angry expressions, which expressed neutral emotions with comparable physical changes to angry expressions. Experiment 5, testing the interaction between emotional expression and face direction, showed that angry expressions looking toward participants were more accurately identified than those looking away from participants, even though they were physically identical. These results suggest that the conscious awareness of emotional facial expressions is enhanced by their emotional significance.


Subject(s)
Emotions , Facial Expression , Humans , Anger , Social Perception , Consciousness
6.
Front Med (Lausanne) ; 10: 1059203, 2023.
Article in English | MEDLINE | ID: mdl-37305136

ABSTRACT

Background: Humanitude approaches have shown positive effects in elderly care. However, the behavioral and neural underpinnings of empathic characteristics in Humanitude-care experts remain unknown. Methods: We investigated the empathic characteristics of a Humanitude-care expert (YG) and those of age-, sex-, and race-matched controls (n = 13). In a behavioral study, we measured subjective valence and arousal ratings and facial electromyography (EMG) of the corrugator supercilii and zygomatic major muscles while participants passively observed dynamic facial expressions associated with anger and happiness and their randomized mosaic patterns. In a functional magnetic resonance imaging (MRI) study, we measured brain activity while participants passively observed the same dynamic facial expressions and mosaics. In a structural MRI study, we acquired structural MRI data and analyzed gray matter volume. Results: Our behavioral data showed that YG experienced higher subjective arousal and showed stronger facial EMG activity congruent with stimulus facial expressions compared with controls. The functional MRI data demonstrated that YG showed stronger activity in the ventral premotor cortex (PMv; covering the precentral gyrus and inferior frontal gyrus) and posterior middle temporal gyrus in the right hemisphere in response to dynamic facial expressions versus dynamic mosaics compared with controls. The structural MRI data revealed higher regional gray matter volume in the right PMv in YG than in controls. Conclusion: These results suggest that Humanitude-care experts have behavioral and neural characteristics associated with empathic social interactions.

7.
Hum Brain Mapp ; 44(8): 3057-3071, 2023 06 01.
Article in English | MEDLINE | ID: mdl-36895114

ABSTRACT

Observing and understanding others' emotional facial expressions, possibly through motor synchronization, plays a primary role in face-to-face communication. To understand the underlying neural mechanisms, previous functional magnetic resonance imaging (fMRI) studies investigated brain regions that are involved in both the observation/execution of emotional facial expressions and found that the neocortical motor regions constituting the action observation/execution matching system or mirror neuron system were active. However, it remains unclear (1) whether other brain regions in the limbic, cerebellum, and brainstem regions could be also involved in the observation/execution matching system for processing facial expressions, and (2) if so, whether these regions could constitute a functional network. To investigate these issues, we performed fMRI while participants observed dynamic facial expressions of anger and happiness and while they executed facial muscle activity associated with angry and happy facial expressions. Conjunction analyses revealed that, in addition to neocortical regions (i.e., the right ventral premotor cortex and right supplementary motor area), bilateral amygdala, right basal ganglia, bilateral cerebellum, and right facial nerve nucleus were activated during both the observation/execution tasks. Group independent component analysis revealed that a functional network component involving the aforementioned regions were activated during both observation/execution tasks. The data suggest that the motor synchronization of emotional facial expressions involves a widespread observation/execution matching network encompassing the neocortex, limbic system, basal ganglia, cerebellum, and brainstem.


Subject(s)
Facial Expression , Neocortex , Humans , Brain Mapping/methods , Emotions/physiology , Happiness , Magnetic Resonance Imaging/methods
8.
Sensors (Basel) ; 23(3)2023 Jan 19.
Article in English | MEDLINE | ID: mdl-36772176

ABSTRACT

Pleasant touching is an important aspect of social interactions that is widely used as a caregiving technique. To address the problems resulting from a lack of available human caregivers, previous research has attempted to develop robots that can perform this kind of pleasant touch. However, it remains unclear whether robots can provide such a pleasant touch in a manner similar to humans. To investigate this issue, we compared the effect of the speed of gentle strokes on the back between human and robot agents on the emotional responses of human participants (n = 28). A robot or a human stroked on the participants' back at two different speeds (i.e., 2.6 and 8.5 cm/s). The participants' subjective (valence and arousal ratings) and physiological (facial electromyography (EMG) recorded from the corrugator supercilii and zygomatic major muscles and skin conductance response) emotional reactions were measured. The subjective ratings demonstrated that the speed of 8.5 cm/s was more pleasant and arousing than the speed of 2.6 cm/s for both human and robot strokes. The corrugator supercilii EMG showed that the speed of 8.5 cm/s resulted in reduced activity in response to both human and robot strokes. These results demonstrate similar speed-dependent modulations of stroke on subjective and physiological positive emotional responses across human and robot agents and suggest that robots can provide a pleasant touch similar to that of humans.


Subject(s)
Robotics , Touch Perception , Humans , Touch/physiology , Touch Perception/physiology , Emotions/physiology , Facial Muscles/physiology , Electromyography
9.
Front Psychol ; 14: 1292178, 2023.
Article in English | MEDLINE | ID: mdl-38264418

ABSTRACT

Touch care has clinically positive effects on older adults. Touch can be delivered using robots, addressing the lack of caregivers. A recent study of younger participants showed that stroke touch delivered via robot produced subjective and physiologically positive emotional responses similar to those evoked by human touch. However, whether robotic touch can elicit similar responses in older adults remains unknown. We investigated this topic by assessing subjective rating (valence and arousal) and physiological signals [corrugator and zygomatic electromyography (EMG) and skin conductance response (SCR)] to gentle stroking motions delivered to the backs of older participants by robot and human agents at two different speeds: 2.6 and 8.5 cm/s. Following the recent study, the participants were informed that only the robot strokes them. We compared the difference between the younger (their data from the previous study) and the older participants in their responses when the two agents (a robot and a human) stroked them. Subjectively, data from both younger and older participants showed that 8.5 cm/s stroking was more positive and arousing than 2.6 cm/s stroking for both human and robot agents. Physiologically, data from both younger and older participants showed that 8.5 cm/s stroking induced weaker corrugator EMG activity and stronger SCR activity than the 2.6 cm/s stroking for both agents. These results demonstrate that the overall patterns of the older groups responses were similar to those of the younger group, and suggest that robot-delivered stroke touch can elicit pleasant emotional responses in older adults.

10.
Nutrients ; 14(22)2022 Nov 09.
Article in English | MEDLINE | ID: mdl-36432423

ABSTRACT

BACKGROUND: Subjective-physiological emotional coherence is thought to be associated with enhanced well-being, and a relationship between subjective-physiological emotional coherence and superior nutritional status has been suggested in older populations. However, no study has examined subjective-physiological emotional coherence among older adults while tasting food. Accordingly, the present study compared subjective-physiological emotional coherence during food consumption among older and younger adults. METHODS: Participants consumed bite-sized gel-type foods with different flavors and provided their subjective ratings of the foods while their physiological responses (facial electromyography (EMG) of the corrugator supercilia, masseter, and suprahyoid, and other autonomic nervous system signals) were simultaneously measured. RESULTS: Our primary findings were that (1) the ratings of liking, wanting, and valence were negatively correlated with corrugator EMG activity in older and young adult participants; (2) the positive association between masseter EMG activity and ratings of wanting/valence was weaker in the older than in the young adult group; and (3) arousal ratings were negatively correlated with corrugator EMG activity in the older group only. CONCLUSIONS: These results demonstrate commonalities and differences in subjective-physiological emotional coherence during food intake between older and young adults.


Subject(s)
Arousal , Emotions , Young Adult , Humans , Aged , Emotions/physiology , Electromyography
11.
Front Psychol ; 13: 856336, 2022.
Article in English | MEDLINE | ID: mdl-36237662

ABSTRACT

When building personal relationships, it is important to select optimal partners, even based on the first meeting. This study was inspired by the idea that people who smile are considered more trustworthy and attractive. However, this may not always be true in daily life. Previous studies have used a relatively simple method of judging others by presenting a photograph of one person's face. To move beyond this approach and examine more complex situations, we presented the faces of two people confronted with each other to participants and asked them to judge them from a third-person perspective. Through three experiments, participants were asked to judge which of the two persons was more appropriate for forming alliances, more trustworthy, or more attractive, respectively. In all experiments, images were shown for a short (500 ms) or a long time (5 s). In all three experiments, the results showed that participants were more likely to choose persons with happy faces than those with neutral, sad, or angry faces when the image presentation was short. Contrarily, the facial expressions did not affect those judgments when the image presentation was long. Instead, judgments were correlated with personality estimated from the model's neutral face in a single-person presentation. These results suggest that although facial expressions can affect the judgments of others when observing two-person confrontations from a third-person perspective, when participants have more time to elaborate their judgments, they go beyond expressions.

12.
Neuroimage ; 263: 119655, 2022 11.
Article in English | MEDLINE | ID: mdl-36182055

ABSTRACT

Facial expressions are indispensable in daily human communication. Previous neuroimaging studies investigating facial expression processing have presented pre-recorded stimuli and lacked live face-to-face interaction. Our paradigm alternated between presentations of real-time model performance and pre-recorded videos of dynamic facial expressions to participants. Simultaneous functional magnetic resonance imaging (fMRI) and facial electromyography activity recordings, as well as post-scan valence and arousal ratings were acquired from 44 female participants. Live facial expressions enhanced the subjective valence and arousal ratings as well as facial muscular responses. Live performances showed greater engagement of the right posterior superior temporal sulcus (pSTS), right inferior frontal gyrus (IFG), right amygdala and right fusiform gyrus, and modulated the effective connectivity within the right mirror neuron system (IFG, pSTS, and right inferior parietal lobule). A support vector machine algorithm could classify multivoxel activation patterns in brain regions involved in dynamic facial expression processing in the mentalizing networks (anterior and posterior cingulate cortex). These results indicate that live social interaction modulates the activity and connectivity of the right mirror neuron system and enhances spontaneous mimicry, further facilitating emotional contagion.


Subject(s)
Mirror Neurons , Humans , Female , Brain Mapping/methods , Brain/physiology , Emotions/physiology , Temporal Lobe/physiology , Magnetic Resonance Imaging/methods , Facial Expression
13.
Sci Rep ; 12(1): 6884, 2022 04 27.
Article in English | MEDLINE | ID: mdl-35477945

ABSTRACT

A gentle touch is an essential part of human interaction that produces a positive care effect. Previously, robotics studies have shown that robots can reproduce a gentle touch that elicits similar, positive emotional responses in humans. However, whether the positive emotional effects of a robot's touch combined with speech can be enhanced using a multimodal approach remains unclear. This study supports the hypothesis that a multimodal interaction combining gentle touch and speech by a robot enhances positive emotional responses. Here, we conducted an experiment using a robotic arm to perform a gentle touch combined with speech and compared three conditions: touch alone, speech alone, and touch with speech. We assessed participants' subjective ratings of valence, arousal, and human likeliness using subjective emotional responses. Furthermore, we recorded facial electromyography (EMG) from the corrugator supercilii and zygomaticus major muscles and measured skin conductance levels (SCLs) as physiological emotional responses. Our results show that touch combined with speech elicited higher subjective valence and arousal ratings, stronger zygomaticus major EMG and SCL activities than touch alone. The results suggest that the positive emotional effects of robotic touch can be boosted by combining elements of speech.


Subject(s)
Robotics , Touch Perception , Emotions/physiology , Facial Muscles/physiology , Humans , Speech , Touch Perception/physiology
14.
J Gerontol B Psychol Sci Soc Sci ; 77(7): 1219-1228, 2022 07 05.
Article in English | MEDLINE | ID: mdl-35137048

ABSTRACT

OBJECTIVES: Previous studies using visual search paradigms have provided inconsistent results regarding rapid detection of emotional faces among older adults. Furthermore, it is uncertain whether the emotional significance of the faces contributes to efficient searches for emotional faces due to the possible confounding effects of visual saliency. We addressed this issue by excluding the influence of visual factors and examined older adults' ability to detect faces with emotional meaning. METHOD: We used an associative learning procedure in which neutral faces were paired with monetary reward or punishment, such that the neutral faces acquired positive or negative emotional value. Older participants completed the associative learning task and then engaged in a visual search task, in which previously learned neutral faces were presented as discrepant faces among newly presented neutral distractor faces. Data of young adults from a previous study that used identical experimental procedures were also analyzed. RESULTS: Older participants exhibited lower learning ability than young participants. However, older adults who were successful at learning were able to detect neutral faces associated with reward or punishment more rapidly than those without monetary outcomes, similar to the pattern observed for young adults. DISCUSSION: The results suggest that acquired emotional value promotes the detection of value-associated neutral faces among older adults who succeed at learning. It is therefore possible that the ability to detect faces that evoke emotions is preserved in older adults.


Subject(s)
Emotions , Learning , Aged , Emotions/physiology , Facial Expression , Humans , Reward
15.
Atten Percept Psychophys ; 84(3): 815-828, 2022 Apr.
Article in English | MEDLINE | ID: mdl-35169990

ABSTRACT

An ensemble or statistical summary can be extracted from facial expressions presented in different spatial locations simultaneously. However, how such complicated objects are represented in the mind is not clear. It is known that the aftereffect of facial expressions, in which prolonged viewing of facial expressions biases the perception of subsequent facial expressions of the same category, occurs only when a visual representation is formed. Using this methodology, we examined whether an ensemble can be represented with visualized information. Experiment 1 revealed that the presentation of multiple facial expressions biased the perception of subsequent facial expressions to less happy as much as the presentation of a single face did. Experiment 2 compared the presentation of faces comprising strong and weak intensities of emotional expressions with an individual face as the adaptation stimulus. The results indicated that the perceptual biases were found after the presentation of four faces and a strong single face, but not after the weak single face presentation. Experiment 3 employed angry expressions, a distinct category from the test expression used as an adaptation stimulus; no aftereffect was observed. Finally, Experiment 4 clearly demonstrated the perceptual bias with a higher number of faces. Altogether, these results indicate that an ensemble average extracted from multiple faces leads to the perceptual bias, and this effect is similar in terms of its properties to that of a single face. This supports the idea that an ensemble of faces is represented with visualized information as a single face.


Subject(s)
Facial Recognition , Figural Aftereffect , Adaptation, Physiological , Anger , Emotions , Facial Expression , Humans
16.
Cogn Emot ; 36(3): 546-559, 2022 05.
Article in English | MEDLINE | ID: mdl-34923928

ABSTRACT

Swift detection of faces with emotional meaning underlies fruitful social relationships. Although previous studies using a visual search paradigm have demonstrated rapid detection of emotional facial expressions, whether it is attributable to emotional/motivational significance remains to be clarified. We examined this issue by excluding the influence of visual factors on the rapid detection of faces with emotional meaning. First, participants were engaged in an associative learning task wherein neutral faces were associated with either monetary rewards, monetary punishments, or zero outcome in order for the neutral faces to acquire positive, negative, and no emotional value, respectively. Then, during the visual search task, the participants detected a target-neutral face associated with high reward or punishment from among newly presented neutral faces. In Experiment 1, neutral faces associated with high reward and punishment values were more rapidly detected than those without monetary outcomes. In Experiment 2, highly rewarded and highly punished neutral faces were more rapidly detected than neutral faces associated with low monetary reward/punishment. Analyses of ratings confirmed that the learned neutral faces acquired emotional value, and the reaction times were negatively related to arousal ratings. These results suggest that the emotional/motivational significance promotes the rapid detection of emotional faces.


Subject(s)
Emotions , Facial Expression , Humans , Motivation , Reward , Social Perception
17.
Nutrients ; 13(12)2021 Nov 24.
Article in English | MEDLINE | ID: mdl-34959773

ABSTRACT

Sensing subjective hedonic or emotional experiences during eating using physiological activity is practically and theoretically important. A recent psychophysiological study has reported that facial electromyography (EMG) measured from the corrugator supercilii muscles was negatively associated with hedonic ratings, including liking, wanting, and valence, during the consumption of solid foods. However, the study protocol prevented participants from natural mastication (crushing of food between the teeth) during physiological data acquisition, which could hide associations between hedonic experiences and masticatory muscle activity during natural eating. We investigated this issue by assessing participants' subjective ratings (liking, wanting, valence, and arousal) and recording physiological measures, including EMG of the corrugator supercilii, zygomatic major, masseter, and suprahyoid muscles while they consumed gel-type solid foods (water-based gellan gum jellies) of diverse flavors. Ratings of liking, wanting, and valence were negatively correlated with corrugator supercilii EMG and positively correlated with masseter and suprahyoid EMG. These findings imply that subjective hedonic experiences during food consumption can be sensed using EMG signals from the brow and masticatory muscles.


Subject(s)
Eating/physiology , Eating/psychology , Eyebrows/physiology , Masticatory Muscles/physiology , Philosophy , Adult , Electromyography , Facial Muscles/physiology , Female , Humans , Male , Mastication/physiology , Young Adult
18.
Front Psychol ; 12: 722108, 2021.
Article in English | MEDLINE | ID: mdl-34489826

ABSTRACT

Aims: We aimed to assess the psychometric properties of a Japanese version of the Actions and Feelings Questionnaire (J-AFQ), an 18-item self-report measure of non-verbal emotional communication, as well as to examine its transcultural properties. Methods: The J-AFQ was administered to 500 Japanese adults (age 20-49, 250 male), alongside the Japanese Broad Autism Phenotype Questionnaire (BAPQ-J) and Empathy Quotient (EQ-J). These were compared to a group of 597 British and Irish participants (age 16-18, 148 male). J-AFQ was assessed in terms of validity by confirmatory factor analysis and convergence with BAPQ-J and EQ-J using Pearson correlation. Internal consistency and differential item functioning (DIF) were assessed and compared between Japanese and UK/Irish participants. Results: Reversed worded items (RWIs) showed poor item-total correlations but excluding these left a 13-item version of the J-AFQ with good internal consistency and content validity. Consistent with the English version, J-AFQ scores correlated with EQ and lower BAPQ scores. However, comparing across cultures, J-AFQ scores were significantly lower in the Japanese sample, and there was evidence of important DIF by country in over half of the J-AFQ items Conclusion: Cultural differences in attitudes to self-report, as well as increased acquiescence to RWI's also seen in previous studies, limit the value of the 18-item instrument in Japanese culture. However, the 13-item J-AFQ is a valid and reliable measure of motor empathy, which, alongside the English version, offers promise for research in motor cognition and non-verbal emotional communication across cultures.

19.
Sci Rep ; 11(1): 5757, 2021 03 11.
Article in English | MEDLINE | ID: mdl-33707605

ABSTRACT

Emotion sensing using physiological signals in real-life situations can be practically valuable. Previous studies have developed wearable devices that record autonomic nervous system activity, which reflects emotional arousal. However, no study determined whether emotional valence can be assessed using wearable devices. To this end, we developed a wearable device to record facial electromyography (EMG) from the corrugator supercilii (CS) and zygomatic major (ZM) muscles. To validate the device, in Experiment 1, we used a traditional wired device and our wearable device, to record participants' facial EMG while they were viewing emotional films. Participants viewed the films again and continuously rated their recalled subjective valence during the first viewing. The facial EMG signals recorded using both wired and wearable devices showed that CS and ZM activities were, respectively, negatively and positively correlated with continuous valence ratings. In Experiment 2, we used the wearable device to record participants' facial EMG while they were playing Wii Bowling games and assessed their cued-recall continuous valence ratings. CS and ZM activities were correlated negatively and positively, respectively, with continuous valence ratings. These data suggest the possibility that facial EMG signals recorded by a wearable device can be used to assess subjective emotional valence in future naturalistic studies.


Subject(s)
Electromyography/instrumentation , Emotions/physiology , Face/physiology , Wearable Electronic Devices , Arousal/physiology , Female , Humans , Male , Multilevel Analysis , Regression Analysis , Young Adult
20.
Nutrients ; 13(1)2020 Dec 22.
Article in English | MEDLINE | ID: mdl-33375209

ABSTRACT

The physiological correlates of hedonic/emotional experiences to visual food stimuli are of theoretical and practical interest. Previous psychophysiological studies have shown that facial electromyography (EMG) signals were related to subjective hedonic ratings in response to food images. However, because other data showed positive correlations between hedonic ratings and objective nutritional values of food, whether the facial EMG reactions to food images could reflect the hedonic evaluation or nutritional assessment of food remains unknown. To address this issue, we measured subjective hedonic ratings (liking, wanting, valence, and arousal) and physiological signals (facial EMG of the corrugator supercilii, zygomatic major, masseter, and suprahyoid muscles, skin potential responses, and heart rates) while participants observed food images that had objective nutritional information (caloric, carbohydrate, fat, and protein contents). The results revealed that zygomatic major EMG activity was positively correlated with ratings of liking, wanting, and valence, but not with any objective nutritional value. These data indicate that facial EMG signals in response to food images reflect subjective hedonic experiences, but not objective nutritional values, associated with the food item.


Subject(s)
Electromyography/methods , Facial Muscles/physiology , Food , Nutritive Value , Asian People , Emotions , Facial Expression , Female , Heart Rate , Humans , Nutrition Assessment , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL
...