Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 15.653
Filter
1.
J Psychiatry Neurosci ; 49(3): E145-E156, 2024.
Article in English | MEDLINE | ID: mdl-38692692

ABSTRACT

BACKGROUND: Neuroimaging studies have revealed abnormal functional interaction during the processing of emotional faces in patients with major depressive disorder (MDD), thereby enhancing our comprehension of the pathophysiology of MDD. However, it is unclear whether there is abnormal directional interaction among face-processing systems in patients with MDD. METHODS: A group of patients with MDD and a healthy control group underwent a face-matching task during functional magnetic resonance imaging. Dynamic causal modelling (DCM) analysis was used to investigate effective connectivity between 7 regions in the face-processing systems. We used a Parametric Empirical Bayes model to compare effective connectivity between patients with MDD and controls. RESULTS: We included 48 patients and 44 healthy controls in our analyses. Both groups showed higher accuracy and faster reaction time in the shape-matching condition than in the face-matching condition. However, no significant behavioural or brain activation differences were found between the groups. Using DCM, we found that, compared with controls, patients with MDD showed decreased self-connection in the right dorsolateral prefrontal cortex (DLPFC), amygdala, and fusiform face area (FFA) across task conditions; increased intrinsic connectivity from the right amygdala to the bilateral DLPFC, right FFA, and left amygdala, suggesting an increased intrinsic connectivity centred in the amygdala in the right side of the face-processing systems; both increased and decreased positive intrinsic connectivity in the left side of the face-processing systems; and comparable task modulation effect on connectivity. LIMITATIONS: Our study did not include longitudinal neuroimaging data, and there was limited region of interest selection in the DCM analysis. CONCLUSION: Our findings provide evidence for a complex pattern of alterations in the face-processing systems in patients with MDD, potentially involving the right amygdala to a greater extent. The results confirm some previous findings and highlight the crucial role of the regions on both sides of face-processing systems in the pathophysiology of MDD.


Subject(s)
Amygdala , Depressive Disorder, Major , Facial Recognition , Magnetic Resonance Imaging , Humans , Depressive Disorder, Major/physiopathology , Depressive Disorder, Major/diagnostic imaging , Male , Female , Adult , Facial Recognition/physiology , Amygdala/diagnostic imaging , Amygdala/physiopathology , Brain/diagnostic imaging , Brain/physiopathology , Neural Pathways/physiopathology , Neural Pathways/diagnostic imaging , Bayes Theorem , Young Adult , Brain Mapping , Facial Expression , Middle Aged , Reaction Time/physiology
2.
Sci Rep ; 14(1): 10371, 2024 05 06.
Article in English | MEDLINE | ID: mdl-38710806

ABSTRACT

Emotion is a human sense that can influence an individual's life quality in both positive and negative ways. The ability to distinguish different types of emotion can lead researchers to estimate the current situation of patients or the probability of future disease. Recognizing emotions from images have problems concealing their feeling by modifying their facial expressions. This led researchers to consider Electroencephalography (EEG) signals for more accurate emotion detection. However, the complexity of EEG recordings and data analysis using conventional machine learning algorithms caused inconsistent emotion recognition. Therefore, utilizing hybrid deep learning models and other techniques has become common due to their ability to analyze complicated data and achieve higher performance by integrating diverse features of the models. However, researchers prioritize models with fewer parameters to achieve the highest average accuracy. This study improves the Convolutional Fuzzy Neural Network (CFNN) for emotion recognition using EEG signals to achieve a reliable detection system. Initially, the pre-processing and feature extraction phases are implemented to obtain noiseless and informative data. Then, the CFNN with modified architecture is trained to classify emotions. Several parametric and comparative experiments are performed. The proposed model achieved reliable performance for emotion recognition with an average accuracy of 98.21% and 98.08% for valence (pleasantness) and arousal (intensity), respectively, and outperformed state-of-the-art methods.


Subject(s)
Electroencephalography , Emotions , Fuzzy Logic , Neural Networks, Computer , Humans , Electroencephalography/methods , Emotions/physiology , Male , Female , Adult , Algorithms , Young Adult , Signal Processing, Computer-Assisted , Deep Learning , Facial Expression
3.
PLoS One ; 19(5): e0302782, 2024.
Article in English | MEDLINE | ID: mdl-38713700

ABSTRACT

Parents with a history of childhood maltreatment may be more likely to respond inadequately to their child's emotional cues, such as crying or screaming, due to previous exposure to prolonged stress. While studies have investigated parents' physiological reactions to their children's vocal expressions of emotions, less attention has been given to their responses when perceiving children's facial expressions of emotions. The present study aimed to determine if viewing facial expressions of emotions in children induces cardiovascular changes in mothers (hypo- or hyper-arousal) and whether these differ as a function of childhood maltreatment. A total of 104 mothers took part in this study. Their experiences of childhood maltreatment were measured using the Childhood Trauma Questionnaire (CTQ). Participants' electrocardiogram signals were recorded during a task in which they viewed a landscape video (baseline) and images of children's faces expressing different intensities of emotion. Heart rate variability (HRV) was extracted from the recordings as an indicator of parasympathetic reactivity. Participants presented two profiles: one group of mothers had a decreased HRV when presented with images of children's facial expressions of emotions, while the other group's HRV increased. However, HRV change was not significantly different between the two groups. The interaction between HRV groups and the severity of maltreatment experienced was marginal. Results suggested that experiences of childhood emotional abuse were more common in mothers whose HRV increased during the task. Therefore, more severe childhood experiences of emotional abuse could be associated with mothers' cardiovascular hyperreactivity. Maladaptive cardiovascular responses could have a ripple effect, influencing how mothers react to their children's facial expressions of emotions. That reaction could affect the quality of their interaction with their child. Providing interventions that help parents regulate their physiological and behavioral responses to stress might be helpful, especially if they have experienced childhood maltreatment.


Subject(s)
Emotions , Facial Expression , Heart Rate , Mothers , Humans , Female , Adult , Heart Rate/physiology , Child , Emotions/physiology , Mothers/psychology , Emotional Abuse/psychology , Male , Electrocardiography , Child Abuse/psychology , Mother-Child Relations/psychology , Surveys and Questionnaires
4.
Sci Rep ; 14(1): 10607, 2024 05 08.
Article in English | MEDLINE | ID: mdl-38719866

ABSTRACT

Guilt is a negative emotion elicited by realizing one has caused actual or perceived harm to another person. One of guilt's primary functions is to signal that one is aware of the harm that was caused and regrets it, an indication that the harm will not be repeated. Verbal expressions of guilt are often deemed insufficient by observers when not accompanied by nonverbal signals such as facial expression, gesture, posture, or gaze. Some research has investigated isolated nonverbal expressions in guilt, however none to date has explored multiple nonverbal channels simultaneously. This study explored facial expression, gesture, posture, and gaze during the real-time experience of guilt when response demands are minimal. Healthy adults completed a novel task involving watching videos designed to elicit guilt, as well as comparison emotions. During the video task, participants were continuously recorded to capture nonverbal behaviour, which was then analyzed via automated facial expression software. We found that while feeling guilt, individuals engaged less in several nonverbal behaviours than they did while experiencing the comparison emotions. This may reflect the highly social aspect of guilt, suggesting that an audience is required to prompt a guilt display, or may suggest that guilt does not have clear nonverbal correlates.


Subject(s)
Facial Expression , Guilt , Humans , Male , Female , Adult , Young Adult , Nonverbal Communication/psychology , Emotions/physiology , Gestures
5.
Sci Rep ; 14(1): 10491, 2024 05 07.
Article in English | MEDLINE | ID: mdl-38714729

ABSTRACT

Dogs (Canis lupus familiaris) are the domestically bred descendant of wolves (Canis lupus). However, selective breeding has profoundly altered facial morphologies of dogs compared to their wolf ancestors. We demonstrate that these morphological differences limit the abilities of dogs to successfully produce the same affective facial expressions as wolves. We decoded facial movements of captive wolves during social interactions involving nine separate affective states. We used linear discriminant analyses to predict affective states based on combinations of facial movements. The resulting confusion matrix demonstrates that specific combinations of facial movements predict nine distinct affective states in wolves; the first assessment of this many affective facial expressions in wolves. However, comparative analyses with kennelled rescue dogs revealed reduced ability to predict affective states. Critically, there was a very low predictive power for specific affective states, with confusion occurring between negative and positive states, such as Friendly and Fear. We show that the varying facial morphologies of dogs (specifically non-wolf-like morphologies) limit their ability to produce the same range of affective facial expressions as wolves. Confusion among positive and negative states could be detrimental to human-dog interactions, although our analyses also suggest dogs likely use vocalisations to compensate for limitations in facial communication.


Subject(s)
Domestication , Emotions , Facial Expression , Wolves , Animals , Wolves/physiology , Dogs , Emotions/physiology , Male , Female , Behavior, Animal/physiology , Humans
6.
Cereb Cortex ; 34(5)2024 May 02.
Article in English | MEDLINE | ID: mdl-38715407

ABSTRACT

Facial palsy can result in a serious complication known as facial synkinesis, causing both physical and psychological harm to the patients. There is growing evidence that patients with facial synkinesis have brain abnormalities, but the brain mechanisms and underlying imaging biomarkers remain unclear. Here, we employed functional magnetic resonance imaging (fMRI) to investigate brain function in 31 unilateral post facial palsy synkinesis patients and 25 healthy controls during different facial expression movements and at rest. Combining surface-based mass-univariate analysis and multivariate pattern analysis, we identified diffused activation and intrinsic connection patterns in the primary motor cortex and the somatosensory cortex on the patient's affected side. Further, we classified post facial palsy synkinesis patients from healthy subjects with favorable accuracy using the support vector machine based on both task-related and resting-state functional magnetic resonance imaging data. Together, these findings indicate the potential of the identified functional reorganizations to serve as neuroimaging biomarkers for facial synkinesis diagnosis.


Subject(s)
Facial Paralysis , Magnetic Resonance Imaging , Synkinesis , Humans , Magnetic Resonance Imaging/methods , Facial Paralysis/physiopathology , Facial Paralysis/diagnostic imaging , Facial Paralysis/complications , Male , Female , Synkinesis/physiopathology , Adult , Middle Aged , Young Adult , Facial Expression , Biomarkers , Motor Cortex/physiopathology , Motor Cortex/diagnostic imaging , Brain Mapping , Somatosensory Cortex/diagnostic imaging , Somatosensory Cortex/physiopathology , Brain/diagnostic imaging , Brain/physiopathology , Support Vector Machine
7.
Food Res Int ; 183: 114158, 2024 May.
Article in English | MEDLINE | ID: mdl-38760149

ABSTRACT

The elderly population holds significance among consumers because many of them experience alterations in taste and smell or suffer from physical disorders. These factors can lead to reduced food intake, malnutrition, and, consequently, serious health problems. Therefore, there is a need to develop tailored products for seniors, offering both nutrition and appealing foods with easily consumable textures. Among the various characteristics of food, appearance stands out as one of the most critical aspects influencing food preferences and choices. Surprisingly, there is limited knowledge about how food shape affects the holistic emotional responses of seniors. The objective of this study was to investigate the impact of food shape on the emotional responses of seniors. This exploration involved the use of explicit methods, such as self-reported questionnaires, and implicit methods, including the measurement of skin conductance responses and facial expressions, as well as their combination. To achieve this goal, we enlisted the participation of 50 individuals (54 % women) from the senior population aged between 55 and 75 years. These participants evaluated two food products with identical sensory characteristics in terms of taste, texture, and flavor. However, these products differed in terms of their shape. We measured their degree of liking and emotional responses using a 7-point hedonic scale, EsSense25, in conjunction with galvanic skin response, and facial expressions, which served as representatives of behavioural and physiological responses. The multivariate analysis allowed to examine sample configurations by gender and establish associations between variables. The combination of implicit and explicit methods led to better discrimination of samples of the same category than the use of each of the methods independently. Although both samples elicited equivalent liking perceptions, they evoked distinct emotional responses, measured at cognitive, physiological, and behavioural levels. In general, men and women experienced different emotions while observing, smelling, handling, or consuming both samples, both consciously and unconsciously. This newfound knowledge could be valuable when designing food products for this demographic. The ultimate goal is to engage consumers and enhance their enjoyment of the food experience by offering more visually appealing food options.


Subject(s)
Emotions , Food Preferences , Humans , Female , Male , Aged , Middle Aged , Food Preferences/physiology , Food Preferences/psychology , Facial Expression , Galvanic Skin Response/physiology , Taste , Surveys and Questionnaires
8.
Cogn Sci ; 48(5): e13451, 2024 05.
Article in English | MEDLINE | ID: mdl-38742266

ABSTRACT

Anxiety shifts visual attention and perceptual mechanisms, preparing oneself to detect potentially threatening information more rapidly. Despite being demonstrated for threat-related social stimuli, such as fearful expressions, it remains unexplored if these effects encompass other social cues of danger, such as aggressive gestures/actions. To this end, we recruited a total of 65 participants and asked them to identify, as quickly and accurately as possible, potentially aggressive actions depicted by an agent. By introducing and manipulating the occurrence of electric shocks, we induced safe and threatening conditions. In addition, the association between electric shocks and aggression was also manipulated. Our result showed that participants have improved sensitivity, with no changes to criterion, when detecting aggressive gestures during threat compared to safe conditions. Furthermore, drift diffusion model analysis showed that under threat participants exhibited faster evidence accumulation toward the correct perceptual decision. Lastly, the relationship between threat source and aggression appeared to not impact any of the effects described above. Overall, our results indicate that the benefits gained from states of anxiety, such as increased sensitivity toward threat and greater evidence accumulation, are transposable to social stimuli capable of signaling danger other than facial expressions.


Subject(s)
Aggression , Fear , Humans , Aggression/psychology , Male , Female , Young Adult , Adult , Anxiety/psychology , Social Perception , Attention , Facial Expression , Cues , Electroshock
9.
BMC Psychol ; 12(1): 279, 2024 May 17.
Article in English | MEDLINE | ID: mdl-38755731

ABSTRACT

OBJECTIVE: The somatic symptom disorder (SSD) is characterized by one or more distressing or disabling somatic symptoms accompanied by an excessive amount of time, energy and emotion related to the symptoms. These manifestations of SSD have been linked to alterations in perception and appraisal of bodily signals. We hypothesized that SSD patients would exhibit changes in interoceptive accuracy (IA), particularly when emotional processing is involved. METHODS: Twenty-three patients with SSD and 20 healthy controls were recruited. IA was assessed using the heartbeat perception task. The task was performed in the absence of stimuli as well as in the presence of emotional interference, i.e., photographs of faces with an emotional expression. IA were examined for correlation with measures related to their somatic symptoms, including resting-state heart rate variability (HRV). RESULTS: There was no significant difference in the absolute values of IA between patients with SSD and healthy controls, regardless of the condition. However, the degree of difference in IA without emotional interference and with neutral facial interference was greater in patients with SSD than in healthy controls (p = 0.039). The IA of patients with SSD also showed a significant correlation with low-frequency HRV (p = 0.004) and high-frequency HRV (p = 0.007). CONCLUSION: SSD patients showed more significant changes in IA when neutral facial interference was given. These results suggest that bodily awareness is more affected by emotionally ambiguous stimuli in SSD patients than in healthy controls.


Subject(s)
Emotions , Heart Rate , Interoception , Humans , Female , Male , Interoception/physiology , Adult , Heart Rate/physiology , Emotions/physiology , Middle Aged , Medically Unexplained Symptoms , Somatoform Disorders/psychology , Somatoform Disorders/physiopathology , Facial Expression
10.
PLoS One ; 19(5): e0302705, 2024.
Article in English | MEDLINE | ID: mdl-38758739

ABSTRACT

Neuropsychological research aims to unravel how diverse individuals' brains exhibit similar functionality when exposed to the same stimuli. The evocation of consistent responses when different subjects watch the same emotionally evocative stimulus has been observed through modalities like fMRI, EEG, physiological signals and facial expressions. We refer to the quantification of these shared consistent signals across subjects at each time instant across the temporal dimension as Consistent Response Measurement (CRM). CRM is widely explored through fMRI, occasionally with EEG, physiological signals and facial expressions using metrics like Inter-Subject Correlation (ISC). However, fMRI tools are expensive and constrained, while EEG and physiological signals are prone to facial artifacts and environmental conditions (such as temperature, humidity, and health condition of subjects). In this research, facial expression videos are used as a cost-effective and flexible alternative for CRM, minimally affected by external conditions. By employing computer vision-based automated facial keypoint tracking, a new metric similar to ISC, called the Average t-statistic, is introduced. Unlike existing facial expression-based methodologies that measure CRM of secondary indicators like inferred emotions, keypoint, and ICA-based features, the Average t-statistic is closely associated with the direct measurement of consistent facial muscle movement using the Facial Action Coding System (FACS). This is evidenced in DISFA dataset where the time-series of Average t-statistic has a high correlation (R2 = 0.78) with a metric called AU consistency, which directly measures facial muscle movement through FACS coding of video frames. The simplicity of recording facial expressions with the automated Average t-statistic expands the applications of CRM such as measuring engagement in online learning, customer interactions, etc., and diagnosing outliers in healthcare conditions like stroke, autism, depression, etc. To promote further research, we have made the code repository publicly available.


Subject(s)
Emotions , Facial Expression , Humans , Emotions/physiology , Female , Male , Adult , Video Recording , Movement/physiology , Young Adult , Magnetic Resonance Imaging/methods , Electroencephalography/methods
11.
Sci Rep ; 14(1): 11686, 2024 05 22.
Article in English | MEDLINE | ID: mdl-38777852

ABSTRACT

Pain is rarely communicated alone, as it is often accompanied by emotions such as anger or sadness. Communicating these affective states involves shared representations. However, how an individual conceptually represents these combined states must first be tested. The objective of this study was to measure the interaction between pain and negative emotions on two types of facial representations of these states, namely visual (i.e., interactive virtual agents; VAs) and sensorimotor (i.e., one's production of facial configurations). Twenty-eight participants (15 women) read short written scenarios involving only pain or a combined experience of pain and a negative emotion (anger, disgust, fear, or sadness). They produced facial configurations representing these experiences on the faces of the VAs and on their face (own production or imitation of VAs). The results suggest that affective states related to a direct threat to the body (i.e., anger, disgust, and pain) share a similar facial representation, while those that present no immediate danger (i.e., fear and sadness) differ. Although visual and sensorimotor representations of these states provide congruent affective information, they are differently influenced by factors associated with the communication cycle. These findings contribute to our understanding of pain communication in different affective contexts.


Subject(s)
Emotions , Facial Expression , Pain , Humans , Female , Male , Pain/psychology , Pain/physiopathology , Adult , Emotions/physiology , Young Adult , Anger/physiology , Affect/physiology , Fear/psychology , Sadness/psychology
12.
Int J Psychophysiol ; 200: 112358, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38710371

ABSTRACT

Recent studies have shown that the processing of neutral facial expressions could be modulated by the valence and self-relevance of preceding verbal evaluations. However, these studies have not distinguished the dimension (i.e., morality and competence) from verbal evaluations. In fact, there is a hot controversy about whether morality or competence receives more weight. Therefore, using the ERP technique, the current study aimed to address this issue by comparing the influence of morality and competence evaluations on behavioral and neural responses to neutral facial expressions when these evaluations varied with contextual valence and self-relevance. Our ERP results revealed that the early EPN amplitudes were larger for neutral faces after receiving evaluations about self relative to evaluations about senders. Moreover, the EPN was more negative after a competence evaluation relative to a morality evaluation when these evaluations were positive, while this effect was absent when these evaluations were negative. The late LPP was larger after a morality evaluation compared to a competence evaluation when these evaluations were negative and directed to self. However, no significant LPP effect between morality and competence evaluations was observed when these evaluations were positive. The present study extended previous studies by showing that early and late processing stages of faces are affected by the evaluation dimension in a top-down manner and further modulated by contextual valence and self-relevance.


Subject(s)
Electroencephalography , Evoked Potentials , Morals , Humans , Female , Male , Young Adult , Adult , Evoked Potentials/physiology , Facial Expression , Facial Recognition/physiology , Photic Stimulation/methods , Adolescent , Self Concept
13.
Sci Rep ; 14(1): 11571, 2024 05 21.
Article in English | MEDLINE | ID: mdl-38773125

ABSTRACT

This study delves into expressing primary emotions anger, happiness, sadness, and fear through drawings. Moving beyond the well-researched color-emotion link, it explores under-examined aspects like spatial concepts and drawing styles. Employing Python and OpenCV for objective analysis, we make a breakthrough by converting subjective perceptions into measurable data through 728 digital images from 182 university students. For the prominent color chosen for each emotion, the majority of participants chose red for anger (73.11%), yellow for happiness (17.8%), blue for sadness (51.1%), and black for fear (40.7%). Happiness led with the highest saturation (68.52%) and brightness (75.44%) percentages, while fear recorded the lowest in both categories (47.33% saturation, 48.78% brightness). Fear, however, topped in color fill percentage (35.49%), with happiness at the lowest (25.14%). Tangible imagery prevailed (71.43-83.52%), with abstract styles peaking in fear representations (28.57%). Facial expressions were a common element (41.76-49.45%). The study achieved an 81.3% predictive accuracy for anger, higher than the 71.3% overall average. Future research can build on these results by improving technological methods to quantify more aspects of drawing content. Investigating a more comprehensive array of emotions and examining factors influencing emotional drawing styles will further our understanding of visual-emotional communication.


Subject(s)
Emotions , Facial Expression , Humans , Emotions/physiology , Male , Female , Young Adult , Happiness , Anger/physiology , Adult , Fear/psychology , Sadness
14.
Sci Rep ; 14(1): 11617, 2024 05 21.
Article in English | MEDLINE | ID: mdl-38773183

ABSTRACT

It has been argued that experiencing the pain of others motivates helping. Here, we investigate the contribution of somatic feelings while witnessing the pain of others onto costly helping decisions, by contrasting the choices and brain activity of participants that report feeling somatic feelings (self-reported mirror-pain synesthetes) against those that do not. Participants in fMRI witnessed a confederate receiving pain stimulations whose intensity they could reduce by donating money. The pain intensity could be inferred either from the facial expressions of the confederate in pain (Face condition) or from the kinematics of the pain-receiving hand (Hand condition). Our results show that self-reported mirror-pain synesthetes increase their donation more steeply, as the intensity of the observed pain increases, and their somatosensory brain activity (SII and the adjacent IPL) was more tightly associated with donation in the Hand condition. For all participants, activation in insula, SII, TPJ, pSTS, amygdala and MCC correlated with the trial by trial donation made in the Face condition, while SI and MTG activation was correlated with the donation in the Hand condition. These results further inform us about the role of somatic feelings while witnessing the pain of others in situations of costly helping.


Subject(s)
Magnetic Resonance Imaging , Pain , Humans , Female , Male , Adult , Pain/psychology , Pain/physiopathology , Young Adult , Brain/physiopathology , Brain/diagnostic imaging , Brain/physiology , Brain Mapping , Facial Expression , Helping Behavior , Hand/physiology
15.
Int J Med Inform ; 187: 105469, 2024 Jul.
Article in English | MEDLINE | ID: mdl-38723429

ABSTRACT

BACKGROUND: Human Emotion Recognition (HER) has been a popular field of study in the past years. Despite the great progresses made so far, relatively little attention has been paid to the use of HER in autism. People with autism are known to face problems with daily social communication and the prototypical interpretation of emotional responses, which are most frequently exerted via facial expressions. This poses significant practical challenges to the application of regular HER systems, which are normally developed for and by neurotypical people. OBJECTIVE: This study reviews the literature on the use of HER systems in autism, particularly with respect to sensing technologies and machine learning methods, as to identify existing barriers and possible future directions. METHODS: We conducted a systematic review of articles published between January 2011 and June 2023 according to the 2020 PRISMA guidelines. Manuscripts were identified through searching Web of Science and Scopus databases. Manuscripts were included when related to emotion recognition, used sensors and machine learning techniques, and involved children with autism, young, or adults. RESULTS: The search yielded 346 articles. A total of 65 publications met the eligibility criteria and were included in the review. CONCLUSIONS: Studies predominantly used facial expression techniques as the emotion recognition method. Consequently, video cameras were the most widely used devices across studies, although a growing trend in the use of physiological sensors was observed lately. Happiness, sadness, anger, fear, disgust, and surprise were most frequently addressed. Classical supervised machine learning techniques were primarily used at the expense of unsupervised approaches or more recent deep learning models. Studies focused on autism in a broad sense but limited efforts have been directed towards more specific disorders of the spectrum. Privacy or security issues were seldom addressed, and if so, at a rather insufficient level of detail.


Subject(s)
Autistic Disorder , Emotions , Facial Expression , Machine Learning , Humans , Autistic Disorder/psychology , Child
16.
Comput Methods Programs Biomed ; 250: 108195, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38692251

ABSTRACT

BACKGROUND AND OBJECTIVE: Timely stroke treatment can limit brain damage and improve outcomes, which depends on early recognition of the symptoms. However, stroke cases are often missed by the first respondent paramedics. One of the earliest external symptoms of stroke is based on facial expressions. METHODS: We propose a computerized analysis of facial expressions using action units to distinguish between Post-Stroke and healthy people. Action units enable analysis of subtle and specific facial movements and are interpretable to the facial expressions. The RGB videos from the Toronto Neuroface Dataset, which were recorded during standard orofacial examinations of 14 people with post-stroke (PS) and 11 healthy controls (HC) were used in this study. Action units were computed using XGBoost which was trained using HC, and classified using regression analysis for each of the nine facial expressions. The analysis was performed without manual intervention. RESULTS: The results were evaluated using leave-one-our validation. The accuracy was 82% for Kiss and Spread, with the best sensitivity of 91% in the differentiation of PS and HC. The features corresponding to mouth muscles were most suitable. CONCLUSIONS: This pilot study has shown that our method can detect PS based on two simple facial expressions. However, this needs to be tested in real-world conditions, with people of different ethnicities and smartphone use. The method has the potential for a computerized assessment of the videos for use by the first respondents using a smartphone to perform screening tests, which can facilitate the timely start of the treatment.


Subject(s)
Facial Expression , Stroke , Humans , Pilot Projects , Female , Male , Middle Aged , Aged , Case-Control Studies , Video Recording
17.
Autism Res ; 17(5): 934-946, 2024 May.
Article in English | MEDLINE | ID: mdl-38716802

ABSTRACT

Autistic people exhibit atypical use of prior information when processing simple perceptual stimuli; yet, it remains unclear whether and how these difficulties in using priors extend to complex social stimuli. Here, we compared autistic people without accompanying intellectual disability and nonautistic people in their ability to acquire an "emotional prior" of a facial expression and update this prior to a different facial expression of the same identity. Participants performed a two-interval same/different discrimination task between two facial expressions. To study the acquisition of the prior, we examined how discrimination was modified by the contraction of the perceived facial expressions toward the average of presented stimuli (i.e., regression to the mean). At first, facial expressions surrounded one average emotional prior (mostly sad or angry), and then the average switched (to mostly angry or sad, accordingly). Autistic people exhibited challenges in facial discrimination, and yet acquired the first prior, demonstrating typical regression-to-the-mean effects. However, unlike nonautistic people, autistic people did not update their perception to the second prior, suggesting they are less flexible in updating an acquired prior of emotional expressions. Our findings shed light on the perception of emotional expressions, one of the most pressing challenges in autism.


Subject(s)
Anger , Autistic Disorder , Facial Expression , Humans , Female , Male , Adult , Anger/physiology , Autistic Disorder/psychology , Young Adult , Learning/physiology , Social Perception , Adolescent , Emotions/physiology , Discrimination, Psychological/physiology
18.
Sci Rep ; 14(1): 12250, 2024 05 28.
Article in English | MEDLINE | ID: mdl-38806507

ABSTRACT

Mona Lisa's ambiguous expression, oscillating between melancholy and contentment, has captivated viewers for centuries, prompting diverse explanations. This article proposes a novel interpretation grounded in the psychological theory of perceptual organisation. Central to the investigation is the "Ambiguity-Nuance", a subtly shaded, blended region framing the upper part of the lips, hypothesised to influence perceived expression due to perceptual organization. Through carefully crafted artwork and systematic manipulations of Mona Lisa reproductions, experiments reveal how alterations in the perceptual relationships of the Ambiguity-Nuance yield significant shifts in perceived expression, explaining why Mona Lisa's appearance changes and under which conditions she looks content versus melancholic based on perceptual organization. These findings underscore the pivotal role of psychological principles in shaping ambiguous expressions in the Mona Lisa, and extend to other Leonardo's portraits, namely La Bella Principessa and Scapigliata. This study sheds light on the intersection of psychology and art, offering new perspectives on timeless masterpieces.


Subject(s)
Smiling , Humans , Female , Facial Expression , Famous Persons , Paintings
19.
J Vis ; 24(5): 14, 2024 May 01.
Article in English | MEDLINE | ID: mdl-38814935

ABSTRACT

Facial color influences the perception of facial expressions, and emotional expressions bias how facial color is remembered. However, it remains unclear whether facial expressions affect daily facial color memory. The memory color effect demonstrates that knowledge about typical colors affects the perception of the actual color of given objects. To investigate the effect of facial color memory, we examined whether the memory color effect for faces varies depending on facial expression. We calculated the subjective achromatic point of the facial expression image stimulus and compared the degree to which it was shifted from the actual achromatic point between facial expression conditions. We hypothesized that if the memory of facial color is influenced by the facial expression color (e.g., anger is a warm color, fear is a cold color), then the subjective achromatic point would vary with facial expression. In Experiment 1, we recruited 13 participants who adjusted the color of facial expression stimuli (anger, neutral, and fear) and a banana stimulus to be achromatic. No significant differences in the subjective achromatic point between facial expressions were observed. Subsequently, we conducted Experiment 2 with 23 participants because Experiment 1 did not account for the sensitivity to color changes on the face; humans perceive greater color differences in faces than in non-faces. Participants selected which facial color they believed the expression stimulus appeared to be, choosing one of two options provided to them. The results indicated that the subjective achromatic points of anger and fear faces significantly shifted toward the opposite color direction compared with neutral faces in the brief presentation condition. This research suggests that the memory color of faces differs depending on facial expressions and supports the idea that the perception of emotional expressions can bias facial color memory.


Subject(s)
Color Perception , Facial Expression , Memory , Humans , Male , Female , Young Adult , Color Perception/physiology , Adult , Memory/physiology , Photic Stimulation/methods , Emotions/physiology , Anger/physiology , Facial Recognition/physiology
20.
Transl Psychiatry ; 14(1): 211, 2024 May 27.
Article in English | MEDLINE | ID: mdl-38802372

ABSTRACT

Lamotrigine is an effective mood stabiliser, largely used for the management and prevention of depression in bipolar disorder. The neuropsychological mechanisms by which lamotrigine acts to relieve symptoms as well as its neural effects on emotional processing remain unclear. The primary objective of this current study was to investigate the impact of an acute dose of lamotrigine on the neural response to a well-characterised fMRI task probing implicit emotional processing relevant to negative bias. 31 healthy participants were administered either a single dose of lamotrigine (300 mg, n = 14) or placebo (n = 17) in a randomized, double-blind design. Inside the 3 T MRI scanner, participants completed a covert emotional faces gender discrimination task. Brain activations showing significant group differences were identified using voxel-wise general linear model (GLM) nonparametric permutation testing, with threshold free cluster enhancement (TFCE) and a family wise error (FWE)-corrected cluster significance threshold of p < 0.05. Participants receiving lamotrigine were more accurate at identifying the gender of fearful (but not happy or angry) faces. A network of regions associated with emotional processing, including amygdala, insula, and the anterior cingulate cortex (ACC), was significantly less activated in the lamotrigine group compared to the placebo group across emotional facial expressions. A single dose of lamotrigine reduced activation in limbic areas in response to faces with both positive and negative expressions, suggesting a valence-independent effect. However, at a behavioural level lamotrigine appeared to reduce the distracting effect of fear on face discrimination. Such effects may be relevant to the mood stabilisation effects of lamotrigine.


Subject(s)
Emotions , Facial Expression , Healthy Volunteers , Lamotrigine , Magnetic Resonance Imaging , Triazines , Humans , Lamotrigine/pharmacology , Lamotrigine/administration & dosage , Male , Female , Adult , Double-Blind Method , Emotions/drug effects , Triazines/pharmacology , Triazines/administration & dosage , Young Adult , Brain/drug effects , Brain/diagnostic imaging , Facial Recognition/drug effects , Gyrus Cinguli/drug effects , Gyrus Cinguli/diagnostic imaging , Amygdala/drug effects , Amygdala/diagnostic imaging , Antimanic Agents/pharmacology , Antimanic Agents/administration & dosage
SELECTION OF CITATIONS
SEARCH DETAIL
...