Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 23.052
Filter
1.
PLoS One ; 19(5): e0300984, 2024.
Article in English | MEDLINE | ID: mdl-38709789

ABSTRACT

Mentalizing describes the ability to imagine mental states underlying behavior. Furthermore, mentalizing allows one to identify, reflect on, and make sense of one's emotional state as well as to communicate one's emotions to oneself and others. In existing self-report measures, the process of mentalizing emotions in oneself and others was not captured. Therefore, the Mentalizing Emotions Questionnaire (MEQ; current version in German) was developed. In Study 1 (N = 510), we explored the factor structure of the MEQ with an Exploratory Factor Analysis. The factor analysis identified one principal (R2 = .65) and three subfactors: the overall factor was mentalizing emotions, the three subdimensions were self, communicating and other. In Study 2 (N = 509), we tested and confirmed the factor structure of the 16-items MEQ in a Confirmatory Factor Analysis (CFI = .959, RMSEA = .078, SRMR = .04) and evaluated its psychometric properties, which showed excellent internal consistency (α = .92 - .95) and good validity. The MEQ is a valid and reliable instrument which assesses the ability to mentalize emotions provides incremental validity to related constructs such as empathy that goes beyond other mentalization questionnaires.


Subject(s)
Emotions , Mentalization , Psychometrics , Self Report , Humans , Male , Female , Emotions/physiology , Adult , Surveys and Questionnaires , Mentalization/physiology , Psychometrics/methods , Young Adult , Middle Aged , Factor Analysis, Statistical , Adolescent , Theory of Mind , Empathy/physiology , Reproducibility of Results
2.
PLoS One ; 19(5): e0302782, 2024.
Article in English | MEDLINE | ID: mdl-38713700

ABSTRACT

Parents with a history of childhood maltreatment may be more likely to respond inadequately to their child's emotional cues, such as crying or screaming, due to previous exposure to prolonged stress. While studies have investigated parents' physiological reactions to their children's vocal expressions of emotions, less attention has been given to their responses when perceiving children's facial expressions of emotions. The present study aimed to determine if viewing facial expressions of emotions in children induces cardiovascular changes in mothers (hypo- or hyper-arousal) and whether these differ as a function of childhood maltreatment. A total of 104 mothers took part in this study. Their experiences of childhood maltreatment were measured using the Childhood Trauma Questionnaire (CTQ). Participants' electrocardiogram signals were recorded during a task in which they viewed a landscape video (baseline) and images of children's faces expressing different intensities of emotion. Heart rate variability (HRV) was extracted from the recordings as an indicator of parasympathetic reactivity. Participants presented two profiles: one group of mothers had a decreased HRV when presented with images of children's facial expressions of emotions, while the other group's HRV increased. However, HRV change was not significantly different between the two groups. The interaction between HRV groups and the severity of maltreatment experienced was marginal. Results suggested that experiences of childhood emotional abuse were more common in mothers whose HRV increased during the task. Therefore, more severe childhood experiences of emotional abuse could be associated with mothers' cardiovascular hyperreactivity. Maladaptive cardiovascular responses could have a ripple effect, influencing how mothers react to their children's facial expressions of emotions. That reaction could affect the quality of their interaction with their child. Providing interventions that help parents regulate their physiological and behavioral responses to stress might be helpful, especially if they have experienced childhood maltreatment.


Subject(s)
Emotions , Facial Expression , Heart Rate , Mothers , Humans , Female , Adult , Heart Rate/physiology , Child , Emotions/physiology , Mothers/psychology , Emotional Abuse/psychology , Male , Electrocardiography , Child Abuse/psychology , Mother-Child Relations/psychology , Surveys and Questionnaires
4.
Sci Rep ; 14(1): 10607, 2024 05 08.
Article in English | MEDLINE | ID: mdl-38719866

ABSTRACT

Guilt is a negative emotion elicited by realizing one has caused actual or perceived harm to another person. One of guilt's primary functions is to signal that one is aware of the harm that was caused and regrets it, an indication that the harm will not be repeated. Verbal expressions of guilt are often deemed insufficient by observers when not accompanied by nonverbal signals such as facial expression, gesture, posture, or gaze. Some research has investigated isolated nonverbal expressions in guilt, however none to date has explored multiple nonverbal channels simultaneously. This study explored facial expression, gesture, posture, and gaze during the real-time experience of guilt when response demands are minimal. Healthy adults completed a novel task involving watching videos designed to elicit guilt, as well as comparison emotions. During the video task, participants were continuously recorded to capture nonverbal behaviour, which was then analyzed via automated facial expression software. We found that while feeling guilt, individuals engaged less in several nonverbal behaviours than they did while experiencing the comparison emotions. This may reflect the highly social aspect of guilt, suggesting that an audience is required to prompt a guilt display, or may suggest that guilt does not have clear nonverbal correlates.


Subject(s)
Facial Expression , Guilt , Humans , Male , Female , Adult , Young Adult , Nonverbal Communication/psychology , Emotions/physiology , Gestures
5.
Sci Rep ; 14(1): 10491, 2024 05 07.
Article in English | MEDLINE | ID: mdl-38714729

ABSTRACT

Dogs (Canis lupus familiaris) are the domestically bred descendant of wolves (Canis lupus). However, selective breeding has profoundly altered facial morphologies of dogs compared to their wolf ancestors. We demonstrate that these morphological differences limit the abilities of dogs to successfully produce the same affective facial expressions as wolves. We decoded facial movements of captive wolves during social interactions involving nine separate affective states. We used linear discriminant analyses to predict affective states based on combinations of facial movements. The resulting confusion matrix demonstrates that specific combinations of facial movements predict nine distinct affective states in wolves; the first assessment of this many affective facial expressions in wolves. However, comparative analyses with kennelled rescue dogs revealed reduced ability to predict affective states. Critically, there was a very low predictive power for specific affective states, with confusion occurring between negative and positive states, such as Friendly and Fear. We show that the varying facial morphologies of dogs (specifically non-wolf-like morphologies) limit their ability to produce the same range of affective facial expressions as wolves. Confusion among positive and negative states could be detrimental to human-dog interactions, although our analyses also suggest dogs likely use vocalisations to compensate for limitations in facial communication.


Subject(s)
Domestication , Emotions , Facial Expression , Wolves , Animals , Wolves/physiology , Dogs , Emotions/physiology , Male , Female , Behavior, Animal/physiology , Humans
6.
Hum Brain Mapp ; 45(7): e26703, 2024 May.
Article in English | MEDLINE | ID: mdl-38716714

ABSTRACT

The default mode network (DMN) lies towards the heteromodal end of the principal gradient of intrinsic connectivity, maximally separated from the sensory-motor cortex. It supports memory-based cognition, including the capacity to retrieve conceptual and evaluative information from sensory inputs, and to generate meaningful states internally; however, the functional organisation of DMN that can support these distinct modes of retrieval remains unclear. We used fMRI to examine whether activation within subsystems of DMN differed as a function of retrieval demands, or the type of association to be retrieved, or both. In a picture association task, participants retrieved semantic associations that were either contextual or emotional in nature. Participants were asked to avoid generating episodic associations. In the generate phase, these associations were retrieved from a novel picture, while in the switch phase, participants retrieved a new association for the same image. Semantic context and emotion trials were associated with dissociable DMN subnetworks, indicating that a key dimension of DMN organisation relates to the type of association being accessed. The frontotemporal and medial temporal DMN showed a preference for emotional and semantic contextual associations, respectively. Relative to the generate phase, the switch phase recruited clusters closer to the heteromodal apex of the principal gradient-a cortical hierarchy separating unimodal and heteromodal regions. There were no differences in this effect between association types. Instead, memory switching was associated with a distinct subnetwork associated with controlled internal cognition. These findings delineate distinct patterns of DMN recruitment for different kinds of associations yet common responses across tasks that reflect retrieval demands.


Subject(s)
Default Mode Network , Emotions , Magnetic Resonance Imaging , Mental Recall , Semantics , Humans , Male , Female , Adult , Young Adult , Emotions/physiology , Default Mode Network/physiology , Default Mode Network/diagnostic imaging , Mental Recall/physiology , Cerebral Cortex/physiology , Cerebral Cortex/diagnostic imaging , Nerve Net/physiology , Nerve Net/diagnostic imaging , Brain Mapping , Pattern Recognition, Visual/physiology
7.
PLoS One ; 19(5): e0301085, 2024.
Article in English | MEDLINE | ID: mdl-38718018

ABSTRACT

Psychopathy is a severe personality disorder marked by a wide range of emotional deficits, including a lack of empathy, emotion dysregulation, and alexithymia. Previous research has largely examined these emotional impairments in isolation, ignoring their influence on each other. Thus, we examined the concurrent interrelationship between emotional impairments in psychopathy, with a particular focus on the mediating role of alexithymia. Using path analyses with cross-sectional data from a community sample (N = 315) and a forensic sample (N = 50), our results yielded a statistically significant mediating effect of alexithymia on the relationship between psychopathy and empathy (community and forensic) and between psychopathy and emotion dysregulation (community). Moreover, replacing psychopathy with its three dimensions (i.e., meanness, disinhibition, and boldness) in the community sample revealed that boldness may function as an adaptive trait, with lower levels of alexithymia counteracting deficits in empathy and emotion dysregulation. Overall, our findings indicate that psychopathic individuals' limited understanding of their own emotions contributes to their lack of empathy and emotion dysregulation. This underscores the potential benefits of improving emotional awareness in the treatment of individuals with psychopathy.


Subject(s)
Affective Symptoms , Antisocial Personality Disorder , Empathy , Humans , Affective Symptoms/psychology , Affective Symptoms/physiopathology , Empathy/physiology , Male , Adult , Female , Antisocial Personality Disorder/psychology , Cross-Sectional Studies , Middle Aged , Emotions/physiology , Emotional Regulation/physiology , Young Adult
8.
PLoS One ; 19(5): e0303144, 2024.
Article in English | MEDLINE | ID: mdl-38718035

ABSTRACT

Charitable fundraising increasingly relies on online crowdfunding platforms. Project images of charitable crowdfunding use emotional appeals to promote helping behavior. Negative emotions are commonly used to motivate helping behavior because the image of a happy child may not motivate donors to donate as willingly. However, some research has found that happy images can be more beneficial. These contradictory results suggest that the emotional valence of project imagery and how fundraisers frame project images effectively remain debatable. Thus, we compared and analyzed brain activation differences in the prefrontal cortex governing human emotions depending on donation decisions using functional near-infrared spectroscopy, a neuroimaging device. We advance existing theory on charitable behavior by demonstrating that little correlation exists in donation intentions and brain activity between negative and positive project images, which is consistent with survey results on donation intentions by victim image. We also discovered quantitative brain hemodynamic signal variations between donors and nondonors, which can predict and detect donor mental brain functioning using functional connectivity, that is, the statistical dependence between the time series of electrophysiological activity and oxygenated hemodynamic levels in the prefrontal cortex. These findings are critical in developing future marketing strategies for online charitable crowdfunding platforms, especially project images.


Subject(s)
Emotions , Fund Raising , Spectroscopy, Near-Infrared , Humans , Emotions/physiology , Spectroscopy, Near-Infrared/methods , Fund Raising/methods , Female , Male , Adult , Charities , Prefrontal Cortex/physiology , Prefrontal Cortex/diagnostic imaging , Intention , Young Adult , Brain Mapping/methods , Crowdsourcing , Brain/physiology , Brain/diagnostic imaging
9.
Sci Rep ; 14(1): 10369, 2024 05 06.
Article in English | MEDLINE | ID: mdl-38710748

ABSTRACT

Emotions experienced within sleep mentation (dreaming) affect mental functioning in waking life. There have been attempts at enhancing dream emotions using olfactory stimulation. Odors readily acquire affective value, but to profoundly influence emotional processing, they should bear personal significance for the perceiver rather than be generally pleasant. The main objective of the present sleep laboratory study was to examine whether prolonged nocturnal exposure to self-selected, preferred ambient room odor while asleep influences emotional aspects of sleep mentation and valence of post-sleep core affect. We asked twenty healthy participants (12 males, mean age 25 ± 4 years) to pick a commercially available scented room diffuser cartridge that most readily evoked positively valenced mental associations. In weekly intervals, the participants attended three sessions. After the adaptation visit, they were administered the odor exposure and odorless control condition in a balanced order. Participants were awakened five minutes into the first rapid eye movement (REM) stage that took place after 2:30 a.m. and, if they had been dreaming, they were asked to rate their mental sleep experience for pleasantness, emotional charge, and magnitude of positive and negative emotions and also to evaluate their post-sleep core affect valence. With rs < 0.20, no practically or statistically significant differences existed between exposure and control in any outcome measures. We conclude that in young, healthy participants, the practical value of olfactory stimulation with self-selected preferred scents for enhancement of dream emotions and post-sleep core affect valence is very limited.


Subject(s)
Dreams , Emotions , Odorants , Humans , Male , Adult , Female , Dreams/physiology , Dreams/psychology , Young Adult , Emotions/physiology , Sleep/physiology , Smell/physiology , Sleep, REM/physiology , Wakefulness/physiology
10.
Sci Rep ; 14(1): 10334, 2024 05 06.
Article in English | MEDLINE | ID: mdl-38710774

ABSTRACT

Effective interventions that support blood donor retention are needed. Yet, integrating an intervention into the time-pressed and operationally sensitive context of a blood donation center requires justification for disruptions to an optimized process. This research provides evidence that virtual reality (VR) paradigms can serve as a research environment in which interventions can be tested prior to being delivered in blood donation centers. Study 1 (N = 48) demonstrated that 360°-video VR blood donation environments elicit a similar profile of emotional experience to a live donor center. Presence and immersion were high, and cybersickness symptoms low. Study 2 (N = 134) was an experiment deploying the 360°-video VR environments to test the impact of an intervention on emotional experience and intentions to donate. Participants in the intervention condition who engaged in a suite of tasks drawn from the process model of emotion regulation (including attentional deployment, positive reappraisal, and response modulation) reported more positive emotion than participants in a control condition, which in turn increased intentions to donate blood. By showing the promise for benefitting donor experience via a relatively low-cost and low-resource methodology, this research supports the use of VR paradigms to trial interventions prior to deployment in operationally-context field settings.


Subject(s)
Blood Donors , Virtual Reality , Humans , Blood Donors/psychology , Male , Female , Adult , Young Adult , Emotions/physiology , Intention , Middle Aged , Adolescent , Blood Donation
11.
Sci Rep ; 14(1): 10371, 2024 05 06.
Article in English | MEDLINE | ID: mdl-38710806

ABSTRACT

Emotion is a human sense that can influence an individual's life quality in both positive and negative ways. The ability to distinguish different types of emotion can lead researchers to estimate the current situation of patients or the probability of future disease. Recognizing emotions from images have problems concealing their feeling by modifying their facial expressions. This led researchers to consider Electroencephalography (EEG) signals for more accurate emotion detection. However, the complexity of EEG recordings and data analysis using conventional machine learning algorithms caused inconsistent emotion recognition. Therefore, utilizing hybrid deep learning models and other techniques has become common due to their ability to analyze complicated data and achieve higher performance by integrating diverse features of the models. However, researchers prioritize models with fewer parameters to achieve the highest average accuracy. This study improves the Convolutional Fuzzy Neural Network (CFNN) for emotion recognition using EEG signals to achieve a reliable detection system. Initially, the pre-processing and feature extraction phases are implemented to obtain noiseless and informative data. Then, the CFNN with modified architecture is trained to classify emotions. Several parametric and comparative experiments are performed. The proposed model achieved reliable performance for emotion recognition with an average accuracy of 98.21% and 98.08% for valence (pleasantness) and arousal (intensity), respectively, and outperformed state-of-the-art methods.


Subject(s)
Electroencephalography , Emotions , Fuzzy Logic , Neural Networks, Computer , Humans , Electroencephalography/methods , Emotions/physiology , Male , Female , Adult , Algorithms , Young Adult , Signal Processing, Computer-Assisted , Deep Learning , Facial Expression
12.
Sci Rep ; 14(1): 12200, 2024 May 28.
Article in English | MEDLINE | ID: mdl-38806616

ABSTRACT

Common inputs synchronize various biological systems, including human physical and cognitive processes. This mechanism potentially explains collective human emotions in theater as unintentional behavioral synchronization. However, the inter-subject correlation of physiological signals among individuals is small. Based on findings on the common-input synchronization of nonlinear systems, we hypothesized that individual differences in perceptual and cognitive systems reduce the reliability of physiological responses to aesthetic stimuli and, thus, disturb synchronization. We tested this by comparing the inter- and intra-subject Pearson's correlation coefficients and nonlinear phase synchronization, calculated using instantaneous heart rate data measured while appreciating music. The results demonstrated that inter-subject correlations were consistently lower than intra-subject correlations, regardless of participants' music preferences and daily moods. Further, music-induced heart rate synchronization depends on the reliability of physiological responses to musical pieces rather than mood or motivation. This study lays the foundation for future empirical research on collective emotions in theater.


Subject(s)
Heart Rate , Music , Humans , Music/psychology , Heart Rate/physiology , Male , Female , Adult , Young Adult , Emotions/physiology , Reproducibility of Results , Affect/physiology
13.
Georgian Med News ; (348): 47-53, 2024 Mar.
Article in English | MEDLINE | ID: mdl-38807390

ABSTRACT

Emotional intelligence (EI) is an important psychological aspect that has a significant impact on the diagnosis and psychotherapy of mental disorders. It includes the ability to effectively recognise, understand, and regulate one's own emotions, as well as the ability to perceive and interact with the emotions of others. The purpose of the study was to assess and compare the role of emotional intelligence (EI) in different methods of diagnosing and treating mental disorders, as well as its impact on therapy outcomes. The study found that the development of EI improves therapy outcomes by increasing patients' emotional awareness and self-regulation. In addition, it is worth noting that minimising the likelihood of relapse in mental illness is associated with the ability of patients to cope with stress and overcome difficult circumstances. To sum up: In addition, developing emotional intelligence can improve patients' well-being by enhancing their interpersonal relationships, expanding their social network, and mitigating feelings of social isolation. The results of the study indicate that EI should be taken into account in clinical practice and that new psychotherapeutic techniques can be developed to improve the outcomes of the treatment of mental disorders.


Subject(s)
Emotional Intelligence , Mental Disorders , Psychotherapy , Humans , Mental Disorders/therapy , Mental Disorders/diagnosis , Mental Disorders/psychology , Psychotherapy/methods , Male , Adaptation, Psychological , Female , Emotions/physiology
14.
Sci Rep ; 14(1): 11571, 2024 05 21.
Article in English | MEDLINE | ID: mdl-38773125

ABSTRACT

This study delves into expressing primary emotions anger, happiness, sadness, and fear through drawings. Moving beyond the well-researched color-emotion link, it explores under-examined aspects like spatial concepts and drawing styles. Employing Python and OpenCV for objective analysis, we make a breakthrough by converting subjective perceptions into measurable data through 728 digital images from 182 university students. For the prominent color chosen for each emotion, the majority of participants chose red for anger (73.11%), yellow for happiness (17.8%), blue for sadness (51.1%), and black for fear (40.7%). Happiness led with the highest saturation (68.52%) and brightness (75.44%) percentages, while fear recorded the lowest in both categories (47.33% saturation, 48.78% brightness). Fear, however, topped in color fill percentage (35.49%), with happiness at the lowest (25.14%). Tangible imagery prevailed (71.43-83.52%), with abstract styles peaking in fear representations (28.57%). Facial expressions were a common element (41.76-49.45%). The study achieved an 81.3% predictive accuracy for anger, higher than the 71.3% overall average. Future research can build on these results by improving technological methods to quantify more aspects of drawing content. Investigating a more comprehensive array of emotions and examining factors influencing emotional drawing styles will further our understanding of visual-emotional communication.


Subject(s)
Emotions , Facial Expression , Humans , Emotions/physiology , Male , Female , Young Adult , Happiness , Anger/physiology , Adult , Fear/psychology , Sadness
15.
Sci Rep ; 14(1): 11590, 2024 05 21.
Article in English | MEDLINE | ID: mdl-38773178

ABSTRACT

Human interaction is immersed in laughter; though genuine and posed laughter are acoustically distinct, they are both crucial socio-emotional signals. In this novel study, autistic and non-autistic adults explicitly rated the affective properties of genuine and posed laughter. Additionally, we explored whether their self-reported everyday experiences with laughter differ. Both groups could differentiate between these two types of laughter. However, autistic adults rated posed laughter as more authentic and emotionally arousing than non-autistic adults, perceiving it to be similar to genuine laughter. Autistic adults reported laughing less, deriving less enjoyment from laughter, and experiencing difficulty in understanding the social meaning of other people's laughter compared to non-autistic people. Despite these differences, autistic adults reported using laughter socially as often as non-autistic adults, leveraging it to mediate social contexts. Our findings suggest that autistic adults show subtle differences in their perception of laughter, which may be associated with their struggles in comprehending the social meaning of laughter, as well as their diminished frequency and enjoyment of laughter in everyday scenarios. By combining experimental evidence with first-person experiences, this study suggests that autistic adults likely employ different strategies to understand laughter in everyday contexts, potentially leaving them socially vulnerable in communication.


Subject(s)
Autistic Disorder , Laughter , Humans , Laughter/psychology , Male , Adult , Female , Autistic Disorder/psychology , Autistic Disorder/physiopathology , Young Adult , Emotions/physiology , Middle Aged
16.
Sci Rep ; 14(1): 11686, 2024 05 22.
Article in English | MEDLINE | ID: mdl-38777852

ABSTRACT

Pain is rarely communicated alone, as it is often accompanied by emotions such as anger or sadness. Communicating these affective states involves shared representations. However, how an individual conceptually represents these combined states must first be tested. The objective of this study was to measure the interaction between pain and negative emotions on two types of facial representations of these states, namely visual (i.e., interactive virtual agents; VAs) and sensorimotor (i.e., one's production of facial configurations). Twenty-eight participants (15 women) read short written scenarios involving only pain or a combined experience of pain and a negative emotion (anger, disgust, fear, or sadness). They produced facial configurations representing these experiences on the faces of the VAs and on their face (own production or imitation of VAs). The results suggest that affective states related to a direct threat to the body (i.e., anger, disgust, and pain) share a similar facial representation, while those that present no immediate danger (i.e., fear and sadness) differ. Although visual and sensorimotor representations of these states provide congruent affective information, they are differently influenced by factors associated with the communication cycle. These findings contribute to our understanding of pain communication in different affective contexts.


Subject(s)
Emotions , Facial Expression , Pain , Humans , Female , Male , Pain/psychology , Pain/physiopathology , Adult , Emotions/physiology , Young Adult , Anger/physiology , Affect/physiology , Fear/psychology , Sadness/psychology
17.
PLoS One ; 19(5): e0304107, 2024.
Article in English | MEDLINE | ID: mdl-38781193

ABSTRACT

AIM: In a previous study, we reported that watching two-dimensional videos of earthquakes significantly reduced sympathetic nerve activity in healthy young adults. In the present study, we aimed to investigate the emotional responses to earthquakes using immersive virtual reality (VR), which can provide a more realistic experience. METHODS: In total, 24 healthy young adults (12 males, 21.4 ± 0.2 years old) participated. Participants were required to watch earthquake and neutral videos while wearing a head-mounted display and near-infrared spectroscopy (NIRS), during which physiological signals, including pulse rate and cerebral blood flow (CBF) in the dorsolateral prefrontal cortex, were measured. We also analyzed changes in sympathetic and parasympathetic indices and obtained seven emotion ratings: valence, arousal, dominance, fear, astonishment, anxiety, and panic. RESULTS: The VR earthquake videos evoked negative subjective emotions, and the pulse rate significantly decreased. Sympathetic nerve activity tended to decrease, whereas CBF in the left prefrontal cortex showed a slight increase, although this was not significant. CONCLUSIONS: This study showed that measurements combined with NIRS and immersive VR have the potential to capture emotional responses to different stimuli.


Subject(s)
Earthquakes , Emotions , Heart Rate , Spectroscopy, Near-Infrared , Virtual Reality , Humans , Male , Spectroscopy, Near-Infrared/methods , Emotions/physiology , Female , Young Adult , Heart Rate/physiology , Cerebrovascular Circulation/physiology , Adult , Prefrontal Cortex/physiology , Arousal/physiology
18.
Curr Biol ; 34(9): R340-R343, 2024 05 06.
Article in English | MEDLINE | ID: mdl-38714159

ABSTRACT

The posterior cerebellum is emerging as a key structure for social cognition. A new study causally demonstrates its early involvement during emotion perception and functional connectivity with the posterior superior temporal sulcus, a cortical hub of the social brain.


Subject(s)
Cerebellum , Social Perception , Humans , Cerebellum/physiology , Emotions/physiology , Social Cognition , Temporal Lobe/physiology
19.
Autism Res ; 17(5): 934-946, 2024 May.
Article in English | MEDLINE | ID: mdl-38716802

ABSTRACT

Autistic people exhibit atypical use of prior information when processing simple perceptual stimuli; yet, it remains unclear whether and how these difficulties in using priors extend to complex social stimuli. Here, we compared autistic people without accompanying intellectual disability and nonautistic people in their ability to acquire an "emotional prior" of a facial expression and update this prior to a different facial expression of the same identity. Participants performed a two-interval same/different discrimination task between two facial expressions. To study the acquisition of the prior, we examined how discrimination was modified by the contraction of the perceived facial expressions toward the average of presented stimuli (i.e., regression to the mean). At first, facial expressions surrounded one average emotional prior (mostly sad or angry), and then the average switched (to mostly angry or sad, accordingly). Autistic people exhibited challenges in facial discrimination, and yet acquired the first prior, demonstrating typical regression-to-the-mean effects. However, unlike nonautistic people, autistic people did not update their perception to the second prior, suggesting they are less flexible in updating an acquired prior of emotional expressions. Our findings shed light on the perception of emotional expressions, one of the most pressing challenges in autism.


Subject(s)
Anger , Autistic Disorder , Facial Expression , Humans , Female , Male , Adult , Anger/physiology , Autistic Disorder/psychology , Young Adult , Learning/physiology , Social Perception , Adolescent , Emotions/physiology , Discrimination, Psychological/physiology
20.
Cogn Sci ; 48(5): e13453, 2024 05.
Article in English | MEDLINE | ID: mdl-38742274

ABSTRACT

"Autonomous Sensory Meridian Response" (ASMR) refers to a sensory-emotional experience that was first explicitly identified and named within the past two decades in online discussion boards. Since then, there has been mounting psychological and neural evidence of a clustering of properties common to the phenomenon of ASMR, including convergence on the set of stimuli that trigger the experience, the properties of the experience itself, and its downstream effects. Moreover, psychological instruments have begun to be developed and employed in an attempt to measure it. Based on this empirical work, we make the case that despite its nonscientific origins, ASMR is a good candidate for being a real kind in the cognitive sciences. The phenomenon appears to have a robust causal profile and may also have an adaptive evolutionary history. We also argue that a more thorough understanding of the distinctive type of phenomenal experience involved in an ASMR episode can shed light on the functions of consciousness, and ultimately undermine certain "cognitive" theories of consciousness. We conclude that ASMR should be the subject of more extensive scientific investigation, particularly since it may also have the potential for therapeutic applications.


Subject(s)
Consciousness , Humans , Consciousness/physiology , Emotions/physiology , Sensation/physiology
SELECTION OF CITATIONS
SEARCH DETAIL
...