Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 24(8)2024 Apr 19.
Artigo em Inglês | MEDLINE | ID: mdl-38676235

RESUMO

Most human emotion recognition methods largely depend on classifying stereotypical facial expressions that represent emotions. However, such facial expressions do not necessarily correspond to actual emotional states and may correspond to communicative intentions. In other cases, emotions are hidden, cannot be expressed, or may have lower arousal manifested by less pronounced facial expressions, as may occur during passive video viewing. This study improves an emotion classification approach developed in a previous study, which classifies emotions remotely without relying on stereotypical facial expressions or contact-based methods, using short facial video data. In this approach, we desire to remotely sense transdermal cardiovascular spatiotemporal facial patterns associated with different emotional states and analyze this data via machine learning. In this paper, we propose several improvements, which include a better remote heart rate estimation via a preliminary skin segmentation, improvement of the heartbeat peaks and troughs detection process, and obtaining a better emotion classification accuracy by employing an appropriate deep learning classifier using an RGB camera input only with data. We used the dataset obtained in the previous study, which contains facial videos of 110 participants who passively viewed 150 short videos that elicited the following five emotion types: amusement, disgust, fear, sexual arousal, and no emotion, while three cameras with different wavelength sensitivities (visible spectrum, near-infrared, and longwave infrared) recorded them simultaneously. From the short facial videos, we extracted unique high-resolution spatiotemporal, physiologically affected features and examined them as input features with different deep-learning approaches. An EfficientNet-B0 model type was able to classify participants' emotional states with an overall average accuracy of 47.36% using a single input spatiotemporal feature map obtained from a regular RGB camera.


Assuntos
Aprendizado Profundo , Emoções , Expressão Facial , Frequência Cardíaca , Humanos , Emoções/fisiologia , Frequência Cardíaca/fisiologia , Gravação em Vídeo/métodos , Processamento de Imagem Assistida por Computador/métodos , Face/fisiologia , Feminino , Masculino
3.
Sci Rep ; 12(1): 11188, 2022 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-35778591

RESUMO

We describe a new method for remote emotional state assessment using multispectral face videos, and present our findings: unique transdermal, cardiovascular and spatiotemporal facial patterns associated with different emotional states. The method does not rely on stereotypical facial expressions but utilizes different wavelength sensitivities (visible spectrum, near-infrared, and long-wave infrared) to gauge correlates of autonomic nervous system activity spatially and temporally distributed across the human face (e.g., blood flow, hemoglobin concentration, and temperature). We conducted an experiment where 110 participants viewed 150 short emotion-eliciting videos and reported their emotional experience, while three cameras recorded facial videos with multiple wavelengths. Spatiotemporal multispectral features from the multispectral videos were used as inputs to a machine learning model that was able to classify participants' emotional state (i.e., amusement, disgust, fear, sexual arousal, or no emotion) with satisfactory results (average ROC AUC score of 0.75), while providing feature importance analysis that allows the examination of facial occurrences per emotional state. We discuss findings concerning the different spatiotemporal patterns associated with different emotional states as well as the different advantages of the current method over existing approaches to emotion detection.


Assuntos
Sistema Cardiovascular , Asco , Emoções , Medo , Humanos , Gravação de Videoteipe
4.
J Exp Psychol Gen ; 149(11): 2154-2168, 2020 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-32309988

RESUMO

Previous studies suggested that the 2016 presidential elections gave rise to pathological levels of election-related distress in liberal Americans; however, it has also been suggested that the public discourse and the professional discourse have increasingly overgeneralized concepts of trauma and psychopathology. In light of this, in the current research, we utilized an array of big data measures and asked whether a political loss in a participatory democracy can indeed lead to psychopathology. We observed that liberals report being more depressed when asked directly about the effects of the election; however, more indirect measures show a short-lived or nonexistent effect. We examined self-report measures of clinical depression with and without a reference to the election (Studies 1A & 1B), analyzed Twitter discourse and measured users' levels of depression using a machine-learning-based model (Study 2), conducted time-series analysis of depression-related search behavior on Google (Study 3), examined the proportion of antidepressants consumption in Medicaid data (Study 4), and analyzed daily surveys of hundreds of thousands of Americans (Study 5), and saw that at the aggregate level, empirical data reject the accounts of "Trump Depression." We discuss possible interpretations of the discrepancies between the direct and indirect measures. The current investigation demonstrates how big-data sources can provide an unprecedented view of the psychological consequences of political events and sheds light on the complex relationship between the political and the personal spheres. (PsycInfo Database Record (c) 2020 APA, all rights reserved).


Assuntos
Depressão/psicologia , Política , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Big Data , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Autorrelato , Inquéritos e Questionários , Estados Unidos , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...