Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 9 de 9
Filter
Add more filters










Database
Language
Publication year range
1.
Sci Rep ; 13(1): 20190, 2023 11 18.
Article in English | MEDLINE | ID: mdl-37980446

ABSTRACT

The COVID-19 pandemic has led to a surge in video content consumption, but measuring viewers' empathy towards the content has been limited to subjective evaluations or attached physiological apparatus. In this study, we introduced a novel non-contact physiological method for measuring empathy towards video content by assessing the synchronization of facial micromovements between the subject and object (i.e., person) within the media. We recorded facial micromovements and heart rate variability (HRV) remotely using a camera while 62 subjects watched one video each, designed and validated to elicit one of four two-dimensional emotions: pleasant-aroused, pleasant-relaxed, unpleasant-aroused, and unpleasant-relaxed. We also collected the subjects' self-assessed emotions and empathy using a questionnaire. The results confirmed that the stimuli effectively induced the intended arousal in the subjects, as evidenced by both self-reported emotions and HRV responses that suggested higher arousal was associated with stronger activity in the sympathetic nervous system. A closer examination of HRV indicators, such as SDNN and Total Power values, showed an amplification during the unpleasant state. We interpret this as the body's dynamic response to stressors, underlining the autonomic nervous system's proactive role in responding to such stimuli. In a broader context, our results emphasized that while subjects showcased augmented empathy during aroused conditions, the introduction of stressors, represented by unpleasant content, led to a dampening of this empathetic response. This findings demonstrate the potential of non-contact physiological methods for measuring empathy toward video content.


Subject(s)
Empathy , Pandemics , Humans , Emotions/physiology , Arousal/physiology , Surveys and Questionnaires
2.
Sensors (Basel) ; 23(11)2023 May 29.
Article in English | MEDLINE | ID: mdl-37299888

ABSTRACT

In the era of user-generated content (UGC) and virtual interactions within the metaverse, empathic digital content has become increasingly important. This study aimed to quantify human empathy levels when exposed to digital media. To assess empathy, we analyzed brain wave activity and eye movements in response to emotional videos. Forty-seven participants watched eight emotional videos, and we collected their brain activity and eye movement data during the viewing. After each video session, participants provided subjective evaluations. Our analysis focused on the relationship between brain activity and eye movement in recognizing empathy. The findings revealed the following: (1) Participants were more inclined to empathize with videos depicting pleasant-arousal and unpleasant-relaxed emotions. (2) Saccades and fixation, key components of eye movement, occurred simultaneously with specific channels in the prefrontal and temporal lobes. (3) Eigenvalues of brain activity and pupil changes showed synchronization between the right pupil and certain channels in the prefrontal, parietal, and temporal lobes during empathic responses. These results suggest that eye movement characteristics can serve as an indicator of the cognitive empathic process when engaging with digital content. Furthermore, the observed changes in pupil size result from a combination of emotional and cognitive empathy elicited by the videos.


Subject(s)
Empathy , Eye Movements , Humans , Internet , Emotions/physiology , Temporal Lobe , Brain/physiology
3.
Sensors (Basel) ; 22(18)2022 Sep 06.
Article in English | MEDLINE | ID: mdl-36146082

ABSTRACT

Simultaneous activation of brain regions (i.e., brain connection features) is an essential mechanism of brain activity in emotion recognition of visual content. The occipital cortex of the brain is involved in visual processing, but the frontal lobe processes cranial nerve signals to control higher emotions. However, recognition of emotion in visual content merits the analysis of eye movement features, because the pupils, iris, and other eye structures are connected to the nerves of the brain. We hypothesized that when viewing video content, the activation features of brain connections are significantly related to eye movement characteristics. We investigated the relationship between brain connectivity (strength and directionality) and eye movement features (left and right pupils, saccades, and fixations) when 47 participants viewed an emotion-eliciting video on a two-dimensional emotion model (valence and arousal). We found that the connectivity eigenvalues of the long-distance prefrontal lobe, temporal lobe, parietal lobe, and center are related to cognitive activity involving high valance. In addition, saccade movement was correlated with long-distance occipital-frontal connectivity. Finally, short-distance connectivity results showed emotional fluctuations caused by unconscious stimulation.


Subject(s)
Eye Movements , Magnetic Resonance Imaging , Brain , Brain Mapping , Emotions/physiology , Humans , Magnetic Resonance Imaging/methods , Parietal Lobe/physiology
4.
Sensors (Basel) ; 22(5)2022 Feb 22.
Article in English | MEDLINE | ID: mdl-35270846

ABSTRACT

The success of digital content depends largely on whether viewers empathize with stories and narratives. Researchers have investigated the elements that may elicit empathy from viewers. Empathic response involves affective and cognitive processes and is expressed through multiple verbal and nonverbal modalities. Specifically, eye movements communicate emotions and intentions and may reflect an empathic status. This study explores feature changes in eye movements when a viewer empathizes with the video's content. Seven feature variables of eye movements (change of pupil diameter, peak pupil dilation, very short, mid, over long fixation duration, saccadic amplitude, and saccadic count) were extracted from 47 participants who viewed eight videos (four empathic videos and four non-empathic videos) distributed in a two-dimensional emotion axis (arousal and valence). The results showed that viewers' saccadic amplitude and peak pupil dilation in the eigenvalues of eye movements increased in the empathic condition. The fixation time and pupil size change showed limited significance, and whether there were asymmetric pupil responses between the left and right pupils remained inconclusive. Our investigation suggests that saccadic amplitude and peak pupil dilation are reliable measures for recognizing whether viewers empathize with content. The findings provide physiological evidence based on eye movements that both affective and cognitive processes accompany empathy during media consumption.


Subject(s)
Empathy , Saccades , Emotions , Eye Movements , Humans , Pupil/physiology
5.
Sensors (Basel) ; 21(23)2021 Nov 24.
Article in English | MEDLINE | ID: mdl-34883820

ABSTRACT

Tracking consumer empathy is one of the biggest challenges for advertisers. Although numerous studies have shown that consumers' empathy affects purchasing, there are few quantitative and unobtrusive methods for assessing whether the viewer is sharing congruent emotions with the advertisement. This study suggested a non-contact method for measuring empathy by evaluating the synchronization of micro-movements between consumers and people within the media. Thirty participants viewed 24 advertisements classified as either empathy or non-empathy advertisements. For each viewing, we recorded the facial data and subjective empathy scores. We recorded the facial micro-movements, which reflect the ballistocardiography (BCG) motion, through the carotid artery remotely using a camera without any sensory attachment to the participant. Synchronization in cardiovascular measures (e.g., heart rate) is known to indicate higher levels of empathy. We found that through cross-entropy analysis, the more similar the micro-movements between the participant and the person in the advertisement, the higher the participant's empathy scores for the advertisement. The study suggests that non-contact BCG methods can be utilized in cases where sensor attachment is ineffective (e.g., measuring empathy between the viewer and the media content) and can be a complementary method to subjective empathy scales.


Subject(s)
Ballistocardiography , Empathy , Emotions , Heart Rate , Humans , Movement
6.
Sensors (Basel) ; 21(21)2021 Oct 26.
Article in English | MEDLINE | ID: mdl-34770419

ABSTRACT

Watching videos online has become part of a relaxed lifestyle. The music in videos has a sensitive influence on human emotions, perception, and imaginations, which can make people feel relaxed or sad, and so on. Therefore, it is particularly important for people who make advertising videos to understand the relationship between the physical elements of music and empathy characteristics. The purpose of this paper is to analyze the music features in an advertising video and extract the music features that make people empathize. This paper combines both methods of the power spectrum of MFCC and image RGB analysis to find the audio feature vector. In spectral analysis, the eigenvectors obtained in the analysis process range from blue (low range) to green (medium range) to red (high range). The machine learning random forest classifier is used to classify the data obtained by machine learning, and the trained model is used to monitor the development of an advertisement empathy system in real time. The result is that the optimal model is obtained with the training accuracy result of 99.173% and a test accuracy of 86.171%, which can be deemed as correct by comparing the three models of audio feature value analysis. The contribution of this study can be summarized as follows: (1) the low-frequency and high-amplitude audio in the video is more likely to resonate than the high-frequency and high-amplitude audio; (2) it is found that frequency and audio amplitude are important attributes for describing waveforms by observing the characteristics of the machine learning classifier; (3) a new audio extraction method is proposed to induce human empathy. That is, the feature value extracted by the method of spectrogram image features of audio has the most ability to arouse human empathy.


Subject(s)
Empathy , Music , Emotions , Humans , Image Processing, Computer-Assisted , Machine Learning
7.
Sensors (Basel) ; 21(20)2021 Oct 12.
Article in English | MEDLINE | ID: mdl-34695976

ABSTRACT

Remote sensing of vital signs has been developed to improve the measurement environment by using a camera without a skin-contact sensor. The camera-based method is based on two concepts, namely color and motion. The color-based method, remote photoplethysmography (RPPG), measures the color variation of the face generated by reflectance of blood, whereas the motion-based method, remote ballistocardiography (RBCG), measures the subtle motion of the head generated by heartbeat. The main challenge of remote sensing is overcoming the noise of illumination variance and motion artifacts. The studies on remote sensing have focused on the blind source separation (BSS) method for RGB colors or motions of multiple facial points to overcome the noise. However, they have still been limited in their real-world applications. This study hypothesized that BSS-based combining of colors and the motions can improve the accuracy and feasibility of remote sensing in daily life. Thus, this study proposed a fusion method to estimate heart rate based on RPPG and RBCG by the BSS methods such as ensemble averaging (EA), principal component analysis (PCA), and independent component analysis (ICA). The proposed method was verified by comparing it with previous RPPG and RBCG from three datasets according to illumination variance and motion artifacts. The three main contributions of this study are as follows: (1) the proposed method based on RPPG and RBCG improved the remote sensing with the benefits of each measurement; (2) the proposed method was demonstrated by comparing it with previous methods; and (3) the proposed method was tested in various measurement conditions for more practical applications.


Subject(s)
Ballistocardiography , Photoplethysmography , Algorithms , Artifacts , Heart Rate , Signal Processing, Computer-Assisted
8.
Sensors (Basel) ; 19(23)2019 Dec 02.
Article in English | MEDLINE | ID: mdl-31810275

ABSTRACT

Embodied emotion is associated with interaction among a person's physiological responses, behavioral patterns, and environmental factors. However, most methods for determining embodied emotion has been considered on only fragmentary independent variables and not their inter-connectivity. This study suggests a method for determining the embodied emotion considering interactions among three factors: the physiological response, behavioral patterns, and an environmental factor based on life-logging. The physiological response was analyzed as heart rate variability (HRV) variables. The behavioral pattern was calculated from features of Global Positioning System (GPS) locations that indicate spatiotemporal property. The environmental factor was analyzed as the ambient noise, which is an external stimulus. These data were mapped with the emotion of that time. The emotion was evaluated on a seven-point scale for arousal level and valence level according to Russell's model of emotion. These data were collected from 79 participants in daily life for two weeks. Their relationships among data were analyzed by the multiple regression analysis, after pre-processing the respective data. As a result, significant differences between the arousal level and valence level of emotion were observed based on their relations. The contributions of this study can be summarized as follows: (1) The emotion was recognized in real-life for a more practical application; (2) distinguishing the interactions that determine the levels of arousal and positive emotion by analyzing relationships of individuals' life-log data. Through this, it was verified that emotion can be changed according to the interaction among the three factors, which was overlooked in previous emotion recognition.


Subject(s)
Emotions/physiology , Female , Geographic Information Systems , Humans , Male , Photoplethysmography
9.
Sensors (Basel) ; 19(15)2019 Jul 24.
Article in English | MEDLINE | ID: mdl-31344939

ABSTRACT

Heart rate has been measured comfortably using a camera without the skin-contact by the development of vision-based measurement. Despite the potential of the vision-based measurement, it has still presented limited ability due to the noise of illumination variance and motion artifacts. Remote ballistocardiography (BCG) was used to estimate heart rate from the ballistocardiographic head movements generated by the flow of blood through the carotid arteries. It was robust to illumination variance but still limited in the motion artifacts such as facial expressions and voluntary head motions. Recent studies on remote BCG focus on the improvement of signal extraction by minimizing the motion artifacts. They simply estimated the heart rate from the cardiac signal using peak detection and fast fourier transform (FFT). However, the heart rate estimation based on peak detection and FFT depend on the robust signal estimation. Thus, if the cardiac signal is contaminated with some noise, the heart rate cannot be estimated accurately. This study aimed to develop a novel method to improve heart rate estimation from ballistocardiographic head movements using the unsupervised clustering. First, the ballistocardiographic head movements were measured from facial video by detecting facial points using the good-feature-to-track (GFTT) algorithm and by tracking using the Kanade-Lucas-Tomasi (KLT) tracker. Second, the cardiac signal was extracted from the ballistocardiographic head movements by bandpass filter and principal component analysis (PCA). The relative power density (RPD) was extracted from its power spectrum between 0.75 Hz and 2.5 Hz. Third, the unsupervised clustering was performed to construct a model to estimate the heart rate from the RPD using the dataset consisting of the RPD and the heart rate measured from electrocardiogram (ECG). Finally, the heart rate was estimated from the RPD using the model. The proposed method was verified by comparing it with previous methods using the peak detection and the FFT. As a result, the proposed method estimated a more accurate heart rate than previous methods in three experiments by levels of the motion artifacts consisting of facial expressions and voluntary head motions. The four main contributions are as follows: (1) the unsupervised clustering improved the heart rate estimation by overcoming the motion artifacts (i.e., facial expressions and voluntary head motions); (2) the proposed method was verified by comparing with the previous methods using the peak detection and the FFT; (3) the proposed method can be combined with existing vision-based measurement and can improve their performance; (4) the proposed method was tested by three experiments considering the realistic environment including the motion artifacts, thus, it increases the possibility of the non-contact measurement in daily life.


Subject(s)
Ballistocardiography , Head Movements/physiology , Heart Rate/physiology , Movement/physiology , Electrocardiography , Humans , Signal Processing, Computer-Assisted
SELECTION OF CITATIONS
SEARCH DETAIL
...