Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
J Biomol Struct Dyn ; : 1-12, 2023 Sep 18.
Article in English | MEDLINE | ID: mdl-37723894

ABSTRACT

Determining the structure-odor relationship has always been a very challenging task. The main challenge in investigating the correlation between the molecular structure and its associated odor is the ambiguous and obscure nature of verbally defined odor descriptors, particularly when the odorant molecules are from different sources. With the recent developments in machine learning (ML) technology, ML and data analytic techniques are significantly being used for quantitative structure-activity relationship (QSAR) in the chemistry domain toward knowledge discovery where the traditional Edisonian methods have not been useful. The smell perception of odorant molecules is one of the aforementioned tasks, as olfaction is one of the least understood senses as compared to other senses. In this study, the XGBoost odor prediction model was generated to classify smells of odorant molecules from their SMILES strings. We first collected the dataset of 1278 odorant molecules with seven basic odor descriptors, and then 1875 physicochemical properties of odorant molecules were calculated. To obtain relevant physicochemical features, a feature reduction algorithm called PCA was also employed. The ML model developed in this study was able to predict all seven basic smells with high precision (>99%) and high sensitivity (>99%) when tested on an independent test dataset. The results of the proposed study were also compared with three recently conducted studies. The results indicate that the XGBoost-PCA model performed better than the other models for predicting common odor descriptors. The methodology and ML model developed in this study may be helpful in understanding the structure-odor relationship.Communicated by Ramaswamy H. Sarma.

2.
Brain Sci ; 12(8)2022 Aug 19.
Article in English | MEDLINE | ID: mdl-36009166

ABSTRACT

While naturalistic stimuli, such as movies, better represent the complexity of the real world and are perhaps crucial to understanding the dynamics of emotion processing, there is limited research on emotions with naturalistic stimuli. There is a need to understand the temporal dynamics of emotion processing and their relationship to different dimensions of emotion experience. In addition, there is a need to understand the dynamics of functional connectivity underlying different emotional experiences that occur during or prior to such experiences. To address these questions, we recorded the EEG of participants and asked them to mark the temporal location of their emotional experience as they watched a video. We also obtained self-assessment ratings for emotional multimedia stimuli. We calculated dynamic functional the connectivity (DFC) patterns in all the frequency bands, including information about hubs in the network. The change in functional networks was quantified in terms of temporal variability, which was then used in regression analysis to evaluate whether temporal variability in DFC (tvDFC) could predict different dimensions of emotional experience. We observed that the connectivity patterns in the upper beta band could differentiate emotion categories better during or prior to the reported emotional experience. The temporal variability in functional connectivity dynamics is primarily related to emotional arousal followed by dominance. The hubs in the functional networks were found across the right frontal and bilateral parietal lobes, which have been reported to facilitate affect, interoception, action, and memory-related processing. Since our study was performed with naturalistic real-life resembling emotional videos, the study contributes significantly to understanding the dynamics of emotion processing. The results support constructivist theories of emotional experience and show that changes in dynamic functional connectivity can predict aspects of our emotional experience.

3.
Brain Sci ; 12(6)2022 May 29.
Article in English | MEDLINE | ID: mdl-35741588

ABSTRACT

Our brain continuously interacts with the body as we engage with the world. Although we are mostly unaware of internal bodily processes, such as our heartbeats, they may be influenced by and in turn influence our perception and emotional feelings. Although there is a recent focus on understanding cardiac interoceptive activity and interaction with brain activity during emotion processing, the investigation of cardiac-brain interactions with more ecologically valid naturalistic emotional stimuli is still very limited. We also do not understand how an essential aspect of emotions, such as context familiarity, influences affective feelings and is linked to statistical interaction between cardiac and brain activity. Hence, to answer these questions, we designed an exploratory study by recording ECG and EEG signals for the emotional events while participants were watching emotional movie clips. Participants also rated their familiarity with the stimulus on the familiarity scale. Linear mixed effect modelling was performed in which the ECG power and familiarity were considered as predictors of EEG power. We focused on three brain regions, including prefrontal (PF), frontocentral (FC) and parietooccipital (PO). The analyses showed that the interaction between the power of cardiac activity in the mid-frequency range and the power in specific EEG bands is dependent on familiarity, such that the interaction is stronger with high familiarity. In addition, the results indicate that arousal is predicted by cardiac-brain interaction, which also depends on familiarity. The results support emotional theories that emphasize context dependency and interoception. Multimodal studies with more realistic stimuli would further enable us to understand and predict different aspects of emotional experience.

4.
Neuroimage ; 102 Pt 1: 162-72, 2014 Nov 15.
Article in English | MEDLINE | ID: mdl-24269801

ABSTRACT

The purpose of this paper is twofold: (i) to investigate the emotion representation models and find out the possibility of a model with minimum number of continuous dimensions and (ii) to recognize and predict emotion from the measured physiological signals using multiresolution approach. The multimodal physiological signals are: Electroencephalogram (EEG) (32 channels) and peripheral (8 channels: Galvanic skin response (GSR), blood volume pressure, respiration pattern, skin temperature, electromyogram (EMG) and electrooculogram (EOG)) as given in the DEAP database. We have discussed the theories of emotion modeling based on i) basic emotions, ii) cognitive appraisal and physiological response approach and iii) the dimensional approach and proposed a three continuous dimensional representation model for emotions. The clustering experiment on the given valence, arousal and dominance values of various emotions has been done to validate the proposed model. A novel approach for multimodal fusion of information from a large number of channels to classify and predict emotions has also been proposed. Discrete Wavelet Transform, a classical transform for multiresolution analysis of signal has been used in this study. The experiments are performed to classify different emotions from four classifiers. The average accuracies are 81.45%, 74.37%, 57.74% and 75.94% for SVM, MLP, KNN and MMC classifiers respectively. The best accuracy is for 'Depressing' with 85.46% using SVM. The 32 EEG channels are considered as independent modes and features from each channel are considered with equal importance. May be some of the channel data are correlated but they may contain supplementary information. In comparison with the results given by others, the high accuracy of 85% with 13 emotions and 32 subjects from our proposed method clearly proves the potential of our multimodal fusion approach.


Subject(s)
Emotions/classification , Emotions/physiology , Models, Neurological , Electrophysiological Phenomena , Humans , Wavelet Analysis
SELECTION OF CITATIONS
SEARCH DETAIL
...