Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Type of study
Language
Publication year range
1.
Article in English | MEDLINE | ID: mdl-38082676

ABSTRACT

Raising awareness of environmental challenges represents an important issue for researchers and scientists. As public opinion remains ambiguous, implicit attitudes toward climate change must be investigated. A custom Single-Category Implicit Association Test, a version of the Implicit Association Test, was developed to assess climate change beliefs. It was administered to 20 subjects while eye movements were tracked using a smart glasses system. Eye gaze patterns were analysed to understand whether they could reflect implicit attitudes toward nature. Recurrence Quantification Analysis was performed to extract 13 features from the eye-tracking data, which were used to perform statistical analyses. Significant differences were found between target stimuli (words related to climate change) and bad attributes in reaction time, and between target stimuli and good attributes in diagonal length entropy, suggesting that eye-tracking may provide an alternative source of information to electroencephalography in modeling and predicting implicit attitudes.


Subject(s)
Attitude , Eye-Tracking Technology , Humans , Eye Movements , Fixation, Ocular , Reaction Time
2.
Article in English | MEDLINE | ID: mdl-38082982

ABSTRACT

This work reports on physiological electroencephalographic (EEG) correlates in cognitive and emotional processes within the discrimination between synthetic and real faces visual stimuli. Human perception of manipulated data has been addressed in the literature from several perspectives. Researchers have investigated how the use of deep fakes alters people's ability in face-processing tasks, such as face recognition. Although recent studies showed that humans, on average, are still able to correctly recognize synthetic faces, this study investigates whether those findings still hold considering the latest advancements in AI-based, synthetic image creation. Specifically, 18-channels EEG signals from 21 healthy subjects were analyzed during a visual experiment where synthetic and actual emotional stimuli were administered. According to recent literature, participants were able to discriminate the real faces from the synthetic ones, by correctly classifying about 77% of all images. Preliminary encouraging results showed statistical significant differences in brain activation in both stimuli (synthetic and real) classification and emotional response.


Subject(s)
Emotions , Recognition, Psychology , Humans , Recognition, Psychology/physiology , Emotions/physiology , Brain/physiology , Electroencephalography , Brain Mapping
3.
Brain Sci ; 13(9)2023 Aug 23.
Article in English | MEDLINE | ID: mdl-37759834

ABSTRACT

The human brain's role in face processing (FP) and decision making for social interactions depends on recognizing faces accurately. However, the prevalence of deepfakes, AI-generated images, poses challenges in discerning real from synthetic identities. This study investigated healthy individuals' cognitive and emotional engagement in a visual discrimination task involving real and deepfake human faces expressing positive, negative, or neutral emotions. Electroencephalographic (EEG) data were collected from 23 healthy participants using a 21-channel dry-EEG headset; power spectrum and event-related potential (ERP) analyses were performed. Results revealed statistically significant activations in specific brain areas depending on the authenticity and emotional content of the stimuli. Power spectrum analysis highlighted a right-hemisphere predominance in theta, alpha, high-beta, and gamma bands for real faces, while deepfakes mainly affected the frontal and occipital areas in the delta band. ERP analysis hinted at the possibility of discriminating between real and synthetic faces, as N250 (200-300 ms after stimulus onset) peak latency decreased when observing real faces in the right frontal (LF) and left temporo-occipital (LTO) areas, but also within emotions, as P100 (90-140 ms) peak amplitude was found higher in the right temporo-occipital (RTO) area for happy faces with respect to neutral and sad ones.

SELECTION OF CITATIONS
SEARCH DETAIL
...