Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
J Autism Dev Disord ; 52(1): 73-88, 2022 Jan.
Article in English | MEDLINE | ID: mdl-33638804

ABSTRACT

This study examined social-pragmatic inferencing, visual social attention and physiological reactivity to complex social scenes. Participants were autistic young adults (n = 14) and a control group of young adults (n = 14) without intellectual disability. Results indicate between-group differences in social-pragmatic inferencing, moment-level social attention and heart rate variability (HRV) reactivity. A key finding suggests associations between increased moment-level social attention to facial emotion expressions, better social-pragmatic inferencing and greater HRV suppression in autistic young adults. Supporting previous research, better social-pragmatic inferencing was found associated with less autistic traits.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Attention , Emotions , Facial Expression , Humans , Young Adult
3.
PLoS One ; 10(9): e0138198, 2015.
Article in English | MEDLINE | ID: mdl-26407322

ABSTRACT

Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that 'saliency map', 'fixation histogram', 'histogram of fixation duration', and 'histogram of saccade slope' are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images.


Subject(s)
Emotions/physiology , Eye Movements/physiology , Recognition, Psychology , Visual Perception/physiology , Attention/physiology , Female , Fixation, Ocular/physiology , Humans , Male , Observer Variation , Photic Stimulation , Recognition, Psychology/physiology , User-Computer Interface
SELECTION OF CITATIONS
SEARCH DETAIL
...