Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
PLoS One ; 17(10): e0275281, 2022.
Article in English | MEDLINE | ID: mdl-36301975

ABSTRACT

The study of gaze perception has largely focused on a single cue (the eyes) in two-dimensional settings. While this literature suggests that 2D gaze perception is shaped by atypical development, as in Autism Spectrum Disorder (ASD), gaze perception is in reality contextually-sensitive, perceived as an emergent feature conveyed by the rotation of the pupils and head. We examined gaze perception in this integrative context, across development, among children and adolescents developing typically or with ASD with both 2D and 3D stimuli. We found that both groups utilized head and pupil rotations to judge gaze on a 2D face. But when evaluating the gaze of a physically-present, 3D robot, the same ASD observers used eye cues less than their typically-developing peers. This demonstrates that emergent gaze perception is a slowly developing process that is surprisingly intact, albeit weakened in ASD, and illustrates how new technology can bridge visual and clinical science.


Subject(s)
Autism Spectrum Disorder , Child , Adolescent , Humans , Fixation, Ocular , Pupil , Cues , Perception
2.
Front Robot AI ; 9: 965369, 2022.
Article in English | MEDLINE | ID: mdl-35880215

ABSTRACT

[This corrects the article DOI: 10.3389/frobt.2022.855819.].

3.
Front Robot AI ; 9: 855819, 2022.
Article in English | MEDLINE | ID: mdl-35677082

ABSTRACT

Children with Autism Spectrum Disorder (ASD) experience deficits in verbal and nonverbal communication skills including motor control, turn-taking, and emotion recognition. Innovative technology, such as socially assistive robots, has shown to be a viable method for Autism therapy. This paper presents a novel robot-based music-therapy platform for modeling and improving the social responses and behaviors of children with ASD. Our autonomous social interactive system consists of three modules. Module one provides an autonomous initiative positioning system for the robot, NAO, to properly localize and play the instrument (Xylophone) using the robot's arms. Module two allows NAO to play customized songs composed by individuals. Module three provides a real-life music therapy experience to the users. We adopted Short-time Fourier Transform and Levenshtein distance to fulfill the design requirements: 1) "music detection" and 2) "smart scoring and feedback", which allows NAO to understand music and provide additional practice and oral feedback to the users as applicable. We designed and implemented six Human-Robot-Interaction (HRI) sessions including four intervention sessions. Nine children with ASD and seven Typically Developing participated in a total of fifty HRI experimental sessions. Using our platform, we collected and analyzed data on social behavioral changes and emotion recognition using Electrodermal Activity (EDA) signals. The results of our experiments demonstrate most of the participants were able to complete motor control tasks with 70% accuracy. Six out of the nine ASD participants showed stable turn-taking behavior when playing music. The results of automated emotion classification using Support Vector Machines illustrates that emotional arousal in the ASD group can be detected and well recognized via EDA bio-signals. In summary, the results of our data analyses, including emotion classification using EDA signals, indicate that the proposed robot-music based therapy platform is an attractive and promising assistive tool to facilitate the improvement of fine motor control and turn-taking skills in children with ASD.

4.
Dev Sci ; 23(2): e12886, 2020 03.
Article in English | MEDLINE | ID: mdl-31271685

ABSTRACT

Gaze is an emergent visual feature. A person's gaze direction is perceived not just based on the rotation of their eyes, but also their head. At least among adults, this integrative process appears to be flexible such that one feature can be weighted more heavily than the other depending on the circumstances. Yet it is unclear how this weighting might vary across individuals or across development. When children engage emergent gaze, do they prioritize cues from the head and eyes similarly to adults? Is the perception of gaze among individuals with autism spectrum disorder (ASD) emergent, or is it reliant on a single feature? Sixty adults (M = 29.86 years-of-age), thirty-seven typically developing children and adolescents (M = 9.3 years-of-age; range = 7-15), and eighteen children with ASD (M = 9.72 years-of-age; range = 7-15) viewed faces with leftward, rightward, or direct head rotations in conjunction with leftward or rightward pupil rotations, and then indicated whether the face was looking leftward or rightward. All individuals, across development and ASD status, used head rotation to infer gaze direction, albeit with some individual differences. However, the use of pupil rotation was heavily dependent on age. Finally, children with ASD used pupil rotation significantly less than typically developing (TD) children when inferring gaze direction, even after accounting for age. Our approach provides a novel framework for understanding individual and group differences in gaze as it is actually perceived-as an emergent feature. Furthermore, this study begins to address an important gap in ASD literature, taking the first look at emergent gaze perception in this population.


Subject(s)
Fixation, Ocular/physiology , Visual Perception/physiology , Adolescent , Adult , Autism Spectrum Disorder/physiopathology , Child , Cues , Face , Female , Humans , Male , Pupil , Rotation
SELECTION OF CITATIONS
SEARCH DETAIL
...