Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 3 de 3
Filter
Add more filters










Database
Language
Publication year range
1.
Sensors (Basel) ; 22(7)2022 Apr 04.
Article in English | MEDLINE | ID: mdl-35408387

ABSTRACT

Teaching is an activity that requires understanding the class's reaction to evaluate the teaching methodology effectiveness. This operation can be easy to achieve in small classrooms, while it may be challenging to do in classes of 50 or more students. This paper proposes a novel Internet of Things (IoT) system to aid teachers in their work based on the redundant use of non-invasive techniques such as facial expression recognition and physiological data analysis. Facial expression recognition is performed using a Convolutional Neural Network (CNN), while physiological data are obtained via Photoplethysmography (PPG). By recurring to Russel's model, we grouped the most important Ekman's facial expressions recognized by CNN into active and passive. Then, operations such as thresholding and windowing were performed to make it possible to compare and analyze the results from both sources. Using a window size of 100 samples, both sources have detected a level of attention of about 55.5% for the in-presence lectures tests. By comparing results coming from in-presence and pre-recorded remote lectures, it is possible to note that, thanks to validation with physiological data, facial expressions alone seem useful in determining students' level of attention for in-presence lectures.


Subject(s)
Facial Recognition , Internet of Things , Facial Expression , Humans , Neural Networks, Computer , Photoplethysmography
2.
Front Robot AI ; 6: 18, 2019.
Article in English | MEDLINE | ID: mdl-33501034

ABSTRACT

Nowadays the market is becoming increasingly competitive, factories are required not only to enhance the product quality but also to reduce manufacturing and maintenance times. In an industrial context, modern factories are composed by many automated systems, such as industrial robots, which can perform different tasks. Although industrial robots are becoming more powerful and efficient, human workers are still required to accomplish different operations, such as training and maintenance procedures. The proposed research aims to assess a remote interaction system in an industrial training collaborative mixed-reality (CMR) environment. A remote expert user is capable of explaining a training procedure to an unskilled local user. Remote and local users interact using different interaction systems: the remote operator gives assistance using an immersive Virtual Reality (VR) device, whereas the local user interacts using a wearable Augmented Reality (AR) device. A comparison between an interaction based on the presence of a virtual human and one based on the use of abstract icons is proposed. In the first case, a virtual 3D representation of the remote technician is shown to the local user by using AR: the remote technician can pinpoint the components involved in the training procedure and the local user can visualize the instructions through some animations of the virtual avatar. In the second case, the local user cannot see a 3D representation of the remote technician; on the other hand, different 3D models, such as animated icons, are displayed to the local operator through AR depending on the component pinpointed by the remote technician in the virtual environment. Each 3D icon should suggest to the local user which component has to be manipulated at the current step of the procedure. Preliminary results suggest that the interface that requires less resources to be developed and managed should be preferred. Although in no audio condition the virtual avatar may improve the sense of presence of the remote technician, the use of abstract metaphors seems to be of primary importance to successfully complete an industrial task.

3.
SELECTION OF CITATIONS
SEARCH DETAIL
...