Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Type of study
Language
Publication year range
1.
PLoS One ; 18(9): e0287006, 2023.
Article in English | MEDLINE | ID: mdl-37773958

ABSTRACT

It is well-known that lighting conditions have an important influence on the automatic recognition of human expressions. Although the impact of lighting on the perception of emotions has been studied in different works, databases of facial expressions do not consider intentional lighting. In this work, a new database of facial expressions performed by virtual characters with four different lighting configurations is presented. This database, named UIBVFEDPlus-Light, is an extension of the previously published UIBVFED virtual facial expression dataset. It includes 100 characters, four lighting configurations and a software application that allows one to interactively visualize the expressions, and manage their intensity and lighting condition. Also, an experience of use is described to show how this work can raise new challenges to facial expression and emotion recognition techniques under usual lighting environments. Thus, opening new study perspectives in this area.


Subject(s)
Facial Expression , Facial Recognition , Humans , Lighting , Emotions , Software , Recognition, Psychology
2.
Sensors (Basel) ; 23(5)2023 Feb 23.
Article in English | MEDLINE | ID: mdl-36904689

ABSTRACT

We developed a mobile application for cervical rehabilitation that uses a non-invasive camera-based head-tracker sensor for monitoring neck movements. The intended user population should be able to use the mobile application in their own mobile device, but mobile devices have different camera sensors and screen dimensions that could affect the user performance and neck movement monitoring. In this work, we studied the influence of mobile devices type on camera-based monitoring of neck movements for rehabilitation purposes. We conducted an experiment to test whether the characteristics of a mobile device affect neck movements when using the mobile application with the head-tracker. The experiment consisted of the use of our application, containing an exergame, in three mobile devices. We used wireless inertial sensors to measure the real-time neck movements performed while using the different devices. The results showed that the effect of device type on neck movements was not statistically significant. We included the sex factor in the analysis, but there was no statistically significant interaction between sex and device variables. Our mobile application proved to be device-agnostic. This will allow intended users to use the mHealth application regardless of the type of device. Thus, future work can continue with the clinical evaluation of the developed application to analyse the hypothesis that the use of the exergame will improve therapeutic adherence in cervical rehabilitation.


Subject(s)
Mobile Applications , Telemedicine , Computers, Handheld
3.
Sensors (Basel) ; 23(1)2022 Dec 23.
Article in English | MEDLINE | ID: mdl-36616728

ABSTRACT

Recognizing facial expressions has been a persistent goal in the scientific community. Since the rise of artificial intelligence, convolutional neural networks (CNN) have become popular to recognize facial expressions, as images can be directly used as input. Current CNN models can achieve high recognition rates, but they give no clue about their reasoning process. Explainable artificial intelligence (XAI) has been developed as a means to help to interpret the results obtained by machine learning models. When dealing with images, one of the most-used XAI techniques is LIME. LIME highlights the areas of the image that contribute to a classification. As an alternative to LIME, the CEM method appeared, providing explanations in a way that is natural for human classification: besides highlighting what is sufficient to justify a classification, it also identifies what should be absent to maintain it and to distinguish it from another classification. This study presents the results of comparing LIME and CEM applied over complex images such as facial expression images. While CEM could be used to explain the results on images described with a reduced number of features, LIME would be the method of choice when dealing with images described with a huge number of features.


Subject(s)
Artificial Intelligence , Facial Expression , Humans , Machine Learning , Neural Networks, Computer
4.
Sensors (Basel) ; 21(6)2021 Mar 23.
Article in English | MEDLINE | ID: mdl-33806813

ABSTRACT

Vision-based interfaces are used for monitoring human motion. In particular, camera-based head-trackers interpret the movement of the user's head for interacting with devices. Neck pain is one of the most important musculoskeletal conditions in prevalence and years lived with disability. A common treatment is therapeutic exercise, which requires high motivation and adherence to treatment. In this work, we conduct an exploratory experiment to validate the use of a non-invasive camera-based head-tracker monitoring neck movements. We do it by means of an exergame for performing the rehabilitation exercises using a mobile device. The experiments performed in order to explore its feasibility were: (1) validate neck's range of motion (ROM) that the camera-based head-tracker was able to detect; (2) ensure safety application in terms of neck ROM solicitation by the mobile application. Results not only confirmed safety, in terms of ROM requirements for different preset patient profiles, according with the safety parameters previously established, but also determined the effectiveness of the camera-based head-tracker to monitor the neck movements for rehabilitation purposes.


Subject(s)
Mobile Applications , Exercise Therapy , Head Movements , Humans , Neck , Range of Motion, Articular
SELECTION OF CITATIONS
SEARCH DETAIL
...