Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Sensors (Basel) ; 20(17)2020 Aug 27.
Article in English | MEDLINE | ID: mdl-32867182

ABSTRACT

An essential aspect in the interaction between people and computers is the recognition of facial expressions. A key issue in this process is to select relevant features to classify facial expressions accurately. This study examines the selection of optimal geometric features to classify six basic facial expressions: happiness, sadness, surprise, fear, anger, and disgust. Inspired by the Facial Action Coding System (FACS) and the Moving Picture Experts Group 4th standard (MPEG-4), an initial set of 89 features was proposed. These features are normalized distances and angles in 2D and 3D computed from 22 facial landmarks. To select a minimum set of features with the maximum classification accuracy, two selection methods and four classifiers were tested. The first selection method, principal component analysis (PCA), obtained 39 features. The second selection method, a genetic algorithm (GA), obtained 47 features. The experiments ran on the Bosphorus and UIVBFED data sets with 86.62% and 93.92% median accuracy, respectively. Our main finding is that the reduced feature set obtained by the GA is the smallest in comparison with other methods of comparable accuracy. This has implications in reducing the time of recognition.


Subject(s)
Automated Facial Recognition , Emotions , Facial Expression , Humans
2.
Games Health J ; 8(5): 313-325, 2019 Oct.
Article in English | MEDLINE | ID: mdl-31287734

ABSTRACT

This systematic review aims to analyze the state-of-the-art regarding interaction modalities used on serious games for upper limb rehabilitation. A systematic search was performed in IEEE Xplore and Web of Science databases. PRISMA and QualSyst protocols were used to filter and assess the articles. Articles must meet the following inclusion criteria: they must be written in English; be at least four pages in length; use or develop serious games; focus on upper limb rehabilitation; and be published between 2007 and 2017. Of 121 articles initially retrieved, 33 articles met the inclusion criteria. Three interaction modalities were found: vision systems (42.4%), complementary vision systems (30.3%), and no-vision systems (27.2%). Vision systems and no-vision systems obtained a similar mean QualSyst (86%) followed by complementary vision systems (85.7%). Almost half of the studies used vision systems as the interaction modality (42.4%) and used the Kinect sensor to collect the body movements (48.48%). The shoulder was the most treated body part in the studies (19%). A key limitation of vision systems and complementary vision systems is that their device performances might be affected by lighting conditions. A main limitation of the no-vision systems is that the range-of-motion in angles of the body movement might not be measured accurately. Due to a limited number of studies, fruitful areas for further research could be the following: serious games focused on finger rehabilitation and trauma injuries, game difficulty adaptation based on user's muscle strength and posture, and multisensor data fusion on interaction modalities.


Subject(s)
Games, Experimental , Rehabilitation/methods , Upper Extremity , Exercise Therapy/methods , Exercise Therapy/standards , Exercise Therapy/trends , Humans , Rehabilitation/standards , Rehabilitation/trends
SELECTION OF CITATIONS
SEARCH DETAIL
...