Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
PLoS One ; 17(9): e0273649, 2022.
Article in English | MEDLINE | ID: mdl-36094924

ABSTRACT

This paper presents a new large-scale signer independent dataset for Kazakh-Russian Sign Language (KRSL) for the purposes of Sign Language Processing. We envision it to serve as a new benchmark dataset for performance evaluations of Continuous Sign Language Recognition (CSLR) and Translation (CSLT) tasks. The proposed FluentSigners-50 dataset consists of 173 sentences performed by 50 KRSL signers resulting in 43,250 video samples. Dataset contributors recorded videos in real-life settings on a wide variety of backgrounds using various devices such as smartphones and web cameras. Therefore, distance to the camera, camera angles and aspect ratio, video quality, and frame rates varied for each dataset contributor. Additionally, the proposed dataset contains a high degree of linguistic and inter-signer variability and thus is a better training set for recognizing a real-life sign language. FluentSigners-50 baseline is established using two state-of-the-art methods, Stochastic CSLR and TSPNet. To this end, we carefully prepared three benchmark train-test splits for models' evaluations in terms of: signer independence, age independence, and unseen sentences. FluentSigners-50 is publicly available at https://krslproject.github.io/FluentSigners-50/.


Subject(s)
Benchmarking , Sign Language , Humans , Language , Linguistics
2.
PLoS One ; 15(6): e0233731, 2020.
Article in English | MEDLINE | ID: mdl-32484837

ABSTRACT

Facial expressions in sign languages are used to express grammatical functions, such as question marking, but can also be used to express emotions (either the signer's own or in constructed action contexts). Emotions and grammatical functions can utilize the same articulators, and the combinations can be congruent or incongruent. For instance, surprise and polar questions can be marked by raised eyebrows, while anger is usually marked by lowered eyebrows. We investigated what happens when different emotions (neutral/surprise/anger) are combined with different sentence types (statement/polar question/wh-question) in Kazakh-Russian Sign Language (KRSL), replicating studies previously made for other sign languages. We asked 9 native signers (5 deaf, 4 hearing children of deaf adults) to sign 10 simple sentences in 9 conditions (3 emotions * 3 sentence types). We used OpenPose software to track eyebrow position in the video recordings. We found that emotions and sentence types influence eyebrow position in KRSL: eyebrows are raised for polar questions and surprise, and lowered for anger. There are also some interactions between the two factors, as well as some differences between hearing and deaf native signers, namely a smaller effect of polar questions for the deaf group, and a different interaction between emotions and wh-question marking in the two groups. We thus find evidence for the complex influences on non-manual behavior in signers of sign languages, and showcase a quantitative approach to this field.


Subject(s)
Emotions , Facial Expression , Sign Language , Adult , Eyebrows/physiology , Female , Humans , Kazakhstan , Male
SELECTION OF CITATIONS
SEARCH DETAIL
...