Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Front Big Data ; 6: 1170820, 2023.
Article in English | MEDLINE | ID: mdl-36968617

ABSTRACT

[This corrects the article DOI: 10.3389/fdata.2022.879389.].

2.
Front Big Data ; 5: 879389, 2022.
Article in English | MEDLINE | ID: mdl-36111178

ABSTRACT

Human Activity Recognition (HAR) is a prominent application in mobile computing and Internet of Things (IoT) that aims to detect human activities based on multimodal sensor signals generated as a result of diverse body movements. Human physical activities are typically composed of simple actions (such as "arm up", "arm down", "arm curl", etc.), referred to as semantic features. Such abstract semantic features, in contrast to high-level activities ("walking", "sitting", etc.) and low-level signals (raw sensor readings), can be developed manually to assist activity recognition. Although effective, this manual approach relies heavily on human domain expertise and is not scalable. In this paper, we address this limitation by proposing a machine learning method, SemNet, based on deep belief networks. SemNet automatically constructs semantic features representative of the axial bodily movements. Experimental results show that SemNet outperforms baseline approaches and is capable of learning features that highly correlate with manually defined semantic attributes. Furthermore, our experiments using a different model, namely deep convolutional LSTM, on household activities illustrate the broader applicability of semantic attribute interpretation to diverse deep neural network approaches. These empirical results not only demonstrate that such a deep learning technique is semantically meaningful and superior to its handcrafted counterpart, but also provides a better understanding of the deep learning methods that are used for Human Activity Recognition.

SELECTION OF CITATIONS
SEARCH DETAIL
...