Your browser doesn't support javascript.
Show: 20 | 50 | 100
Results 1 - 2 de 2
Add filters

Document Type
Year range
Int J Environ Res Public Health ; 18(17)2021 08 27.
Article in English | MEDLINE | ID: covidwho-1374402


Nowadays people are mostly focused on their work while ignoring their health which in turn is creating a drastic effect on their health in the long run. Remote health monitoring through telemedicine can help people discover potential health threats in time. In the COVID-19 pandemic, remote health monitoring can help obtain and analyze biomedical signals including human body temperature without direct body contact. This technique is of great significance to achieve safe and efficient health monitoring in the COVID-19 pandemic. Existing remote biomedical signal monitoring methods cannot effectively analyze the time series data. This paper designs a remote biomedical signal monitoring framework combining the Internet of Things (IoT), 5G communication and artificial intelligence techniques. In the constructed framework, IoT devices are used to collect biomedical signals at the perception layer. Subsequently, the biomedical signals are transmitted through the 5G network to the cloud server where the GRU-AE deep learning model is deployed. It is noteworthy that the proposed GRU-AE model can analyze multi-dimensional biomedical signals in time series. Finally, this paper conducts a 24-week monitoring experiment for 2000 subjects of different ages to obtain real data. Compared with the traditional biomedical signal monitoring method based on the AutoEncoder model, the GRU-AE model has better performance. The research has an important role in promoting the development of biomedical signal monitoring techniques, which can be effectively applied to some kinds of remote health monitoring scenario.

COVID-19 , Internet of Things , Artificial Intelligence , Humans , Pandemics , SARS-CoV-2
Electronics ; 10(15):1769, 2021.
Article in English | MDPI | ID: covidwho-1325621


Emotion-aware music recommendations has gained increasing attention in recent years, as music comes with the ability to regulate human emotions. Exploiting emotional information has the potential to improve recommendation performances. However, conventional studies identified emotion as discrete representations, and could not predict users’ emotional states at time points when no user activity data exists, let alone the awareness of the influences posed by social events. In this study, we proposed an emotion-aware music recommendation method using deep neural networks (emoMR). We modeled a representation of music emotion using low-level audio features and music metadata, model the users’ emotion states using an artificial emotion generation model with endogenous factors exogenous factors capable of expressing the influences posed by events on emotions. The two models were trained using a designed deep neural network architecture (emoDNN) to predict the music emotions for the music and the music emotion preferences for the users in a continuous form. Based on the models, we proposed a hybrid approach of combining content-based and collaborative filtering for generating emotion-aware music recommendations. Experiment results show that emoMR performs better in the metrics of Precision, Recall, F1, and HitRate than the other baseline algorithms. We also tested the performance of emoMR on two major events (the death of Yuan Longping and the Coronavirus Disease 2019 (COVID-19) cases in Zhejiang). Results show that emoMR takes advantage of event information and outperforms other baseline algorithms.