Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters











Database
Language
Publication year range
1.
Sensors (Basel) ; 22(22)2022 Nov 08.
Article in English | MEDLINE | ID: mdl-36433190

ABSTRACT

Packet loss is a major problem for wireless networks and has significant effects on the perceived quality of many internet services. Packet loss models are used to understand the behavior of packet losses caused by several reasons, e.g., interferences, coexistence, fading, collisions, and insufficient/excessive memory buffers. Among these, the Gilbert-Elliot (GE) model, based on a two-state Markov chain, is the most used model in communication networks. However, research has proven that the GE model is inadequate to represent the real behavior of packet losses in Wi-Fi networks. In this last category, variables of a single network layer are used, usually the physical one. In this article, we propose a new packet loss model for Wi-Fi that simultaneously considers the temporal behavior of losses and the variables that describe the state of the network. In addition, the model uses two important variables, the signal-to-noise ratio and the network occupation, which none of the packet loss models available for Wi-Fi networks simultaneously take into account. The proposed model uses the well-known Hidden Markov Model (HMM), which facilitates training and forecasting. At each state of HMM, the burst-length of losses is characterized using probability distributions. The model was evaluated by comparing computer simulation and real data samples for validation, and using the log-log complementary distribution of burst-length. We compared the proposed model with competing models through the analysis of mean square error (MSE) using a validation sample collected from a real network. Results demonstrated that the proposed model outperforms the currently available models for packet loss in Wi-Fi networks.

2.
Entropy (Basel) ; 23(8)2021 Jul 25.
Article in English | MEDLINE | ID: mdl-34441088

ABSTRACT

Quality of service (QoS) requirements for live streaming are most required for video-on-demand (VoD), where they are more sensitive to variations in delay, jitter, and packet loss. Dynamic Adaptive Streaming over HTTP (DASH) is the most popular technology for live streaming and VoD, where it has been massively deployed on the Internet. DASH is an over-the-top application using unmanaged networks to distribute content with the best possible quality. Widely, it uses large reception buffers in order to keep a seamless playback for VoD applications. However, the use of large buffers in live streaming services is not allowed because of the induced delay. Hence, network congestion caused by insufficient queues could decrease the user-perceived video quality. Active Queue Management (AQM) arises as an alternative to control the congestion in a router's queue, pressing the TCP traffic sources to reduce their transmission rate when it detects incipient congestion. As a consequence, the DASH client tends to decrease the quality of the streamed video. In this article, we evaluate the performance of recent AQM strategies for real-time adaptive video streaming and propose a new AQM algorithm using Long Short-Term Memory (LSTM) neural networks to improve the user-perceived video quality. The LSTM forecast the trend of queue delay to allow earlier packet discard in order to avoid the network congestion. The results show that the proposed method outperforms the competing AQM algorithms, mainly in scenarios where there are congested networks.

SELECTION OF CITATIONS
SEARCH DETAIL