Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 22(3)2022 Jan 30.
Artigo em Inglês | MEDLINE | ID: mdl-35161829

RESUMO

Innovation in wireless communications and microtechnology has progressed day by day, and this has resulted in the creation of wireless sensor networks. This technology is utilised in a variety of settings, including battlefield surveillance, home security, and healthcare monitoring, among others. However, since tiny batteries with very little power are used, this technology has power and target monitoring issues. With the development of various architectures and algorithms, considerable research has been done to address these problems. The adaptive learning automata algorithm (ALAA) is a scheduling machine learning method that is utilised in this study. It offers a time-saving scheduling method. As a result, each sensor node in the network has been outfitted with learning automata, allowing them to choose their appropriate state at any given moment. The sensor is in one of two states: active or sleep. Several experiments were conducted to get the findings of the suggested method. Different parameters are utilised in this experiment to verify the consistency of the method for scheduling the sensor node so that it can cover all of the targets while using less power. The experimental findings indicate that the proposed method is an effective approach to schedule sensor nodes to monitor all targets while using less electricity. Finally, we have benchmarked our technique against the LADSC scheduling algorithm. All of the experimental data collected thus far demonstrate that the suggested method has justified the problem description and achieved the project's aim. Thus, while constructing an actual sensor network, our suggested algorithm may be utilised as a useful technique for scheduling sensor nodes.


Assuntos
Redes de Comunicação de Computadores , Tecnologia sem Fio , Algoritmos , Aprendizado de Máquina , Monitorização Fisiológica
2.
Sensors (Basel) ; 22(3)2022 Feb 02.
Artigo em Inglês | MEDLINE | ID: mdl-35161892

RESUMO

Object detection is a vital step in satellite imagery-based computer vision applications such as precision agriculture, urban planning and defense applications. In satellite imagery, object detection is a very complicated task due to various reasons including low pixel resolution of objects and detection of small objects in the large scale (a single satellite image taken by Digital Globe comprises over 240 million pixels) satellite images. Object detection in satellite images has many challenges such as class variations, multiple objects pose, high variance in object size, illumination and a dense background. This study aims to compare the performance of existing deep learning algorithms for object detection in satellite imagery. We created the dataset of satellite imagery to perform object detection using convolutional neural network-based frameworks such as faster RCNN (faster region-based convolutional neural network), YOLO (you only look once), SSD (single-shot detector) and SIMRDWN (satellite imagery multiscale rapid detection with windowed networks). In addition to that, we also performed an analysis of these approaches in terms of accuracy and speed using the developed dataset of satellite imagery. The results showed that SIMRDWN has an accuracy of 97% on high-resolution images, while Faster RCNN has an accuracy of 95.31% on the standard resolution (1000 × 600). YOLOv3 has an accuracy of 94.20% on standard resolution (416 × 416) while on the other hand SSD has an accuracy of 84.61% on standard resolution (300 × 300). When it comes to speed and efficiency, YOLO is the obvious leader. In real-time surveillance, SIMRDWN fails. When YOLO takes 170 to 190 milliseconds to perform a task, SIMRDWN takes 5 to 103 milliseconds.


Assuntos
Redes Neurais de Computação , Imagens de Satélites , Algoritmos , Aprendizado de Máquina , Software
3.
Sensors (Basel) ; 21(23)2021 Nov 25.
Artigo em Inglês | MEDLINE | ID: mdl-34883857

RESUMO

The smart grid (SG) is a contemporary electrical network that enhances the network's performance, reliability, stability, and energy efficiency. The integration of cloud and fog computing with SG can increase its efficiency. The combination of SG with cloud computing enhances resource allocation. To minimise the burden on the Cloud and optimise resource allocation, the concept of fog computing integration with cloud computing is presented. Fog has three essential functionalities: location awareness, low latency, and mobility. We offer a cloud and fog-based architecture for information management in this study. By allocating virtual machines using a load-balancing mechanism, fog computing makes the system more efficient (VMs). We proposed a novel approach based on binary particle swarm optimisation with inertia weight adjusted using simulated annealing. The technique is named BPSOSA. Inertia weight is an important factor in BPSOSA which adjusts the size of the search space for finding the optimal solution. The BPSOSA technique is compared against the round robin, odds algorithm, and ant colony optimisation. In terms of response time, BPSOSA outperforms round robin, odds algorithm, and ant colony optimisation by 53.99 ms, 82.08 ms, and 81.58 ms, respectively. In terms of processing time, BPSOSA outperforms round robin, odds algorithm, and ant colony optimisation by 52.94 ms, 81.20 ms, and 80.56 ms, respectively. Compared to BPSOSA, ant colony optimisation has slightly better cost efficiency, however, the difference is insignificant.


Assuntos
Computação em Nuvem , Sistemas Computacionais , Algoritmos , Reprodutibilidade dos Testes
4.
Inf Fusion ; 74: 50-64, 2021 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-35702568

RESUMO

Internet of things (IoT) application in e-health can play a vital role in countering rapidly spreading diseases that can effectively manage health emergency scenarios like pandemics. Efficient disease control also requires monitoring of Standard operating procedure (SOP) follow-up of the population in the disease-prone area with a cost-effective reporting and responding mechanism to register any violation. However, the IoT devices have limited resources and the application requires delay-sensitive data transmission. Named Data Networking (NDN) can significantly reduce content retrieval delays but inherits cache overflow and network congestion challenges. Therefore, we are motivated to present a novel smart COVID-19 pandemic-controlled eradication over NDN-IoT (SPICE-IT) mechanism. SPICE-IT introduces autonomous monitoring in indoor environments with efficient pull-based reporting mechanism that records violations at local servers and cloud server. Intelligent face mask detection and temperature monitoring mechanism examines every person. Cloud server controls the response action from the centre with an adaptive decision-making mechanism. Long short-term memory (LSTM) based caching mechanism reduces the cache overflow and overall network congestion problem.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...