Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Heliyon ; 9(11): e21947, 2023 Nov.
Article in English | MEDLINE | ID: mdl-38053860

ABSTRACT

As wireless communication grows, so does the need for smart, simple, affordable solutions. The need prompted academics to develop appropriate network solutions ranging from wireless sensor networks (WSNs) to the Internet of Things (IoT). With the innovations of researchers, the necessity for enhancements in existing researchers has increased. Initially, network protocols were the focus of study and development. Regardless, IoT devices are already being employed in different industries and collecting massive amounts of data through complicated applications. This necessitates IoT load-balancing research. Several studies tried to address the communication overheads produced by significant IoT network traffic. These studies intended to control network loads by evenly spreading them across IoT nodes. Eventually, the practitioners decided to migrate the IoT node data and the apps processing it to the cloud. So, the difficulty is to design a cloud-based load balancer algorithm that meets the criteria of IoT network protocols. Defined as a unique method for controlling loads on cloud-integrated IoT networks. The suggested method analyses actual and virtual host machine needs in cloud computing environments. The purpose of the proposed model is to design a load balancer that improves network response time while reducing energy consumption. The proposed load balancer algorithm may be easily integrated with peer-existing IoT frameworks. Handling the load for cloud-based IoT architectures with the above-described methods. Significantly boosts response time for the IoT network by 60 %. The proposed scheme has less energy consumption (31 %), less execution time (24\%), decreased node shutdown time (45 %), and less infrastructure cost (48\%) in comparison to existing frameworks. Based on the simulation results, it is concluded that the proposed framework offers an improved solution for IoT-based cloud load-balancing issues.

2.
Comput Intell Neurosci ; 2022: 9015778, 2022.
Article in English | MEDLINE | ID: mdl-35795732

ABSTRACT

In this paper, an autonomous brain tumor segmentation and detection model is developed utilizing a convolutional neural network technique that included a local binary pattern and a multilayered support vector machine. The detection and classification of brain tumors are a key feature in order to aid physicians; an intelligent system must be designed with less manual work and more automated operations in mind. The collected images are then processed using image filtering techniques, followed by image intensity normalization, before proceeding to the patch extraction stage, which results in patch extracted images. During feature extraction, the RGB image is converted to a binary image by grayscale conversion via the colormap process, and this process is then completed by the local binary pattern (LBP). To extract feature information, a convolutional network can be utilized, while to detect objects, a multilayered support vector machine (ML-SVM) can be employed. CNN is a popular deep learning algorithm that is utilized in a wide variety of engineering applications. Finally, the classification approach used in this work aids in determining the presence or absence of a brain tumor. To conduct the comparison, the entire work is tested against existing procedures and the proposed approach using critical metrics such as dice similarity coefficient (DSC), Jaccard similarity index (JSI), sensitivity (SE), accuracy (ACC), specificity (SP), and precision (PR).


Subject(s)
Brain Neoplasms , Support Vector Machine , Algorithms , Benchmarking , Brain Neoplasms/diagnostic imaging , Engineering , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...