Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
Sci Rep ; 14(1): 11703, 2024 May 22.
Article in English | MEDLINE | ID: mdl-38778085

ABSTRACT

Internet of Things (IoT) technology has revolutionized modern industrial sectors. Moreover, IoT technology has been incorporated within several vital domains of applicability. However, security is overlooked due to the limited resources of IoT devices. Intrusion detection methods are crucial for detecting attacks and responding adequately to every IoT attack. Conspicuously, the current study outlines a two-stage procedure for the determination and identification of intrusions. In the first stage, a binary classifier termed an Extra Tree (E-Tree) is used to analyze the flow of IoT data traffic within the network. In the second stage, an Ensemble Technique (ET) comprising of E-Tree, Deep Neural Network (DNN), and Random Forest (RF) examines the invasive events that have been identified. The proposed approach is validated for performance analysis. Specifically, Bot-IoT, CICIDS2018, NSL-KDD, and IoTID20 dataset were used for an in-depth performance assessment. Experimental results showed that the suggested strategy was more effective than existing machine learning methods. Specifically, the proposed technique registered enhanced statistical measures of accuracy, normalized accuracy, recall measure, and stability.

2.
Sensors (Basel) ; 23(24)2023 Dec 16.
Article in English | MEDLINE | ID: mdl-38139716

ABSTRACT

The Internet of Things (IoT) technology has seen substantial research in Deep Learning (DL) techniques to detect cyberattacks. Critical Infrastructures (CIs) must be able to quickly detect cyberattacks close to edge devices in order to prevent service interruptions. DL approaches outperform shallow machine learning techniques in attack detection, giving them a viable alternative for use in intrusion detection. However, because of the massive amount of IoT data and the computational requirements for DL models, transmission overheads prevent the successful implementation of DL models closer to the devices. As they were not trained on pertinent IoT, current Intrusion Detection Systems (IDS) either use conventional techniques or are not intended for scattered edge-cloud deployment. A new edge-cloud-based IoT IDS is suggested to address these issues. It uses distributed processing to separate the dataset into subsets appropriate to different attack classes and performs attribute selection on time-series IoT data. Next, DL is used to train an attack detection Recurrent Neural Network, which consists of a Recurrent Neural Network (RNN) and Bidirectional Long Short-Term Memory (LSTM). The high-dimensional BoT-IoT dataset, which replicates massive amounts of genuine IoT attack traffic, is used to test the proposed model. Despite an 85 percent reduction in dataset size made achievable by attribute selection approaches, the attack detection capability was kept intact. The models built utilizing the smaller dataset demonstrated a higher recall rate (98.25%), F1-measure (99.12%), accuracy (99.56%), and precision (99.45%) with no loss in class discrimination performance compared to models trained on the entire attribute set. With the smaller attribute space, neither the RNN nor the Bi-LSTM models experienced underfitting or overfitting. The proposed DL-based IoT intrusion detection solution has the capability to scale efficiently in the face of large volumes of IoT data, thus making it an ideal candidate for edge-cloud deployment.

3.
Comput Intell Neurosci ; 2022: 4723124, 2022.
Article in English | MEDLINE | ID: mdl-36093501

ABSTRACT

Internet of Things (IoT)-inspired drone environment is having a greater influence on daily lives in the form of drone-based smart electricity monitoring, traffic routing, and personal healthcare. However, communication between drones and ground control systems must be protected to avoid potential vulnerabilities and improve coordination among scattered UAVs in the IoT context. In the current paper, a distributed UAV scheme is proposed that uses blockchain technology and a network topology similar to the IoT and cloud server to secure communications during data collection and transmission and reduce the likelihood of attack by maliciously manipulated UAVs. As an alternative to relying on a traditional blockchain approach, a unique, safe, and lightweight blockchain architecture is proposed that reduces computing and storage requirements while keeping privacy and security advantages. In addition, a unique reputation-based consensus protocol is built to assure the dependability of the decentralized network. Numerous types of transactions are established to characterize diverse data access. To validate the presented blockchain-based distributed system, performance evaluations are conducted to estimate the statistical effectiveness in the form of temporal delay, packet flow efficacy, precision, specificity, sensitivity, and security efficiency.


Subject(s)
Blockchain , Computer Communication Networks , Computer Security , Delivery of Health Care , Unmanned Aerial Devices
4.
BMC Med Imaging ; 22(1): 120, 2022 07 05.
Article in English | MEDLINE | ID: mdl-35790901

ABSTRACT

Covid-19 is a disease that can lead to pneumonia, respiratory syndrome, septic shock, multiple organ failure, and death. This pandemic is viewed as a critical component of the fight against an enormous threat to the human population. Deep convolutional neural networks have recently proved their ability to perform well in classification and dimension reduction tasks. Selecting hyper-parameters is critical for these networks. This is because the search space expands exponentially in size as the number of layers increases. All existing approaches utilize a pre-trained or designed architecture as an input. None of them takes design and pruning into account throughout the process. In fact, there exists a convolutional topology for any architecture, and each block of a CNN corresponds to an optimization problem with a large search space. However, there are no guidelines for designing a specific architecture for a specific purpose; thus, such design is highly subjective and heavily reliant on data scientists' knowledge and expertise. Motivated by this observation, we propose a topology optimization method for designing a convolutional neural network capable of classifying radiography images and detecting probable chest anomalies and infections, including COVID-19. Our method has been validated in a number of comparative studies against relevant state-of-the-art architectures.


Subject(s)
COVID-19 , COVID-19/diagnostic imaging , Humans , Neural Networks, Computer , Tomography, X-Ray Computed/methods , X-Rays
5.
Neural Comput Appl ; 34(17): 15007-15029, 2022.
Article in English | MEDLINE | ID: mdl-35599971

ABSTRACT

Over the last decade, deep neural networks have shown great success in the fields of machine learning and computer vision. Currently, the CNN (convolutional neural network) is one of the most successful networks, having been applied in a wide variety of application domains, including pattern recognition, medical diagnosis and signal processing. Despite CNNs' impressive performance, their architectural design remains a significant challenge for researchers and practitioners. The problem of selecting hyperparameters is extremely important for these networks. The reason for this is that the search space grows exponentially in size as the number of layers increases. In fact, all existing classical and evolutionary pruning methods take as input an already pre-trained or designed architecture. None of them take pruning into account during the design process. However, to evaluate the quality and possible compactness of any generated architecture, filter pruning should be applied before the communication with the data set to compute the classification error. For instance, a medium-quality architecture in terms of classification could become a very light and accurate architecture after pruning, and vice versa. Many cases are possible, and the number of possibilities is huge. This motivated us to frame the whole process as a bi-level optimization problem where: (1) architecture generation is done at the upper level (with minimum NB and NNB) while (2) its filter pruning optimization is done at the lower level. Motivated by evolutionary algorithms' (EAs) success in bi-level optimization, we use the newly suggested co-evolutionary migration-based algorithm (CEMBA) as a search engine in this research to address our bi-level architectural optimization problem. The performance of our suggested technique, called Bi-CNN-D-C (Bi-level convolution neural network design and compression), is evaluated using the widely used benchmark data sets for image classification, called CIFAR-10, CIFAR-100 and ImageNet. Our proposed approach is validated by means of a set of comparative experiments with respect to relevant state-of-the-art architectures.

6.
Sensors (Basel) ; 22(7)2022 Mar 29.
Article in English | MEDLINE | ID: mdl-35408244

ABSTRACT

Drone advancements have ushered in new trends and possibilities in a variety of sectors, particularly for small-sized drones. Drones provide navigational interlocation services, which are made possible by the Internet of Things (IoT). Drone networks, on the other hand, are subject to privacy and security risks due to design flaws. To achieve the desired performance, it is necessary to create a protected network. The goal of the current study is to look at recent privacy and security concerns influencing the network of drones (NoD). The current research emphasizes the importance of a security-empowered drone network to prevent interception and intrusion. A hybrid ML technique of logistic regression and random forest is used for the purpose of classification of data instances for maximal efficacy. By incorporating sophisticated artificial-intelligence-inspired techniques into the framework of a NoD, the proposed technique mitigates cybersecurity vulnerabilities while making the NoD protected and secure. For validation purposes, the suggested technique is tested against a challenging dataset, registering enhanced performance results in terms of temporal efficacy (34.56 s), statistical measures (precision (97.68%), accuracy (98.58%), recall (98.59%), F-measure (99.01%), reliability (94.69%), and stability (0.73).


Subject(s)
Internet of Things , Computer Security , Machine Learning , Reproducibility of Results , Unmanned Aerial Devices
7.
J Supercomput ; 78(2): 1783-1806, 2022.
Article in English | MEDLINE | ID: mdl-34177116

ABSTRACT

Rapid communication of viral sicknesses is an arising public medical issue across the globe. Out of these, COVID-19 is viewed as the most critical and novel infection nowadays. The current investigation gives an effective framework for the monitoring and prediction of COVID-19 virus infection (C-19VI). To the best of our knowledge, no research work is focused on incorporating IoT technology for C-19 outspread over spatial-temporal patterns. Moreover, limited work has been done in the direction of prediction of C-19 in humans for controlling the spread of COVID-19. The proposed framework includes a four-level architecture for the expectation and avoidance of COVID-19 contamination. The presented model comprises COVID-19 Data Collection (C-19DC) level, COVID-19 Information Classification (C-19IC) level, COVID-19-Mining and Extraction (C-19ME) level, and COVID-19 Prediction and Decision Modeling (C-19PDM) level. Specifically, the presented model is used to empower a person/community to intermittently screen COVID-19 Fever Measure (C-19FM) and forecast it so that proactive measures are taken in advance. Additionally, for prescient purposes, the probabilistic examination of C-19VI is quantified as degree of membership, which is cumulatively characterized as a COVID-19 Fever Measure (C-19FM). Moreover, the prediction is realized utilizing the temporal recurrent neural network. Additionally, based on the self-organized mapping technique, the presence of C-19VI is determined over a geographical area. Simulation is performed over four challenging datasets. In contrast to other strategies, altogether improved outcomes in terms of classification efficiency, prediction viability, and reliability were registered for the introduced model.

SELECTION OF CITATIONS
SEARCH DETAIL
...