Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
PLoS One ; 18(10): e0286652, 2023.
Article in English | MEDLINE | ID: mdl-37844095

ABSTRACT

Recent years have witnessed an in-depth proliferation of the Internet of Things (IoT) and Industrial Internet of Things (IIoT) systems linked to Industry 4.0 technology. The increasing rate of IoT device usage is associated with rising security risks resulting from malicious network flows during data exchange between the connected devices. Various security threats have shown high adverse effects on the availability, functionality, and usability of the devices among which denial of service (DoS) and distributed denial of service (DDoS), which attempt to exhaust the capacity of the IoT network (gateway), thereby causing failure in the functionality of the system have been more pronounced. Various machine learning and deep learning algorithms have been used to propose intelligent intrusion detection systems (IDS) to mitigate the challenging effects of these network threats. One concern is that although deep learning algorithms have shown good accuracy results on tabular data, not all deep learning algorithms can perform well on tabular datasets, which happen to be the most commonly available format of datasets for machine learning tasks. Again, there is also the challenge of model explainability and feature selection, which affect model performance. In this regard, we propose a model for IDS that uses attentive mechanisms to automatically select salient features from a dataset to train the IDS model and provide explainable results, the TabNet-IDS. We implement the proposed model using the TabNet algorithm based on PyTorch which is a deep-learning framework. The results obtained show that the TabNet architecture can be used on tabular datasets for IoT security to achieve good results comparable to those of neural networks, reaching an accuracy of 97% on CIC-IDS2017, 95% on CSE-CICIDS2018 and 98% on CIC-DDoS2019 datasets.


Subject(s)
Deep Learning , Internet of Things , Algorithms , Internet , Neural Networks, Computer
2.
Sensors (Basel) ; 20(15)2020 Jul 31.
Article in English | MEDLINE | ID: mdl-32751877

ABSTRACT

The growth of the Internet of Things (IoT) led to the deployment of many applications that use wireless networks, like smart cities and smart agriculture. Low Power Wide Area Networks (LPWANs) meet many requirements of IoT, such as energy efficiency, low cost, large coverage area, and large-scale deployment. Long Range Wide Area Network (LoRaWAN) networks are one of the most studied and implemented LPWAN technologies, due to the facility to build private networks with an open standard. Typical LoRaWAN networks are single-hop in a star topology, composed of end-devices that transmit data directly to gateways. Recently, several studies proposed multihop LoRaWAN networks, thus forming wireless mesh networks. This article provides a review of the state-of-the-art multihop proposals for LoRaWAN. In addition, we carried out a comparative analysis and classification, considering technical characteristics, intermediate devices function, and network topologies. This paper also discusses open issues and future directions to realize the full potential of multihop networking. We hope to encourage other researchers to work on improving the performance of LoRaWAN mesh networks, with more theoretical and simulation analysis, as well as practical deployments.

SELECTION OF CITATIONS
SEARCH DETAIL
...