Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters










Publication year range
1.
Sci Rep ; 13(1): 20843, 2023 Nov 27.
Article in English | MEDLINE | ID: mdl-38012161

ABSTRACT

The Internet of Things (IoT) involves the gathering of all those devices that connect to the Internet with the purpose of collecting and sharing data. The application of IoT in the different sectors, including health, industry has also picked up the threads to augment over the past few years. The IoT and, by integrity, the IIoT, are found to be highly susceptible to different types of threats and attacks owing to the networks nature that in turn leads to even poor outcomes (i.e., increasing error rate). Hence, it is critical to design attack detection systems that can provide the security of IIoT networks. To overcome this research work of IIoT attack detection in large amount of evolutions is failed to determine the certain attacks resulting in a minimum detection performance, reinforcement learning-based attack detection method called sliding principal component and dynamic reward reinforcement learning (SPC-DRRL) for detecting various IIoT network attacks is introduced. In the first stage of this research methodology, preprocessing of raw TON_IoT dataset is performed by employing min-max normalization scaling function to obtain normalized values with same scale. Next, with the processed sample data as output, to extract data from multi-sources (i.e., different service profiles from the dataset), a robust log likelihood sliding principal component-based feature extraction algorithm is applied with an arbitrary size sliding window to extract computationally-efficient features. Finally, dynamic reward reinforcement learning-based IIoT attack detection model is presented to control the error rate involved in the design. Here, with the design of dynamic reward function and introducing incident repository that not only generates the reward function in an arbitrary fashion but also stores the action results in the incident repository for the next training, therefore reducing the attack detection error rate. Moreover, an IIoT attack detection system based on SPC-DRRL is constructed. Finally, we verify the algorithm on the ToN_IoT dataset of University of New South Wales Australia. The experimental results show that the IIoT attack detection time and overhead along with the error rate are reduced considerably with higher accuracy than that of traditional reinforcement learning methods.

2.
Sci Rep ; 13(1): 15681, 2023 Sep 21.
Article in English | MEDLINE | ID: mdl-37735185

ABSTRACT

Ensuring the privacy and trustworthiness of smart city-Internet of Things (IoT) networks have recently remained the central problem. Cyborg intelligence is one of the most popular and advanced technologies suitable for securing smart city networks against cyber threats. Various machine learning and deep learning-based cyborg intelligence mechanisms have been developed to protect smart city networks by ensuring property, security, and privacy. However, it limits the critical problems of high time complexity, computational cost, difficulty to understand, and reduced level of security. Therefore, the proposed work intends to implement a group of novel methodologies for developing an effective Cyborg intelligence security model to secure smart city systems. Here, the Quantized Identical Data Imputation (QIDI) mechanism is implemented at first for data preprocessing and normalization. Then, the Conjugate Self-Organizing Migration (CSOM) optimization algorithm is deployed to select the most relevant features to train the classifier, which also supports increased detection accuracy. Moreover, the Reconciliate Multi-Agent Markov Learning (RMML) based classification algorithm is used to predict the intrusion with its appropriate classes. The original contribution of this work is to develop a novel Cyborg intelligence framework for protecting smart city networks from modern cyber-threats. In this system, a combination of unique and intelligent mechanisms are implemented to ensure the security of smart city networks. It includes QIDI for data filtering, CSOM for feature optimization and dimensionality reduction, and RMML for categorizing the type of intrusion. By using these methodologies, the overall attack detection performance and efficiency have been greatly increased in the proposed cyborg model. Here, the main reason of using CSOM methodology is to increase the learning speed and prediction performance of the classifier while detecting intrusions from the smart city networks. Moreover, the CSOM provides the optimized set of features for improving the training and testing operations of classifier with high accuracy and efficiency. Among other methodologies, the CSOM has the unique characteristics of increased searching efficiency, high convergence, and fast processing speed. During the evaluation, the different types of cyber-threat datasets are considered for testing and validation, and the results are compared with the recent state-of-the-art model approaches.

3.
Sensors (Basel) ; 23(16)2023 Aug 16.
Article in English | MEDLINE | ID: mdl-37631743

ABSTRACT

Internet of Things (IoT) enables day-to-day objects to connect with the Internet and transmit and receive data for meaningful purposes. Recently, IoT has resulted in many revolutions in all sectors. Nonetheless, security risks to IoT networks and devices are persistently disruptive due to the growth of Internet technology. Phishing becomes a common threat to Internet users, where the attacker aims to fraudulently extract confidential data of the system or user by using websites, fictitious emails, etc. Due to the dramatic growth in IoT devices, hackers target IoT gadgets, including smart cars, security cameras, and so on, and perpetrate phishing attacks to gain control over the vulnerable device for malicious purposes. These scams have been increasing and advancing over the last few years. To resolve these problems, this paper presents a binary Hunter-prey optimization with a machine learning-based phishing attack detection (BHPO-MLPAD) method in the IoT environment. The BHPO-MLPAD technique can find phishing attacks through feature selection and classification. In the presented BHPO-MLPAD technique, the BHPO algorithm primarily chooses an optimal subset of features. The cascaded forward neural network (CFNN) model is employed for phishing attack detection. To adjust the parameter values of the CFNN model, the variable step fruit fly optimization (VFFO) algorithm is utilized. The performance assessment of the BHPO-MLPAD method takes place on the benchmark dataset. The results inferred the betterment of the BHPO-MLPAD technique over compared approaches in different evaluation measures.

4.
Neurosci Lett ; 809: 137313, 2023 07 13.
Article in English | MEDLINE | ID: mdl-37257682

ABSTRACT

Depression is a psychological condition which hampers day to day activity (Thinking, Feeling or Action). The early detection of this illness will help to save many lives because it is now recognized as a global problem which could even lead to suicide. Electroencephalogram (EEG) signals can be used to diagnose depression using machine learning techniques. The dataset studied is public dataset which consists of 30 healthy people and 34 depression patients. The methods used for detection of depression are Decision Tree, Random Forest, Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Bidirectional Long-Short Term Memory (Bi-LSTM), Gradient Boosting, Extreme Gradient Boosting (XGBoost) along with band power. Among Deep Learning techniques, CNN model got the highest accuracy with 98.13%, specificity of 99%, and sensitivity of 97% using band power features.


Subject(s)
Depression , Electroencephalography , Machine Learning , Humans , Depression/diagnosis , Depression/psychology , Case-Control Studies , Datasets as Topic , Neural Networks, Computer , Decision Trees , Random Forest , Depressive Disorder, Major/diagnosis , Depressive Disorder, Major/psychology
5.
J Cloud Comput (Heidelb) ; 12(1): 38, 2023.
Article in English | MEDLINE | ID: mdl-36937654

ABSTRACT

The Industrial Internet of Things (IIoT) promises to deliver innovative business models across multiple domains by providing ubiquitous connectivity, intelligent data, predictive analytics, and decision-making systems for improved market performance. However, traditional IIoT architectures are highly susceptible to many security vulnerabilities and network intrusions, which bring challenges such as lack of privacy, integrity, trust, and centralization. This research aims to implement an Artificial Intelligence-based Lightweight Blockchain Security Model (AILBSM) to ensure privacy and security of IIoT systems. This novel model is meant to address issues that can occur with security and privacy when dealing with Cloud-based IIoT systems that handle data in the Cloud or on the Edge of Networks (on-device). The novel contribution of this paper is that it combines the advantages of both lightweight blockchain and Convivial Optimized Sprinter Neural Network (COSNN) based AI mechanisms with simplified and improved security operations. Here, the significant impact of attacks is reduced by transforming features into encoded data using an Authentic Intrinsic Analysis (AIA) model. Extensive experiments are conducted to validate this system using various attack datasets. In addition, the results of privacy protection and AI mechanisms are evaluated separately and compared using various indicators. By using the proposed AILBSM framework, the execution time is minimized to 0.6 seconds, the overall classification accuracy is improved to 99.8%, and detection performance is increased to 99.7%. Due to the inclusion of auto-encoder based transformation and blockchain authentication, the anomaly detection performance of the proposed model is highly improved, when compared to other techniques.

6.
Appl Bionics Biomech ; 2021: 4520450, 2021.
Article in English | MEDLINE | ID: mdl-34876924

ABSTRACT

The word radiomics, like all domains of type omics, assumes the existence of a large amount of data. Using artificial intelligence, in particular, different machine learning techniques, is a necessary step for better data exploitation. Classically, researchers in this field of radiomics have used conventional machine learning techniques (random forest, for example). More recently, deep learning, a subdomain of machine learning, has emerged. Its applications are increasing, and the results obtained so far have demonstrated their remarkable effectiveness. Several previous studies have explored the potential applications of radiomics in colorectal cancer. These potential applications can be grouped into several categories like evaluation of the reproducibility of texture data, prediction of response to treatment, prediction of the occurrence of metastases, and prediction of survival. Few studies, however, have explored the potential of radiomics in predicting recurrence-free survival. In this study, we evaluated and compared six conventional learning models and a deep learning model, based on MRI textural analysis of patients with locally advanced rectal tumours, correlated with the risk of recidivism; in traditional learning, we compared 2D image analysis models vs. 3D image analysis models, models based on a textural analysis of the tumour versus models taking into account the peritumoural environment in addition to the tumour itself. In deep learning, we built a 16-layer convolutional neural network model, driven by a 2D MRI image database comprising both the native images and the bounding box corresponding to each image.

7.
Appl Bionics Biomech ; 2021: 9014559, 2021.
Article in English | MEDLINE | ID: mdl-34804200

ABSTRACT

Mobile edge computing (MEC) is a paradigm novel computing that promises the dramatic effect of reduction in latency and consumption of energy by computation offloading intensive; these tasks to the edge clouds in proximity close to the smart mobile users. In this research, reduce the offloading and latency between the edge computing and multiusers under the environment IoT application in 5G using bald eagle search optimization algorithm. The deep learning approach may consume high computational complexity and more time. In an edge computing system, devices can offload their computation-intensive tasks to the edge servers to save energy and shorten their latency. The bald eagle algorithm (BES) is the advanced optimization algorithm that resembles the strategy of eagle hunting. The strategies are select, search, and swooping stages. Previously, the BES algorithm is used to consume the energy and distance; to improve the better energy and reduce the offloading latency in this research and some delays occur when devices increase causes demand for cloud data, it can be improved by offering ROS (resource) estimation. To enhance the BES algorithm that introduces the ROS estimation stage to select the better ROSs, an edge system, which offloads the most appropriate IoT subtasks to edge servers then the expected time of execution, got minimized. Based on multiuser offloading, we proposed a bald eagle search optimization algorithm that can effectively reduce the end-end time to get fast and near-optimal IoT devices. The latency is reduced from the cloud to the local; this can be overcome by using edge computing, and deep learning expects faster and better results from the network. This can be proposed by BES algorithm technique that is better than other conventional methods that are compared on results to minimize the offloading latency. Then, the simulation is done to show the efficiency and stability by reducing the offloading latency.

8.
Front Public Health ; 8: 599550, 2020.
Article in English | MEDLINE | ID: mdl-33330341

ABSTRACT

In this paper, a data mining model on a hybrid deep learning framework is designed to diagnose the medical conditions of patients infected with the coronavirus disease 2019 (COVID-19) virus. The hybrid deep learning model is designed as a combination of convolutional neural network (CNN) and recurrent neural network (RNN) and named as DeepSense method. It is designed as a series of layers to extract and classify the related features of COVID-19 infections from the lungs. The computerized tomography image is used as an input data, and hence, the classifier is designed to ease the process of classification on learning the multidimensional input data using the Expert Hidden layers. The validation of the model is conducted against the medical image datasets to predict the infections using deep learning classifiers. The results show that the DeepSense classifier offers accuracy in an improved manner than the conventional deep and machine learning classifiers. The proposed method is validated against three different datasets, where the training data are compared with 70%, 80%, and 90% training data. It specifically provides the quality of the diagnostic method adopted for the prediction of COVID-19 infections in a patient.


Subject(s)
COVID-19/diagnosis , COVID-19/physiopathology , Lung/diagnostic imaging , SARS-CoV-2/pathogenicity , Symptom Assessment/methods , Tomography, X-Ray Computed/methods , Algorithms , Deep Learning , Humans , Machine Learning , Neural Networks, Computer , Sensitivity and Specificity
9.
Sensors (Basel) ; 20(18)2020 Sep 07.
Article in English | MEDLINE | ID: mdl-32906665

ABSTRACT

Monitoring what application or type of applications running on a computer or a cluster without violating the privacy of the users can be challenging, especially when we may not have operator access to these devices, or specialized software. Smart grids and Internet of things (IoT) devices can provide power consumption data of connected individual devices or groups. This research will attempt to provide insides on what applications are running based on the power consumption of the machines and clusters. It is therefore assumed that there is a correlation between electric power and what software application is running. Additionally, it is believed that it is possible to create power consumption profiles for various software applications and even normal and abnormal behavior (e.g., a virus). In order to achieve this, an experiment was organized for the purpose of collecting 48 h of continuous real power consumption data from two PCs that were part of a university computer lab. That included collecting data with a one-second sample period, during class as well as idle time from each machine and their cluster. During the second half of the recording period, one of the machines was infected with a custom-made virus, allowing comparison between power consumption data before and after infection. The data were analyzed using different approaches: descriptive analysis, F-Test of two samples of variance, two-way analysis of variance (ANOVA) and autoregressive integrated moving average (ARIMA). The results show that it is possible to detect what type of application is running and if an individual machine or its cluster are infected. Additionally, we can conclude if the lab is used or not, making this research an ideal management tool for administrators.

10.
Article in English | MEDLINE | ID: mdl-31905999

ABSTRACT

Algorithms for measuring semantic similarity between Gene Ontology (GO) terms has become a popular area of research in bioinformatics as it can help to detect functional associations between genes and potential impact to the health and well-being of humans, animals, and plants. While the focus of the research is on the design and improvement of GO semantic similarity algorithms, there is still a need for implementation of such algorithms before they can be used to solve actual biological problems. This can be challenging given that the potential users usually come from a biology background and they are not programmers. A number of implementations exist for some well-established algorithms but these implementations are not generic enough to support any algorithm other than the ones they are designed for. The aim of this paper is to shift the focus away from implementation, allowing researchers to focus on algorithm's design and execution rather than implementation. This is achieved by an implementation approach capable of understanding and executing user defined GO semantic similarity algorithms. Questions and answers were used for the definition of the user defined algorithm. Additionally, this approach understands any direct acyclic digraph in an Open Biomedical Ontologies (OBO)-like format and its annotations. On the other hand, software developers of similar applications can also benefit by using this as a template for their applications.


Subject(s)
Algorithms , Gene Ontology , Semantics , Animals , Computational Biology , Humans , Software
SELECTION OF CITATIONS
SEARCH DETAIL
...