Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 7 de 7
Filter
Add more filters










Database
Language
Publication year range
1.
2.
Life (Basel) ; 12(11)2022 Nov 19.
Article in English | MEDLINE | ID: mdl-36431068

ABSTRACT

Background and Objective: Coronary artery disease (CAD) is one of the most prevalent causes of death worldwide. The early diagnosis and timely medical care of cardiovascular patients can greatly prevent death and reduce the cost of treatments associated with CAD. In this study, we attempt to prepare a new model for early CAD diagnosis. The proposed model can diagnose CAD based on clinical data and without the use of an invasive procedure. Methods: In this paper, machine-learning (ML) techniques were used for the early detection of CAD, which were applied to a CAD dataset known as Z-Alizadeh Sani. Since this dataset has 54 features, the Pearson correlation feature selection method was conducted to identify the most effective features. Then, six machine learning techniques including decision tree, deep learning, logistic regression, random forest, support vector machine (SVM), and Xgboost were employed based on a semi-random-partitioning framework. Result: Applying Pearson feature selection to the dataset demonstrated that only eight features were the most effective for CAD diagnosis. The results of running the six machine-learning models on the selected features showed that logistic regression and SVM had the same performance with 95.45% accuracy, 95.91% sensitivity, 91.66% specificity, and a 96.90% F1 score. In addition, the ROC curve indicates a similar result regarding the AUC (0.98). Conclusions: Prediction is an important component of medical decision support systems. The results of the present study showed that feature selection has a high impact on machine-learning performance and, regardless of the evaluation metrics of the machine-learning models, determining the effective features is very important. However, SVM and Logistic Regression were designated as the best models according to our selected features.

3.
Sensors (Basel) ; 22(20)2022 Oct 18.
Article in English | MEDLINE | ID: mdl-36298266

ABSTRACT

The number of unsecured and portable Internet of Things (IoT) devices in the smart industry is growing exponentially. A diversity of centralized and distributed platforms have been implemented to defend against security attacks; however, these platforms are insecure because of their low storage capacities, high power utilization, single node failure, underutilized resources, and high end-to-end delay. Blockchain and Software-Defined Networking (SDN) are growing technologies to create a secure system and to ensure safe network connectivity. Blockchain technology offers a strong and trustworthy foundation to deal with threats and problems, including safety, privacy, adaptability, scalability, and security. However, the integration of blockchain with SDN is still in the implementation phase, which provides an efficient resource allocation and reduced latency that can overcome the issues of industrial IoT networks. We propose an energy-efficient blockchain-integrated software-defined networking architecture for Industrial IoT (IIoT) to overcome these challenges. We present a framework for implementing decentralized blockchain integrated with SDN for IIoT applications to achieve efficient energy utilization and cluster-head selection. Additionally, the blockchain-enabled distributed ledger ensures data consistency throughout the SDN controller network and keeps a record of the nodes enforced in the controller. The simulation result shows that the proposed model provides the best energy consumption, end-to-end latency, and overall throughput compared to the existing works.

4.
Micromachines (Basel) ; 13(9)2022 Sep 05.
Article in English | MEDLINE | ID: mdl-36144094

ABSTRACT

Predicting bearing failures is a vital component of machine health monitoring since bearings are essential parts of rotary machines, particularly large motor machines. In addition, determining the degree of bearing degeneration will aid firms in scheduling maintenance. Maintenance engineers may be gradually supplanted by an automated detection technique in identifying motor issues as improvements in the extraction of useful information from vibration signals are made. State-of-the-art deep learning approaches, in particular, have made a considerable contribution to automatic defect identification. Under variable shaft speed, this research presents a novel approach for identifying bearing defects and their amount of degradation. In the proposed approach, vibration signals are represented by spectrograms, and deep learning methods are applied via pre-processing with the short-time Fourier transform (STFT). A convolutional neural network (CNN), VGG16, is then used to extract features and classify health status. After this, RUL prediction is carried out with the use of regression. Explainable AI using LIME was used to identify the part of the image used by the CNN algorithm to give the output. Our proposed method was able to achieve very high accuracy and robustness for bearing faults, according to numerous experiments.

5.
Comput Intell Neurosci ; 2022: 1419360, 2022.
Article in English | MEDLINE | ID: mdl-35769276

ABSTRACT

In recent years, the Internet of Things (IoT) has been industrializing in various real-world applications, including smart industry and smart grids, to make human existence more reliable. An overwhelming volume of sensing data is produced from numerous sensor devices as the Industrial IoT (IIoT) becomes more industrialized. Artificial Intelligence (AI) plays a vital part in big data analyses as a powerful analytic tool that provides flexible and reliable information insights in real-time. However, there are some difficulties in designing and developing a useful big data analysis tool using machine learning, such as a centralized approach, security, privacy, resource limitations, and a lack of sufficient training data. On the other hand, Blockchain promotes a decentralized architecture for IIoT applications. It encourages the secure data exchange and resources among the various nodes of the IoT network, removing centralized control and overcoming the industry's current challenges. Our proposed approach goal is to design and implement a consensus mechanism that incorporates Blockchain and AI to allow successful big data analysis. This work presents an improved Delegated Proof of Stake (DPoS) algorithm-based IIoT network that combines Blockchain and AI for real-time data transmission. To accelerate IIoT block generation, nodes use an improved DPoS to reach a consensus for selecting delegates and store block information in the trading node. The proposed approach is evaluated regarding energy consumption and transaction efficiency compared with the exciting consensus mechanism. The evaluation results reveal that the proposed consensus algorithm reduces energy consumption and addresses current security issues.


Subject(s)
Internet of Things , Artificial Intelligence , Consensus , Conservation of Energy Resources , Humans , Industry
6.
Big Data ; 10(3): 186-203, 2022 06.
Article in English | MEDLINE | ID: mdl-34747652

ABSTRACT

In recent years, the growth of internet of things (IoT) is immense, and the observations of their evolution need to be carried out effectively. The development of the IoT has been broadly adopted in the construction of intelligent environments. There are various challenging IoT issues such as routing messages, addressing, Localizing nodes, data blending, etc. Formerly learning eloquent information from big data systems to construct a data-gathering setup in an IoT environment is challenging. Among many viable data sources, the IoT is a rich big data source: Various IoT nodes produce a massive quantity of data. Localization is one of the crucial problems that make a significant impact inside the IoT system. It needs to be engaged with proper and effective procedures to collect all sorts of data without noise. Numerous localization procedures and schemes have been initiated by deploying the IoT sensor with wireless sensor networks for both interior and outside environments. To accomplish higher localization accuracy, with less cost for the large volume of data, it is considered a hectic task in the IoT sensor environment. By viewing the nature of the IoT, the merging of different technologies such as the internet, WiFi, etc., can aid diverse ways to acquire information about various objects' locations. Location-based service is an exceptional service of the IoT, whereas localization accuracy is a significant issue. The data generated from the sensor are available in both static and dynamic forms. In this article, a sophisticated accuracy localization scheme for big data is proposed with an optimization approach that can effectively produce proper and effective outcomes for IoT environments. The theme of the article is to develop an enriched Swarm Intelligence algorithm based on Artificial Bee Colony by using the EKF (Extended Kalman Filter) data blend technique for improving Localization in IoT for the unsuspecting environment. The performance of the proposed algorithm is evaluated by using communication consumption and Localization accuracy and its comparative advantage.


Subject(s)
Big Data , Internet of Things , Algorithms , Information Storage and Retrieval
7.
Front Public Health ; 9: 821410, 2021.
Article in English | MEDLINE | ID: mdl-35004605

ABSTRACT

Over the last decade, the field of bioinformatics has been increasing rapidly. Robust bioinformatics tools are going to play a vital role in future progress. Scientists working in the field of bioinformatics conduct a large number of researches to extract knowledge from the biological data available. Several bioinformatics issues have evolved as a result of the creation of massive amounts of unbalanced data. The classification of precursor microRNA (pre miRNA) from the imbalanced RNA genome data is one such problem. The examinations proved that pre miRNAs (precursor microRNAs) could serve as oncogene or tumor suppressors in various cancer types. This paper introduces a Hybrid Deep Neural Network framework (H-DNN) for the classification of pre miRNA in imbalanced data. The proposed H-DNN framework is an integration of Deep Artificial Neural Networks (Deep ANN) and Deep Decision Tree Classifiers. The Deep ANN in the proposed H-DNN helps to extract the meaningful features and the Deep Decision Tree Classifier helps to classify the pre miRNA accurately. Experimentation of H-DNN was done with genomes of animals, plants, humans, and Arabidopsis with an imbalance ratio up to 1:5000 and virus with a ratio of 1:400. Experimental results showed an accuracy of more than 99% in all the cases and the time complexity of the proposed H-DNN is also very less when compared with the other existing approaches.


Subject(s)
MicroRNAs , Neural Networks, Computer , Animals , Computational Biology/methods , MicroRNAs/genetics
SELECTION OF CITATIONS
SEARCH DETAIL
...