Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Front Big Data ; 7: 1349116, 2024.
Article in English | MEDLINE | ID: mdl-38638340

ABSTRACT

With the rapid growth of information and communication technologies, governments worldwide are embracing digital transformation to enhance service delivery and governance practices. In the rapidly evolving landscape of information technology (IT), secure data management stands as a cornerstone for organizations aiming to safeguard sensitive information. Robust data modeling techniques are pivotal in structuring and organizing data, ensuring its integrity, and facilitating efficient retrieval and analysis. As the world increasingly emphasizes sustainability, integrating eco-friendly practices into data management processes becomes imperative. This study focuses on the specific context of Pakistan and investigates the potential of cloud computing in advancing e-governance capabilities. Cloud computing offers scalability, cost efficiency, and enhanced data security, making it an ideal technology for digital transformation. Through an extensive literature review, analysis of case studies, and interviews with stakeholders, this research explores the current state of e-governance in Pakistan, identifies the challenges faced, and proposes a framework for leveraging cloud computing to overcome these challenges. The findings reveal that cloud computing can significantly enhance the accessibility, scalability, and cost-effectiveness of e-governance services, thereby improving citizen engagement and satisfaction. This study provides valuable insights for policymakers, government agencies, and researchers interested in the digital transformation of e-governance in Pakistan and offers a roadmap for leveraging cloud computing technologies in similar contexts. The findings contribute to the growing body of knowledge on e-governance and cloud computing, supporting the advancement of digital governance practices globally. This research identifies monitoring parameters necessary to establish a sustainable e-governance system incorporating big data and cloud computing. The proposed framework, Monitoring and Assessment System using Cloud (MASC), is validated through secondary data analysis and successfully fulfills the research objectives. By leveraging big data and cloud computing, governments can revolutionize their digital governance practices, driving transformative changes and enhancing efficiency and effectiveness in public administration.

2.
Sensors (Basel) ; 23(21)2023 Oct 27.
Article in English | MEDLINE | ID: mdl-37960453

ABSTRACT

Smart cities have emerged as a specialized domain encompassing various technologies, transitioning from civil engineering to technology-driven solutions. The accelerated development of technologies, such as the Internet of Things (IoT), software-defined networks (SDN), 5G, artificial intelligence, cognitive science, and analytics, has played a crucial role in providing solutions for smart cities. Smart cities heavily rely on devices, ad hoc networks, and cloud computing to integrate and streamline various activities towards common goals. However, the complexity arising from multiple cloud service providers offering myriad services necessitates a stable and coherent platform for sustainable operations. The Smart City Operational Platform Ecology (SCOPE) model has been developed to address the growing demands, and incorporates machine learning, cognitive correlates, ecosystem management, and security. SCOPE provides an ecosystem that establishes a balance for achieving sustainability and progress. In the context of smart cities, Internet of Things (IoT) devices play a significant role in enabling automation and data capture. This research paper focuses on a specific module of SCOPE, which deals with data processing and learning mechanisms for object identification in smart cities. Specifically, it presents a car parking system that utilizes smart identification techniques to identify vacant slots. The learning controller in SCOPE employs a two-tier approach, and utilizes two different models, namely Alex Net and YOLO, to ensure procedural stability and improvement.

3.
Comput Intell Neurosci ; 2022: 4515642, 2022.
Article in English | MEDLINE | ID: mdl-36238679

ABSTRACT

There are an increasing number of Internet of Things (IoT) devices connected to the network these days, and due to the advancement in technology, the security threads and cyberattacks, such as botnets, are emerging and evolving rapidly with high-risk attacks. These attacks disrupt IoT transition by disrupting networks and services for IoT devices. Many recent studies have proposed ML and DL techniques for detecting and classifying botnet attacks in the IoT environment. This study proposes machine learning methods for classifying binary classes. This purpose is served by using the publicly available dataset UNSW-NB15. This dataset resolved a class imbalance problem using the SMOTE-OverSampling technique. A complete machine learning pipeline was proposed, including exploratory data analysis, which provides detailed insights into the data, followed by preprocessing. During this process, the data passes through six fundamental steps. A decision tree, an XgBoost model, and a logistic regression model are proposed, trained, tested, and evaluated on the dataset. In addition to model accuracy, F1-score, recall, and precision are also considered. Based on all experiments, it is concluded that the decision tree outperformed with 94% test accuracy.


Subject(s)
Machine Learning , Software , Data Analysis , Logistic Models
4.
Sensors (Basel) ; 22(16)2022 Aug 10.
Article in English | MEDLINE | ID: mdl-36015727

ABSTRACT

The digital transformation disrupts the various professional domains in different ways, though one aspect is common: the unified platform known as cloud computing. Corporate solutions, IoT systems, analytics, business intelligence, and numerous tools, solutions and systems use cloud computing as a global platform. The migrations to the cloud are increasing, causing it to face new challenges and complexities. One of the essential segments is related to data storage. Data storage on the cloud is neither simplistic nor conventional; rather, it is becoming more and more complex due to the versatility and volume of data. The inspiration of this research is based on the development of a framework that can provide a comprehensive solution for cloud computing storage in terms of replication, and instead of using formal recovery channels, erasure coding has been proposed for this framework, which in the past proved itself as a trustworthy mechanism for the job. The proposed framework provides a hybrid approach to combine the benefits of replication and erasure coding to attain the optimal solution for storage, specifically focused on reliability and recovery. Learning and training mechanisms were developed to provide dynamic structure building in the future and test the data model. RAID architecture is used to formulate different configurations for the experiments. RAID-1 to RAID-6 are divided into two groups, with RAID-1 to 4 in the first group while RAID-5 and 6 are in the second group, further categorized based on FTT, parity, failure range and capacity. Reliability and recovery are evaluated on the rest of the data on the server side, and for the data in transit at the virtual level. The overall results show the significant impact of the proposed hybrid framework on cloud storage performance. RAID-6c at the server side came out as the best configuration for optimal performance. The mirroring for replication using RAID-6 and erasure coding for recovery work in complete coherence provide good results for the current framework while highlighting the interesting and challenging paths for future research.


Subject(s)
Cloud Computing , Information Storage and Retrieval , Computers , Reproducibility of Results
5.
Biomed Res Int ; 2022: 2636515, 2022.
Article in English | MEDLINE | ID: mdl-35707376

ABSTRACT

One of the most well-known methods for solving real-world and complex optimization problems is the gravitational search algorithm (GSA). The gravitational search technique suffers from a sluggish convergence rate and weak local search capabilities while solving complicated optimization problems. A unique hybrid population-based strategy is designed to tackle the problem by combining dynamic multiswarm particle swarm optimization with gravitational search algorithm (GSADMSPSO). In this manuscript, GSADMSPSO is used as novel training techniques for Feedforward Neural Networks (FNNs) in order to test the algorithm's efficiency in decreasing the issues of local minima trapping and existing evolutionary learning methods' poor convergence rate. A novel method GSADMSPSO distributes the primary population of masses into smaller subswarms, according to the proposed algorithm, and also stabilizes them by offering a new neighborhood plan. At this time, each agent (particle) increases its position and velocity by using the suggested algorithm's global search capability. The fundamental concept is to combine GSA's ability with DMSPSO's to improve the performance of a given algorithm's exploration and exploitation. The suggested algorithm's performance on a range of well-known benchmark test functions, GSA, and its variations is compared. The results of the experiments suggest that the proposed method outperforms the other variants in terms of convergence speed and avoiding local minima; FNNs are being trained.


Subject(s)
Algorithms , Neural Networks, Computer , Biological Evolution , Computer Simulation , Gravitation
6.
Biomed Res Int ; 2022: 9809932, 2022.
Article in English | MEDLINE | ID: mdl-35711517

ABSTRACT

There are many thyroid diseases affecting people all over the world. Many diseases affect the thyroid gland, like hypothyroidism, hyperthyroidism, and thyroid cancer. Thyroid inefficiency can cause severe symptoms in patients. Effective classification and machine learning play a significant role in the timely detection of thyroid diseases. This timely classification will indeed affect the timely treatment of the patients. Automatic and precise thyroid nodule detection in ultrasound pictures is critical for reducing effort and radiologists' mistake rate. Medical images have evolved into one of the most valuable and consistent data sources for machine learning generation. In this paper, various machine learning algorithms like decision tree, random forest algorithm, KNN, and artificial neural networks on the dataset create a comparative analysis to better predict the disease based on parameters established from the dataset. Also, the dataset has been manipulated for accurate prediction for the classification. The classification was performed on both the sampled and unsampled datasets for better comparison of the dataset. After dataset manipulation, we obtained the highest accuracy for the random forest algorithm, equal to 94.8% accuracy and 91% specificity.


Subject(s)
Machine Learning , Thyroid Nodule , Algorithms , Decision Trees , Humans , Neural Networks, Computer , Support Vector Machine , Thyroid Nodule/diagnostic imaging
SELECTION OF CITATIONS
SEARCH DETAIL
...