Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 6 de 6
Filter
Add more filters










Database
Language
Publication year range
1.
Soft comput ; 27(5): 2717-2727, 2023.
Article in English | MEDLINE | ID: mdl-34483721

ABSTRACT

Communicable disease pandemic is a severe disease outbreak all over the countries and continents. Swine Flu, HIV/AIDS, corona virus disease-19 (COVID-19), etc., are some of the global pandemics in the world. The major cause of becoming pandemic is community transmission and lack of social distancing. Recently, COVID-19 is such a largest outbreak all over the world. This disease is a communicable disease which is spreading fastly due to community transmission, where the affected people in the community affect the heathy people in the community. Government is taking precautions by imposing social distancing in the countries or state to control the impact of COVID-19. Social distancing can reduce the community transmission of COVID-19 by reducing the number of infected persons in an area. This is performed by staying at home and maintaining social distance with people. It reduces the density of people in an area by which it is difficult for the virus to spread from one person to other. In this work, the community transmission is presented using simulations. It shows how an infected person affects the healthy persons in an area. Simulations also show how social distancing can control the spread of COVID-19. The simulation is performed in GNU Octave programming platform by considering number of infected persons and number of healthy persons as parameters. Results show that using the social distancing the number of infected persons can be reduced and heathy persons can be increased. Therefore, from the analysis it is concluded that social distancing will be a better solution of prevention from community transmission.

2.
Multimed Tools Appl ; 81(29): 41995-42021, 2022.
Article in English | MEDLINE | ID: mdl-36090152

ABSTRACT

Coronavirus Disease-19 (COVID-19) is a major concern for the entire world in the current era. Coronavirus is a very dangerous infectious virus that spreads rapidly from person to person. It spreads in exponential manner on a global scale. It affects the doctors, nurse and other COVID-19 warriors those who are actively involved for the treatment of COVID-19 infected (CI) patients. So, it is very much essential to focus on automation and artificial intelligence (AI) in different hospitals for the treatment of such infected patients and all should be very much careful to break the chain of spreading this novel virus. In this paper, a novel patient service robots (PSRs) assignment framework and a priority based (PB) method using fuzzy rule based (FRB) approach is proposed for the assignment of PSRs for CI patients in hospitals in order to provide safety to the COVID-19 warriors as well as to the CI infected patients. This novel approach is mainly focused on lowering the active involvement of COVID-19 warriors for the treatment of high asymptotic COVID-19 infected (HACI) patients for handling this tough situation. In this work, we have focused on HACI and low asymptotic COVID-19 infected (LACI) patients. Higher priority is given to HACI patients as compared to LACI patients to handle this critical situation in order to increase the survival probability of these patients. The proposed method deals with situations that practically arise during the assignment of PSRs for the treatment of such patients. The simulation of the work is carried out using MATLAB R2015b.

3.
Neural Comput Appl ; 34(14): 11361-11382, 2022.
Article in English | MEDLINE | ID: mdl-33526959

ABSTRACT

Coronavirus disease-19 (COVID-19) is a very dangerous infectious disease for the entire world in the current scenario. Coronavirus spreads from one person to another person very rapidly. It spreads exponentially throughout the globe. Everyone should be cautious to avoid the spreading of this novel disease. In this paper, a fuzzy rule-based approach using priority-based method is proposed for the management of hospital beds for COVID-19 infected patients in the worst-case scenario where the number of hospital beds is very less as compared to the number of COVID-19 infected patients. This approach mainly attempts to minimize the number of hospital beds as well as emergency beds requirement for the treatment of COVID-19 infected patients to handle such a critical situation. In this work, higher priority has given to severe COVID-19 infected patients as compared to mild COVID-19 infected patients to handle this critical situation so that the survival probability of the COVID-19 infected patients can be increased. The proposed method is compared with first-come first-serve (FCFS)-based method to analyze the practical problems that arise during the assignment of hospital beds and emergency beds for the treatment of COVID-19 patients. The simulation of this work is carried out using MATLAB R2015b.

4.
Comput Intell Neurosci ; 2022: 6967938, 2022.
Article in English | MEDLINE | ID: mdl-36590844

ABSTRACT

Fog computing provides a multitude of end-based IoT system services. End IoT devices exchange information with fog nodes and the cloud to handle client undertakings. During the process of data collection between the layer of fog and the cloud, there are more chances of crucial attacks or assaults like DDoS and many more security attacks being compromised by IoT end devices. These network (NW) threats must be spotted early. Deep learning (DL) assumes an unmistakable part in foreseeing the end client behavior by extricating highlights and grouping the foe in the network. Yet, because of IoT devices' compelled nature in calculation and storage spaces, DL cannot be managed on those. Here, a framework for fog-based attack detection is proffered, and different attacks are prognosticated utilizing long short-term memory (LSTM). The end IoT gadget behaviour can be prognosticated by installing a trained LSTMDL model at the fog node computation module. The simulations are performed using Python by comparing LSTMDL model with deep neural multilayer perceptron (DNMLP), bidirectional LSTM (Bi-LSTM), gated recurrent units (GRU), hybrid ensemble model (HEM), and hybrid deep learning model (CNN + LSTM) comprising convolutional neural network (CNN) and LSTM on DDoS-SDN (Mendeley Dataset), NSLKDD, UNSW-NB15, and IoTID20 datasets. To evaluate the performance of the binary classifier, metrics like accuracy, precision, recall, f1-score, and ROC-AUC curves are considered on these datasets. The LSTMDL model shows outperforming nature in binary classification with 99.70%, 99.12%, 94.11%, and 99.88% performance accuracies on experimentation with respective datasets. The network simulation further shows how different DL models present fog layer communication behaviour detection time (CBDT). DNMLP detects communication behaviour (CB) faster than other models, but LSTMDL predicts assaults better.


Subject(s)
Benchmarking , Communication , Humans , Computer Simulation , Data Collection , Intelligence
5.
PeerJ Comput Sci ; 7: e578, 2021.
Article in English | MEDLINE | ID: mdl-34239972

ABSTRACT

In the traditional irrigation process, a huge amount of water consumption is required which leads to water wastage. To reduce the wasting of water for this tedious task, an intelligent irrigation system is urgently needed. The era of machine learning (ML) and the Internet of Things (IoT) brings it is a great advantage of building an intelligent system that performs this task automatically with minimal human effort. In this study, an IoT enabled ML-trained recommendation system is proposed for efficient water usage with the nominal intervention of farmers. IoT devices are deployed in the crop field to precisely collect the ground and environmental details. The gathered data are forwarded and stored in a cloud-based server, which applies ML approaches to analyze data and suggest irrigation to the farmer. To make the system robust and adaptive, an inbuilt feedback mechanism is added to this recommendation system. The experimentation, reveals that the proposed system performs quite well on our own collected dataset and National Institute of Technology (NIT) Raipur crop dataset.

6.
Int Sch Res Notices ; 2014: 653131, 2014.
Article in English | MEDLINE | ID: mdl-27433485

ABSTRACT

Selecting junctions intelligently for data transmission provides better intelligent transportation system (ITS) services. The main problem in vehicular communication is high disturbances of link connectivity due to mobility and less density of vehicles. If link conditions are predicted earlier, then there is a less chance of performance degradation. In this paper, an intelligent junction selection based routing protocol (IJS) is proposed to transmit the data in a quickest path, in which the vehicles are mostly connected and have less link connectivity problem. In this protocol, a helping vehicle is set at every junction to control the communication by predicting link failures or network gaps in a route. Helping vehicle at the junction produces a score for every neighboring junction to forward the data to the destination by considering the current traffic information and selects that junction which has minimum score. IJS protocol is implemented and compared with GyTAR, A-STAR, and GSR routing protocols. Simulation results show that IJS performs better in terms of average end-to-end delay, network gap encounter, and number of hops.

SELECTION OF CITATIONS
SEARCH DETAIL
...