Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 5 de 5
Filter
Add more filters










Database
Language
Publication year range
1.
Physica A ; 557: 124991, 2020 Nov 01.
Article in English | MEDLINE | ID: mdl-32834435

ABSTRACT

In this article we want to show the potential of an evolutionary algorithm called Topological Weighted Centroid (TWC). This algorithm can obtain new and relevant information from extremely limited and poor datasets. In a world dominated by the concept of big (fat?) data we want to show that it is possible, by necessity or choice, to work profitably even on small data. This peculiarity of the algorithm means that even in the early stages of an epidemic process, when the data are too few to have sufficient statistics, it is possible to obtain important information. To prove our theory, we addressed one of the most central issues at the moment: the COVID-19 epidemic. In particular, the cases recorded in Italy have been selected. Italy seems to have a central role in this epidemic because of the high number of measured infections. Through this innovative artificial intelligence algorithm, we have tried to analyze the evolution of the phenomenon and to predict its future steps using a dataset that contained only geospatial coordinates (longitude and latitude) of the first recorded cases. Once the coordinates of the places where at least one case of contagion had been officially diagnosed until February 26th, 2020 had been collected, research and analysis was carried out on: outbreak point and related heat map (TWC alpha); probability distribution of the contagion on February 26th (TWC beta); possible spread of the phenomenon in the immediate future and then in the future of the future (TWC gamma and TWC theta); how this passage occurred in terms of paths and mutual influence (Theta paths and Markov Machine). Finally, a heat map of the possible situation towards the end of the epidemic in terms of infectiousness of the areas was drawn up. The analyses with TWC confirm the assumptions made at the beginning.

2.
Comput Methods Programs Biomed ; 191: 105401, 2020 Jul.
Article in English | MEDLINE | ID: mdl-32146212

ABSTRACT

BACKGROUND AND OBJECTIVE: Atrial fibrillation (AF) is the most common cardiac arrhythmia in clinical practice, having been recognized as a true cardiovascular epidemic. In this paper, a new methodology for Computer Aided Diagnosis of AF based on a special kind of artificial adaptive systems has been developed. METHODS: Following the extraction of data from the PhysioNet repository, a new dataset composed of the R/R distances of 73 patients was created. To avoid redundancy, the training set was created by randomly selecting 50% of the subjects from the entire sample, thus making a choice by patient and not by record. The remaining 50% of subjects were randomly split by records in testing and prediction sets. The original ECG data has been transformed according to the following four orders of abstraction: a) sequence of R/R intervals; b) composition of ECG data into a moving window; c) training of different machine learning systems to abstract the function governing the AF; d) fuzzy transformation of Machine learning estimations. In this paper, in parallel with the classic method of windowing, we propose a variant based on a system of progressive moving averages. RESULTS: The best performing machine learning, Supervised Contractive Map (SVCm), reached an overall mean accuracy of 95%. SVCm is a new deep neural network based on a different principle than the usual descending gradient. The minimization of the error occurs by means of decomposition into contracted sine functions. CONCLUSIONS: In this research, atrial fibrillation is considered from a completely different point of view than classical methods. It is seen as the stable process, i.e. the function, that manages the irregularity of the irregularities of the R/R intervals. The idea, therefore, is to abstract from mere physiology to investigate fibrillation as a mathematical object that handles irregularities. The attained results seem to open new perspectives for the use of potent artificial adaptive systems for the automatic detection of atrial fibrillation, with accuracy rates extremely promising for real world applications.


Subject(s)
Atrial Fibrillation/diagnosis , Diagnosis, Computer-Assisted , Machine Learning , Algorithms , Databases, Factual , Humans
3.
Chaos ; 28(5): 055914, 2018 May.
Article in English | MEDLINE | ID: mdl-29857650

ABSTRACT

In this paper, we introduce an innovative approach to the fusion between datasets in terms of attributes and observations, even when they are not related at all. With our technique, starting from datasets representing independent worlds, it is possible to analyze a single global dataset, and transferring each dataset onto the others is always possible. This procedure allows a deeper perspective in the study of a problem, by offering the chance of looking into it from other, independent points of view. Even unrelated datasets create a metaphoric representation of the problem, useful in terms of speed of convergence and predictive results, preserving the fundamental relationships in the data. In order to extract such knowledge, we propose a new learning rule named double backpropagation, by which an auto-encoder concurrently codifies all the different worlds. We test our methodology on different datasets and different issues, to underline the power and flexibility of the Theory of Impossible Worlds.

4.
Artif Intell Med ; 64(1): 59-74, 2015 May.
Article in English | MEDLINE | ID: mdl-25997573

ABSTRACT

OBJECTIVE: This paper proposes a new, complex algorithm for the blind classification of the original electroencephalogram (EEG) tracing of each subject, without any preliminary pre-processing. The medical need in this field is to reach an early differential diagnosis between subjects affected by mild cognitive impairment (MCI), early Alzheimer's disease (AD) and the healthy elderly (CTR) using only the recording and the analysis of few minutes of their EEG. METHODS AND MATERIAL: This study analyzed the EEGs of 272 subjects, recorded at Rome's Neurology Unit of the Policlinico Campus Bio-Medico. The EEG recordings were performed using 19 electrodes, in a 0.3-70Hz bandpass, positioned according to the International 10-20 System. Many powerful learning machines and algorithms have been proposed during the last 20 years to effectively resolve this complex problem, resulting in different and interesting outcomes. Among these algorithms, a new artificial adaptive system, named implicit function as squashing time (I-FAST), is able to diagnose, with high accuracy, a few minutes of the subject's EEG track; whether it manifests an AD, MCI or CTR condition. An updating of this system, carried out by adding a new algorithm, named multi scale ranked organizing maps (MS-ROM), to the I-FAST system, is presented, in order to classify with greater accuracy the unprocessed EEG's of AD, MCI and control subjects. RESULTS: The proposed system has been measured on three independent pattern recognition tasks from unprocessed EEG tracks of a sample of AD subjects, MCI subjects and CTR: (a) AD compared with CTR; (b) AD compared with MCI; (c) CTR compared with MCI. While the values of accuracy of the previous system in distinguishing between AD and MCI were around 92%, the new proposed system reaches values between 94% and 98%. Similarly, the overall accuracy with best artificial neural networks (ANNs) is 98.25% for the distinguishing between CTR and MCI. CONCLUSIONS: This new version of I-FAST makes different steps forward: (a) avoidance of pre-processing phase and filtering procedure of EEG data, being the algorithm able to directly process an unprocessed EEG; (b) noise elimination, through the use of a training variant with input selection and testing system, based on naïve Bayes classifier; (c) a more robust classification phase, showing the stability of results on nine well known learning machine algorithms; (d) extraction of spatial invariants of an EEG signal using, in addition to the unsupervised ANN, the principal component analysis and the multi scale entropy, together with the MS-ROM; a more accurate performance in this specific task.


Subject(s)
Alzheimer Disease/diagnosis , Pattern Recognition, Automated/methods , Bayes Theorem , Cognitive Dysfunction/diagnosis , Diagnosis, Differential , Electroencephalography/methods , Female , Humans , Male , Neural Networks, Computer
5.
Microsc Microanal ; 19(4): 988-95, 2013 Aug.
Article in English | MEDLINE | ID: mdl-23721654

ABSTRACT

Duplex stainless steels (DSS) may be defined as a category of steels with a two-phase ferritic-austenitic microstructure, which combines good mechanical and corrosion properties. However, these steels can undergo significant microstructural modification as a consequence of either thermo-mechanical treatments (ferrite decomposition, which causes σ- and χ-phase formation and nitride precipitation) or plastic deformation at room temperature [austenite transformation into strain-induced martensite (SIM)]. These secondary phases noticeably affect the properties of DSS, and therefore are of huge industrial interest. In the present work, SIM formation was investigated in a 2101 lean DSS. The material was subjected to cold rolling at various degrees of deformation (from 10 to 80% thickness reduction) and the microstructure developed after plastic deformation was investigated by electron backscattered diffraction, X-ray diffraction measurements, and hardness and magnetic tests. It was observed that SIM formed as a consequence of deformations higher than ~20% and residual austenite was still observed at 80% of thickness reduction. Furthermore, a direct relationship was found between microstructure and magnetic properties.

SELECTION OF CITATIONS
SEARCH DETAIL
...