An incremental neural network with a reduced architecture.
Neural Netw
; 35: 70-81, 2012 Nov.
Article
em En
| MEDLINE
| ID: mdl-22954480
This paper proposes a technique, called Evolving Probabilistic Neural Network (ePNN), that presents many interesting features, including incremental learning, evolving architecture, the capacity to learn continually throughout its existence and requiring that each training sample be used only once in the training phase without reprocessing. A series of experiments was performed on data sets in the public domain; the results indicate that ePNN is superior or equal to the other incremental neural networks evaluated in this paper. These results also demonstrate the advantage of the small ePNN architecture and show that its architecture is more stable than the other incremental neural networks evaluated. ePNN thus appears to be a promising alternative for a quick learning system and a fast classifier with a low computational cost.
Texto completo:
1
Coleções:
01-internacional
Base de dados:
MEDLINE
Assunto principal:
Algoritmos
/
Inteligência Artificial
/
Redes Neurais de Computação
/
Lógica Fuzzy
/
Rede Nervosa
Idioma:
En
Revista:
Neural Netw
Assunto da revista:
NEUROLOGIA
Ano de publicação:
2012
Tipo de documento:
Article
País de afiliação:
Brasil
País de publicação:
Estados Unidos