Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters










Database
Language
Publication year range
1.
IEEE Trans Neural Netw Learn Syst ; 24(4): 673-8, 2013 Apr.
Article in English | MEDLINE | ID: mdl-24808387

ABSTRACT

Ensemble pruning aims to increase efficiency by reducing the number of base classifiers, without sacrificing and preferably enhancing performance. In this brief, a novel pruning paradigm is proposed. Two class supervised learning problems are pruned using a combination of first- and second-order Walsh coefficients. A comparison is made with other ordered aggregation pruning methods, using multilayer perceptron base classifiers. The Walsh pruning method is analyzed with the help of a model that shows the relationship between second-order coefficients and added classification error with respect to Bayes error.

2.
IEEE Trans Neural Netw ; 22(8): 1334-9, 2011 Aug.
Article in English | MEDLINE | ID: mdl-21813360

ABSTRACT

Two-class supervised learning in the context of a classifier ensemble may be formulated as learning an incompletely specified Boolean function, and the associated Walsh coefficients can be estimated without the knowledge of the unspecified patterns. Using an extended version of the Tumer-Ghosh model, the relationship between added classification error and second-order Walsh coefficients is established. In this brief, the ensemble is composed of multilayer perceptron base classifiers, with the number of hidden nodes and epochs systematically varied. Experiments demonstrate that the mean second-order coefficients peak at the same number of training epochs as ensemble test error reaches a minimum.


Subject(s)
Algorithms , Pattern Recognition, Automated/methods , Databases, Factual/classification
3.
IEEE Trans Neural Netw ; 22(6): 988-94, 2011 Jun.
Article in English | MEDLINE | ID: mdl-21606023

ABSTRACT

A feature ranking scheme for multilayer perceptron (MLP) ensembles is proposed, along with a stopping criterion based upon the out-of-bootstrap estimate. To solve multi-class problems feature ranking is combined with modified error-correcting output coding. Experimental results on benchmark data demonstrate the versatility of the MLP base classifier in removing irrelevant features.


Subject(s)
Algorithms , Models, Theoretical , Neural Networks, Computer , Pattern Recognition, Automated/methods , Computer Simulation
4.
IEEE Trans Neural Netw ; 17(5): 1194-211, 2006 Sep.
Article in English | MEDLINE | ID: mdl-17001981

ABSTRACT

The difficulties of tuning parameters of multilayer perceptrons (MLP) classifiers are well known. In this paper, a measure is described that is capable of predicting the number of classifier training epochs for achieving optimal performance in an ensemble of MLP classifiers. The measure is computed between pairs of patterns on the training data and is based on a spectral representation of a Boolean function. This representation characterizes the mapping from classifier decisions to target label and allows accuracy and diversity to be incorporated within a single measure. Results on many benchmark problems, including the Olivetti Research Laboratory (ORL) face database demonstrate that the measure is well correlated with base-classifier test error, and may be used to predict the optimal number of training epochs. While correlation with ensemble test error is not quite as strong, it is shown in this paper that the measure may be used to predict number of epochs for optimal ensemble performance. Although the technique is only applicable to two-class problems, it is extended here to multiclass through output coding. For the output-coding technique, a random code matrix is shown to give better performance than one-per-class code, even when the base classifier is well-tuned.


Subject(s)
Algorithms , Models, Theoretical , Neural Networks, Computer , Pattern Recognition, Automated/methods , Artificial Intelligence , Cluster Analysis , Computer Simulation , Computing Methodologies
SELECTION OF CITATIONS
SEARCH DETAIL
...