Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Language
Publication year range
1.
IEEE Trans Neural Netw ; 11(3): 668-79, 2000.
Article in English | MEDLINE | ID: mdl-18249794

ABSTRACT

A novel neural network based technique, called "data strip mining" extracts predictive models from data sets which have a large number of potential inputs and comparatively few data points. This methodology uses neural network sensitivity analysis to determine which predictors are most significant in the problem. Neural network sensitivity analysis holds all but one input to a trained neural network constant while varying each input over its entire range to determine its effect on the output. The least sensitive variables are iteratively removed from the input set. For each iteration, model cross-validation uses multiple splits of training and validation data to determine an estimate of the model's ability to predict the output for data points not used during training. Elimination of variables through neural network sensitivity analysis and predicting performance through model cross-validation allows the analyst to reduce the number of inputs and improve the model's predictive ability at the same time. This paper illustrates this technique using a cartoon problem from classical physics. It then demonstrates its effectiveness on a pair of challenging problems from combinatorial chemistry with over 400 potential inputs each. For these data sets, model selection by neural sensitivity analysis outperformed other variable selection methods including forward selection and a genetic algorithm.

SELECTION OF CITATIONS
SEARCH DETAIL
...