Improving dimensionality reduction with spectral gradient descent.
Neural Netw
; 18(5-6): 702-10, 2005.
Article
in En
| MEDLINE
| ID: mdl-16112551
We introduce spectral gradient descent, a way of improving iterative dimensionality reduction techniques. The method uses information contained in the leading eigenvalues of a data affinity matrix to modify the steps taken during a gradient-based optimization procedure. We show that the approach is able to speed up the optimization and to help dimensionality reduction methods find better local minima of their objective functions. We also provide an interpretation of our approach in terms of the power method for finding the leading eigenvalues of a symmetric matrix and verify the usefulness of the approach in some simple experiments.
Search on Google
Collection:
01-internacional
Database:
MEDLINE
Main subject:
Artificial Intelligence
/
Data Interpretation, Statistical
Type of study:
Prognostic_studies
/
Risk_factors_studies
Language:
En
Journal:
Neural Netw
Journal subject:
NEUROLOGIA
Year:
2005
Document type:
Article
Affiliation country:
Canada
Country of publication:
United States