Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 1 de 1
Filter
Add more filters










Database
Type of study
Language
Publication year range
1.
Neural Comput ; 16(10): 2197-219, 2004 Oct.
Article in English | MEDLINE | ID: mdl-15333211

ABSTRACT

In this letter, we show a direct relation between spectral embedding methods and kernel principal components analysis and how both are special cases of a more general learning problem: learning the principal eigenfunctions of an operator defined from a kernel and the unknown data-generating density. Whereas spectral embedding methods provided only coordinates for the training points, the analysis justifies a simple extension to out-of-sample examples (the Nyström formula) for multidimensional scaling (MDS), spectral clustering, Laplacian eigenmaps, locally linear embedding (LLE), and Isomap. The analysis provides, for all such spectral embedding methods, the definition of a loss function, whose empirical average is minimized by the traditional algorithms. The asymptotic expected value of that loss defines a generalization performance and clarifies what these algorithms are trying to learn. Experiments with LLE, Isomap, spectral clustering, and MDS show that this out-of-sample embedding formula generalizes well, with a level of error comparable to the effect of small perturbations of the training set on the embedding.


Subject(s)
Algorithms , Artificial Intelligence , Learning/physiology , Models, Statistical , Neural Networks, Computer , Cluster Analysis , Generalization, Psychological , Humans
SELECTION OF CITATIONS
SEARCH DETAIL
...