GTM: the generative topographic mapping
Neural Computation
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
The minimax distortion redundancy in empirical quantizer design
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Nonlinear Principal Manifolds --- Adaptive Hybrid Learning Approaches
HAIS '08 Proceedings of the 3rd international workshop on Hybrid Artificial Intelligence Systems
Adaptive nonlinear manifolds and their applications to pattern recognition
Information Sciences: an International Journal
A leave-k-out cross-validation scheme for unsupervised kernel regression
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Hi-index | 0.01 |
Many settings of unsupervised learning can be viewed as quantization problems -- the minimization of the expected quantization error subject to some restrictions. This allows the use of tools such as regularization from the theory of (supervised) risk minimization for unsupervised settings. Moreover, this setting is very closely related to both principal curves and the generative topographic map. We explore this connection in two ways: 1) we propose an algorithm for finding principal manifolds that can be regularized in a variety of ways. Experimental results demonstrate the feasibility of the approach. 2) We derive uniform convergence bounds and hence bounds on the learning rates of the algorithm. In particular, we give good bounds on the covering numbers which allows us to obtain a nearly optimal learning rate of order O(m-1/2+α) for certain types of regularization operators, where m is the sample size and ff an arbitrary positive constant.