SIAM Journal on Scientific and Statistical Computing
GTM: the generative topographic mapping
Neural Computation
Learning in graphical models
Learning and Design of Principal Curves
IEEE Transactions on Pattern Analysis and Machine Intelligence
Finding Curvilinear Features in Spatial Point Patterns: Principal Curve Clustering with Noise
IEEE Transactions on Pattern Analysis and Machine Intelligence
Piecewise Linear Skeletonization Using Principal Curves
IEEE Transactions on Pattern Analysis and Machine Intelligence
Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Principal Manifolds and Probabilistic Subspaces for Visual Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Mean Shift, Mode Seeking, and Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Think globally, fit locally: unsupervised learning of low dimensional manifolds
The Journal of Machine Learning Research
Unsupervised Learning of Image Manifolds by Semidefinite Programming
International Journal of Computer Vision
Gaussian Mean-Shift Is an EM Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Principal Manifolds for Data Visualization and Dimension Reduction
Principal Manifolds for Data Visualization and Dimension Reduction
Locally Defined Principal Curves and Surfaces
The Journal of Machine Learning Research
Freeway traffic stream modeling based on principal curves and its analysis
IEEE Transactions on Intelligent Transportation Systems
RBF principal manifolds for process monitoring
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In practical applications related to, for instance, machine learning, data mining and pattern recognition, one is commonly dealing with noisy data lying near some low-dimensional manifold. A well-established tool for extracting the intrinsically low-dimensional structure from such data is principal component analysis (PCA). Due to the inherent limitations of this linear method, its extensions to extraction of nonlinear structures have attracted increasing research interest in recent years. Assuming a generative model for noisy data, we develop a probabilistic approach for separating the data-generating nonlinear functions from noise. We demonstrate that ridges of the marginal density induced by the model are viable estimators for the generating functions. For projecting a given point onto a ridge of its estimated marginal density, we develop a generalized trust region Newton method and prove its convergence to a ridge point. Accuracy of the model and computational efficiency of the projection method are assessed via numerical experiments where we utilize Gaussian kernels for nonparametric estimation of the underlying densities of the test datasets.