A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Active shape models—their training and application
Computer Vision and Image Understanding
Machine Learning
Automated pivot location for the Cartesian-Polar hybrid point distribution model
BMVC '95 Proceedings of the 1995 British conference on Machine vision (Vol. 1)
Probabilistic Visual Learning for Object Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
A hierarchical Markov modeling approach for the segmentation and tracking of deformable shapes
Graphical Models and Image Processing
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Nonlinear Modeling of Scattered Multivariate Data and Its Application to Shape Change
IEEE Transactions on Pattern Analysis and Machine Intelligence
Diffusion-Snakes Using Statistical Shape Knowledge
AFPAC '00 Proceedings of the Second International Workshop on Algebraic Frames for the Perception-Action Cycle
VLSM '01 Proceedings of the IEEE Workshop on Variational and Level Set Methods (VLSM'01)
Nonlinear Shape Statistics in Mumford-Shah Based Segmentation
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part II
Dynamic multi-view exploration of shape spaces
EuroVis'10 Proceedings of the 12th Eurographics / IEEE - VGTC conference on Visualization
Hi-index | 0.01 |
We present a novel approach for representing shape knowledge in terms of example views of 3D objects. Typically, such data sets exhibit a highly nonlinear structure with distinct clusters in the shape vector space, preventing the usual encoding by linear principal component analysis (PCA). For this reason, we propose a nonlinear Mercerkernel PCA scheme which takes into account both the projection distance and the within-subspace distance in a high-dimensional feature space. The comparison of our approach with supervised mixture models indicates that the statistics of example views of distinct 3D objects can fairly well be learned and represented in a completely unsupervised way.