Connectionist learning procedures
Artificial Intelligence
Learning invariance from transformation sequences
Neural Computation
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Models for Audiovisual Fusion in a Noisy-Vowel Recognition Task
Journal of VLSI Signal Processing Systems - special issue on multimedia signal processing
Slow feature analysis: unsupervised learning of invariances
Neural Computation
Extracting Slow Subspaces from Natural Videos Leads to Complex Cells
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Best Practices for Convolutional Neural Networks Applied to Visual Document Analysis
ICDAR '03 Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 2
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Face Recognition with Image Sets Using Manifold Density Divergence
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Learning viewpoint invariant object representations using a temporal coherence principle
Biological Cybernetics
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
A trainable feature extractor for handwritten digit recognition
Pattern Recognition
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Deformation Models for Image Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel class-wise locality preserving projection
Information Sciences: an International Journal
Removing time variation with the anti-hebbian differential synapse
Neural Computation
A unified framework for semi-supervised dimensionality reduction
Pattern Recognition
5th Annual International Conference on Mobile and Ubiquitous Systems: Computing, Networking, and Services
Adaptive nonlinear manifolds and their applications to pattern recognition
Information Sciences: an International Journal
Supervised nonlinear dimensionality reduction for visualization and classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
EASE'08 Proceedings of the 12th international conference on Evaluation and Assessment in Software Engineering
Curvilinear component analysis: a self-organizing neural network for nonlinear mapping of data sets
IEEE Transactions on Neural Networks
Feature extraction using a fast null space based linear discriminant analysis algorithm
Information Sciences: an International Journal
Shape classification by manifold learning in multiple observation spaces
Information Sciences: an International Journal
Two-factor face authentication using matrix permutation transformation and a user password
Information Sciences: an International Journal
Hi-index | 0.07 |
Temporal coherence principle is an attractive biologically inspired learning rule to extract slowly varying features from quickly varying input data. In this paper we develop a new Nonlinear Neighborhood Preserving (NNP) technique, by utilizing the temporal coherence principle to find an optimal low dimensional representation from the original high dimensional data. NNP is based on a nonlinear expansion of the original input data, such as polynomials of a given degree. It can be solved by the eigenvalue problem without using gradient descent and is guaranteed to find the global optimum. NNP can be viewed as a nonlinear dimensionality reduction framework which takes into consideration both time series and data sets without an obvious temporal structure. According to different situations, we introduce three algorithms of NNP, named NNP-1, NNP-2, and NNP-3. The objective function of NNP-1 is equal to Slow Feature Analysis (SFA), and it works well for time series such as image sequences. NNP-2 artificially constructs time series consisting of neighboring points for data sets without a clear temporal structure such as image data. NNP-3 is proposed for classification tasks, which can minimize the distances of neighboring points in the embedding space and ensure that the remaining points are as far apart as possible simultaneously. Furthermore, the kernel extension of NNP is also discussed in this paper. The proposed algorithms work very well on some image sequences and image data sets compared to other methods. Meanwhile, we perform the classification task on the MNIST handwritten digit database using the supervised NNP algorithms. The experimental results demonstrate that NNP is an effective technique for nonlinear dimensionality reduction tasks.