Connectionist learning procedures
Artificial Intelligence
Learning invariance from transformation sequences
Neural Computation
Slow feature analysis: unsupervised learning of invariances
Neural Computation
Best Practices for Convolutional Neural Networks Applied to Visual Document Analysis
ICDAR '03 Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 2
Learning viewpoint invariant object representations using a temporal coherence principle
Biological Cybernetics
A trainable feature extractor for handwritten digit recognition
Pattern Recognition
Deformation Models for Image Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Removing time variation with the anti-hebbian differential synapse
Neural Computation
Hi-index | 0.00 |
Slow Feature Analysis (SFA) is an unsupervised algorithm by extracting the slowly varying features from time series and has been used to pattern recognition successfully. Based on SFA, this paper develops a new algorithm, Slow Feature Discriminant Analysis (SFDA), which can maximize the temporal variation of between-class time series, and minimize the temporal variation of within-class time series simultaneously. Due to adoption of discrimination power, the performance on pattern recognition is improved compared to SFA. The experiments results on MNIST digit handwritten database also show that the proposed algorithm is in particular attractive.