Connectionist learning procedures
Artificial Intelligence
Learning invariance from transformation sequences
Neural Computation
Modified Hebbian learning for curve and surface fitting
Neural Networks
Learning factorial codes by predictability minimization
Neural Computation
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Sequential Extraction of Minor Components
Neural Processing Letters
Vector Quantization
Self-Organizing Maps
Slow feature analysis: unsupervised learning of invariances
Neural Computation
Training products of experts by minimizing contrastive divergence
Neural Computation
Unsupervised Learning in LSTM Recurrent Neural Networks
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Candid Covariance-Free Incremental Principal Component Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
A spatio-temporal extension to Isomap nonlinear dimension reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems
The Journal of Machine Learning Research
Removing time variation with the anti-hebbian differential synapse
Neural Computation
Novel Incremental Principal Component Analysis with Improved Performance
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
PerMIS '08 Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems
Sequential constant size compressors for reinforcement learning
AGI'11 Proceedings of the 4th international conference on Artificial general intelligence
On the relation of slow feature analysis and laplacian eigenmaps
Neural Computation
Incremental slow feature analysis
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Algorithms for accelerated convergence of adaptive PCA
IEEE Transactions on Neural Networks
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Construction of approximation spaces for reinforcement learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
We introduce here an incremental version of slow feature analysis (IncSFA), combining candid covariance-free incremental principal components analysis (CCIPCA) and covariance-free incremental minor components analysis (CIMCA). IncSFA's feature updating complexity is linear with respect to the input dimensionality, while batch SFA's (BSFA) updating complexity is cubic. IncSFA does not need to store, or even compute, any covariance matrices. The drawback to IncSFA is data efficiency: it does not use each data point as effectively as BSFA. But IncSFA allows SFA to be tractably applied, with just a few parameters, directly on high-dimensional input streams (e.g., visual input of an autonomous agent), while BSFA has to resort to hierarchical receptive-field-based architectures when the input dimension is too high. Further, IncSFA's updates have simple Hebbian and anti-Hebbian forms, extending the biological plausibility of SFA. Experimental results show IncSFA learns the same set of features as BSFA and can handle a few cases where BSFA fails.