What is the goal of sensory coding?
Neural Computation
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Training Invariant Support Vector Machines
Machine Learning
Best Practices for Convolutional Neural Networks Applied to Visual Document Analysis
ICDAR '03 Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 2
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
LIBLINEAR: A Library for Large Linear Classification
The Journal of Machine Learning Research
Optimal connectivity in hardware-targetted MLP networks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Deep, big, simple neural nets for handwritten digit recognition
Neural Computation
Sparse activity and sparse connectivity in supervised learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
Sparsely connected Multi-Layer Perceptrons (MLPs) differ from conventional MLPs in that only a small fraction of entries in their weight matrices are nonzero. Using sparse matrix-vector multiplication algorithms reduces the computational complexity of classification. Training of sparsely connected MLPs is achieved in two consecutive stages. In the first stage, initial values for the network's parameters are given by the solution to an unsupervised matrix factorization problem, minimizing the reconstruction error. In the second stage, a modified version of the supervised backpropagation algorithm optimizes the MLP's parameters with respect to the classification error. Experiments on the MNIST database of handwritten digits show that the proposed approach achieves equal classification performance compared to a densely connected MLP while speeding-up classification by a factor of seven.