Numerical continuation methods: an introduction
Numerical continuation methods: an introduction
Explanation-Based Neural Network Learning: A Lifelong Learning Approach
Explanation-Based Neural Network Learning: A Lifelong Learning Approach
Global Continuation for Distance Geometry Problems
SIAM Journal on Optimization
Parallel Continuation-Based Global Optimization for Molecular Conformation and Protein Folding
Parallel Continuation-Based Global Optimization for Molecular Conformation and Protein Folding
UNSUPERVISED LEARNING OF DISTRIBUTIONS ON BINARY VECTORS USING TWO LAYER NETWORKS
UNSUPERVISED LEARNING OF DISTRIBUTIONS ON BINARY VECTORS USING TWO LAYER NETWORKS
A fast learning algorithm for deep belief nets
Neural Computation
An empirical evaluation of deep architectures on problems with many factors of variation
Proceedings of the 24th international conference on Machine learning
Restricted Boltzmann machines for collaborative filtering
Proceedings of the 24th international conference on Machine learning
A unified architecture for natural language processing: deep neural networks with multitask learning
Proceedings of the 25th international conference on Machine learning
Extracting and composing robust features with denoising autoencoders
Proceedings of the 25th international conference on Machine learning
Deep learning via semi-supervised embedding
Proceedings of the 25th international conference on Machine learning
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
From baby steps to Leapfrog: how "Less is More" in unsupervised dependency parsing
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
Word representations: a simple and general method for semi-supervised learning
ACL '10 Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics
Multitask Kernel-based Learning with Logic Constraints
Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
Unsupervised Layer-Wise Model Selection in Deep Neural Networks
Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
Exploration in relational worlds
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part II
IEEE Transactions on Evolutionary Computation - Special issue on preference-based multiobjective evolutionary algorithms
Language models as representations for weakly-supervised NLP tasks
CoNLL '11 Proceedings of the Fifteenth Conference on Computational Natural Language Learning
Natural Language Processing (Almost) from Scratch
The Journal of Machine Learning Research
Supervised learning with minimal effort
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Compositional matrix-space models for sentiment analysis
EMNLP '11 Proceedings of the Conference on Empirical Methods in Natural Language Processing
Abstraction and generalization in reinforcement learning: a summary and framework
ALA'09 Proceedings of the Second international conference on Adaptive and Learning Agents
Editors Choice Article: I2VM: Incremental import vector machines
Image and Vision Computing
On the utility of curricula in unsupervised learning of probabilistic grammars
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Curriculum learning for motor skills
Canadian AI'12 Proceedings of the 25th Canadian conference on Advances in Artificial Intelligence
Behavioral factors in interactive training of text classifiers
NAACL HLT '12 Proceedings of the 2012 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Three dependency-and-boundary models for grammar induction
EMNLP-CoNLL '12 Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning
Self-paced dictionary learning for image classification
Proceedings of the 20th ACM international conference on Multimedia
Ensemble partitioning for unsupervised image categorization
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part III
Ordered racing protocols for automatically configuring algorithms for scaling performance
Proceedings of the 15th annual conference on Genetic and evolutionary computation
Textual Similarity with a Bag-of-Embedded-Words Model
Proceedings of the 2013 Conference on the Theory of Information Retrieval
Exploration in relational domains for model-based reinforcement learning
The Journal of Machine Learning Research
Short text classification by detecting information path
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Deep learning of representations: looking forward
SLSP'13 Proceedings of the First international conference on Statistical Language and Speech Processing
Object and Action Classification with Latent Window Parameters
International Journal of Computer Vision
Hi-index | 0.00 |
Humans and animals learn much better when the examples are not randomly presented but organized in a meaningful order which illustrates gradually more concepts, and gradually more complex ones. Here, we formalize such training strategies in the context of machine learning, and call them "curriculum learning". In the context of recent research studying the difficulty of training in the presence of non-convex training criteria (for deep deterministic and stochastic neural networks), we explore curriculum learning in various set-ups. The experiments show that significant improvements in generalization can be achieved. We hypothesize that curriculum learning has both an effect on the speed of convergence of the training process to a minimum and, in the case of non-convex criteria, on the quality of the local minima obtained: curriculum learning can be seen as a particular form of continuation method (a general strategy for global optimization of non-convex functions).