A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
SIAM Journal on Numerical Analysis
Boosting with structural sparsity
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Multiple indefinite kernel learning with mixed norm regularization
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Multi-class confidence weighted algorithms
EMNLP '09 Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 2 - Volume 2
Robustness and Regularization of Support Vector Machines
The Journal of Machine Learning Research
Hi-index | 0.00 |
Multi class problems are everywhere. Given an input the goal is to predict one of a few possible classes. Most previous work reduced learning to minimizing the empirical loss over some training set and an additional regularization term, prompting simple models or some other prior knowledge. Many learning regularizations promote sparsity, that is, small models or small number of features, as performed in group LASSO. Yet, such models do not always represent the classes well. In some problems, for each class, there is a small set of features that represents it well, yet the union of these sets is not small. We propose to use other regularizations that promote this type of sparsity, analyze the generalization property of such formulations, and show empirically that indeed, these regularizations not only perform well, but also promote such sparsity structure.