The Journal of Machine Learning Research
Using Generalization Error Bounds to Train the Set Covering Machine
Neural Information Processing
Nearly Uniform Validation Improves Compression-Based Error Bounds
The Journal of Machine Learning Research
Selective sampling for classification
Canadian AI'08 Proceedings of the Canadian Society for computational studies of intelligence, 21st conference on Advances in artificial intelligence
Margin-sparsity trade-off for the set covering machine
ECML'05 Proceedings of the 16th European conference on Machine Learning
Hierarchical linear support vector machine
Pattern Recognition
A pseudo-boolean set covering machine
CP'12 Proceedings of the 18th international conference on Principles and Practice of Constraint Programming
Learning theory analysis for association rules and sequential event prediction
The Journal of Machine Learning Research
Hi-index | 0.00 |
We present a learning algorithm for decision lists which allows features that are constructed from the data and allows a trade-off between accuracy and complexity. We provide bounds on the generalization error of this learning algorithm in terms of the number of errors and the size of the classifier it finds on the training data. We also compare its performance on some natural data sets with the set covering machine and the support vector machine. Furthermore, we show that the proposed bounds on the generalization error provide effective guides for model selection.