Bounding sample size with the Vapnik-Chervonenkis dimension
Discrete Applied Mathematics
Neural Computation
Journal of Complexity
Neural Computation
Performance bounds for nonlinear time series prediction
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
On the value of partial information for learning from examples
Journal of Complexity
Learning dynamical systems in a stationary environment
Systems & Control Letters - Special issue: learning theory
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
The Role of Critical Sets in Vapnik-Chervonenkis Theory
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Learning theory techniques in control theory
Learning theory techniques in control theory
Boosting with side information
ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part I
Hi-index | 0.00 |
This paper considers a modification of a PAC learning theory problem in which each instance of the training data is supplemented with side information. In this case, a transformation, given by a side-information map, of the training instance is also classified. However, the learning algorithm needs only to classify a new instance, not the instance and its value under the side information map. Side information can improve general learning rates, but not always. This paper shows that side information leads to the improvement of standard PAC learning theory rate bounds, under restrictions on the probable overlap between concepts and their images under the side information map.