Machine Learning
Learning From a Small Number of Training Examples by Exploiting Object Categories
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 6 - Volume 06
One-Shot Learning of Object Categories
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Sharing Visual Features for Multiclass and Multiview Object Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning a meta-level prior for feature relevance from multiple related tasks
Proceedings of the 24th international conference on Machine learning
Identifying feature relevance using a random forest
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
Hi-index | 0.00 |
The human ability to learn difficult object categories from just a few views is often explained by an extensive use of knowledge from related classes. In this work we study the use of feature relevance as prior information from similar binary classification tasks. An approach is presented which is capable to use this information to increase the recognition performance for learning with few examples on a new binary classification task. Feature relevance probabilities are estimated by a randomized decision forest of a related task and used as a prior distribution in the construction of a new forest. Experiments in an image categorization scenario show a significant performance gain in the case of few training examples.