Better subset regression using the nonnegative garrote
Technometrics
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Kodak's consumer video benchmark data set: concept definition and annotation
Proceedings of the international workshop on Workshop on multimedia information retrieval
Consistency of the Group Lasso and Multiple Kernel Learning
The Journal of Machine Learning Research
Local invariant feature detectors: a survey
Foundations and Trends® in Computer Graphics and Vision
Group lasso with overlap and graph lasso
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
NUS-WIDE: a real-world web image database from National University of Singapore
Proceedings of the ACM International Conference on Image and Video Retrieval
MSRA-MM 2.0: A Large-Scale Web Multimedia Dataset
ICDMW '09 Proceedings of the 2009 IEEE International Conference on Data Mining Workshops
Multi-label boosting for image annotation by structural grouping sparsity
Proceedings of the international conference on Multimedia
Heterogeneous feature selection by group lasso with logistic regression
Proceedings of the international conference on Multimedia
Image annotation by composite kernel learning with group structure
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Towards multi-semantic image annotation with graph regularized exclusive group lasso
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Logistic tensor regression for classification
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Hi-index | 0.00 |
As image feature vector is large, selecting the right features plays a fundamental role in Web image annotation. Most existing approaches are either based on individual feature selection, which leads to local optima, or using a convex penalty, which leads to inconsistency. To address these difficulties, in this paper we propose a new sparsity-based approach NOVA (NOn-conVex group spArsity). To the best of our knowledge, NOVA is the first to introduce non-convex penalty for group selection in high-dimensional heterogeneous features space. Because it is a group-sparsity approach, it approximately reaches global optima. Because it uses non-convex penalty, it achieves the consistency. We demonstrate the superior performance of NOVA via three means. First, we present theoretical proof that NOVA is consistent, satisfying un-biasness, sparsity and continuity. Second, we show NOVA converges to the true underlying model by using a ground-truth-available generative-model simulation. Third, we report extensive experimental results on three diverse and widely-used data sets Kodak, MSRA-MM 2.0, and NUS-WIDE. We also compare NOVA against the state-of-the-art approaches, and report superior experimental results.