Nonnegative sparse coding for discriminative semi-supervised learning

  • Authors:
  • Ran He; Wei-Shi Zheng; Bao-Gang Hu; Xiang-Wei Kong

  • Affiliations:
  • Inst. of Autom., Chinese Acad. of Sci., Beijing, China;Sch. of Inf. Sci. & Technol., Sun Yat-sen Univ., Guangzhou, China;Inst. of Autom., Chinese Acad. of Sci., Beijing, China;Dept. of Electron. Eng., Dalian Univ. of Technol., Dalian, China

  • Venue:
  • CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

An informative and discriminative graph plays an important role in the graph-based semi-supervised learning methods. This paper introduces a nonnegative sparse algorithm and its approximated algorithm based on the l^0-l^1 equivalence theory to compute the nonnegative sparse weights of a graph. Hence, the sparse probability graph (SPG) is termed for representing the proposed method. The nonnegative sparse weights in the graph naturally serve as clustering indicators, benefiting for semi-supervised learning. More important, our approximation algorithm speeds up the computation of the nonnegative sparse coding, which is still a bottle-neck for any previous attempts of sparse non-negative graph learning. And it is much more efficient than using l^1-norm sparsity technique for learning large scale sparse graph. Finally, for discriminative semi-supervised learning, an adaptive label propagation algorithm is also proposed to iteratively predict the labels of data on the SPG. Promising experimental results show that the nonnegative sparse coding is efficient and effective for discriminative semi-supervised learning.