Fast sparse multinomial regression applied to hyperspectral data

  • Authors:
  • Janete S. Borges;José M. Bioucas-Dias;André R. S. Marçal

  • Affiliations:
  • Faculdade de Ciências, Universidade do Porto, DMA, Porto, Portugal;Instituto de Telecomunicações – Instituto Superior Técnico, Lisboa, Portugal;Faculdade de Ciências, Universidade do Porto, DMA, Porto, Portugal

  • Venue:
  • ICIAR'06 Proceedings of the Third international conference on Image Analysis and Recognition - Volume Part II
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

Methods for learning sparse classification are among the state-of-the-art in supervised learning. Sparsity, essential to achieve good generalization capabilities, can be enforced by using heavy tailed priors/ regularizers on the weights of the linear combination of functions. These priors/regularizers favour a few large weights and many to exactly zero. The Sparse Multinomial Logistic Regression algorithm [1] is one of such methods, that adopts a Laplacian prior to enforce sparseness. Its applicability to large datasets is still a delicate task from the computational point of view, sometimes even impossible to perform. This work implements an iterative procedure to calculate the weights of the decision function that is O(m2) faster than the original method introduced in [1] (m is the number of classes). The benchmark dataset Indian Pines is used to test this modification. Results over subsets of this dataset are presented and compared with others computed with support vector machines.