Multiplicative Updates for Nonnegative Quadratic Programming

  • Authors:
  • Fei Sha;Yuanqing Lin;Lawrence K. Saul;Daniel D. Lee

  • Affiliations:
  • Computer Science Division, University of California, Berkeley, Berkeley, CA 94720, U.S.A. feisha@cs.berkeley.edu;Department of Electrical and Systems Engineering, University of Pennsylvania, Philadelphia, PA 19104, U.S.A. linyuanq@seas.upenn.edu;Department of Computer Science and Engineering, University of California, San Diego, La Jolla, CA 92093, U.S.A. saul@cs.ucsd.edu;Department of Electrical and Systems Engineering, University of Pennsylvania, Philadelphia, PA 19104, U.S.A. ddlee@seas.upenn.edu

  • Venue:
  • Neural Computation
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many problems in neural computation and statistical learning involve optimizations with nonnegativity constraints. In this article, we study convex problems in quadratic programming where the optimization is confined to an axis-aligned region in the nonnegative orthant. For these problems, we derive multiplicative updates that improve the value of the objective function at each iteration and converge monotonically to the global minimum. The updates have a simple closed form and do not involve any heuristics or free parameters that must be tuned to ensure convergence. Despite their simplicity, they differ strikingly in form from other multiplicative updates used in machine learning. We provide complete proofs of convergence for these updates and describe their application to problems in signal processing and pattern recognition.