A note on lewicki-sejnowski gradient for learning overcomplete representations

  • Authors:
  • Zhaoshui He;Shengli Xie;Liqing Zhang;Andrzej Cichocki

  • Affiliations:
  • Sch. of Electr. and Info. Eng., South China Univ. of Technology, Guangzhou, 510640, China, and Lab. for Advanced Brain Signal Processing, RIKEN Brain Science Inst., Wako-shi, Saitama 351-0198, Jap ...;School of Electronics and Information Engineering, South China University of Technology, Guangzhou, 510640, China. adshlxie@scut.edu.cn;Department of Computer Science and Engineering, Shanghai Jiao Tong University, Shanghai, 200240, China. zhang-lq@cs.sjtu.edu.cn;Lab. for Adv. Brain Signal Proc., RIKEN Brain Sci. Inst., Wako-shi, Saitama 351-0198, Japan/ Sys. Res. Inst., Polish Acad. of Sci. (PAN), Warsaw, Poland/ and Dept. of Elec. Eng., Warsaw Univ. of T ...

  • Venue:
  • Neural Computation
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Overcomplete representations have greater robustness in noise environment and also have greater flexibility in matching structure in the data. Lewicki and Sejnowski (2000) proposed an efficient extended natural gradient for learning the overcomplete basis and developed an overcomplete representation approach. However, they derived their gradient by many approximations, and their proof is very complicated. To give a stronger theoretical basis, we provide a brief and more rigorous mathematical proof for this gradient in this note. In addition, we propose a more robust constrained Lewicki-Sejnowski gradient.