An unsupervised neural model for oriented principal component extraction

  • Authors:
  • K. I. Diamantaras;S. Y. Kung

  • Affiliations:
  • Dept. of Electr. Eng., Princeton Univ., NJ, USA;Dept. of Electr. Eng., Princeton Univ., NJ, USA

  • Venue:
  • ICASSP '91 Proceedings of the Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference
  • Year:
  • 1991

Quantified Score

Hi-index 0.00

Visualization

Abstract

The concept of oriented principal component (OPC) analysis is introduced. It is the extension of the GSVD (generalized singular value decomposition) concept to the case of random processes (much like principal component analysis extends SVD for stochastic signals). In the random signal case, OPC analysis is equivalent to matched filtering and can be found useful in many classification and detection applications. The authors propose a corresponding neural model equipped with an efficient training algorithm for estimating the oriented principal component of two stochastic processes without assuming explicit knowledge of their statistics. The algorithm is based on the (normalized) learning rule proposed by Hebb for training the synaptic weights of a network of neurons. Both the theoretical justification and the numerical performance are shown, giving an explicit estimate of the learning rate parameter for best convergence speed.