A Theoretical Analysis of On-Line Learning Using Correlated Examples

  • Authors:
  • Chihiro Seki;Shingo Sakurai;Masafumi Matsuno;Seiji Miyoshi

  • Affiliations:
  • -;-;-;-

  • Venue:
  • IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we analytically investigate the generalization performance of learning using correlated inputs in the framework of on-line learning with a statistical mechanical method. We consider a model composed of linear perceptrons with Gaussian noise. First, we analyze the case of the gradient method. We analytically clarify that the larger the correlation among inputs is or the larger the number of inputs is, the stricter the condition the learning rate should satisfy is, and the slower the learning speed is. Second, we treat the block orthogonal projection learning as an alternative learning rule and derive the theory. In a noiseless case, the learning speed does not depend on the correlation and is proportional to the number of inputs used in an update. The learning speed is identical to that of the gradient method with uncorrelated inputs. On the other hand, when there is noise, the larger the correlation among inputs is, the slower the learning speed is and the larger the residual generalization error is.