A framework for the development of globally convergent adaptive learning rate algorithms

  • Authors:
  • G. D. Magoulas;V. P. Plagianakos;G. S. Androulakis;M. N. Vrahatis

  • Affiliations:
  • Department of Informatics, University of Athens, GR-157.84 Athens, GREECE and University of Patras Artificial Intelligence Research Center, GR-261.10 Patras, GREECE;Department of Mathematics, University of Patras, GR-261.10 Patras, GREECE and University of Patras Artificial Intelligence Research Center, GR-261.10 Patras, GREECE;Department of Mathematics, University of Patras, GR-261.10 Patras, GREECE and University of Patras Artificial Intelligence Research Center, GR-261.10 Patras, GREECE;Department of Mathematics, University of Patras, GR-261.10 Patras, GREECE and University of Patras Artificial Intelligence Research Center, GR-261.10 Patras, GREECE

  • Venue:
  • Progress in computer research
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we propose a framework for developing globally convergent batch training algorithms with adaptive learning rate. The proposed framework provides conditions under which global convergence is guaranteed for adaptive learning rate training algorithms. To this end, the learning rate is appropriately tuned along the given descent direction. Providing conditions regarding the search direction and the corresponding stepsize length this framework can also guarantee global convergence for training algorithms that use a different learning rate for each weight. To illustrate the effectiveness of the proposed approach on various training algorithms simulation results are provided.