Model selection for regularized least-squares classification

  • Authors:
  • Hui-Hua Yang;Xing-Yu Wang;Yong Wang;Hai-Hua Gao

  • Affiliations:
  • Department of Computer Science, Guilin University of Electronic Technology, Guilin, China;School of Information Science and Engineering, East China University of Science and Technology, Shanghai, China;Department of Computer Science, Guilin University of Electronic Technology, Guilin, China;School of Information Science and Engineering, East China University of Science and Technology, Shanghai, China

  • Venue:
  • ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Regularized Least-Squares Classification (RLSC) can be regarded as a kind of 2 layers neural network using regularized square loss function and kernel trick. Poggio and Smale recently reformulated it in the framework of the mathematical foundations of learning and called it a key algorithm of learning theory. The generalization performance of RLSC depends heavily on the setting of its kernel and hyper parameters. Therefore we presented a novel two-step approach for optimal parameters selection: firstly the optimal kernel parameters are selected by maximizing kernel target alignment, and then the optimal hyper-parameter is determined via minimizing RLSC's leave-one-out bound. Compared with traditional grid search, our method needs no independent validation set. We worked on IDA's benchmark datasets using Gaussian kernel, the results demonstrate that our method is feasible and time efficient.