Boosting method for local learning in statistical pattern recognition

  • Authors:
  • Masanori Kawakita;Shinto Eguchi

  • Affiliations:
  • Department of Computer Science and Communication Engineering, Kyushu University, Fukuoka 819-0395, Japan. kawakita@csce.kyushu-u.ac.jp;Institute of Statistical Mathematics, Department of Statistical Science, Graduate University of Advanced Studies, Tokyo 106-8569, Japan. eguchi@ism.ac.jp

  • Venue:
  • Neural Computation
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a local boosting method in classification problems borrowing from an idea of the local likelihood method. Our proposal, local boosting, includes a simple device for localization for computational feasibility. We proved the Bayes risk consistency of the local boosting in the framework of Probably approximately correct learning. Inspection of the proof provides a useful viewpoint for comparing ordinary boosting and local boosting with respect to the estimation error and the approximation error. Both boosting methods have the Bayes risk consistency if their approximation errors decrease to zero. Compared to ordinary boosting, local boosting may perform better by controlling the trade-off between the estimation error and the approximation error. Ordinary boosting with complicated base classifiers or other strong classification methods, including kernel machines, may have classification performance comparable to local boosting with simple base classifiers, for example, decision stumps. Local boosting, however, has an advantage with respect to interpretability. Local boosting with simple base classifiers offers a simple way to specify which features are informative and how their values contribute to a classification rule even though locally. Several numerical studies on real data sets confirm these advantages of local boosting.