Ent-Boost: Boosting using entropy measures for robust object detection

  • Authors:
  • Duy-Dinh Le;Shin'ichi Satoh

  • Affiliations:
  • The Graduate University for Advanced Studies, Department of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, Japan;The Graduate University for Advanced Studies, Department of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, Japan and National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda-ku ...

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2007

Quantified Score

Hi-index 0.10

Visualization

Abstract

Recently, boosting has come to be used widely in object-detection applications because of its impressive performance in both speed and accuracy. However, learning weak classifiers which is one of the most significant tasks in using boosting is left to users. In Discrete AdaBoost, weak classifiers with binary output are too weak to boost when the training data is complex. Meanwhile, determining the appropriate number of bins for weak classifiers learned by Real AdaBoost is a challenging task because small ones might not accurately approximate the real distribution while large ones might cause over-fitting, increase computation time and waste storage space. We have developed Ent-Boost, a novel boosting scheme for efficiently learning weak classifiers using entropy measures. Class entropy information is used to automatically estimate the optimal number of bins through discretization process. Then Kullback-Leibler divergence which is the relative entropy between probability distributions of positive and negative samples is used to select the best weak classifier in the weak classifier set. Experiments showed that strong classifiers learned by Ent-Boost can achieve good performance, and achieve compact storage space. The result of building a robust face detector using Ent-Boost showed the boosting scheme to be effective.