LoGID: An adaptive framework combining local and global incremental learning for dynamic selection of ensembles of HMMs

  • Authors:
  • Paulo R. Cavalin;Robert Sabourin;Ching Y. Suen

  • Affiliations:
  • Universidade Federal do Tocantins (UFT), ícole de Technologie Supérieure (ETS), Quadra 109 Norte Av. NS15 s/n Bl. II sala 21, Palmas (TO) 77001-090, Brazil;ícole de Technologie Supérieure (ETS), 1100 Notre-dame ouest, Montréal (QC), Canada H3C-1K3;Centre for Pattern Recognition and Machine Intelligence (CENPARMI), Concordia University, 1455 de Maisonneuve Blvd West, Montréal (QC), Canada H3G-1M8

  • Venue:
  • Pattern Recognition
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this work, we propose the LoGID (Local and Global Incremental Learning for Dynamic Selection) framework, the main goal of which is to adapt hidden Markov model-based pattern recognition systems during both the generalization and learning phases. Given that the baseline system is composed of a pool of base classifiers, adaptation during generalization is performed through the dynamic selection of the members of this pool that best recognize each test sample. This is achieved by the proposed K-nearest output profiles algorithm, while adaptation during learning consists of gradually updating the knowledge embedded in the base classifiers, by processing previously unobserved data. This phase employs two types of incremental learning: local and global. Local incremental learning involves updating the pool of base classifiers by adding new members to this set. The new members are created with the Learn++ algorithm. Global incremental learning, in contrast, consists of updating the set of output profiles used during generalization. The proposed framework has been evaluated on a diversified set of databases. The results indicate that LoGID is promising. For most databases, the recognition rates achieved by the proposed method are higher than those achieved by other state-of-the-art approaches, such as batch learning. Furthermore, the simulated incremental learning setting demonstrates that LoGID can effectively improve the performance of systems created with small training sets as more data are observed over time.