Several speed-up variants of cascade generalization

  • Authors:
  • Zhipeng Xie

  • Affiliations:
  • Department of Computing and Information Technology, Fudan University, Shanghai, China

  • Venue:
  • FSKD'06 Proceedings of the Third international conference on Fuzzy Systems and Knowledge Discovery
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Cascade generalization sequentially composes different classification methods into one single framework. However, the latter classification method has to be up against a larger attribute set. As a result, its learning process will slow down, especially for the data sets that have many class labels, and for the learning algorithms whose computational complexity is high with respect to the number of attributes, among others. This paper is devoted to propose several variants of the original cascade generalization, where the basic idea is to reduce the number of augmented attributed each time in the Cascade framework. Extensive experimental results manifest that all the variants are much faster than the original one as supposed. In addition, all the variants have achieved a little reduction of error rates, compared with the original Cascade framework.