Supervised learning with minimal effort

  • Authors:
  • Eileen A. Ni;Charles X. Ling

  • Affiliations:
  • Department of Computer Science, The University of Western Ontario, London, Ontario, Canada;Department of Computer Science, The University of Western Ontario, London, Ontario, Canada

  • Venue:
  • PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Traditional supervised learning learns from whatever training examples given to it This is dramatically different from human learning; human learns simple examples before conquering hard ones to minimize his effort Effort can equate to energy consumption, and it would be important for machine learning modules to use minimal energy in real-world deployments In this paper, we propose a novel, simple and effective machine learning paradigm that explicitly exploits this important simple-to-complex (S2C) human learning strategy, and implement it based on C4.5 efficiently Experiment results show that S2C has several distinctive advantages over the original C4.5 First of all, S2C does indeed take much less effort in learning the training examples than C4.5 which selects examples randomly Second, with minimal effort, the learning process is much more stable Finally, even though S2C only locally updates the model with minimal effort, we show that it is as accurate as the global learner C4.5 The applications of this simple-to-complex learning strategy in real-world learning tasks, especially cognitive learning tasks, will be fruitful.