More efficient windowing

  • Authors:
  • Johannes Fürnkranz

  • Affiliations:
  • Austrian Research Institute for Artificial Intelligence, Wien, Austria

  • Venue:
  • AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

Windowing has been proposed as a procedure for efficient memory use in the ID3 decision tree learning algorithm. However, previous work has shown that windowing may often lead to a decrease in performance. In this work, we try to argue that rule learning algorithms are more appropriate for windowing than decision tree algorithms, because the former typically learn and evaluate rules independently and are thus less susceptible to changes in class distributions. Most importantly, we present a new windowing algorithm that achieves additional gains in efficiency by saving promising rules and removing examples covered by these rules from the learning window. While the presented algorithm is only suitable for redundant, noise-free data sets, we will also briefly discuss the problem of noisy data for windowing algorithms.