Tolerating Concept and Sampling Shift in Lazy Learning UsingPrediction Error Context Switching

  • Authors:
  • Marcos Salganicoff

  • Affiliations:
  • Applied Science and Engineering Laboratories (ASEL), University of Delaware/Alfred I. duPont Institute, 1600 Rockland Road, Wilmington, Delaware, USA. E-mail: salganic@asel.udel.edu

  • Venue:
  • Artificial Intelligence Review - Special issue on lazy learning
  • Year:
  • 1997

Quantified Score

Hi-index 0.00

Visualization

Abstract

In their unmodified form, lazy-learning algorithms may havedifficulty learningand tracking time-varying input/output function maps such as those thatoccur in conceptshift. Extensions of these algorithms, such as Time-Windowed forgetting(TWF), can permitlearning of time-varying mappings by deleting older exemplars, but havedecreased classificationaccuracy when the input-space sampling distribution of the learning set istime-varying.Additionally, TWF suffers from lower asymptotic classification accuracy thanequivalentnon-forgetting algorithms when the input sampling distributions arestationary. Other shift-sensitivealgorithms, such as Locally-Weighted forgetting (LWF) avoid the negativeeffectsof time-varying sampling distributions, but still have lower asymptoticclassification innon-varying cases. We introduce Prediction Error Context Switching (PECS)which allowslazy-learning algorithms to have good classification accuracy in conditionshaving a time-varyingfunction mapping and input sampling distributions, while still maintainingtheirasymptotic classification accuracy in static tasks. PECS works by selectingand re-activatingpreviously stored instances based on their most recent consistency record.The classificationaccuracy and active learning set sizes for the above algorithms are comparedin a set oflearning tasks that illustrate the differing time-varying conditionsdescribed above. The resultsshow that the PECS algorithm has the best overall classification accuracyover these differingtime-varying conditions, while still having asymptotic classificationaccuracy competitive withunmodified lazy-learners intended for static environments.