Incremental learning in fuzzy pattern matching

  • Authors:
  • Moamar Sayed Mouchaweh;Arnaud Devillez;Gerard Villermain Lecolier;Patrice Billaudel

  • Affiliations:
  • Laboratoire d'Automatique et de Microélectronique, IFTS, 7, Boulevard Jean Delautre, 08000 Charleville-Méziéres, France;Laboratoire d'Automatique et de Microélectronique, IFTS, 7, Boulevard Jean Delautre, 08000 Charleville-Méziéres, France;Laboratoire d'Automatique et de Microélectronique, IFTS, 7, Boulevard Jean Delautre, 08000 Charleville-Méziéres, France;Laboratoire d'Automatique et de Microélectronique, IFTS, 7, Boulevard Jean Delautre, 08000 Charleville-Méziéres, France

  • Venue:
  • Fuzzy Sets and Systems - Possibility theory and fuzzy logic
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We use learning methods to build classifiers in using a set of training samples. A classifier is capable to assign a new sample into one of the different learnt classes. In a non-stationary work environment, a classifier must be retrained every time a new sample is classified, to obtain new knowledge from it. This is an impractical solution since it requires the storage of all available samples and a considerable computation time. The development of a classifier that is capable to acquire new knowledge from each new classified sample while preserving the current one is known as incremental learning. Our team of research "Diagnosis of Industrial Processes" works on diagnosis in using fuzzy classification methods for data coming from industrial and medical sectors. We use the Fuzzy Pattern Matching (FPM) as a method of classification and the transformation probability-possibility of Dubois and Prade to construct densities of possibilities. These densities are used to assign each new sample to its suitable class. When FPM works in non-stationary environment, it must update its possibility densities after the classification of each new sample. The goal is to adapt the classifier to the possible changes in the work environment. When the number of samples increases, the update time also increases. Furthermore the memory size required increases to store all the samples. In the literature, there is not any published paper that integrates the incremental learning in FPM. Thus, in this paper, we propose a method to integrate it. Then we show that the update time and the memory size are constant and independent from the number of samples. Finally we illustrate the advantages of this method in using different examples.