Weighted fuzzy pattern matching
Fuzzy Sets and Systems - Mathematical Modelling
An incremental—learning neural network for the classification of remote—sensing images
Pattern Recognition Letters - Special issue on pattern recognition in practice VI
Performance evaluation of fuzzy classification methods designed for real time application
International Journal of Approximate Reasoning
Semi-supervised classification method for dynamic applications
Fuzzy Sets and Systems
Auto-adaptive and dynamical clustering neural network
ICANN/ICONIP'03 Proceedings of the 2003 joint international conference on Artificial neural networks and neural information processing
On dynamic soft dimension reduction in evolving fuzzy classifiers
IPMU'10 Proceedings of the Computational intelligence for knowledge-based systems design, and 13th international conference on Information processing and management of uncertainty
On-line incremental feature weighting in evolving fuzzy classifiers
Fuzzy Sets and Systems
A clustering-based approach for the identification of a class of temporally switched linear systems
Pattern Recognition Letters
Hi-index | 0.00 |
We use learning methods to build classifiers in using a set of training samples. A classifier is capable to assign a new sample into one of the different learnt classes. In a non-stationary work environment, a classifier must be retrained every time a new sample is classified, to obtain new knowledge from it. This is an impractical solution since it requires the storage of all available samples and a considerable computation time. The development of a classifier that is capable to acquire new knowledge from each new classified sample while preserving the current one is known as incremental learning. Our team of research "Diagnosis of Industrial Processes" works on diagnosis in using fuzzy classification methods for data coming from industrial and medical sectors. We use the Fuzzy Pattern Matching (FPM) as a method of classification and the transformation probability-possibility of Dubois and Prade to construct densities of possibilities. These densities are used to assign each new sample to its suitable class. When FPM works in non-stationary environment, it must update its possibility densities after the classification of each new sample. The goal is to adapt the classifier to the possible changes in the work environment. When the number of samples increases, the update time also increases. Furthermore the memory size required increases to store all the samples. In the literature, there is not any published paper that integrates the incremental learning in FPM. Thus, in this paper, we propose a method to integrate it. Then we show that the update time and the memory size are constant and independent from the number of samples. Finally we illustrate the advantages of this method in using different examples.