Optimal Fisher discriminant analysis using the rank decomposition
Pattern Recognition
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Expert Systems with Applications: An International Journal
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Nearest neighbor pattern classification
IEEE Transactions on Information Theory
Expert Systems with Applications: An International Journal
Hi-index | 12.05 |
Recently, pattern recognition techniques have been applied for fault diagnosis. Principal component analysis (PCA) and kernel principal component analysis (KPCA) are introduced for feature extraction. However, those unsupervised learning methods have not incorporated the prior knowledge of process patterns. This paper proposes a novel fault diagnosis system to improve the performance of fault diagnosis. Kernel Fisher discriminant analysis (KFDA) is used in the first step for feature extraction, then Gaussian mixture model (GMM) and k-nearest neighbor (kNN) are applied for fault detection and isolation on the KFDA subspace. Since the performance of fault diagnosis system would be degraded in the fault detection stage, fault detection and identification are presented in a holistic manner without an intermediate step in the novel system. A case study of the Tennessee Eastman (TE) benchmark process indicates that the proposed methods are more efficient, compared to the traditional ones. Furthermore, as the performances of GMM and kNN are comparable, the data structure of the process should be checked beforehand, depending on which the optimal classifier can be selected.