Matrix computations (3rd ed.)
IDR/QR: An Incremental Dimension Reduction Algorithm via QR Decomposition
IEEE Transactions on Knowledge and Data Engineering
The Journal of Machine Learning Research
Least squares linear discriminant analysis
Proceedings of the 24th international conference on Machine learning
Hypergraph spectral learning for multi-label classification
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
A least squares formulation for a class of generalized eigenvalue problems in machine learning
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Least Square Incremental Linear Discriminant Analysis
ICDM '09 Proceedings of the 2009 Ninth IEEE International Conference on Data Mining
A scalable two-stage approach for a class of dimensionality reduction techniques
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Overview and recent advances in partial least squares
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
Incremental linear discriminant analysis for classification of data streams
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Incremental Linear Discriminant Analysis for Face Recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 12.05 |
Linear discriminant analysis (LDA) is one of the most widely used supervised dimensionality reduction algorithms. Standard LDA performs in batch way which needs all the data be available before learning. However, in many real world applications, data is coming continuously over time and sometimes undergoing concept drift, so it is more desirable to only keep the most recent data by using a certain slide window. Several incremental LDA algorithms have been developed and achieved success, however, they do not consider the case when an instance is deleted and require large computational cost. In this paper, we propose a new online LDA algorithm, LS-OLDA, based on the least square solution of LDA. When an instance is inserted or deleted, it dynamically updates the least square solution of LDA. Our analysis reveals that this algorithm produces the exact least square solution of batch LDA, while its computational cost is O(min(n;d)xd+nk) for one update on dataset containing n instances in d-dimensional space with k classes. Experimental results show that our proposed algorithm could achieve high accuracy with low time cost.