IEEE Transactions on Neural Networks
A multitask learning model for online pattern recognition
IEEE Transactions on Neural Networks
Comparison between analog and digital neural network implementations for range-finding applications
IEEE Transactions on Neural Networks
A growing and pruning method for radial basis function networks
IEEE Transactions on Neural Networks
An incremental learning algorithm of recursive Fisher linear discriminant
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Adaptive incremental principal component analysis in nonstationary online learning environments
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Incremental principal component analysis based on adaptive accumulation ratio
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
PRICAI'10 Proceedings of the 11th Pacific Rim international conference on Trends in artificial intelligence
A fast incremental kernel principal component analysis for online feature extraction
PRICAI'10 Proceedings of the 11th Pacific Rim international conference on Trends in artificial intelligence
Radial Basis Function Network for Multitask Pattern Recognition
Neural Processing Letters
WCCI'12 Proceedings of the 2012 World Congress conference on Advances in Computational Intelligence
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Weighted Online Sequential Extreme Learning Machine for Class Imbalance Learning
Neural Processing Letters
Hi-index | 0.00 |
This paper presents a pattern classification system in which feature extraction and classifier learning are simultaneously carried out not only online but also in one pass where training samples are presented only once. For this purpose, we have extended incremental principal component analysis (IPCA) and some classifier models were effectively combined with it. However, there was a drawback in this approach that training samples must be learned one by one due to the limitation of IPCA. To overcome this problem, we propose another extension of IPCA called chunk IPCA in which a chunk of training samples is processed at a time. In the experiments, we evaluate the classification performance for several large-scale data sets to discuss the scalability of chunk IPCA under one-pass incremental learning environments. The experimental results suggest that chunk IPCA can reduce the training time effectively as compared with IPCA unless the number of input attributes is too large. We study the influence of the size of initial training data and the size of given chunk data on classification accuracy and learning time. We also show that chunk IPCA can obtain major eigenvectors with fairly good approximation.