Vector quantization and signal compression
Vector quantization and signal compression
Machine Learning
Self-organizing maps
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
`Neural-gas' network for vector quantization and its application to time-series prediction
IEEE Transactions on Neural Networks
Hi-index | 0.02 |
In this paper, we propose VQ methods based on ensemble learning algorithms Bagging and AdaBoost. The proposed methods consist of more than one weak learner, which are trained in parallel or sequentially. In Bagging, the weak learners are trained in parallel by using randomly selected data from a given data set. The output for Bagging is given as the average among the weak learners. In AdaBoost, the weak learners are sequentially trained. The first weak learner is trained by using randomly selected data from a given data set. For the second and later weak learners, the probability distribution of learning data is modified so that each weak learner focuses on data involving higher error for the previous weak one. The output for AdaBoost is given as the weighted average among the weak learners. The presented simulation results show that the proposed methods can achieve a good performance in shorter learning times than conventional ones such as K-means and NG.