Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Artificial Intelligence
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
DS'05 Proceedings of the 8th international conference on Discovery Science
Asymmetric hemisphere modeling in an offline brain-computerinterface
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
Bagging, boosting and random subspace are three popular ensemble learning methods, which have already shown effectiveness in many practical classification problems. For electroencephalogram (EEG) signal classification arising in recent brain-computer interface (BCI) research, however, there are almost no reports investigating their feasibilities. This paper systematically evaluates the performance of these three ensemble methods for their new application on EEG signal classification. Experiments are conducted on three BCI subjects with k-nearest neighbor and decision tree as base classifiers. Several valuable conclusions are derived about the feasibility and performance of ensemble methods for classifying EEG signals.