Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Spectral technique for hidden layer neural network training
Pattern Recognition Letters
Strategies for combining classifiers employing shared and distinct pattern representations
Pattern Recognition Letters - special issue on pattern recognition in practice V
Spectral Techniques in Digital Logic
Spectral Techniques in Digital Logic
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
On sequential construction of binary neural networks
IEEE Transactions on Neural Networks
Spectral coefficients and classifier correlation
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Hi-index | 0.00 |
Various methods of reducing correlation between classifiers in a multiple classifier framework have been attempted. Here we propose a recursive partitioning technique for analysing feature space of multiple classifier decisions. Spectral summation of individual pattern components in intermediate feature space enables each training pattern to be rated according to its contribution to separability, measured as k-monotonic constraints. A constructive algorithm sequentially extracts maximally separable subsets of patterns, from which is derived an inconsistently classified set (ICS). Leaving out random subsets of ICS patterns from individual (base) classifier training sets is shown to improve performance of the combined classifiers. For experiments reported here on artificial and real data, the constituent classifiers are identical single hidden layer MLPs with fixed parameters.