A Comparative Study on Feature Selection in Text Categorization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
The Journal of Machine Learning Research
IEEE Transactions on Neural Networks
Distributed Nearest Neighbor-Based Condensation of Very Large Data Sets
IEEE Transactions on Knowledge and Data Engineering
A modular reduction method for k-NN algorithm with self-recombination learning
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
This paper presents a Min-Max modular k-nearest neighbor (M3-k-NN) classification method for massively parallel text categorization. The basic idea behind the method is to decompose a large-scale text categorization problem into a number of smaller two-class subproblems and combine all of the individual modular k-NN classifiers trained on the smaller two-class subproblems into an M3-k-NN classifier. Our experiments in text categorization demonstrate that M3-k-NN is much faster than conventional k-NN, and meanwhile the classification accuracy of M3-k-NN is slightly better than that of the conventional k-NN. In practical, M3-k-NN has intimate relationship with high order k-NN algorithm; therefore, in theoretical sense, the reliability of M3-k-NN has been supported to some extend.