Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Efficient Pattern Recognition Using a New Transformation Distance
Advances in Neural Information Processing Systems 5, [NIPS Conference]
A case study of applying data mining techniques in an outfitter's customer value analysis
Expert Systems with Applications: An International Journal
Computational Statistics & Data Analysis
Bundling classifiers by bagging trees
Computational Statistics & Data Analysis
An empirical analysis on auto corporation training program planning by data mining techniques
Expert Systems with Applications: An International Journal
An automatic recognition method of journal impact factor manipulation
Journal of Information Science
Experimental comparison of parametric, non-parametric, and hybrid multigroup classification
Expert Systems with Applications: An International Journal
Visitors of two types of museums: A segmentation study
Expert Systems with Applications: An International Journal
Hi-index | 0.03 |
We construct a hybrid (composite) classifier by combining two classifiers in common use--classification trees and k-nearest-neighbor (k-NN). In our scheme we divide the feature space up by a classification tree, and then classify test set items using the k-NN rule just among those training items in the same leaf as the test item. This reduces somewhat the computational load associated with k-NN, and it produces a classification rule that performs better than either trees or the usual k-NN in a number of well-known data sets.