Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Classification with Nonmetric Distances: Image Retrieval and Class Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Locally Adaptive Metric Nearest-Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust Real-Time Face Detection
International Journal of Computer Vision
Computer Manual in MATLAB to Accompany Pattern Classification, Second Edition
Computer Manual in MATLAB to Accompany Pattern Classification, Second Edition
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Learning distance functions for image retrieval
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Image retrieval: Ideas, influences, and trends of the new age
ACM Computing Surveys (CSUR)
Improving Performance of a Binary Classifier by Training Set Selection
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Feature selection based-on genetic algorithm for image annotation
Knowledge-Based Systems
Using Boosting to prune Double-Bagging ensembles
Computational Statistics & Data Analysis
Boosting k-nearest neighbor classifier by means of input space projection
Expert Systems with Applications: An International Journal
Probability-Based Distance Function for Distance-Based Classifiers
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Pattern Recognition Letters
Boosting k-NN for Categorization of Natural Scenes
International Journal of Computer Vision
Hi-index | 0.10 |
In this work we introduce a new distance estimation technique by boosting and we apply it to the K-Nearest Neighbor Classifier (K-NN). Instead of applying AdaBoost to a typical classification problem, we use it for learning a distance function and the resulting distance is used into K-NN. The proposed method (Boosted Distance with Nearest Neighbor) outperforms the AdaBoost classifier when the training set is small. It also outperforms the K-NN classifier used with several different distances and the distances obtained with other estimation methods such as Relevant Component Analysis (RCA) [Duda, R.O., Hart, P.E., Stock, D.G., 2001. Pattern Classification, John Wiley and Sons Inc.]. Furthermore, our distance estimation performs dimension-reduction, being much more efficient in terms of classification accuracy than classical techniques such as PCA, LDA, and NDA. The method has been thoroughly tested on 13 standard databases from the UCI repository, a standard gender recognition database and the MNIST database.