Computational geometry: an introduction
Computational geometry: an introduction
Machine Learning
Ensembling neural networks: many could be better than all
Artificial Intelligence
A Unifeid Bias-Variance Decomposition and its Applications
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Building Projectable Classifiers of Arbitrary Complexity
ICPR '96 Proceedings of the 13th International Conference on Pattern Recognition - Volume 2
Random sampling LDA for face recognition
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
An assessment of machine learning techniques for review recommendation
AICS'09 Proceedings of the 20th Irish conference on Artificial intelligence and cognitive science
Hi-index | 0.00 |
Ensemble methods (EMs) have become increasingly popular in data mining because of their efficiency. These methods(EMs) generate a set of classifiers using one or several machine learning algorithms (MLAs) and aggregate them into a single classifier (Meta-Classifier, MC). Of the MLAs, k-Nearest Neighbors (kNN) is one of the most well-known used in the context of EMs. However, handling the parameter k can be difficult. This drawback is the same for all MLA that are instance based. Here, we propose an approach based on neighborhood graphs as an alternative. Thanks to these related graphs, like relative neighborhood graphs (RNGs) or Gabriel graphs (GGs), we provide a generalized approach with less arbitrary parameters. Neighborhood graphs have never been introduced into EM approaches before. The results of our algorithm : Neighborhood Random Classification are very promising as they are equal to the best EM approaches such as Random Forest or those based on SVMs. In this exploratory and experimental work, we provide the methodological approach and many comparative results.