Algorithms for clustering data
Algorithms for clustering data
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
The Image Foresting Transform: Theory, Algorithms, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Semi-supervised graph clustering: a kernel approach
ICML '05 Proceedings of the 22nd international conference on Machine learning
How boosting the margin can also boost classifier complexity
ICML '06 Proceedings of the 23rd international conference on Machine learning
Graph-Theoretical Methods for Detecting and Describing Gestalt Clusters
IEEE Transactions on Computers
Semisupervised Clustering with Metric Learning using Relative Comparisons
IEEE Transactions on Knowledge and Data Engineering
A New Variant of the Optimum-Path Forest Classifier
ISVC '08 Proceedings of the 4th International Symposium on Advances in Visual Computing
A discrete approach for supervised pattern recognition
IWCIA'08 Proceedings of the 12th international conference on Combinatorial image analysis
Which is the best multiclass SVM method? an empirical study
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
IWCIA'11 Proceedings of the 14th international conference on Combinatorial image analysis
Hi-index | 0.00 |
Graph-based approaches for pattern recognition techniques are commonly designed for unsupervised and semi-supervised ones. Recently, a novel collection of supervised pattern recognition techniques based on an optimum-path forest (OPF) computation in a feature space induced by graphs were presented: the OPF-based classifiers. They have some advantages with respect to the widely used supervised classifiers: they do not make assumption of shape/separability of the classes and run training phase faster. Actually, there exists two versions of OPF-based classifiers: OPF cpl (the first one) and OPF knn . Here, we introduce a learning algorithm for the last one and we show that a classifier can learns with its own errors without increasing its training set.