Data structures and algorithms for nearest neighbor search in general metric spaces
SODA '93 Proceedings of the fourth annual ACM-SIAM Symposium on Discrete algorithms
Prolog (3rd ed.): programming for artificial intelligence
Prolog (3rd ed.): programming for artificial intelligence
Top-Down Induction of Clustering Trees
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Efficient algorithms for decision tree cross-validation
The Journal of Machine Learning Research
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Predicting Structured Data (Neural Information Processing)
Predicting Structured Data (Neural Information Processing)
Decision trees for hierarchical multi-label classification
Machine Learning
Towards a general framework for data mining
KDID'06 Proceedings of the 5th international conference on Knowledge discovery in inductive databases
Learning classification rules for multiple target attributes
PAKDD'08 Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining
A survey of hierarchical classification across different application domains
Data Mining and Knowledge Discovery
Hi-index | 0.00 |
In this work, we address several tasks of structured prediction and propose a new method for handling such tasks. Structured prediction is becoming important as data mining is dealing with increasingly complex data (images, videos, sound, graphs, text,...). Our method, k-NN for structured prediction (kNN-SP), is an extension of the well known k-nearest neighbours method and can handle three different structured prediction problems: multi-target prediction, hierarchical multi-label classification, and prediction of short time-series. We evaluate the performance of kNN-SP on several datasets for each task and compare it to the performance of other structured prediction methods (predictive clustering trees and rules). We show that, despite it's simplicity, the kNN-SP method performs satisfactory on all tested problems.