Solving the multiple instance problem with axis-parallel rectangles
Artificial Intelligence
A framework for multiple-instance learning
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Content-Based Image Retrieval Using Multiple-Instance Learning
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Solving the Multiple-Instance Problem: A Lazy Learning Approach
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Image Categorization by Learning and Reasoning with Regions
The Journal of Machine Learning Research
MILES: Multiple-Instance Learning via Embedded Instance Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Localized Content-Based Image Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
MILD: Multiple-Instance Learning via Disambiguation
IEEE Transactions on Knowledge and Data Engineering
MILIS: Multiple Instance Learning with Instance Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust Object Tracking with Online Multiple Instance Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiple-instance learning with instance selection via dominant sets
SIMBAD'11 Proceedings of the First international conference on Similarity-based pattern recognition
Prototype Selection for Nearest Neighbor Classification: Taxonomy and Empirical Study
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
The k-nearest neighbor algorithm (kNN) is one of the most well-known techniques in standard supervised learning. It could be adapted to the setting of multiple-instance learning (MIL) by using set-based distance metrics, such as Citation-kNN and Bayesian-kNN. However, kNN suffers from several drawbacks, including high storage requirements, low efficiency in classification response and low noise tolerance. These drawbacks would become particularly significant in MIL since every example here is a set of instances. One of the most promising solutions is dependent on prototype selection, and many prototype selection methods have been proposed in standard supervised learning. In this paper, we propose an efficient Salience-based Prototype Selection (MISPS) method to tackle the above problems in MIL. Then we present two variants of Citation-kNN and Bayesian-kNN based on MISPS, called MISPS-CkNN and MISPS-BkNN. Experimental results on five benchmark data-sets show that MISPS is effective and our MISPS-based algorithms are competitive to the state-of-the-art.