Solving the multiple instance problem with axis-parallel rectangles
Artificial Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Note on Learning from Multiple-Instance Examples
Machine Learning - Special issue on the ninth annual conference on computational theory (COLT '96)
A framework for multiple-instance learning
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
On combining classifiers using sum and product rules
Pattern Recognition Letters
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Experiments with Classifier Combining Rules
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Solving the Multiple-Instance Problem: A Lazy Learning Approach
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
Machine Learning Methods for Predicting Failures in Hard Drives: A Multiple-Instance Application
The Journal of Machine Learning Research
Supervised versus multiple instance learning: an empirical comparison
ICML '05 Proceedings of the 22nd international conference on Machine learning
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
MILES: Multiple-Instance Learning via Embedded Instance Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Adaptive p-posterior mixture-model kernels for multiple instance learning
Proceedings of the 25th international conference on Machine learning
A Multiple Instance Learning Strategy for Combating Good Word Attacks on Spam Filters
The Journal of Machine Learning Research
Learning classifiers from only positive and unlabeled data
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Multi-instance learning by treating instances as non-I.I.D. samples
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Human Action Recognition in Videos Using Kinematic Features and Multiple Instance Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Knowledge Engineering Review
Dissimilarity-based multiple instance learning
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
MIForests: multiple-instance learning with randomized trees
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part VI
MILIS: Multiple Instance Learning with Instance Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Online multiple instance gradient feature selection for robust visual tracking
Pattern Recognition Letters
Hi-index | 0.01 |
In multiple-instance learning (MIL), an object is represented as a bag consisting of a set of feature vectors called instances. In the training set, the labels of bags are given, while the uncertainty comes from the unknown labels of instances in the bags. In this paper, we study MIL with the assumption that instances are drawn from a mixture distribution of the concept and the non-concept, which leads to a convenient way to solve MIL as a classifier combining problem. It is shown that instances can be classified with any standard supervised classifier by re-weighting the classification posteriors. Given the instance labels, the label of a bag can be obtained as a classifier combining problem. An optimal decision rule is derived that determines the threshold on the fraction of instances in a bag that is assigned to the concept class. We provide estimators for the two parameters in the model. The method is tested on a toy data set and various benchmark data sets, and shown to provide results comparable to state-of-the-art MIL methods.