Communications of the ACM
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
A Necessary Condition for Learning from Positive Examples
Machine Learning
Inductive inference from positive data is powerful
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Probably approximate learning of sets and functions
SIAM Journal on Computing
SIAM Journal on Computing
Efficient noise-tolerant learning from statistical queries
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
An introduction to computational learning theory
An introduction to computational learning theory
Learning distributions by their density levels: a paradigm for learning without a teacher
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Machine Learning
Machine Learning
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A Guided Tour Across the Boundaries of Learning Recursive Languages
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
PAC Learning from Positive Statistical Queries
ALT '98 Proceedings of the 9th International Conference on Algorithmic Learning Theory
Positive and Unlabeled Examples Help Learning
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
Learning from Positive and Unlabeled Examples
ALT '00 Proceedings of the 11th International Conference on Algorithmic Learning Theory
PAC Learning from Positive Statistical Queries
ALT '98 Proceedings of the 9th International Conference on Algorithmic Learning Theory
Building Text Classifiers Using Positive and Unlabeled Examples
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Learning to Classify Documents with Only a Small Positive Training Set
ECML '07 Proceedings of the 18th European conference on Machine Learning
Cool Blog Classification from Positive and Unlabeled Examples
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Negative training data can be harmful to text classification
EMNLP '10 Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
Semi-supervised learning from only positive and unlabeled data using entropy
WAIM'10 Proceedings of the 11th international conference on Web-age information management
Rough set and ensemble learning based semi-supervised algorithm for text classification
Expert Systems with Applications: An International Journal
Semi-Supervised Novelty Detection
The Journal of Machine Learning Research
A survey of grammatical inference methods for natural language learning
Artificial Intelligence Review
TAMC'11 Proceedings of the 8th annual conference on Theory and applications of models of computation
Extracting initial and reliable negative documents to enhance classification performance
KDLL'06 Proceedings of the 2006 international conference on Knowledge Discovery in Life Science Literature
Comparison of documents classification techniques to classify medical reports
PAKDD'06 Proceedings of the 10th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining
Learning to filter junk e-mail from positive and unlabeled examples
IJCNLP'04 Proceedings of the First international joint conference on Natural Language Processing
Identifying Web Spam with the Wisdom of the Crowds
ACM Transactions on the Web (TWEB)
Learning from positive and unlabeled examples with different data distributions
ECML'05 Proceedings of the 16th European conference on Machine Learning
A new PU learning algorithm for text classification
MICAI'05 Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence
Estimate unlabeled-data-distribution for semi-supervised PU learning
APWeb'12 Proceedings of the 14th Asia-Pacific international conference on Web Technologies and Applications
Positive unlabeled learning for time series classification
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Named entity disambiguation in streaming data
ACL '12 Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics: Long Papers - Volume 1
Learning to predict from textual data
Journal of Artificial Intelligence Research
Learning from data streams with only positive and unlabeled data
Journal of Intelligent Information Systems
Theoretical Computer Science
Hi-index | 0.00 |
Learning from positive examples occurs very frequently in natural learning. The PAC learning model of Valiant takes many features of natural learning into account, but in most cases it fails to describe such kind of learning. We show that in order to make the learning from positive data possible, extra-information about the underlying distribution must be provided to the learner. We define a PAC learning model from positive and unlabeled examples. We also define a PAC learning model from positive and unlabeled statistical queries. Relations with PAC model ([Val84]), statistical query model ([Kea93]) and constantpartition classification noise model ([Dec97]) are studied. We show that k-DNF and k-decision lists are learnable in both models, i.e. with far less information than it is assumed in previously used algorithms.