A Probabilistic Framework for the Hierarchic Organisation and Classification of Document Collections
Journal of Intelligent Information Systems
A new discriminative kernel from probabilistic models
Neural Computation
Advanced lectures on machine learning
The Journal of Machine Learning Research
Segmentation Given Partial Grouping Constraints
IEEE Transactions on Pattern Analysis and Machine Intelligence
Asymptotic properties of the Fisher kernel
Neural Computation
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Face recognition based on multi-class mapping of Fisher scores
Pattern Recognition
Confidence-weighted linear classification for text categorization
The Journal of Machine Learning Research
Hi-index | 0.00 |
We present a general framework for discriminative estimation based on the maximum entropy principle and its extensions. All calculations involve distributions over structures and/or parameters rather than specific settings and reduce to relative entropy projections. This holds even when the data is not separable within the chosen parametric class, in the context of anomaly detection rather than classification, or when the labels in the training set are uncertain or incomplete. Support vector machines are naturally subsumed under this class and we provide several extensions. We are also able to estimate exactly and efficiently discriminative distributions over tree structures of class-conditional models within this framework. Preliminary experimental results are indicative of the potential in these techniques.