Classifier-specific intermediate representation for multimedia tasks
Proceedings of the 2nd ACM International Conference on Multimedia Retrieval
Knowledge adaptation for ad hoc multimedia event detection with few exemplars
Proceedings of the 20th ACM international conference on Multimedia
Thinking of images as what they are: compound matrix regression for image classification
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
A new family of boosting algorithms, denoted Taylor-Boost, is proposed. It supports any combination of loss function and first or second order optimization, and includes classical algorithms such as AdaBoost, Gradient-Boost, or LogitBoost as special cases. Its restriction to the set of canonical losses makes it possible to have boosting algorithms with explicit margin control. A new large family of losses with this property, based on the set of cumulative distributions of zero mean random variables, is then proposed. A novel loss function in this family, the Laplace loss, is finally derived. The combination of this loss and second order TaylorBoost produces a boosting algorithm with explicit margin control.