A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Machine Learning
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
A Unifying Framework for Detecting Outliers and Change Points from Time Series
IEEE Transactions on Knowledge and Data Engineering
Discriminative learning for differing training and test distributions
Proceedings of the 24th international conference on Machine learning
Covariate Shift Adaptation by Importance Weighted Cross Validation
The Journal of Machine Learning Research
Change-Point Detection in Time-Series Data Based on Subspace Identification
ICDM '07 Proceedings of the 2007 Seventh IEEE International Conference on Data Mining
Support Vector Machines
A Least-squares Approach to Direct Importance Estimation
The Journal of Machine Learning Research
Semi-Supervised Learning
Estimating divergence functionals and the likelihood ratio by convex risk minimization
IEEE Transactions on Information Theory
Density Ratio Estimation in Machine Learning
Density Ratio Estimation in Machine Learning
An online kernel change detection algorithm
IEEE Transactions on Signal Processing - Part II
Machine Learning in Non-Stationary Environments: Introduction to Covariate Shift Adaptation
Machine Learning in Non-Stationary Environments: Introduction to Covariate Shift Adaptation
Sequential change-point detection based on direct density-ratio estimation
Statistical Analysis and Data Mining
On optimum recognition error and reject tradeoff
IEEE Transactions on Information Theory
Divergence Estimation of Continuous Distributions Based on Data-Dependent Partitions
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We address the problem of estimating the difference between two probability densities. A naive approach is a two-step procedure of first estimating two densities separately and then computing their difference. However, this procedure does not necessarily work well because the first step is performed without regard to the second step, and thus a small estimation error incurred in the first stage can cause a big error in the second stage. In this letter, we propose a single-shot procedure for directly estimating the density difference without separately estimating two densities. We derive a nonparametric finite-sample error bound for the proposed single-shot density-difference estimator and show that it achieves the optimal convergence rate. We then show how the proposed density-difference estimator can be used in L2-distance approximation. Finally, we experimentally demonstrate the usefulness of the proposed method in robust distribution comparison such as class-prior estimation and change-point detection.