Bioinformatics: the machine learning approach
Bioinformatics: the machine learning approach
Algebraic Analysis for Singular Statistical Estimation
ALT '99 Proceedings of the 10th International Conference on Algorithmic Learning Theory
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
Asymptotic Bayesian generalization error when training and test distributions are different
Proceedings of the 24th international conference on Machine learning
Covariate Shift Adaptation by Importance Weighted Cross Validation
The Journal of Machine Learning Research
Hi-index | 0.00 |
In the standard setting of statistical learning theory, we assume that the training and test data are generated from the same distribution. However, this assumption cannot hold in many practical cases, e.g., brain-computer interfacing, bioinformatics, etc. Especially, changing input distribution in the regression problem often occurs, and is known as the covariate shift. There are a lot of studies to adapt the change, since the ordinary machine learning methods do not work properly under the shift. The asymptotic theory has also been developed in the Bayesian inference. Although many effective results are reported on statistical regular ones, the non-regular models have not been considered well. This paper focuses on behaviors of non-regular models under the covariate shift. In the former study [1], we formally revealed the factors changing the generalization error and established its upper bound. We here report that the experimental results support the theoretical findings. Moreover it is observed that the basis function in the model plays an important role in some cases.