A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Bandwidth selection for kernel conditional density estimation
Computational Statistics & Data Analysis
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Comparison of presmoothing methods in kernel density estimation under censoring
Computational Statistics
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Multivariate online kernel density estimation with Gaussian kernels
Pattern Recognition
Hi-index | 0.01 |
A conditional density function, which describes the relationship between response and explanatory variables, plays an important role in many analysis problems. In this paper, we propose a new kernel-based parametric method to estimate conditional density. An exponential function is employed to approximate the unknown density, and its parameters are computed from the given explanatory variable via a nonlinear mapping using kernel principal component analysis (KPCA). We develop a new kernel function, which is a variant to polynomial kernels, to be used in KPCA. The proposed method is compared with the Nadaraya-Watson estimator through numerical simulation and practical data. Experimental results show that the proposed method outperforms the Nadaraya-Watson estimator in terms of revised mean integrated squared error (RMISE). Therefore, the proposed method is an effective method for estimating the conditional densities.