Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
Exponential families for conditional random fields
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Large Margin Methods for Structured and Interdependent Output Variables
The Journal of Machine Learning Research
A general regression technique for learning transductions
ICML '05 Proceedings of the 22nd international conference on Machine learning
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Binet-Cauchy Kernels on Dynamical Systems and its Application to the Analysis of Dynamic Scenes
International Journal of Computer Vision
Estimating labels from label proportions
Proceedings of the 25th international conference on Machine learning
Steepest Descent Algorithms for Optimization Under Unitary Matrix Constraint
IEEE Transactions on Signal Processing
Path integral control by reproducing kernel Hilbert space embedding
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Hi-index | 0.00 |
In this paper, we extend the Hilbert space embedding approach to handle conditional distributions. We derive a kernel estimate for the conditional embedding, and show its connection to ordinary embeddings. Conditional embeddings largely extend our ability to manipulate distributions in Hilbert spaces, and as an example, we derive a nonparametric method for modeling dynamical systems where the belief state of the system is maintained as a conditional embedding. Our method is very general in terms of both the domains and the types of distributions that it can handle, and we demonstrate the effectiveness of our method in various dynamical systems. We expect that conditional embeddings will have wider applications beyond modeling dynamical systems.