Fundamentals of statistical exponential families: with applications in statistical decision theory
Fundamentals of statistical exponential families: with applications in statistical decision theory
Machine Learning
Online Model Selection Based on the Variational Bayes
Neural Computation
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Tractable nonparametric Bayesian inference in Poisson processes with Gaussian process intensities
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Nonlinear Models Using Dirichlet Process Mixtures
The Journal of Machine Learning Research
Nonparametric Bayes classification and hypothesis testing on manifolds
Journal of Multivariate Analysis
Hi-index | 0.00 |
We propose Dirichlet Process mixtures of Generalized Linear Models (DP-GLM), a new class of methods for nonparametric regression. Given a data set of input-response pairs, the DP-GLM produces a global model of the joint distribution through a mixture of local generalized linear models. DP-GLMs allow both continuous and categorical inputs, and can model the same class of responses that can be modeled with a generalized linear model. We study the properties of the DP-GLM, and show why it provides better predictions and density estimates than existing Dirichlet process mixture regression models. We give conditions for weak consistency of the joint distribution and pointwise consistency of the regression estimate.