A valuation-based language for expert systems
International Journal of Approximate Reasoning
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Machine Learning - Special issue on learning with probabilistic representations
Computational Geometry for Design and Manufacture
Computational Geometry for Design and Manufacture
Bezier and B-Spline Techniques
Bezier and B-Spline Techniques
Mixtures of Truncated Exponentials in Hybrid Bayesian Networks
ECSQARU '01 Proceedings of the 6th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Axioms for probability and belief-function proagation
UAI '88 Proceedings of the Fourth Annual Conference on Uncertainty in Artificial Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Knot selection by boosting techniques
Computational Statistics & Data Analysis
Bivariate Lagrange interpolation at the Padua points: the ideal theory approach
Numerische Mathematik
Hyperinterpolation in the cube
Computers & Mathematics with Applications
Bayesian classifiers based on kernel density estimation: Flexible classifiers
International Journal of Approximate Reasoning
Maximum Likelihood Learning of Conditional MTE Distributions
ECSQARU '09 Proceedings of the 10th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty
Learning hybrid Bayesian networks using mixtures of truncated exponentials
International Journal of Approximate Reasoning
IDA'07 Proceedings of the 7th international conference on Intelligent data analysis
Parameter estimation and model selection for mixtures of truncated exponentials
International Journal of Approximate Reasoning
Inference in hybrid Bayesian networks using mixtures of polynomials
International Journal of Approximate Reasoning
Discriminative Learning of Bayesian Networks via Factorized Conditional Log-Likelihood
The Journal of Machine Learning Research
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Mixtures of truncated basis functions
International Journal of Approximate Reasoning
Nonparametric multivariate density estimation: a comparative study
IEEE Transactions on Signal Processing
Two issues in using mixtures of polynomials for inference in hybrid Bayesian networks
International Journal of Approximate Reasoning
No free lunch theorems for optimization
IEEE Transactions on Evolutionary Computation
A Survey of Discretization Techniques: Taxonomy and Empirical Analysis in Supervised Learning
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.00 |
Non-parametric density estimation is an important technique in probabilistic modeling and reasoning with uncertainty. We present a method for learning mixtures of polynomials (MoPs) approximations of one-dimensional and multidimensional probability densities from data. The method is based on basis spline interpolation, where a density is approximated as a linear combination of basis splines. We compute maximum likelihood estimators of the mixing coefficients of the linear combination. The Bayesian information criterion is used as the score function to select the order of the polynomials and the number of pieces of the MoP. The method is evaluated in two ways. First, we test the approximation fitting. We sample artificial datasets from known one-dimensional and multidimensional densities and learn MoP approximations from the datasets. The quality of the approximations is analyzed according to different criteria, and the new proposal is compared with MoPs learned with Lagrange interpolation and mixtures of truncated basis functions. Second, the proposed method is used as a non-parametric density estimation technique in Bayesian classifiers. Two of the most widely studied Bayesian classifiers, i.e., the naive Bayes and tree-augmented naive Bayes classifiers, are implemented and compared. Results on real datasets show that the non-parametric Bayesian classifiers using MoPs are comparable to the kernel density-based Bayesian classifiers. We provide a free R package implementing the proposed methods.