Random number generation and quasi-Monte Carlo methods
Random number generation and quasi-Monte Carlo methods
Artificial Intelligence Review - Special issue on lazy learning
Efficient greedy learning of Gaussian mixture models
Neural Computation
Genetic-Based EM Algorithm for Learning Gaussian Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Gaussian sum particle filtering
IEEE Transactions on Signal Processing
Projection pursuit mixture density estimation
IEEE Transactions on Signal Processing
Density estimation via exponential model selection
IEEE Transactions on Information Theory
Density estimation and random variate generation using multilayer networks
IEEE Transactions on Neural Networks
Deterministic design for neural network learning: an approach based on discrepancy
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
A new efficient technique for estimating probability densities from data through the application of the approximate global maximum likelihood (AGML) approach is proposed. It employs a composition of kernel functions to estimate the correct behavior of parameters involved in the expression of the unknown probability density. Convergence to the optimal solution is guaranteed by a deterministic learning framework when low discrepancy sequences are used to generate the centers of the kernels. Trials on mixture of Gaussians show that the proposed semi-local technique is able to efficiently approximate the maximum likelihood solution even in complex situations where implementations based on standard neural networks require an excessive computational cost.