Machine learning an artificial intelligence approach volume II
Machine learning an artificial intelligence approach volume II
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Multilayer feedforward networks are universal approximators
Neural Networks
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Nonparametric regression analysis using self-organizing topological maps
Neural networks for perception (Vol. 2)
Adaptive mixtures: recursive nonparametric pattern recognition
Pattern Recognition
C4.5: programs for machine learning
C4.5: programs for machine learning
An introduction to Kolmogorov complexity and its applications
An introduction to Kolmogorov complexity and its applications
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Algorithmic learning
Communications of the ACM
The nature of statistical learning theory
The nature of statistical learning theory
A New Image Motion Estimation Algorithm Based on the EM Technique
IEEE Transactions on Pattern Analysis and Machine Intelligence
Probabilistic Visual Learning for Object Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Example-Based Learning for View-Based Human Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Mathematics of Generalization: The Proceedings of the SFI/CNLS Workshop on Formal Approaches to Supervised Learning
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
Active learning with statistical models
Journal of Artificial Intelligence Research
Improved heterogeneous distance functions
Journal of Artificial Intelligence Research
A regularization approach to joint blur identification and image restoration
IEEE Transactions on Image Processing
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
What's there and what's not?: focused crawling for missing documents in digital libraries
Proceedings of the 5th ACM/IEEE-CS joint conference on Digital libraries
Hi-index | 0.00 |
This paper presents a general and efficient framework for probabilistic inference and learning from arbitrary uncertain information. It exploits the calculation properties of finite mixture models, conjugate families and factorization. Both the joint probability density of the variables and the likelihood function of the (objective or subjective) observation are approximated by a special mixture model, in such a way that any desired conditional distribution can be directly obtained without numerical integration. We have developed an extended version of the expectation maximization (EM) algorithm to estimate the parameters of mixture models from uncertain training examples (indirect observations). As a consequence, any piece of exact or uncertain information about both input and output values is consistently handled in the inference and learning stages. This ability, extremely useful in certain situations, is not found in most alternative methods. The proposed framework is formally justified from standard probabilistic principles and illustrative examples are provided in the fields of nonparametric pattern classification, nonlinear regression and pattern completion. Finally, experiments on a real application and comparative results over standard databases provide empirical evidence of the utility of the method in a wide range of applications.