A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
On-line EM Algorithm for the Normalized Gaussian Network
Neural Computation
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Dynamic Control of Adaptive Mixture-of-Gaussians Background Model
AVSS '06 Proceedings of the IEEE International Conference on Video and Signal Based Surveillance
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
EURASIP Journal on Applied Signal Processing
Trial pruning based on genetic algorithm for single-trial EEG classification
Computers and Electrical Engineering
Hangman BCI: An unsupervised adaptive self-paced Brain-Computer Interface for playing games
Computers in Biology and Medicine
State of the art in photon density estimation
ACM SIGGRAPH 2012 Courses
Progressive expectation-maximization for hierarchical volumetric photon mapping
EGSR'11 Proceedings of the Twenty-second Eurographics conference on Rendering
Hi-index | 0.00 |
In this paper we present a sequential expectation maximization algorithm to adapt in an unsupervised manner a Gaussian mixture model for a classification problem. The goal is to adapt the Gaussian mixture model to cope with the non-stationarity in the data to classify and hence preserve the classification accuracy. Experimental results on synthetic data show that this method is able to learn the time-varying statistical features in data by adapting a Gaussian mixture model online. In order to control the adaptation method and to ensure the stability of the adapted model, we introduce an index to detect when the adaptation would fail.