Maximum trimmed likelihood estimators: a unified approach, examples, and algorithms
Computational Statistics & Data Analysis
Deterministic annealing EM algorithm
Neural Networks
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Constrained Clustering as an Optimization Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Artificial Intelligence
Using Rest Class and Control Paradigms for Brain Computer Interfacing
IWANN '09 Proceedings of the 10th International Work-Conference on Artificial Neural Networks: Part I: Bio-Inspired Systems: Computational and Ambient Intelligence
An outlier-aware data clustering algorithm in mixture models
ICICS'09 Proceedings of the 7th international conference on Information, communications and signal processing
On the on-line learning algorithms for EEG signal classification in brain computer interfaces
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
DS'05 Proceedings of the 8th international conference on Discovery Science
Hi-index | 0.00 |
This paper presents a novel method based on deterministic annealing to circumvent the problem of the sensitivity to atypical observations associated with the maximum likelihood (ML) estimator via conventional EM algorithm for mixture models. In order to learn the mixture models in a robust way, the parameters of mixture model are estimated by trimmed likelihood estimator (TLE), and the learning process is controlled by temperature based on the principle of maximum entropy. Moreover, we apply the proposed method to the single-trial electroencephalography (EEG) classification task. The motivation of this work is to eliminate the negative effects of artifacts in EEG data, which usually exist in real-life environments, and the experimental results demonstrate that the proposed method can successfully detect the outliers and therefore achieve more reliable result.