IEEE Transactions on Pattern Analysis and Machine Intelligence
Variational mixture of Bayesian independent component analyzers
Neural Computation
Inference in Hidden Markov Models (Springer Series in Statistics)
Inference in Hidden Markov Models (Springer Series in Statistics)
ICA mixture model algorithm for unsupervised classification of remote sensing imagery
International Journal of Remote Sensing
A general procedure for learning mixtures of independent component analyzers
Pattern Recognition
Unsupervised image classification, segmentation, and enhancement using ICA mixture models
IEEE Transactions on Image Processing
Application of independent component analysis for evaluation of ashlar masonry walls
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part II
Nonlinear prediction based on independent component analysis mixture modelling
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part II
Hi-index | 0.08 |
We present in this communication a procedure to extend ICA mixture models (ICAMM) to the case of having sequential dependence in the feature observation record. We call it sequential ICAMM (SICAMM). We present the algorithm, essentially a sequential Bayes processor, which can be used to sequentially classify the input feature vector among a given set of possible classes. Estimates of the class-transition probabilities are used in conjunction with the classical ICAMM parameters: mixture matrices, centroids and source probability densities. Some simulations are presented to verify the improvement of SICAMM with respect to ICAMM. Moreover a real data case is considered: the computation of hypnograms to help in the diagnosis of sleep disorders. Both simulated and real data analysis suggest the potential interest of including sequential dependence in the implementation of an ICAMM classifier.