Factoring Gaussian precision matrices for linear dynamic models
Pattern Recognition Letters
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Speaker recognition with mixtures of Gaussians with sparse regression matrices
HLT-SRWS '04 Proceedings of the Student Research Workshop at HLT-NAACL 2004
Covariance estimation in decomposable Gaussian graphical models
IEEE Transactions on Signal Processing
Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
SIAM Journal on Matrix Analysis and Applications
Sparse covariance estimates for high dimensional classification using the cholesky decomposition
SSPR'06/SPR'06 Proceedings of the 2006 joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Hi-index | 0.00 |
Most HMM-based speech recognition systems use Gaussian mixtures as observation probability density functions. An important goal in all such systems is to improve parsimony. One method is to adjust the type of covariance matrices used. In this work, factored sparse inverse covariance matrices are introduced. Based on U'DU factorization, the inverse covariance matrix can be represented using linear regressive coefficients which 1) correspond to sparse patterns in the inverse covariance matrix (and therefore represent conditional independence properties of the Gaussian), and 2), result in a method of partial tying of the covariance matrices without requiring non-linear EM update equations. Results show that the performance of full-covariance Gaussians can be matched by factored sparse inverse covariance Gaussians having significantly fewer parameters.