Estimation of high-dimensional prior and posterior covariance matrices in Kalman filter variants
Journal of Multivariate Analysis
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
The Journal of Machine Learning Research
First-Order Methods for Sparse Covariance Selection
SIAM Journal on Matrix Analysis and Applications
High Dimensional Inverse Covariance Matrix Estimation via Linear Programming
The Journal of Machine Learning Research
Exact covariance thresholding into connected components for large-scale graphical lasso
The Journal of Machine Learning Research
Hi-index | 0.00 |
Undirected graphs are often used to describe high dimensional distributions. Under sparsity conditions, the graph can be estimated using l1-penalization methods. We propose and study the following method. We combine a multiple regression approach with ideas of thresholding and refitting: first we infer a sparse undirected graphical model structure via thresholding of each among many l1-norm penalized regression functions; we then estimate the covariance matrix and its inverse using the maximum likelihood estimator. We show that under suitable conditions, this approach yields consistent estimation in terms of graphical structure and fast convergence rates with respect to the operator and Frobenius norm for the covariance matrix and its inverse. We also derive an explicit bound for the Kullback Leibler divergence.