Graphical Models, Exponential Families, and Variational Inference
Foundations and Trends® in Machine Learning
Limiting spectral distribution of large-dimensional sample covariance matrices generated by VARMA
Journal of Multivariate Analysis
Learning Gaussian tree models: analysis of error exponents and extremal structures
IEEE Transactions on Signal Processing
High Dimensional Inverse Covariance Matrix Estimation via Linear Programming
The Journal of Machine Learning Research
Adaptive First-Order Methods for General Sparse Inverse Covariance Selection
SIAM Journal on Matrix Analysis and Applications
Regularized parameter estimation in high-dimensional gaussian mixture models
Neural Computation
High-dimensional Covariance Estimation Based On Gaussian Graphical Models
The Journal of Machine Learning Research
Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
SIAM Journal on Optimization
Alternating Direction Method for Covariance Selection Models
Journal of Scientific Computing
Monitoring the covariance matrix with fewer observations than variables
Computational Statistics & Data Analysis
High-dimensional Gaussian graphical model selection: walk summability and local separation criterion
The Journal of Machine Learning Research
Hi-index | 0.00 |
Given a sample covariance matrix, we solve a maximum likelihood problem penalized by the number of nonzero coefficients in the inverse covariance matrix. Our objective is to find a sparse representation of the sample data and to highlight conditional independence relationships between the sample variables. We first formulate a convex relaxation of this combinatorial problem, we then detail two efficient first-order algorithms with low memory requirements to solve large-scale, dense problem instances.