A nonmonotone line search technique for Newton's method
SIAM Journal on Numerical Analysis
Determinant Maximization with Linear Matrix Inequality Constraints
SIAM Journal on Matrix Analysis and Applications
Nonmonotone Spectral Projected Gradient Methods on Convex Sets
SIAM Journal on Optimization
Sparse graphical models for exploring gene expression data
Journal of Multivariate Analysis
Smooth minimization of non-smooth functions
Mathematical Programming: Series A and B
Factored sparse inverse covariance matrices
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 02
First-Order Methods for Sparse Covariance Selection
SIAM Journal on Matrix Analysis and Applications
Covariance selection for nonchordal graphs via chordal embedding
Optimization Methods & Software - Mathematical programming in data mining and machine learning
Smooth Optimization Approach for Sparse Covariance Selection
SIAM Journal on Optimization
Topology Selection in Graphical Models of Autoregressive Processes
The Journal of Machine Learning Research
Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
SIAM Journal on Optimization
Exact covariance thresholding into connected components for large-scale graphical lasso
The Journal of Machine Learning Research
Hi-index | 0.00 |
In this paper we consider estimating sparse inverse covariance of a Gaussian graphical model whose conditional independence is assumed to be partially known. Similarly as in [A. d'Aspremont, O. Banerjee, and L. El Ghaoui, SIAM J. Matrix Anal. Appl., 30 (2008), pp. 56-66; M. Yuan and Y. Lin, Biometrika, 94 (2007), pp. 19-35], we formulate it as an $l_1$-norm penalized maximum likelihood estimation problem. Further, we propose an algorithm framework, and develop two first-order methods, that is, the adaptive spectral projected gradient (ASPG) method and the adaptive Nesterov's smooth (ANS) method, for solving this estimation problem. Finally, we compare the performance of these two methods with glasso [J. Friedman, T. Hastie, and R. Tibshirani, Biostatistics, 9 (2008), pp. 432-441; J. Friedman, T. Hastie, and R. Tibshirani, Glasso: Graphical Lasso for R, Software package, Department of Statistics, Stanford University, Stanford, CA, 2007] on a set of randomly generated instances. Our computational results demonstrate that our methods are capable of solving problems of size at least a thousand and number of constraints of nearly a half million within a reasonable amount of time, and moreover, that the ASPG method generally outperforms the ANS method and glasso.