Better subset regression using the nonnegative garrote
Technometrics
MiniMax Methods for Image Reconstruction
MiniMax Methods for Image Reconstruction
A well-conditioned estimator for large-dimensional covariance matrices
Journal of Multivariate Analysis
The Journal of Machine Learning Research
First-Order Methods for Sparse Covariance Selection
SIAM Journal on Matrix Analysis and Applications
High-dimensional Covariance Estimation Based On Gaussian Graphical Models
The Journal of Machine Learning Research
Model selection and estimation in the matrix normal graphical model
Journal of Multivariate Analysis
CODA: high dimensional copula discriminant analysis
The Journal of Machine Learning Research
Self-learning K-means clustering: a global optimization approach
Journal of Global Optimization
Edge detection in sparse Gaussian graphical models
Computational Statistics & Data Analysis
Sparse matrix inversion with scaled Lasso
The Journal of Machine Learning Research
A joint convex penalty for inverse covariance matrix estimation
Computational Statistics & Data Analysis
Hi-index | 0.00 |
This paper considers the problem of estimating a high dimensional inverse covariance matrix that can be well approximated by "sparse" matrices. Taking advantage of the connection between multivariate linear regression and entries of the inverse covariance matrix, we propose an estimating procedure that can effectively exploit such "sparsity". The proposed method can be computed using linear programming and therefore has the potential to be used in very high dimensional problems. Oracle inequalities are established for the estimation error in terms of several operator norms, showing that the method is adaptive to different types of sparsity of the problem.