Random projections of regular simplices
Discrete & Computational Geometry
Polynomial time approximation schemes for dense instances of NP-hard problems
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
Approximation algorithms for maximization problems arising in graph partitioning
Journal of Algorithms
Finding Dense Subgraphs with Semidefinite Programming
APPROX '98 Proceedings of the International Workshop on Approximation Algorithms for Combinatorial Optimization
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
Handbook of Mathematical Functions, With Formulas, Graphs, and Mathematical Tables,
The quadratic knapsack problem-a survey
Discrete Applied Mathematics
Smoothing Technique and its Applications in Semidefinite Optimization
Mathematical Programming: Series A and B
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
SFCS '93 Proceedings of the 1993 IEEE 34th Annual Foundations of Computer Science
Optimal Solutions for Sparse Principal Component Analysis
The Journal of Machine Learning Research
Counting the Faces of Randomly-Projected Hypercubes and Orthants, with Applications
Discrete & Computational Geometry
Testing the nullspace property using semidefinite programming
Mathematical Programming: Series A and B - Special Issue on "Optimization and Machine learning"; Alexandre d’Aspremont • Francis Bach • Inderjit S. Dhillon • Bin Yu
On verifiable sufficient conditions for sparse signal recovery via ℓ 1 minimization
Mathematical Programming: Series A and B - Special Issue on "Optimization and Machine learning"; Alexandre d’Aspremont • Francis Bach • Inderjit S. Dhillon • Bin Yu
Uncertainty principles and ideal atomic decomposition
IEEE Transactions on Information Theory
Decoding by linear programming
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
IEEE Transactions on Information Theory
Hi-index | 0.00 |
We study a weaker formulation of the nullspace property which guarantees recovery of sparse signals from linear measurements by L1 minimization. We require this condition to hold only with high probability, given a distribution on the nullspace of the coding matrix A. Under some assumptions on the distribution of the reconstruction error, we show that testing these weak conditions means bounding the optimal value of two classical graph partitioning problems: the k-Dense-Subgraph and MaxCut problems. Both problems admit efficient, relatively tight relaxations, and we use a randomization argument to produce new approximation bounds for k-Dense-Subgraph. We test the performance of our results on several families of coding matrices.