Abstract
In this paper, we develop verifiable sufficient conditions and computable performance bounds of ℓ1-minimization based sparse recovery algorithms in both the noise-free and noisy cases. We define a family of quality measures for arbitrary sensing matrices as a set of optimization problems, and design polynomial-time algorithms with theoretical global convergence guarantees to compute these quality measures. The proposed algorithms solve a series of second-order cone programs, or linear programs. We derive performance bounds on the recovery errors in terms of these quality measures. We also analytically demonstrate that the developed quality measures are non-degenerate for a large class of random sensing matrices, as long as the number of measurements is relatively large. Numerical experiments show that, compared with the restricted isometry based performance bounds, our error bounds apply to a wider range of problems and are tighter, when the sparsity levels of the signals are relatively low.
Original language | English |
---|---|
Article number | 6939687 |
Pages (from-to) | 132-141 |
Number of pages | 10 |
Journal | IEEE Transactions on Signal Processing |
Volume | 63 |
Issue number | 1 |
DOIs | |
State | Published - Jan 1 2015 |
Keywords
- Compressive sensing
- Computable performance bounds
- Linear programming
- Second-order cone programming
- Sparse recovery