The algorithm performs an embedded re-weighting inside the subroutines of the SPGL1 algorithm updating the Pareto curve with every reweighted subproblem. The result is an improved sparse recovery performance compared with standard L1 minimization with no additional computational cost.

Sparse signal recovery has been dominated by the basis pursuit denoise (BPDN) problem formulation for over a decade. In this paper, we propose an algorithm that outperforms BPDN in ﬁnding sparse solutions to underdetermined linear systems of equations at no additional computational cost. Our algorithm, called WSPGL1, is a modiﬁcation of the spectral projected gradient for `1 minimization (SPGL1) algorithm in which the sequence of LASSO subproblems are replaced by a sequence of weighted LASSO subproblems with constant weights applied to a support estimate. The support estimate is derived from the data and is updated at every iteration. The algorithm also modiﬁes the Pareto curve at every iteration to reﬂect the new weighted `1 minimization problem that is being solved. We demonstrate through extensive simulations that the sparse recovery performance of our algorithm is superior to that of `1 minimization and approaches the recovery performance of iterative re-weighted `1 (IRWL1) minimization of Candes, Wakin, and Boyd, although it does not match ` it in general. Moreover, our algorithm has the computational cost of a single BPDN problem.