Co-author: Clayton Webster (UTK/ORNL). This talk is concerned with the compressed sensing approach to reconstruction of high-dimensional functions from limited amount of data. In this approach, the uniform bounds of the underlying global polynomial bases have often been relied on for the complexity analysis and algorithm development. We prove a new, improved recovery condition without using this uniform boundedness assumption, applicable to multidimensional Legendre approximations. Specifically, our sample complexity is established using the unbounded envelope of all polynomials, thus independent of polynomial subspaces. Some consequent, simple criteria for choosing good random sample sets will also be discussed. In the second part, I will discuss the recovery guarantees of nonconvex optimizations. These minimizations are generally closer to l_0 penalty than l_1 norm, thus it is widely accepted (also demonstrated computationally in UQ) that they are able to enhance the sparsity and accuracy of the approximations. However, the theory proving that nonconvex penalties are as good as or better than l1 minimization in sparse reconstruction has not been available beyond a few specific cases. We aim to fill this gap by establishing new recovery guarantees through unified null space properties that encompass most of the currently proposed nonconvex functionals in the literature, verifying that they are truly superior to l_1.