Simone Brugiapaglia, Rick Archibald and I have been investigating the robustness of constrained l1-minimization regularization to unknown measurement errors. The vast majority of existing theory for such problems requires an a priori bound on the noise level. Yet in many, if not most, real-world applications such a bound is unknown. Our work provides the first recovery guarantees for a large class of practical measurement matrices, thereby extending existing results for subgaussian random matrices. Besides this, our work sheds new light on sparse regularization in practice, addressing questions about the relationship between model fidelity, noise parameter estimation and reconstruction error.
In it we demonstrate how the structured sparsity of polynomial coefficients of high-dimensional functions can be exploited via weighted l1 minimization techniques. This yields approximation algorithms whose sample complexities are essentially independent of the dimension d. Hence the curse of dimensionality is mitigated to a significant extent.