Robustness to unknown error in sparse regularization

Simone Brugiapaglia, Rick Archibald and I have been investigating the robustness of constrained l1-minimization regularization to unknown measurement errors.  The vast majority of existing theory for such problems requires an a priori bound on the noise level.  Yet in many, if not most, real-world applications such a bound is unknown.  Our work provides the first recovery guarantees for a large class of practical measurement matrices, thereby extending existing results for subgaussian random matrices.  Besides this, our work sheds new light on sparse regularization in practice, addressing questions about the relationship between model fidelity, noise parameter estimation and reconstruction error.

Simone will be presenting this work at SPARS2017 and SampTA 2017.  In the meantime our conference paper is available here:

Recovery guarantees for compressed sensing with unknown errors

Approximating high-dimensional functions via compressed sensing

Simone Brugiapaglia, Clayton Webster and I have written a chapter that surveys recent trends in high-dimensional approximation using the theory and techniques of compressed sensing:

Polynomial approximation of high-dimensional functions via compressed sensing

In it we demonstrate how the structured sparsity of polynomial coefficients of high-dimensional functions can be exploited via weighted l1 minimization techniques.  This yields approximation algorithms whose sample complexities are essentially independent of the dimension d.  Hence the curse of dimensionality is mitigated to a significant extent.