When approximating a multivariate function defined on an irregular domain, a good choice of sampling points is critical. In this paper, my PhD student Juan and I develop new, practical sampling strategies for which the sample complexity is near-optimal: specifically, it is linear (up to a log factor) in the degree of the approximation. This improves previous approaches which were at best quadratic in the degree. Here’s the paper:
I am very pleased to say that my postdoc Simone Brugiapaglia has recently accepted a tenure-track position at Concordia University. Congratulations!
Simone was also featured in an in-depth interview conducted by PIMS. Check it out here:
I am pleased to announce the arrival of three new members to my group this fall:
- Nick Dexter is a PIMS Postdoctoral Fellow, joining from the University of Tennessee in the USA.
- Juan Manuel Cárdenas is a PhD student, joining from the University of Concepcion in Chile. He previously visited my group in Spring 2017.
- Matthew King-Roskamp is an NSERC MSc student, and a former undergraduate honours student at SFU. He was previously an undergraduate researcher in my group.
I am pleased to announce we will be organizing the 2020 Foundations of Computational Mathematics conference at Simon Fraser University. My organizing team and I are looking forward to hosting the conference attendees in Vancouver in June 2020. Stay tuned for more information!
Multivariate polynomials are excellent means of approximating high-dimensional functions on tensor-product domains. But what about approximations on irregular domains, as is quite common in applications? In our new paper, Daan Huybrechs and I tackle this question using tools from frame theory and approximation theory:
We establish a series of results on approximation rates and sample complexity, deriving bounds that scale well with dimension in a variety of cases.
Matt King-Roskamp – an undergraduate student in my group co-supervised with Simone Brugiapaglia – was awarded runner-up in the poster competition at the recent SIAM PNW Conference for his poster Optimal Sampling Strategies for Compressive Imaging. He beat out a competitive field of graduate and undergraduate students from across the Pacific Northwest region.
In it we explain how recent progress in the theory of compressed sensing allows one to significantly enhance the performance of sparse recovery techniques in imaging applications.
My former undergraduate student Anyi (Casie) Bao was a winner of the SFU Department of Mathematics Undergraduate Research Prize. Her work involved the development and analysis of compressed sensing-based strategies for correcting for corrupted measurements in Uncertainty Quantification. A draft version of the resulting paper can be found here:
Matt King-Roskamp – an undergraduate student in my group co-supervised with Simone Brugiapaglia – was awarded runner-up in two poster competitions this summer:
- The SFU Science Undergraduate Research Journal (SURJ) poster competition
- The SFU Symposium on Mathematics & Computation
His work, entitled Optimal Sampling Strategies for Compressive Imaging, presents new, theoretically optimal sampling techniques for imaging using compressed sensing.
Simone Brugiapaglia, Rick Archibald and I have been investigating the robustness of constrained l1-minimization regularization to unknown measurement errors. The vast majority of existing theory for such problems requires an a priori bound on the noise level. Yet in many, if not most, real-world applications such a bound is unknown. Our work provides the first recovery guarantees for a large class of practical measurement matrices, thereby extending existing results for subgaussian random matrices. Besides this, our work sheds new light on sparse regularization in practice, addressing questions about the relationship between model fidelity, noise parameter estimation and reconstruction error.