I am very pleased to say that my postdoc Simone Brugiapaglia has recently accepted a tenure-track position at Concordia University. Congratulations!
Simone was also featured in an in-depth interview conducted by PIMS. Check it out here:
I am very pleased to say that my postdoc Simone Brugiapaglia has recently accepted a tenure-track position at Concordia University. Congratulations!
Simone was also featured in an in-depth interview conducted by PIMS. Check it out here:
I am pleased to announce the arrival of three new members to my group this fall:
Welcome!
I am pleased to announce we will be organizing the 2020 Foundations of Computational Mathematics conference at Simon Fraser University. My organizing team and I are looking forward to hosting the conference attendees in Vancouver in June 2020. Stay tuned for more information!
Multivariate polynomials are excellent means of approximating high-dimensional functions on tensor-product domains. But what about approximations on irregular domains, as is quite common in applications? In our new paper, Daan Huybrechs and I tackle this question using tools from frame theory and approximation theory:
Approximating smooth, multivariate functions on irregular domains
We establish a series of results on approximation rates and sample complexity, deriving bounds that scale well with dimension in a variety of cases.
Matt King-Roskamp – an undergraduate student in my group co-supervised with Simone Brugiapaglia – was awarded runner-up in the poster competition at the recent SIAM PNW Conference for his poster Optimal Sampling Strategies for Compressive Imaging. He beat out a competitive field of graduate and undergraduate students from across the Pacific Northwest region.
Alex Bastounis, Anders C. Hansen and I have an article in this month’s edition of SIAM News:
From Global to Local: Getting More from Compressed Sensing
In it we explain how recent progress in the theory of compressed sensing allows one to significantly enhance the performance of sparse recovery techniques in imaging applications.
My former undergraduate student Anyi (Casie) Bao was a winner of the SFU Department of Mathematics Undergraduate Research Prize. Her work involved the development and analysis of compressed sensing-based strategies for correcting for corrupted measurements in Uncertainty Quantification. A draft version of the resulting paper can be found here:
Compressed sensing with sparse corruptions: Fault-tolerant sparse collocation approximations
Matt King-Roskamp – an undergraduate student in my group co-supervised with Simone Brugiapaglia – was awarded runner-up in two poster competitions this summer:
His work, entitled Optimal Sampling Strategies for Compressive Imaging, presents new, theoretically optimal sampling techniques for imaging using compressed sensing.
Simone Brugiapaglia, Rick Archibald and I have been investigating the robustness of constrained l1-minimization regularization to unknown measurement errors. The vast majority of existing theory for such problems requires an a priori bound on the noise level. Yet in many, if not most, real-world applications such a bound is unknown. Our work provides the first recovery guarantees for a large class of practical measurement matrices, thereby extending existing results for subgaussian random matrices. Besides this, our work sheds new light on sparse regularization in practice, addressing questions about the relationship between model fidelity, noise parameter estimation and reconstruction error.
Simone will be presenting this work at SPARS2017 and SampTA 2017. In the meantime our conference paper is available here:
Recovery guarantees for compressed sensing with unknown errors
Simone Brugiapaglia, Clayton Webster and I have written a chapter that surveys recent trends in high-dimensional approximation using the theory and techniques of compressed sensing:
Polynomial approximation of high-dimensional functions via compressed sensing
In it we demonstrate how the structured sparsity of polynomial coefficients of high-dimensional functions can be exploited via weighted l1 minimization techniques. This yields approximation algorithms whose sample complexities are essentially independent of the dimension d. Hence the curse of dimensionality is mitigated to a significant extent.