Opportunities

Overview

I am always looking for talented undergraduate, graduate and postdoctoral researchers to join my group.  I take on students with degrees in mathematics, computer science, engineering or related disciplines. The main requirement is a strong background in mathematics.

If you are a prospective student or postdoc please take a look at my publications to get an overview of the type of research my group does, as well as the list of recent projects below.

You are welcome to email me to discuss your application. However, due to the volume of emails I receive, I am unable to respond to all inquiries. Make sure your email clearly explains (i) why you are interested in working with me, and (ii) your relevant experience.

Opportunities

Here is a list of opportunities broken down by category:

Postdocs

None at this time. Please check back later.

Graduate Students

Prospective MSc and PhD students who wish to work in my group need to apply to the SFU Mathematics graduate program. The application deadline is typically in January of each year.

Undergraduate Students

Mitacs Globalink: I regularly advertise projects in the Mitacs Globalink Research Internship program.  This program offers funding for 12-week summer research projects for senior undergraduate students.

SFU USRA/VPR: I regularly host one or more undergraduate projects as part of the USRA/VPR program (for students from both SFU and elsewhere). These are normally advertised in December of each year on the SFU Mathematics website.

Recent and Ongoing Student Projects and Theses

Christoffel Adaptive Sampling for Sparse Random Feature Expansions

Generative Priors for Inverse Problems: Bayesian Sampling and Theoretical Guarantees

Efficient machine learning with redundant dictionaries

Active learning for deep operator learning

Overcoming Silent Data Corruptions using Structured Sparsity and Machine Learning

Optimal Active Learning with General Data

Active Learning for Scientific Machine Learning

On the Approximation of Gaussian Sobolev Functionals

Optimal and Efficient Algorithms for Learning High-Dimensional, Banach-Valued Functions from Limited Samples

Optimal approximation of holomorphic functions on unbounded domains

Adaptive Sampling Strategies for Function Approximation in High Dimensions

Unrolled NESTA: Constructing Stable, Accurate and Efficient Neural Networks for Inverse Problems

Topics in combinatorial optimization using quantum computing

Deep Learning Techniques for Inverse Problems in Imaging

Global guarantees from local knowledge: stable and robust recovery of sparse in levels vectors