Yazid Janati El Idrissi


Google Scholar
/ Github / Email


I am currently a postdoctoral research at CMAP, Ecole polytechnique working with Eric Moulines and Alain Durmus. Previously I was a PhD candidate in Statistics at Télécom SudParis - Institut Polytechnique de Paris, advised by Sylvain Le Corff (LPSM, Sorbonne Université) and Yohan Petetin (CITI, Télécom SudParis).

My PhD work focused on building new algorithms related to Monte Carlo methods and studying their theoretical properties. I was and still am particularly interested in the interplay between MC and deep learning methods. At the moment I work on solving inverse problems with Denoising Diffusion models.


Research

Entropic Mirror Monte Carlo
YJEL, A. Durmus, S. Le Corff, Y. Petetin, J. Stoehr.
Under review.

Monte Carlo guided Diffusion for Bayesian linear inverse problems
G. Cardoso, YJEL, S. Le Corff, E. Moulines.
International Conference on Learning Representations (ICLR) 2024. Oral, top 1.2%.

State and parameter learning with PaRISian Particle Gibbs
G. Cardoso, YJEL, S. Le Corff, E. Moulines and J. Olsson.
International Conference on Machine Learning (ICML) 2023.

Variance estimation for Sequential Monte Carlo Algorithms: a backward sampling approach
YJEL, S. Le Corff and Y. Petetin.
Accepted for publication in Bernoulli.

NEO: Non Equilibrium Sampling on the Orbit of a Deterministic Transform
A. Thin, YJEL, S. Le Corff, C. Ollion, A. Doucet, A. Durmus, E. Moulines and C. Robert.
Advances in Neural Information Processing Systems (NeurIPS) 34 (2021): 17060-17071.

Structured variational Bayesian inference for Gaussian state-space models with regime switching.
Y. Petetin, YJEL and F. Desbouvries.
IEEE Signal Processing Letters 28 (2021): 1953-1957.

Talks and posters

Monte Carlo guided Diffusion for Bayesian linear inverse problems
An iterative scheme for backward Kullback-Leibler minimization
Variance estimation for Sequential Monte Carlo Algorithms
Non Equilibrium Sampling on the Orbit of a Deterministic Transform