Please click below on a speaker's name to view their abstract.
Kamélia Daudel
Kamélia Daudel
Schedule: 9H30-10H20
Title: Learning with Importance Weighted Variational Inference
Joint work with François Roueff
Abstract: Several variational bounds involving importance weighting ideas have been proposed to generalize and improve on the Evidence Lower BOund (ELBO) in the context of maximum likelihood optimization, such as the Importance Weighted Auto-Encoder (IWAE), Variational Rényi (VR) and VR-IWAE bounds. Learning the parameters of interest using these bounds typically involves stochastic gradient-based variational inference procedures. Yet, it remains unclear how the joint choice of bound and gradient estimator impacts the behavior of the resulting algorithms.
In this talk, we study reparameterized and doubly-reparameterized gradient estimators tied to the IWAE, VR and VR-IWAE bounds. Our asymptotic analyses provide a unified comparison of these estimators under mild assumptions, allowing us to identify their respective strengths. Additional asymptotic analyses reveal a new perspective on challenging regimes where the variational approximation deteriorates: even in such settings, importance-weighted gradient estimators can still be used to learn the parameters of interest. Consequently, our work motivates further exploration of importance weighting as a principle for designing and analyzing variational inference algorithms. In addition, our proof techniques establish general theoretical tools that apply more broadly within importance weighting and are of independent interest. We complement our theoretical contributions with experiments illustrating our findings.
Jimmy Olsson
Jimmy Olsson
Schedule: 10H50-11H40
Title: to be completed.
Julien Stoehr
Julien Stoehr
Schedule: 11H40-12H30
Title: Entropic Mirror Monte Carlo
Abstract : Importance sampling is a well-known Monte Carlo method used to estimate expectations under a target distribution by drawing weighted samples from a proposal distribution. However, for intricate target distributions, such as multi-modal distributions in high-dimensional spaces, the method becomes inefficient unless the proposal distribution is carefully designed.
In this talk, we introduce an adaptive framework for constructing efficient proposal distributions related to the recent framework of mirror descent. Our algorithm enhances exploration of the target distribution by combining global sampling strategies with a delayed weighting mechanism. This delayed weighting is essential, as it enables immediate resampling in regions where the proposal distribution is poorly suited. We establish that the proposed scheme exhibits geometric convergence under mild assumptions.
François Portier
François Portier
Schedule: 14H00-14H50
Title: Stochastic mirror descent for nonparametric adaptive importance sampling
Joint work with Pascal Bianchi, Bernard Delyon and Victor Priser
Abstract: This paper addresses the problem of approximating an unknown probability distribution with density $f$ - which can only be evaluated up to an unknown scaling factor - with the help of a sequential algorithm that produces at each iteration $n\geq 1$ an estimated density $q_n$. The proposed method optimizes the Kullback-Leibler divergence using a mirror descent (MD) algorithm directly on the space of density functions, while a stochastic approximation technique helps to manage between algorithm complexity and variability. One of the key innovations of this work is the theoretical guarantee that is provided for an algorithm with a fixed MD learning rate \(\eta \in (0,1 )\). The main result is that the sequence \(q_n\) converges almost surely to the target density \(f\) uniformly on compact sets. Through numerical experiments, we show that fixing the learning rate \(\eta \in (0,1 )\) significantly improves the algorithm's performance, particularly in the context of multi-modal target distributions where a small value of $\eta$ allows to increase the chance of finding all modes. Additionally, we propose a particle subsampling method to enhance computational efficiency and compare our method against other approaches through numerical experiments.
Yazid Janati
Yazid Janati
Schedule: 14H50-15H40
Title: Guiding Diffusion models at Inference
Abstract: Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems. Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems, only leveraging inference-time compute and thereby eliminating the need to retrain task-specific models on the same dataset. To approximate the posterior of a Bayesian inverse problem, a diffusion model samples from a sequence of intermediate posterior distributions, each with an intractable likelihood function. This work proposes a novel mixture approximation of these intermediate distributions. Since direct gradient-based sampling of these mixtures is infeasible due to intractable terms, we propose a practical method based on Gibbs sampling. We validate our approach through extensive experiments on image inverse problems, utilizing both pixel- and latent-space diffusion priors, as well as on source separation with an audio diffusion model.
François Bertholom
François Bertholom
Schedule: 16H10-16H40
Title: Limit behavior of the alpha-divergence and strong minimality of exponential families.
Joint work with Randal Douc and François Roueff.
Abstract: Minimizing the alpha-divergence is a compelling way to approximate an unnormalized density with an exponential family distribution. To establish convergence properties for monotonic alpha-divergence minimization algorithms, it is helpful to understand how the objective behaves. In particular, we would like to know if its level sets are compact. This presentation investigates the behavior of the alpha-divergence as the parameter approaches the boundary of the parameter space, and as its norm goes to infinity. We connect this limit behavior to a key property of the approximating exponential family, which we call “strong minimality”. This property is sufficient to guarantee the compactness of the level sets.
Yvann Le Fay
Yvann Le Fay
Schedule: 16H40-17H10
Title:
Joint work
world/std2025_abstract.1763127433.txt.gz · Last modified: 2025/11/14 14:37 by rdouc