~~NOCACHE~~ {{page>.:indexheader}} \\ ==== Next talk ==== [[en:seminaires:StatP6P7:index|Statistics seminar]]\\ Tuesday May 12, 2026, 10:45AM, Sophie Germain en salle 2018\\ **François Roueff** (Télécom Paris) //Variational Inference with Rényi divergence and importance weights// \\ Variational Inference (VI) is now a well established approach in statistical learning. It gave raise to many variants and algorithms used to optimize the variational parameter or the model parameter. The common first ingredient of these variants is to exhibit a convenient lower bound of the model likelihood. We will first present recent works on the inference of the variational parameter based on the monotonic optimization of the alpha divergence. However, leveraging reparameterization strategies, many algorithms are based on stochastic gradient descent, in which case controlling the bias and variance of the stochastic gradient is of particular interest. Several variational bounds involving importance weighting ideas have been proposed to generalize and improve on the Evidence Lower BOund (ELBO) in the context of marginal likelihood optimization, such as the Importance-weighted Auto-Encoder (IWAE), Variational Rényi (VR) and VR-IWAE bounds. Yet, it remains unclear how the joint choice of bound and gradient estimator impacts the behavior of the resulting algorithms. In this context, we will present some recent works studying reparameterized and doubly-reparameterized gradient estimators tied to the IWAE, VR and VR-IWAE bounds. {{page>.:info}} \\ ==== Previous talks ==== \\ === Year 2026 === {{page>.:statp6p72026}} \\ === Year 2025 === {{page>.:statp6p72025}} \\ === Year 2024 === {{page>.:statp6p72024}} \\ === Year 2023 === {{page>.:statp6p72023}} \\ === Year 2022 === {{page>.:statp6p72022}} \\ === Year 2021 === {{page>.:statp6p72021}}