Welcome to Randal Douc's wiki

A collaborative site on maths but not only!

User Tools

Site Tools


world:std2025_abstract

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
world:std2025_abstract [2025/11/21 17:05]
rdouc
world:std2025_abstract [2025/11/21 17:07] (current)
rdouc
Line 72: Line 72:
   * **Joint work** ​   * **Joint work** ​
 </​note>​ </​note>​
 +* **Abstract**:​ Sequential Monte Carlo (SMC) methods are general iterative stochastic algorithms to generate samples from a sequence of probability distributions. They proceed by recursively moving samples through the bridge via three crucial steps: importance sampling, resampling, and rejuvenating. The last step usually consists in running independent Markov chains, and only keeping the final states for the next iterations. ​
  
-  * **Abstract**:​ Sequential Monte Carlo (SMC) methods are general iterative stochastic algorithms to generate +Waste-free ​SMC [Dau and Chopin2022] avoids discarding ​the intermediary samples
-samples from a sequence of probability distributions. They proceed by recursively moving samples +
-through the bridge via three crucial steps: importance sampling, resampling, ​and rejuvenating.  +
-The last step usually consists in running independent Markov chainsand only keeping ​the final +
-states for the next iterations.+
  
-Waste-free SMC [Dau and Chopin2022avoids discarding ​the intermediary samples.+We establish a finite-sample complexity bound for waste-free SMC, our proof generalises the approach of Marion et al [2023]. This bound prescribes sufficient conditions on the size of the particle system for the sampler to return estimated expectations $\hat{\pi}_T(f)$ such that $|\hat{\pi}_T − \pi_T(f)| ≤ \varepsilon$ with probability at least $1 − \eta$, for any bounded test function $f$
  
-We establish a finite-sample complexity bound for waste-free SMC, our proof generalises the  +We demonstrate that waste-free SMC enjoys lower complexity than SMC differing from a logarithmic factor ​$O(\log(T\eta^{-1} \varepsilon^{-2}))$. 
-approach of Marion et al [2023]. This bound prescribes sufficient conditions on the size of the +
-particle system for the sampler to return estimated expectations ​$\hat{\pi}_T(f)$ such that $|\hat{\pi}_T − \pi_T(f)| ≤ \varepsilon$ with probability at least $1 − \eta$, for any bounded test function $f$.+
  
-We demonstrate that waste-free SMC enjoys lower complexity than SMC differing from a +We show that our analysis allows practitioners to tune the particle budget according to the specific statistical objective in hand. In particular, when the goal is to control errors for expectation estimates under $\hat{\pi}_T$ of bounded functions, the sampler can be run with an intentionally reduced number of particles in earlier iterations while still preserving finite-sample guarantees on the estimates. In doing so, we highlight an important distinction between those two objectives: estimating normalisation constants is more challenging than estimating expectations of bounded functions.
-logarithmic factor $O(\log(T\eta^{-1} \varepsilon^{-2}))$. +
- +
-We show that our analysis allows practitioners to tune the particle budget according to the +
-specific statistical objective in hand. In particular, when the goal is to control errors for expec- +
-tation estimates under $\hat{\pi}_T$ of bounded functions, the sampler can be run with an intentionally +
-reduced number of particles in earlier iterations while still preserving finite-sample guarantees on +
-the estimates. In doing so, we highlight an important distinction between those two objectives: estimating normal- +
-isation constants is more challenging than estimating expectations of bounded functions. +
- +
-* **Abstract**:​ Sequential Monte Carlo (SMC) methods are general iterative stochastic algorithms to generate samples from a sequence of probability distributions. They proceed by recursively moving samples through the bridge via three crucial steps: importance sampling, resampling, and rejuvenating. The last step usually consists in running independent Markov chains, and only keeping the final states for the next iterations. Waste-free SMC [Dau and Chopin, 2022] avoids discarding the intermediary samples. We establish a finite-sample complexity bound for waste-free SMC, our proof generalises the approach of Marion et al [2023]. This bound prescribes sufficient conditions on the size of the particle system for the sampler to return estimated expectations $\hat{\pi}_T(f)$ such that $|\hat{\pi}_T − \pi_T(f)| ≤ \varepsilon$ with probability at least $1 − \eta$, for any bounded test function $f$. We demonstrate that waste-free SMC enjoys lower complexity than SMC differing from a logarithmic factor $O(\log(T\eta^{-1} \varepsilon^{-2}))$. ​We show that our analysis allows practitioners to tune the particle budget according to the specific statistical objective in hand. In particular, when the goal is to control errors for expectation estimates under $\hat{\pi}_T$ of bounded functions, the sampler can be run with an intentionally reduced number of particles in earlier iterations while still preserving finite-sample guarantees on the estimates. In doing so, we highlight an important distinction between those two objectives: estimating normalisation constants is more challenging than estimating expectations of bounded functions.+
  
 </​hidden>​ </​hidden>​
  
  
world/std2025_abstract.txt · Last modified: 2025/11/21 17:07 by rdouc