In this talk, we study reparameterized and doubly-reparameterized gradient estimators tied to the IWAE, VR and VR-IWAE bounds. Our asymptotic analyses provide a unified comparison of these estimators under mild assumptions, allowing us to identify their respective strengths. Additional asymptotic analyses reveal a new perspective on challenging regimes where the variational approximation deteriorates: even in such settings, importance-weighted gradient estimators can still be used to learn the parameters of interest. Consequently, our work motivates further exploration of importance weighting as a principle for designing and analyzing variational inference algorithms. In addition, our proof techniques establish general theoretical tools that apply more broadly within importance weighting and are of independent interest. We complement our theoretical contributions with experiments illustrating our findings.