{{page>:defs}} ====== A necessary condition for a Markov kernel to be geometrically ergodic ====== This result is taken from Roberts and Tweedie, Thm 5.1 (Biometrika 1996): // Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms. // **Proposition. ** Let $ P $ be an irreducible Markov kernel with invariant distribution $ \pi $ which is not concentrated on a single point, such that $P(x,\{x\})$ is measurable and $$ \mathrm{ess sup} P(x,\{x\})=1 $$ where the essential supremum is taken wrt the measure $ \pi $. Then the Markov kernel $ P $ is not geometrically ergodic ==== Proof ==== The proof works by contradiction. Assume that $P$ is geometrically ergodic, then there exists a $m$-small set $C$ such that $\sup_{x\in C} \PE_x[\beta^{\sigma_C}]<\infty$ for some constant $\beta>1$. Now, for any $\eta<1$, define $A_\eta=\set{x \in \Xset}{P(x,\{x\})\geq \eta}$. We can assume that $\pi(A_\eta)>0$ (since in the assumptions, the esssup is taken wrt $\pi$). Then, if $x\in A_\eta$, \begin{equation*} \PP_x(X_1=\ldots=X_j=x) \geq \eta^j \end{equation*} Moreover, * Using that $C$ is a small set, we can easily show that $\sup_{x \in C} P(x,\{x\})<1$ (indeed, if $C\cap A_\eta$ contains two distincts elements $x,x'$ for $\eta$ sufficiently small, then $\epsilon \nu(\{x\}^c) \leq P^m(x,\{x\}^c) \leq 1-\eta^m$ showing that $\nu(\{x\}^c) + \nu(\{x'\}^c)$ is arbitrary small which is not possible since this sum is bounded from below by $\nu(\Xset)$). * This allows to choose $B=A_\eta$ with $\eta$ chosen sufficiently close to $1$ so that $B \cap C=\emptyset$. Then, there exists $w_0\in C$ and $k \in \nset$ such that $\PP_{w_0}(X_k \in B, \sigma_C >k)>0$ (which can be easily seen by contradiction). Now, write for any $\beta>1$, \begin{align*} \sup_{x\in C} \PE_x[\beta^{\sigma_C}]&\geq \PE_{w_0}[\beta^{\sigma_C}-1]+1=(\beta-1) \sum_{i=0}^{\infty} \beta^i \PP_{w_0}(\sigma_C > i ) +1\\ & \geq (\beta-1) \sum_{j=0}^{\infty} \beta^{k+j} \PP_{w_0}(X_k\in B,\sigma_C > k+j ) +1 \\ & \geq (\beta-1) \sum_{j=0}^{\infty} \beta^{k+j} \PP_{w_0}(X_k\in B,\sigma_C > k, X_k=X_{k+1}= \ldots=X_{k+j} ) +1 \\ & \geq (\beta-1) \sum_{j=0}^{\infty} \beta^{k+j} \PP_{w_0}(X_k\in B,\sigma_C > k) \eta^{j} +1 \end{align*} which is divergent for $\eta$ sufficiently close to 1.