Wiki
Wiki
Courses and public working groups
Courses and public working groups
Private Working Groups
Private Working Groups
- New!!! Reading Group
- Theatre
- Admin
- Research
- Teaching
If follows a standard normal distribution, then
Assume that are iid. Denote . In the case where the distribution of is standard normal, then The bound is not bad in but not very nice in .
Let be iid standard gaussian random variables. Then, by Jensen's inequality, for all , Taking the and dividing by , we get: Choosing such that yields for , With a similar argument, we can show that Finally, a Markov inequality yields for all which is better than the previous bound \eqref{eq:bound:max} wrt but dramatic wrt …
Let be a non-negative random variable on a probability space and assume that there exists a constant such that for all , Then,
The proof is completed by noting that
It may seem a bit convoluted to bound using a bound of . I tried using a direct proof. The bound is less sharp because on the second line, we only apply a rough bound on the survival function of a Gaussian distribution. And not surprisingly, the resulting bound is less sharp that the previous one because: .
Let be a square integrable -martingale. Then,
Let with the convention that . Then, We first rewrite the rhs using that is also a -martingale. To see this last property, write , which implies Now, the rhs of \eqref{eq:kolm:one} can be written using
Define with the convention that . Then,
where we used in the second inequality that is non-negative and in the third inequality that is a supermartingale. The proof then follows by letting goes to infinity.
where we used in the last inequality that is non-negative. The proof is completed.