Wiki
Wiki
Courses and public working groups
Courses and public working groups
Private Working Groups
Private Working Groups
- New!!! Reading Group
- Theatre
- Admin
- Research
- Teaching
This is an interesting application of a symmetrisation argument. This article is almost directly taken from Terrence Tao's webpage (Theorem 8) (see also Theorem 2.1.12 in “Topics in Random Matrix Theory” by Terrence Tao).
Theorem Let be iid such that and set . Let be a -Lipschitz function wrt to the euclidian distance on , i.e., for all . Then for every ,
The original argument was given by Maurey and Pisier. We only prove the assertion under the more restrictive assumption . Let an independent copy of , then for every , where the last inequality follows from the Jensen inequality. We can see that the symmetrisation argument comes from a Jensen inequality and this now allows to use an infinitesimal writing of the difference , where and are standard Gaussian r.v. and independent!!!! This is an amazing trick, we don't use the classical interpolation but a “polar” interpolation and this allows to get the product of uncorrelated terms (and therefore independent since we are in the context of Gaussian vectors). We again make use of Jensen's inequality in order to let appear, with a tower property, a term of the form where is Gaussian, The rest of the proof is now standard. This proves that for every Choosing now such that yields and Applying the previous inequality with replaced by and using finishes the proof.
We can see \eqref{eq:two} as where is the cumulative distribution function of and . Then, for all , we have where is the generalized inverse of the function . Therefore, there exists an exponential random variable such that We can improve this bound with some more intricate proof, see Thm B.6, Introduction to high dimensional statistics, C. Giraud, which shows that