Wiki
Wiki
Courses and public working groups
Courses and public working groups
Private Working Groups
Private Working Groups
- New!!! Reading Group
- Theatre
- Admin
- Research
- Teaching
Let be a matrix with real-valued entries. We define and . We can note that and are subspaces of .
. Otherwise stated, $x$ is the orthogonal projection of $y$ on $\b{X}$ if and only if we have the two properties $x \in \img(\bf{X})$ and $(y-x) \in \ker(\b{X}^T)$. The fact that is the orthogonal complement of stems from the following remark: is equivalent to the fact that where are the column vectors of and this in turn is equivalent to .
Denote by the matrix of the orthogonal projection on . By abuse of notation, we also write . We have and we can also note that on .
An orthogonal projection is uniquely determined by the subspace on which it projects. This implies in particular the following property. Assume that is a matrix and is a matrix where we can possibly have . As soon as , we have
By the Pythagore identity, for all , we have () showing that with equality only if .
is a pseudo inverse of (and we note ) iff or, equivalently, for all , . We admit that a pseudo inverse always exists.
Let be a vector of size and a matrix. Since is in , it can be written as for some .
The fundamental result. Theorem. The following properties are equivalent
Moreover, for any choice of the pseudo inverse .
A side effect of this theorem is that does not depend on the choice of the pseudo inverse .
Proof of the fundamental result (click here)
Proof of the fundamental result (click here)
The only difficult part is to show that the matrix defined in the last statement is the one of the orthogonal projector onto . Set and let . We show that by checking successively that and . The first statement is straightforward: . The second statement is in two steps:
Finally, on and and therefore on so that . This concludes the proof.
We can now give the general solutions of the normal equations.
Solving the Normal equations. Theorem. The following equivalence holds true. solves the normal equations iff there exists such that for some pseudo-inverse .
Solving the Normal equations (click here)
Solving the Normal equations (click here)
Indeed, if can be written as in the statement of the Theorem, then applying we get showing that solves the normal equations by the fundemental result. Conversely, assume that solves the normal equations. Then choosing any pseudo inverse , which completes the proof.
We now consider where is a zero mean vector with covariance matrix . We say that is estimable if there exists such that is an unbiased estimator of , which is equivalent to for all and therefore to $\lambda^T=a^T\b{X}$.
The Gauss-Markov Theorem . For any linear unbiased estimator of , where is the least square estimator of . We then say that is BLUE (Best Linear Unbiased Estimator). Moreover, the equality only holds if , saying that the BLUE is unique.
A short proof of the Gauss-Markov Theorem (click here to see it)
A short proof of the Gauss-Markov Theorem (click here to see it)
Note that since is an unbiased estimator of , we have . This implies that Then, , showing that is unbiased. Moreover, using again (), so that, (see $\star$) with equality only if which implies .
In the curse of the proof, we have seen that if is estimable, then choosing such that , and a side effect is that does not depend on the chosen pseudo-inverse whenever .