This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
|
world:pca [2022/11/12 13:54] rdouc [Proof] |
world:pca [2022/11/13 18:38] (current) rdouc [Proof] |
||
|---|---|---|---|
| Line 31: | Line 31: | ||
| \sum_{i=1}^n \norm{\projorth{\rset w}X_i}^2 &\leq \lambda_1 \sum_{j=1}^d (w^T w_j)^2=\lambda_1 \norm{w}^2=\lambda_1 | \sum_{i=1}^n \norm{\projorth{\rset w}X_i}^2 &\leq \lambda_1 \sum_{j=1}^d (w^T w_j)^2=\lambda_1 \norm{w}^2=\lambda_1 | ||
| \end{align*} | \end{align*} | ||
| - | Since $\sum_{i=1}^n \norm{\projorth{\rset w_1}X_i}^2=w_1^T S_n w_1=\lambda_1$. Therefore \eqref{eq:vp} holds true for $p=1$. | + | Note that from \eqref{eq:dim1}, we have for any eigenvector $w_k$ of $S_n$, |
| + | \begin{equation} \label{eq:eigenvector} | ||
| + | \sum_{i=1}^n \norm{\projorth{\rset w_k}X_i}^2=w_k^T S_n w_k=\lambda_k. | ||
| + | \end{equation} | ||
| + | In particular, $\sum_{i=1}^n \norm{\projorth{\rset w_1}X_i}^2=\lambda_1$. Therefore \eqref{eq:vp} holds true for $p=1$. | ||
| - | Assume now that \eqref{eq:vp} hold true for some $p \in [1:d-1]$ and let $H \in \Hset_p$. Then, denote by $G=\mathrm{Span}(w_1,\ldots,w_p)^\perp$. Since $\mathrm{dim} (G)=d-p$ and $\mathrm{dim}(H)=p+1$, we must have $G \cap H \notin \{0\}$ (Otherwise the subspace $G+H$ would be of dimension $n-p+p+1=n+1$ which is not possible). Let $w_0$ a unitary vector of $G \cap H$. Then, we have the decomposition $H=\rset w_0 \stackrel{\perp}{+} H_0$ where $H_0$ is of dimension $p$. Then, applying \eqref{eq:dim1} | + | Assume now that \eqref{eq:vp} hold true for some $p \in [1:d-1]$ and let $H \in \Hset_{p+1}$. Then, denote by $G=\mathrm{Span}(w_1,\ldots,w_p)^\perp$. Since $\mathrm{dim} (G)=d-p$ and $\mathrm{dim}(H)=p+1$, we must have $G \cap H \notin \{0\}$ (Otherwise the subspace $G+H$ would be of dimension $d-p+p+1=d+1$ which is not possible). Let $w_0$ a unitary vector of $G \cap H$. Then, we have the decomposition $H=\rset w_0 \stackrel{\perp}{+} H_0$ where $H_0$ is of dimension $p$. Then, applying \eqref{eq:dim1} |
| \begin{align*} | \begin{align*} | ||
| \sum_{i=1}^n \norm{\projorth{H}X_i}^2&= \sum_{i=1}^n \norm{\projorth{\rset w_0}X_i}^2+\norm{\projorth{H_0}X_i}^2\\ | \sum_{i=1}^n \norm{\projorth{H}X_i}^2&= \sum_{i=1}^n \norm{\projorth{\rset w_0}X_i}^2+\norm{\projorth{H_0}X_i}^2\\ | ||
| Line 39: | Line 43: | ||
| &=\sum_{j=p+1}^d \lambda_j (w_0^T w_j)^2 +\sum_{i=1}^n \norm{\projorth{H_0}X_i}^2\\ | &=\sum_{j=p+1}^d \lambda_j (w_0^T w_j)^2 +\sum_{i=1}^n \norm{\projorth{H_0}X_i}^2\\ | ||
| \end{align*} | \end{align*} | ||
| - | where we used that $w_0 \in G=\mathrm{Span}(w_1,\ldots,w_p)^\perp$. Applying the induction assumption and then \eqref{eq:dim1}, | + | where we used that $w_0 \in G=\mathrm{Span}(w_1,\ldots,w_p)^\perp$. Applying the induction assumption and then \eqref{eq:eigenvector}, |
| \begin{align*} | \begin{align*} | ||
| \sum_{i=1}^n \norm{\projorth{H}X_i}^2& \leq \lambda_{p+1} \sum_{j=p+1}^d (w_0^T w_j)^2 +\sum_{i=1}^n \norm{\projorth{\mathrm{Span}(w_1,\ldots,w_p)}X_i}^2\\ | \sum_{i=1}^n \norm{\projorth{H}X_i}^2& \leq \lambda_{p+1} \sum_{j=p+1}^d (w_0^T w_j)^2 +\sum_{i=1}^n \norm{\projorth{\mathrm{Span}(w_1,\ldots,w_p)}X_i}^2\\ | ||