{{page>:defs}} ====== Quiz 2 ====== Let $(y_i,x_{i,1})_{i=1,\ldots,n}$ be the observation. The real model is given by $y_i=\beta_1 x_{i,1} + \epsilon_i$ where $(\epsilon_i)$ are iid with variance $\sigma^2$. Consider the OLS estimator $\hat\beta_1=(X_1'X_1)^{-1}X_1'Y$ where $X_1'=(x_{1,1},\ldots,x_{n,1})$. Let $X_2 \in \mathbb{R}^n$ be another vector such that $\mathrm{rank}(X)=2$ where $X=[X_1|X_2]$. Define $\begin{pmatrix}{\tilde \beta}_1\\ {\tilde \beta}_2 \end{pmatrix}=(X'X)^{-1}X'Y$. Which of the following statements are always true: - $\tilde \beta_1$ is an unbiased estimator of $\beta_1$. VRAI - $\PE[\tilde \beta_2]=0$. VRAI - Using the Gauss-Markov theorem, we can show that $V (\hat \beta_1) \leq V(\tilde \beta_1)$. VRAI - $\mathrm{Var}(\hat \beta_1)= \frac{\sigma^2}{\sum_{i=1}^n x_{i,1}^2/n - (\sum_{i=1}^n x_{i,1}/n)^2 }$. FAUX - $\mathrm{Var}(\hat \beta_1)= \frac{\sigma^2}{\sum_{i=1}^n x_{i,1}^2}$. VRAI - $\mathrm{Var}(\hat \beta_1)= \frac{\sigma^2}{\sum_{i=1}^n x_{i,1}^2/n}$. FAUX