{{page>:defs}} ====== Quiz 5 ====== Consider a multiple linear regression model $Y=X\beta +\epsilon$ such $\mathrm{rank}(X)=p$ and $\PE[\epsilon]=0$ and $\Var(\epsilon)=\sigma^2 I$ are satisfied. We write $X=[X_1|\ldots|X_p]$. Denote by $\hat \beta$ the OLS estimator of $\beta$ obtained from the regression of $Y$ on $\{X\beta,\ \beta \in \rset^p\}$. Define $P_X=X(X'X)^{-1}X'=[h_{i,j}]_{1\leq i, j\leq n}$. Which of the following statements are true? * 1. If $\epsilon$ is a Gaussian vector, $\hat Y$ and $\hat \sigma^2$ are independent. **VRAI** * 2. If $\epsilon$ is a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent. **FAUX** * 3. If $\epsilon$ is not a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent and $\Cov(\hat \beta,\hat \epsilon)$ may be different from $0$. **FAUX** * 4. If $\epsilon$ is not a Gaussian vector, then $\hat Y$ and $\hat \sigma^2$ are not necessarily independent but we have that $\Cov(\hat Y,\hat \epsilon)=0$. **VRAI** * 5. We always have $\hat \beta = (X'X)^{-1}X' \hat Y$ and $\hat \beta = (X'X)^{-1}X' Y$. **VRAI** Consider a logistic regression model, where $\PP(Y=1|X)=f( \pscal{X}{\beta})$ with $f(u)=\frac{\rme^u}{1+\rme^u}$. Which of the following statements are true? * 6. Once $\beta$ has been estimated by some $\hat \beta$, the associated classifier $h$ follows the rule: if $\pscal{X}{\beta}<0$ we choose $h(X)=0$ and otherwise we choose $h(X)=1$. **VRAI** * 7. In a logistic regression, the law of $X$ may depend on $\beta$. **FAUX**