This shows you the differences between two versions of the page.
Both sides previous revision Previous revision | |||
world:quiz5 [2025/10/06 17:27] francois_bertholom ↷ Page moved from emines2024:quiz5 to world:quiz5 |
world:quiz5 [2025/10/06 23:27] (current) rdouc |
||
---|---|---|---|
Line 8: | Line 8: | ||
Denote by $\hat \beta$ the OLS estimator of $\beta$ obtained from the regression of $Y$ on $\{X\beta,\ \beta \in \rset^p\}$. Define $P_X=X(X'X)^{-1}X'=[h_{i,j}]_{1\leq i, j\leq n}$. | Denote by $\hat \beta$ the OLS estimator of $\beta$ obtained from the regression of $Y$ on $\{X\beta,\ \beta \in \rset^p\}$. Define $P_X=X(X'X)^{-1}X'=[h_{i,j}]_{1\leq i, j\leq n}$. | ||
Which of the following statements are true? | Which of the following statements are true? | ||
- | * 1. If $\epsilon$ is a Gaussian vector, $\hat Y$ and $\hat \sigma^2$ are independent. | + | * 1. If $\epsilon$ is a Gaussian vector, $\hat Y$ and $\hat \sigma^2$ are independent. **VRAI** |
- | * 2. If $\epsilon$ is a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent. | + | * 2. If $\epsilon$ is a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent. **FAUX** |
- | * 3. If $\epsilon$ is not a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent and $\Cov(\hat \beta,\hat \epsilon)$ may be different from $0$. | + | * 3. If $\epsilon$ is not a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent and $\Cov(\hat \beta,\hat \epsilon)$ may be different from $0$. **FAUX** |
- | * 4. If $\epsilon$ is not a Gaussian vector, then $\hat Y$ and $\hat \sigma^2$ are not necessarily independent but we have that $\Cov(\hat Y,\hat \epsilon)=0$. | + | * 4. If $\epsilon$ is not a Gaussian vector, then $\hat Y$ and $\hat \sigma^2$ are not necessarily independent but we have that $\Cov(\hat Y,\hat \epsilon)=0$. **VRAI** |
- | * 5. We always have $\hat \beta = (X'X)^{-1}X' \hat Y$ and $\hat \beta = (X'X)^{-1}X' Y$. | + | * 5. We always have $\hat \beta = (X'X)^{-1}X' \hat Y$ and $\hat \beta = (X'X)^{-1}X' Y$. **VRAI** |
Consider a logistic regression model, where $\PP(Y=1|X)=f( \pscal{X}{\beta})$ with $f(u)=\frac{\rme^u}{1+\rme^u}$. | Consider a logistic regression model, where $\PP(Y=1|X)=f( \pscal{X}{\beta})$ with $f(u)=\frac{\rme^u}{1+\rme^u}$. | ||
Which of the following statements are true? | Which of the following statements are true? | ||
- | * 6. Once $\beta$ has been estimated by some $\hat \beta$, the associated classifier $h$ follows the rule: if $\pscal{X}{\beta}<0$ we choose $h(X)=0$ and otherwise we choose $h(X)=1$. | + | * 6. Once $\beta$ has been estimated by some $\hat \beta$, the associated classifier $h$ follows the rule: if $\pscal{X}{\beta}<0$ we choose $h(X)=0$ and otherwise we choose $h(X)=1$. **VRAI** |
- | * 7. In a logistic regression, the law of $X$ may depend on $\beta$. | + | * 7. In a logistic regression, the law of $X$ may depend on $\beta$. **FAUX** |
- | + | ||
- | {{url>https://docs.google.com/forms/d/e/1FAIpQLSfq0eGbXk5kIzniUspKaS1t_x6XPCDKGD_s4VzN0-vs4C9b9Q/viewform?embedded=TRUE, 100%,800px,noborder}} | + | |