Welcome to Randal Douc's wiki

A collaborative site on maths but not only!

User Tools

Site Tools


world:quiz5

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
world:quiz5 [2025/10/06 00:42]
rdouc created
world:quiz5 [2025/10/06 23:27] (current)
rdouc
Line 8: Line 8:
 Denote by $\hat \beta$ the OLS estimator of $\beta$ obtained from the regression of $Y$ on $\{X\beta,\ \beta \in \rset^p\}$. Define $P_X=X(X'​X)^{-1}X'​=[h_{i,​j}]_{1\leq i, j\leq n}$.  Denote by $\hat \beta$ the OLS estimator of $\beta$ obtained from the regression of $Y$ on $\{X\beta,\ \beta \in \rset^p\}$. Define $P_X=X(X'​X)^{-1}X'​=[h_{i,​j}]_{1\leq i, j\leq n}$. 
  Which of the following statements are true?   Which of the following statements are true? 
-  * 1. If $\epsilon$ is a Gaussian vector, $\hat Y$ and $\hat \sigma^2$ are independent.  +  * 1. If $\epsilon$ is a Gaussian vector, $\hat Y$ and $\hat \sigma^2$ are independent. ​**VRAI** 
-  * 2. If $\epsilon$ is a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent. +  * 2. If $\epsilon$ is a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent. ​**FAUX** 
-  * 3. If $\epsilon$ is not a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent and $\Cov(\hat \beta,\hat \epsilon)$ may be different from $0$.  +  * 3. If $\epsilon$ is not a Gaussian vector, then $\hat \beta$ and $\hat \sigma^2$ are not necessarily independent and $\Cov(\hat \beta,\hat \epsilon)$ may be different from $0$. **FAUX** 
-  * 4. If $\epsilon$ is not a Gaussian vector, then $\hat Y$ and $\hat \sigma^2$ are not necessarily independent but we have that $\Cov(\hat Y,\hat \epsilon)=0$.  +  * 4. If $\epsilon$ is not a Gaussian vector, then $\hat Y$ and $\hat \sigma^2$ are not necessarily independent but we have that $\Cov(\hat Y,\hat \epsilon)=0$. ​**VRAI** 
-  * 5. We always have $\hat \beta = (X'​X)^{-1}X'​ \hat Y$ and $\hat \beta = (X'​X)^{-1}X'​ Y$. +  * 5. We always have $\hat \beta = (X'​X)^{-1}X'​ \hat Y$ and $\hat \beta = (X'​X)^{-1}X'​ Y$. **VRAI**
  
 Consider a logistic regression model, where $\PP(Y=1|X)=f( \pscal{X}{\beta})$ with $f(u)=\frac{\rme^u}{1+\rme^u}$. ​ Consider a logistic regression model, where $\PP(Y=1|X)=f( \pscal{X}{\beta})$ with $f(u)=\frac{\rme^u}{1+\rme^u}$. ​
 Which of the following statements are true? Which of the following statements are true?
-  * 6. Once $\beta$ has been estimated by some $\hat \beta$, the associated classifier $h$ follows the rule: if $\pscal{X}{\beta}<​0$ we choose $h(X)=0$ and otherwise we choose $h(X)=1$.  +  * 6. Once $\beta$ has been estimated by some $\hat \beta$, the associated classifier $h$ follows the rule: if $\pscal{X}{\beta}<​0$ we choose $h(X)=0$ and otherwise we choose $h(X)=1$. ​**VRAI** 
-  * 7. In a logistic regression, the law of $X$ may depend on $\beta$. ​+  * 7. In a logistic regression, the law of $X$ may depend on $\beta$. ​**FAUX** ​
  
world/quiz5.1759704128.txt.gz · Last modified: 2025/10/06 00:42 by rdouc