# Welcome to Randal Douc's wiki

A collaborative site on maths but not only!

• Theatre
• Research
• Teaching

### Miscellanous

world:logdet


2017/10/07 23:39 ·


$$\newcommand{\blemma}{<WRAP center round box 80%> **__Lemma__** }$$

# Concavity of the $\log \det$ function

Let $\Mplus$ the space of real-valued $n \times n$ symmetric positive matrices. We show

Lemma: The function $X \mapsto \log \det X$ is concave on $\Mplus$.

## Proof

Let $X,Y \in \Mplus$ and $\lambda \in [0,1]$. Since $X^{-1/2}YX^{-1/2} \in \Mplus$, it is diagonalisable in some orthonormal basis and write $\mu_1,\ldots, \mu_n$ the (possibly repeated) entries of the diagonal. Note in particular that $\det \lr{X^{-1/2}YX^{-1/2}}=\prod_{i=1}^n \mu_i$. Then, \begin{align*} \log \det \lr{(1-\lambda)X+\lambda Y}&=\log \det \lrb{X^{1/2} \lr{(1-\lambda)I+\lambda X^{-1/2}YX^{-1/2}} X^{1/2}}\\ &=\log \det X + \log \det \lr{(1-\lambda)I+\lambda X^{-1/2}YX^{-1/2}} \nonumber \\ &=\log \det X + \sum_{i=1}^n \log(1-\lambda+\lambda \mu_i)\nonumber \\ & \geq \log \det X + \sum_{i=1}^n (1-\lambda) \underbrace{\log(1)}_{=0}+\lambda \log( \mu_i) \label{eq:diag}:= D \end{align*} where the last inequality follows from the concavity of the $\log$. Now, rewrite the rhs $D$ as: \begin{align*} D&=(1-\lambda) \log \det X + \lambda \lr{\log \det X^{1/2}+ \log \det X^{-1/2}YX^{-1/2} + \log \det X^{1/2}} \\ &=(1-\lambda) \log \det X + \lambda \log \det Y \end{align*} $\eproof$

## Derivatives

Lemma: The derivative of the real valued function $\Sigma \mapsto \log\mathrm{det}(\Sigma)$ defined on $\rset^{d\times d}$ is given at a $\Sigma$ which is symmetric positive by: $\partial_{\Sigma}\{\log\mathrm{det}(\Sigma)\}= \Sigma^{-1}\eqsp,$ where, for all real valued function $f$ defined on $\rset^{d\times d}$, $\partial_{\Sigma}f(\Sigma)$ denotes the $\rset^{d\times d}$ matrix such that for all $1\leqslant i,j\leqslant d$, $\{\partial_{\Sigma}f(\Sigma)\}_{i,j}$ is the partial derivative of $f$ with respect to $\Sigma_{i,j}$.

## Proof

Recall that for all $i \in \{1,\ldots,d\}$ we have $\det(\Sigma)=\sum_{k=1}^d \Sigma_{i,k} \Delta_{i,k}$ where $\Delta_{i,j}$ is the $(i,j)$-cofactor associated to $\Sigma$. For any fixed $i,j$, the component $\Sigma_{i,j}$ does not appear in anywhere in the decomposition $\sum_{k=1}^d \Sigma_{i,k} \Delta_{i,k}$, except for the term $k=j$. This implies $$\frac{\partial \log \det(\Sigma)}{\partial \Sigma_{i,j}}=\frac{1}{\det \Sigma}\frac{\partial \det(\Sigma)}{\partial \Sigma_{i,j}}=\frac{\Delta_{i,j}}{\det \Sigma}$$ Recalling the identity $\Sigma\; [\Delta_{j,i}]_{1\leq i,j \leq d}=(\det \Sigma)\; I_d$ so that $\Sigma^{-1}=\frac{[\Delta_{i,j}]_{1\leq i,j \leq d}^T}{\det \Sigma}$, we finally get $$\lrb{\frac{\partial \log \det(\Sigma)}{\partial \Sigma_{i,j}}}_{1\leq i,j \leq d}=(\Sigma^{-1})^T=\Sigma^{-1}$$ where the last equality follows from the fact that $\Sigma$ is symmetric. $\eproof$