Monday 17 April 2017

matrices - Matrix Calculus: Derivative of Vectorized Symmetric Positive Definite Matrix w.r.t. its Vectorized Matrix Logarithm



Setup:



Let $k\in{}\mathbb{N},$ and let $\mathrm{M}_{k,k}(\mathbb{}{R})$ denote the set of $k\times{}k$ matrices over the field of real numbers.



Let $X\in{}\mathrm{M}_{k,k}(\mathbb{}{R})$ be a symmetric, positive definite matrix.




Then $X$ has $k$ positive eigenvalues $\lambda_{1},\dots{},\lambda_{k}$ with corresponding eigenvectors $v_{1},\dots{},v_{k}$.



The eigendecomposition/spectral decomposition of $X$ is:



$$X = V\Lambda{}V^{-1} = V\Lambda{}V',$$



where $\Lambda{}=\mathrm{diag}(\lambda_{1},\dots{},\lambda_{k})\in{}\mathrm{M}_{k,k}(\mathbb{}{R})$ is the diagonal matrix with the $k$ eigenvalues on the main diagonal and $V=(v_{1},\dots{},v_{k})\in{}\mathrm{M}_{k,k}(\mathbb{}{R})$ is the matrix whose $k$ columns are the orthonormal eigenvectors.



We define the natural matrix logarithm of $X$, denoted $\log{}(X)$, to be
$$\log{}(X)=V\log{}(\Lambda{})V',$$

where $\log{}(\Lambda{})=\mathrm{diag}(\log{}(\lambda_{1}),\dots{},\log{}(\lambda_{k}))\in{}\mathrm{M}_{k,k}(\mathbb{}{R})$.



Question:



What, if it can be found, is the analytical form of the $[k(k+1)/2] \times{} [k(k+1)/2]$ Jacobian matrix



$$\frac{\partial{}\mathrm{vec}(X)}{\partial{}\mathrm{vec}(\log{}(X))'}$$



where $\mathrm{vec}(\cdot{})$ is the half-vectorization operator that stacks the lower triangular part of its square argument matrix.




(Background: This is a recurring problem in multivariate statistics when one adopts a "log-parameterization" of a covariance or precision matrix, which are both, by definition, symmetric and positive (semi-)definite.)


Answer



Let $$\eqalign{
G &= \log(X) \implies X = e^G \cr
}$$

and find the differential of $X$ via the power series of the exponential
$$\eqalign{
dX=de^G &= d\,\Bigg[\sum_{i=0}^\infty \frac{G^i}{i!}\Bigg] \cr
&= \sum_{i=1}^\infty \frac{1}{i!}\,\sum_{j=0}^{i-1}\,G^{j}\,dG\,G^{i-j-1}\cr
}$$


Now apply vectorization
$$\eqalign{
dx &=\Bigg[\sum_{i=1}^\infty \frac{1}{i!}\,\sum_{j=0}^{i-1}G^{i-j-1}\otimes G^{j}\Bigg]\,dg \cr
\frac{\partial x}{\partial g} &= \sum_{i=1}^\infty \frac{1}{i!}\,\sum_{j=0}^{i-1}\,\Big[G^{i-j-1}\otimes G^{j}\,\Big] \cr\cr
}$$

To change between vectorization and half-vectorization, multipy by Duplication and Elimination matrices of the appropriate dimensions
$$\eqalign{
L_k\,dx &= L_k\Bigg[\frac{\partial x}{\partial g}\Bigg]D_kL_k\,dg \cr
dx^\prime &= L_k\Bigg[\frac{\partial x}{\partial g}\Bigg]D_k\,dg^\prime \cr
\frac{\partial x^\prime}{\partial g^\prime} &= L_k\Bigg[\frac{\partial x}{\partial g}\Bigg]D_k \cr

\cr\cr
}$$

P.S.



The reason I used $G$ for the log, was because I wanted to use $L$ for the elimination matrix.



In the vectorization of the power series I used the fact that $G$ is symmetric to omit some transpose operations.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...