Wednesday 8 June 2016

linear algebra - Finding an eigenvalue decomposition of a $2mtimes 2m$ Hermitian matrix




Let $A$ be an $m\times m$ matrix with entries in $\mathbb{C}$ and with a singular value decomposition $A=U\Sigma V^*$. Find an eigenvalue decomposition of the $2m \times 2m$ Hermitian matrix: $$\begin{bmatrix} O &A^* \\ A & O\end{bmatrix}.$$




What I did so far is to denote $M = \begin{bmatrix} O &A^* \\ A & O\end{bmatrix}.$ I know that I need to find a diagonal matrix $\Lambda$ with the eigenvalues of $M$ and a matrix $X$ with eigenvectors of $M$ such that $M = X \Lambda X^{-1}$.



So $Mx = \lambda x \implies \begin{bmatrix} O & A^* \\ A & O \end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \end{bmatrix} =\lambda \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} \implies A^*x_2 =\lambda x_1$ and $Ax_1 = \lambda x_2$.




From here, I don't really know where to go with how to find the eigenvectors and relating or using the SVD of $A$.



Is this a viable approach to proceed?


Answer



Note that if the SVD of $A$ is given by $A = U \Sigma V^*$, then we have the following system of equations,
\begin{align*}
AV = U \Sigma \\
A^*U = V\Sigma.
\end{align*}

We can write the above system in terms of block matrices,
\begin{align*}
\begin{bmatrix}
0 & A \\
A^* & 0
\end{bmatrix}
\begin{bmatrix}
U \\
V
\end{bmatrix} =

\begin{bmatrix}
U\Sigma \\
V \Sigma
\end{bmatrix}.
\end{align*}
It is also easy to verify that



\begin{align*}
\begin{bmatrix}
0 & A \\

A^* & 0
\end{bmatrix}
\begin{bmatrix}
U \\
-V
\end{bmatrix} =
\begin{bmatrix}
-U\Sigma \\
V \Sigma
\end{bmatrix}.

\end{align*}
Putting these together, we have
\begin{align*}
\begin{bmatrix}
0 & A \\
A^* & 0
\end{bmatrix}
\begin{bmatrix}
U & U \\
V & -V

\end{bmatrix} =
\begin{bmatrix}
U \Sigma & -U \Sigma \\
V \Sigma & V \Sigma
\end{bmatrix} =
\begin{bmatrix}
U & -U \\
V & V
\end{bmatrix}
\begin{bmatrix}

\Sigma & 0 \\
0 & \Sigma
\end{bmatrix}.
\end{align*}
Pushing the negative sign into the diagonal matrix of singular values, we conclude
\begin{align*}
\begin{bmatrix}
0 & A \\
A^* & 0
\end{bmatrix}

\begin{bmatrix}
U & U \\
V & -V
\end{bmatrix} =
\begin{bmatrix}
U & U \\
V & -V
\end{bmatrix}
\begin{bmatrix}
\Sigma & 0 \\

0 & -\Sigma
\end{bmatrix}.
\end{align*}
Multiplying on the right by $\begin{bmatrix}
U & U \\
V & -V
\end{bmatrix}^{-1}$ yields the desired eigenvalue decomposition.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...