Tuesday, 3 December 2013

linear algebra - Why are singular values of "complex" matrices always real and non-negative?



I've already read the following related questions on math.SE:





The conclusion they seem to agree on is the following:



For $A \in \mathbb{R}^{m\times n} $, its singular values are real non-negative.




I do not see, however, how can this be the case for $A \in \mathbb{C}^{m\times n}$, even though in most answers, people say it is the same as for real matrices.



Usual argument for singular values of $A \in \mathbb{R}^{m\times n} $ being real non-negative:



The SVD of $A$ is:
$$
A = U S V^T
$$

We have
$$

B = A^T A = V S^T U^T U S V^T = V S^T S V^T
$$

where $S^T S$ is diagonal with elements $\sigma_i^2$, where $\sigma_i$ are the singular values of $A$.
Now, $\sigma_i^2$ are real-nonegative because they can be seen as the eigenvalues $\lambda_i=\sigma_i^2$ of the symmetric, positive-definite, matrix $B = V \Lambda V^T$ (i.e., $\Lambda = S^T S$).



Using $\lambda_i=\sigma_i^2 \ge 0$ to solve for $\sigma_i$, we find that:




  • $\sigma_i$ is real because if it were complex, then the only way $\sigma_i^2$ would be real is if $\sigma_i$ is real or pure imaginary. However, in the latter case we get $\sigma_i^2$ negative, which contradicts $\sigma_i^2 \ge 0$.

  • $\sigma_i$ is the square-root of $\lambda_i$, which can be positive or negative. By convention, however, we take the positive square-root.




Hence,
$$
\sigma_i \text{ are real-nonnegative themselves.}
$$



A try for a similar argument for singular values of $A \in \mathbb{C}^{m\times n} $ :



The SVD of $A$ is:

$$
A = U S V^H
$$

We have
$$
B = A^H A = V S^H U^H U S V^H = V S^H S V^H
$$

where $S^H S$ is diagonal with elements $|\sigma_i|^2$, where $\sigma_i$ are the singular values of $A$.
Now, $|\sigma_i|^2$ are real-nonnegative because of the modulus square, and they can also be seen as the eigenvalues $\lambda_i=|\sigma_i|^2$ of the Hermitian, positive-definite, matrix $B = V \Lambda V^H$ (i.e., $\Lambda = S^H S$).




Using $\lambda_i=|\sigma_i|^2 \ge 0$, how can one solve for $\sigma_i$ and prove it is real-nonnegative?



Particularly, what prevents $\sigma_i$ from being complex?


Answer



The result is that a complex matrix $A$ has a factorization of form $A=USV^H$ where $S$ is nonnegative real diagonal and $U$ and $V$ are "unitary" in the sense that they might be rectangular but $U^HU$ and $V^HV$ are identity matrices. To produce it one takes the positive square roots of the eigenvalues of $AA^H$ and $A^HA$.



No one asserts that all factorizations of $A$ as a product of form $USV^H$ with "unitary" $U$ and $V$ and diagonal $S$ must have all entries of $S$ nonnegative real.



Take, for instance, factorizations of form $A=UDD^{-1}SV^H$ where $D$ is diagonal with complex numbers of unit magnitude on the diagonal. If $U$ is "unitary" so is $UD$, if $S$ is diagonal, so is $D^{-1}S$. So if there is a standard SVD decomposition $A=USV^H$ there are non-standard ones too, that have non-positive diagonal middle factors.




More concretely: suppose $A=USV^H$ is an SVD in the standard sense. Then $A=(-U)(-S)V^H$ is also true, and $-U$ is "unitary" since $U$ is. But $-S$ is not positive on its diagonal. The factorization $A=(-U)(-S)V^H$ holds, but is not a standard SVD factorization.



So the answer boils down to this: the middle factor in an SVD is defined to be nonnegative real diagonal.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...