Wednesday, 10 January 2018

linear algebra - Clarification on the meaning of "uniqueness" in the (linked) Matrix diagonalization theorem



I came across this source yesterday, where they say: given a square real matrix S of size $nxn$, with n linearly independent eigenvectors, there exists an eigendecomposition such as $S=UDU^{-1}$. If the eigenvalues on the diagonal of D are distinct (which I suppose it means of mulitplicity 1, correct?), this decomposition is unique.



I don't understand the latter sentence (this decomposition is unique). What do they mean by unique?



For example, given a square matrix, in the particular case where the matrix is symmetric, if all eigenvalues have multiplicity 1, then each eigenvector associated to a certain eigenvalue can be re-expressed as a multiplication between a scalar and the normalized eigenvector for that eigenvalue. So denoting by x the normalized eigenvector, taking $cx$, where c is a scalar, still gives an eigenvector which is linearly independent from the eigenvectors of the other eigenvalues (i.e. if the multiplicity of all eigenvalues is 1, then normalized eigenvectors are unique up to a sign transformation +/-1*x, so setting c=1 implies $1x$ is clearly unique). So in that case, the decomposition $S=UDU^{-1}$ is not unique: there may exist multiple matrixes $U$ such that $S=UDU^{-1}$ obtained by multiplying any eigenvector in the columns of $U$ by a scalar. Please correct me if I am wrong.



In that source they often talk about standardized eigenvectors, so maybe are they referring to the special case where we consider only the standardized version of the eigenvectors? As specified in the theorem below in the same page regarding symmetric square matrixes? Otherwise, what am I misinterpreting?



Answer



You are right that the decomposition is not unique. And the referenced text is wrong in claiming that it is. Certainly the change of basis matrix $U$ is not unique, and its columns can be independently multiplied by nonzero scalars without changing the validity of the decomposition. The diagonal matrix has more claim to unicity, but still its entries can be permuted (if the columns of $U$ are correspondingly permuted). However the linked text take that into account be requiring the diagonal entries to be decreasing. Note that this can only be done since the text is supposing the existence of (distinct) real eigenvalues; if one wants to allow complex eigenvalues, then this method will not so easily.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...