Thursday 22 December 2016

linear algebra - When is matrix multiplication commutative?



I know that matrix multiplication in general is not commutative. So, in general:




$A, B \in \mathbb{R}^{n \times n}: A \cdot B \neq B \cdot A$



But for some matrices, this equations holds, e.g. A = Identity or A = Null-matrix $\forall B \in \mathbb{R}^{n \times n}$.



I think I remember that a group of special matrices (was it $O(n)$, the group of orthogonal matrices?) exist, for which matrix multiplication is commutative.



For which matrices $A, B \in \mathbb{R}^{n \times n}$ is $A\cdot B = B \cdot A$?


Answer



Two matrices that are simultaneously diagonalizable are always commutative.




Proof: Let $A$, $B$ be two such $n \times n$ matrices over a base field $\mathbb K$, $v_1, \ldots, v_n$ a basis of Eigenvectors for $A$. Since $A$ and $B$ are simultaneously diagonalizable, such a basis exists and is also a basis of Eigenvectors for $B$. Denote the corresponding Eigenvalues of $A$ by $\lambda_1,\ldots\lambda_n$ and those of $B$ by $\mu_1,\ldots,\mu_n$.



Then it is known that there is a matrix $T$ whose columns are $v_1,\ldots,v_n$ such that $T^{-1} A T =: D_A$ and $T^{-1} B T =: D_B$ are diagonal matrices. Since $D_A$ and $D_B$ trivially commute (explicit calculation shows this), we have $$AB = T D_A T^{-1} T D_B T^{-1} = T D_A D_B T^{-1} =T D_B D_A T^{-1}= T D_B T^{-1} T D_A T^{-1} = BA.$$


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...