Friday 18 October 2019

linear algebra - Prove that a square matrix commutes with its inverse




The Question:




This is a very fundamental and commonly used result in linear algebra, but I haven't been able to find a proof or prove it myself. The statement is as follows:




let $A$ be an $n\times n$ square matrix, and suppose that $B=\operatorname{LeftInv}(A)$ is a matrix such that $BA=I$. Prove that $AB=I$. That is, prove that a matrix commutes with its inverse, that the left-inverse is also the right-inverse




My thoughts so far:



This is particularly annoying to me because it seems like it should be easy.




We have a similar statement for group multiplication, but the commutativity of inverses is often presented as part of the definition. Does this property necessarily follow from the associativity of multiplication? I've noticed that from associativity, we have
$$
\left(A\operatorname{LeftInv}(A)\right)A=A\left(\operatorname{LeftInv}(A)A\right)
$$
But is that enough?



It might help to talk about generalized inverses.


Answer



You notation $A^{-1}$ is confusing because it makes you think of it as a two-sided inverse but we only know it's a left-inverse.




Let's call $B$ the matrix so that $BA=I$. You want to prove $AB=I$.



First, you need to prove that there is a $C$ so that $AC=I$. To do that, you can use the determinant but there must be another way. [EDIT] There are several methods here. The simplest (imo) is the one using the fact the matrix has full rank.[/EDIT]



Then you have that $B=BI=B(AC)=(BA)C=IC=C$ so you get $B=C$ and therefore $AB=I$.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...