Monday, 9 May 2016

linear algebra - proof: inverse of lower triangular identity matrix

As you know that is enough negating below of diagonal to inverse of lower triangular identity matrix.




example



$$A = \left(\begin{matrix}
1 & 0 & 0 & 0 \\
3 & 1 & 0 & 0 \\
-1 & 0 & 1 & 0 \\
2 & 0 & 0 & 1 \\
\end{matrix}\right)
$$




basically inverse of A



$$A' = \left(
\begin{matrix}
1 & 0 & 0 & 0 \\
-3 & 1 & 0 & 0 \\
1 & 0 & 1 & 0 \\
-2 & 0 & 0 & 1 \\
\end{matrix}\right)
$$




I just need to prove it.



my question is not related any software. it is general linear algebra question. It's not enough to say that "if $A'=$ (inversion of $A$), multiplication $A'$ and $A$ should be $I$ (identity matrix)". we cannot say for all case. I need a general proof.



it's related topic with Gauss Elimination - LU decomposition



thank you for any help.

No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...