Friday 25 November 2016

linear algebra - Decompose $A$ to the product of elementary matrices. Use this to find $A^{-1}$



Decompose $A$ to the product of elementary matrices. Use this to find $A^{-1}$
$$
A =
\begin{bmatrix}
3 & 1\\

1 & -2
\end{bmatrix}
$$



I understand how to reduce this into row echelon form but I'm not sure what it means by decomposing to the product of elementary matrices. I know what elementary matrices are, sort of, (a row echelon form matrix with a row operation on it) but not sure what it means by product of them. could someone demonstrate an example please? It'd be very helpful


Answer



An elementary matrix $E$ is a square matrix that the effect of performing a single elementary row operation to $I$. For example, $\left[ \begin{matrix} 1 & 0 \\ 1 & 1 \end{matrix}\right]$ is an elementary matrix because adding the first row of $I$ to the second row of $I$ gives us this matrix. Moreover, matrix multiplication on the left by an elementary matrix is equivalent to performing the corresponding elementary row operation. Thus, with $A$ as in your question and $E$ as above, we have that
$$EA=\left[ \begin{matrix} 1& 0 \\ 1 & 1 \end{matrix}\right]\left[ \begin{matrix} 3 & 1 \\ 1 & -2 \end{matrix}\right]=\left[ \begin{matrix} 3 & 1 \\ 3+1 & 1+(-2) \end{matrix}\right] =\left[ \begin{matrix} 3 & 1 \\ 4 & -1 \end{matrix}\right]$$



A square matrix $A$ is invertible if and only if it can be reduced to the identity matrix, which is to say that by multiplying by finitely-many elementary matrices on the left we get the identity: $$E_nE_{n-1}\cdots E_2E_1A=I$$ so that $$A=E_1^{-1}E_2^{-1}\cdots E_{n-1}^{-1}E_n^{-1}.$$

The above is well-defined since every elementary matrix is invertible (its inverse corresponds to the elementary row operation that reverses the elementary row operation corresponding to the original elementary matrix).



Thus, the first step is to row-reduce $A$ to the identity $I$, keeping track of what operations we used, and then multiplying the corresponding inverses in the opposite order as indicated above.



For example, take $A=\left[ \begin{matrix} 1 & 2 \\ 2 & 1 \end{matrix}\right]$. We can reduce this to $I$ by subtracting two times the second row from the first row, giving us $\left[ \begin{matrix} -3 & 0 \\ 2 & 1 \end{matrix}\right]$. We can then add $2/3$ the first row to the second to give us $\left[ \begin{matrix} -3 & 0 \\ 0 & 1 \end{matrix}\right]$. Finally, we multiply the first row by $-1/3$. This gives us the decomposition
$$\left[ \begin{matrix} 1 & 2 \\ 2 & 1 \end{matrix}\right] =\left[ \begin{matrix} 1 & 2 \\ 0 & 1 \end{matrix}\right] \left[ \begin{matrix} 1 & 0 \\ -2/3 & 1 \end{matrix}\right] \left[ \begin{matrix} -3 & 0 \\ 0 & 1 \end{matrix}\right]$$


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...