Thursday 27 December 2018

Proof of determinants for matrices of any order



I was told that the determinant of a square matrix can be expanded along any row or column and given a proof by expanding in all possible ways, but only for square matrices of order 2 and 3.





  • Is a general proof for any order even possible ?

  • If so, how is this done ?

  • On a similar note, how can we prove the various properties of determinants for square matrices for any order like the following:




    • Swap two rows/columns and all we get is a minus sign as a result.

    • $R_1 \to R_1+ aR_2$ does not change the determinant.

    • Determinant of the transpose is the same as the determinant of the original matrix.




Answer



Here is one possible path. We define the determinant recursively:




  • if $A$ is $1\times 1$, let $\det A=A$;


  • If $A$ is $(n+1)\times (n+1)$, let
    $$
    \det A=\sum_{k=1}^{n+1} (-1)^{k+1}A_{k1}\,M_{k1}^A,

    $$
    where $M_{k1}^A$ is the determinant of the $n\times n$ matrix obtained by removing the $k^{\rm th}$ row and the first column of $A$.




Now,




  1. Show that if $B$ is obtained from $A$ by multiplying a row by $\alpha$, then $$\det B=\alpha\,\det A.$$ This is done by induction very easily.


  2. Show that if we have $A,B,C$ with $A_{rj}=B_{rj}+C_{rj}$ for all $j$, and $A_{kj}=B_{kj}=C_{kj}$ when $k\ne r$ and for all $j$, then
    $$\det A=\det B+\det C.$$ Again this is done by induction. When $r=1$ the equality follows trivially from the definition of determinant (as the minors of $A,B,C$ will be all equal) and when $r\ne 1$ we use induction.



  3. Show that if $B$ is obtained from $A$ by swapping two rows, then $$\det B=-\det A.$$ Here one first swaps rows $1$ and $r$, and then any other swapping of two rows $r$ and $s$ can be achieved by three swaps ($r$ to $1$, $s$ to $1$, $r$ to $1$). This can be used to show that one can calculate the determinant along any row (swap it with row 1).


  4. It now follows that if $A$ has two equal rows, then $\det A=0$ (because $\det A=-\det A$).


  5. If $B_{rj}=A_{rj}+\alpha A_{sj}$, and $B_{kj}=A_{kj}$ when $k\ne r$, then by 1. and 2.,
    $$\det B=\det A+\alpha\det C,$$ where $C$ is the matrix equal to $A$ but with the $s$ row in place of the $r$ row; by 4., $\det C=0$, so $$\det B=\det A$.


  6. Now one considers the elementary matrices, and checks directly (using the above properties) that for any elementary matrix $E$, $$\det EA=\det E\,\det A.$$


  7. If $B$ is invertible, then $B$ can be written as a product of elementary matrices, $B=E_1E_2\cdots E_m$, and so
    \begin{align}
    \det BA&=\det E_1E_2\cdots E_m A=\det E_1\det E_2\cdot\det E_m\det A\\ \ \\
    &=\det (E_1\cdots E_m)\det A=\det B\det A.
    \end{align}

    Similarly, $\det AB=\det A\det B$.


  8. If neither $A$ nor $B$ are invertible: then $AB$ is not invertible either. For a non-invertible matrix, its Reduced Row Echelon form has a row of zeroes, and so its determinant is zero; as we can move to $A$ by row operations, it follows that $\det A=0$; similarly, $\det AB=0$. So $$\det AB=\det A\det B$$ also when one of them is not invertible.


  9. Knowing that det is multiplicative, we immediate get that, when $A$ is invertible, $$\det A^{-1}=\frac1{\det A}.$$


  10. For an arbitrary matrix $A$, it is similar to its Jordan form: $A=PJP^{-1}$. Then
    $$
    \det A=\det (PJP^{-1})=\det P\,\det J\,\frac1{\det P}=\det J.
    $$
    As $J$ is triangular with the eigenvalues of $A$ (counting multiplicities) in its diagonal, we get that
    $$
    \det A=\lambda_1\cdots\lambda_n,

    $$
    where $\lambda_1,\ldots,\lambda_n$ are the eigenvalues of $A$, counting multiplicities.


  11. Since the eigenvalues of $A^T$ are the same as those from $A$, we get
    $$
    \det A^T=\det A.
    $$


  12. Now, everything we did for rows, we can do for columns by working on the transpose. In particular, we can calculate the determinant along any column.



No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...