Thursday, 6 April 2017

matrices - Determining linear indepenence

I am having trouble with the dozens and dozens of rules for determining dependence, independence and generating sets and consistent and inconsistent. I know that these are all very closely related though and I know that generating sets and independent are also very close.



I have a list of rules for independence that I wrote during class but I am not sure if they are correct.



For matrix A nxk



A) It is independent if rank of A is equal to k




B) If n > k it can't span $R^n$, I am not sure what this means.



C) If k > n then it is linearly dependent (not "and non zero solution)



So from this I think I can determine that a matrix is independent if in the reduced row echelon form of a square matrix each column has a pivot. Is this wrong?



For example I have a square matrix (rows = columns)



\begin{bmatrix}

1 & -1 & 1 \\[0.3em]
-1 & 0 & 2 \\[0.3em]
-2 & 1& 1
\end{bmatrix}



To get reduced row echelon form I do the following transformations:



$R_1 +R_2$



\begin{bmatrix}

1 & -1 & 1 \\[0.3em]
0 & -1 & 3 \\[0.3em]
-2 & 1& 1
\end{bmatrix}
$2R_1+R_3$



\begin{bmatrix}
1 & -1 & 1 \\[0.3em]
0 & -1 & 3 \\[0.3em]
0 & 0 & 2

\end{bmatrix}



And from here it is fairly trivial to get it into reduced row echelon, so my rank is 3 and my columns are 3. Why is it not independent?

No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...