Sunday, 22 December 2013

linear algebra - Prove that if the set of vectors is linearly independent, then the arbitrary subset will be linearly independent as well.



This one is quite straightforward, but I just want to make sure that my reasoning is clear.



I have following proposition:




Proposition. If S={v1,v2...,vn} is linearly independent then any subset T={v1,v2...,vm}, where m<n, is also linearly independent.





My attempt:



We prove proposition by contrapositive.



Suppose T is linearly dependent. We have



k1v1+k2v2kjvj...kmvm=O



Where there is at least one scalar, call it kj, such that kj=a (a0) and all other scalars are zero.




Since T is the subset of S, the linear combination of vectors in S is:



(k1v1+k2v2kjvj...kmvm)+km+1vm+1+knvn=O



Let kj=a, and set all other scalars for zero:



(0v1+0v2avj...0vm)=O by (1)+0vm+1+0vn=O because all scalars = 0=O



We can see that linear combination of S equals to zero but we have at least one non-zero scalar, which implies that S is not linearly independent, which is a contradiction. Therefore, if S is linearly independent, arbitrary subset T must be linearly independent as well.







Is it correct?


Answer



I don't see anything wrong with your proof. Just be careful with the claim that you get a contradiction. The contrapositive of a statement is logically equivalent to the statement itself, so you don't get any contradiction whatsoever when proving a contrapositive.



If you were to use a proof by contradiction, you would start off by assuming that S is linearly independent but T is not, and show that it leads to some impossibility.


No comments:

Post a Comment

real analysis - How to find limhrightarrow0fracsin(ha)h

How to find limh0sin(ha)h without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...