Monday, 7 September 2015

linear algebra - Linearly independent subset of V ⊆ basis of V



The Theorem states:




Let V be a vector space that has a finite spanning set, and let S be a

linearly independent subset of V. Then there exists a basis S' of V,
with S ⊆ S'




I don't need a proof necessarily, just want to build an intuition for it. Also, how would I use this theorem to solve questions like this one:



a) Find a basis of R^4 containing the linearly independent set S = {(1,2,3,4),(-1,0,0,0)}.


Answer



The idea is that $S$ is a basis for some subspace $W_1\subset V$, so if we choose any $v_1\in in V\backslash W_1$, then $S\cup\{v_1\}$ is linearly independent, and spans a subspace $W_2$ with $W_1\subset W_2\subseteq V$. If $W_2=V$ we are done, otherwise, we can find $v_2\in V\backslash W_2$ so that $S\cup\{v_1,v_2\}$ is linearly independent. Keep going until you have constructed a basis.




In your particular example, find a vector that is not in the span of $S$ (e.g. $(0,1,0,0)$ is an easy choice). Then $S'=\{(1,2,3,4),(-1,0,0,0),(0,1,0,0)\}$ is linearly independent and spans a $3$-dimensional subspace. Find a vector not in the span of $S'$ to form a set $S''$ that is a basis.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...