Saturday, 6 December 2014

linear algebra - Does a set of basis vectors have to be linearly independent?



The definition for a set of vectors to be considered a basis for $R^n$ is that 1) this set spans $R^n$ - any vector in $R^n$ can be written as a combination of this set and 2) this set is linearly independent.



Extending this analogy to vector spaces, $V$, from the below article, it states "a set B of elements (vectors) in a vector space V is called a $\textbf{basis}$, if every element of V may be written in a unique way as a (finite) linear combination of elements of B."




Then, it goes on to say "B is a $\textbf{basis}$ if its elements are linearly independent and every element of V is a linear combination of elements of B. In more general terms, a basis is a linearly independent spanning set."



https://en.wikipedia.org/wiki/Basis_(linear_algebra)



So, is a set $B$ considered a basis if everything in $V$ can be written with $B$ and $B$ must be linearly independent? Or, is $B$ a basis solely based on the fact that everything in $V$ can be written with $B$?


Answer



To remove any confusion:



A Basis $B$ of a vector space $V$ is a subset $B \subset V$ such that





  • $span(B) = V$ and

  • $B$ is linearly independent


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...