Thursday, 24 August 2017

linear algebra - Vector space basics: scalar times a nonzero vector = zero implies scalar = zero?



I'm working through a linear algebra text, starting right from the axioms. So far I've understood and proved for myself that, for some vector space V over a field F:





  • the additive identity (zero vector) 0 in a vector space is unique

  • the additive inverses in a vector space are unique

  • scalar zero times any vector is the zero vector: vV:0v=0

  • any scalar times the zero vector is the zero vector: aF:a0=0



However, I'm stuck on the converse of the third statement: suppose av=0 and v0. Show that a must be equal to 0. In other words, show that the scalar zero is the only element of F that allows for rule of inference number 3 in the above list.



It seems like such a simple thing but I'm not used to proving super basic statements like this axiomatically. My attempt so far is something like:




av=0(1)av+0=0(vector additive identity)av+av=0(substitute from 1)(a+a)v=0(distributive property)(2a)v=0(2a)v=av(substitute from 1)therefore"2a=aa=0




But I'm not sure I'm "allowed" to do that second-to-last step yet, given the things proved so far. I think it might just be a circular argument. Is it?



EDIT:



I figured it out with some prodding; turned out I had all the pieces in front of me already but didn't realize it. Here it is for completeness:



Suppose aF, vV, and av=0. Either a=0 or a0. In the case where a0:



av=01a(av)=1a0(1aa)v=01v=0v=0



But then suppose further that v0. Then a=0 by the contrapositive.


Answer



Hint: There was that axiom that said that 1v=v, was there not?



No comments:

Post a Comment

real analysis - How to find limhrightarrow0fracsin(ha)h

How to find limh0sin(ha)h without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...