Edit: I really like the approach taken by $\mathbb{R}^n$ in the comments below. I posted this late last night and didn't get to ask him why $\bf R(a \times b) = Ra \times Rb$ for any rotation matrix $\bf R$. I'm not sure how to prove that: along with trying to evaluate the determinant directly, I looked in a linear algebra book and tried using the "norm-preserving" property of rotation matrices to evaluate $|\bf R(a \times b) - R(a) - R(b)|$ $^2$, but didn't succeed. How do you do this?
In one of my courses, a professor briefly summarized properties of the cross product at the start of last class. I realized that the assertion
$$ \bf a \times (b + c) = a\times b + a\times c$$
was actually surprising to me. Clearly this is fundamental (if you don't accept this, you can't derive a way of computing the cross product), but the proof is a bit slippery.
If you start from $\bf{ a \times b}$ $:= \bf \hat n |a||b|$ $ \sin\theta$, where $ \bf \hat n$ comes from the right hand rule, then if you can prove as a lemma $\bf a \cdot (b \times c) = (a \times b) \cdot c$, then there's a neat little proof which I found here. But the only argument I've seen for the needed lemma is about the volume of a parallelepiped, which only convinces me that $\bf | a \cdot (b \times c)| = |(a \times b) \cdot c|$.
I think I prefer the approach in one of my textbooks, which starts by defining the cross product by determinants - so, distributivity holds - and proves most of the needed properties. But it wusses out at a crucial point: "it can be proven that the orthogonal vector obtained from this matrix obeys the Right Hand Rule".
Could somebody either prove that lemma, or the textbook claim? (Preferably the latter.)
No comments:
Post a Comment