I have the following question here.
Let $A$ and $B$ be $3 × 3$ matrices with $det(A) = 3$ and $det(B) = 2$. Let $C = \frac{1}{2}A^{-1}B^3$ and let $D$ be the reduced row echelon form of $C$. Then:
$(a)$ $det(C)=\frac{4}{3}$, $det(D)=1$
$(b)$ $det(C)=\frac{1}{3}$, $det(D)=1$
$(c)$ $det(C)=\frac{4}{3}$, $det(D)=\frac{4}{3}$
$(d)$ $det(C)=\frac{1}{3}$, $det(D)=3$
$(e)$ $det(C)=\frac{1}{3}$, $det(D)=\frac{1}{3}$
The answer is supposed to be $b$. I know $det(C)=\frac{1}{3}$ just because of determinant properties. That was easy. I'm not 100% sure how the $RREF$ of $D$ comes into play here. I know that elementary row operations affect the determinant but HOW does that affects the determinant here.
Can someone provide any guidance as to how I would calculate $det(D)$?
Answer
Since the matrix $C$ is non-singular, its row reduced echelon form is just $I$.
No comments:
Post a Comment