Let $L:V \rightarrow V$ be a linear map such that $L^3 = 0$ (i.e. $L^3$ is the zero matrix). Show that $I-L$ is invertible and find $(I-L)^{-1}$ in terms of a polynomial of $L$.
This question is giving me fits. How do I show this? Furthermore, how do I find the invertible matrix in terms of a polynomial of $L$? I know, by the Invertible Matrix Theorem, that the following are equivalent for an $n \times n$ square matrix:
- A is an invertible matrix
- A is row equivalent to the $n \times n$ identity matrix
- A has n pivot positions.
- $Ax=0$ has only the trivial solution.
- The columns of A form a linearly independent set.
- The linear transformation $x \rightarrow Ax$ is one-to-one.
- The columns of A span $\mathbb{R}^n$
- The linear transformation $x \rightarrow Ax$ maps $\mathbb{R}^n$ onto $\mathbb{R}^n$.
- There is an $n \times n$ matrix $C$ such that $CA=I$.
- There is an $n \times n$ matrix $D$ such that $AD=I$.
- $A^T$ is an invertible matrix.
and so on.
New to linear algebra. Usually I can give a bit more in my questions.
Any help is appreciated.
Answer
Suppose
$L^k = 0, \; k \ge 1; \tag 1$
then consider the identity, which holds for any $m \ge 1$,
$L^m - I = (L - I)(\displaystyle \sum_0^{m - 1} L^j) = (L - I)(L^{m - 1} + L^{m - 2} + \ldots L + I); \tag 2$
this equation may easily be proved (by induction on $m$ if you like), and is quite likely familiar to the reader either from high-school algebra or the study of roots of unity in field theory. Be that as it may, with (1) in place we see that (2) becomes, with $m = k$,
$-I = (L - I)(L^{k - 1} + L^{k - 2} + L + I), \tag 3$
which shows that $I - L$ is invertible with inverse
$(I - L)^{-1} = L^{k - 1} + L^{k - 2} + L + I. \tag 4$
The particular case at hand may be resolved by taking $k = 3$.
No comments:
Post a Comment