I do not know where to start for solving this exercise although I have the "official" solution. In the solution, I see that some variables are exchanged but I cannot connect the steps to a coherent "story". Some direct help is highly appreciated. I do know how to get the poles from a characteristic equation.
Exercise
Consider a matrix $ A \in \mathbb{R}^{n \times n} $ with the characteristic polynomial
$$ \det(sI-A)=a(s)=s^n+a_{n-1}s^{n-1}+ \cdots + a_1 s+a_0 $$
a) Show that if $ A $ has distinct eigenvalues $(λ_1, λ_2, \ldots , λ_n)$, the following relationship holds:
$$ \Lambda^n + a_{n-1}\Lambda^{n-1}+ \cdots + a_0 I = 0 $$
with $ \Lambda = \begin{pmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & \lambda_n
\end{pmatrix} $
b) Now show that
$$ A^n + a_{n-1}A^{n-1}+ \cdots + a_0 I = 0 $$
(This proves the Cayley-Hamilton Theorem for distinct eigenvalues.)
Hint: Use the fact that a matrix A with distinct eigenvalues can be written as
$A = T ΛT^{−1}$; where $Λ$ is diagonal.
Solution:
a) The characteristic equation is true for all eigenvalues of $A$, $λ_1 . . . λ_n$
$$ \lambda^n + a_{n-1}\lambda^{n-1}+\cdots + a_0 = 0 $$
$$ \Lambda = \begin{pmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & \lambda_n
\end{pmatrix} $$
so $ \Lambda^n + a_{n-1} \Lambda^{n-1}+ \cdots + a_0 I = 0 $
This is the matrix characteristic equation.
b) With distinct eigenvalues and diagonal $Λ$ we have
$$ A=T\Lambda T^{-1} \\ A^2 = T \Lambda T^{-1} T \Lambda T^{-1} = T \Lambda^2 T^{-1} \\ \vdots \\ A^m=T \Lambda^m T^{-1}$$
Multiply the matrix characteristic equation by $T$ (left) and $T^{-1}$ (right) to obtain $$T \Lambda^n T^{-1} + a_{n-1}T \Lambda^{n-1} T^{-1}+\cdots + a_0 TT^{-1}= 0 $$
$$ A^n + a_{n-1}A^{n-1}+\cdots+a_0 I = 0 $$
Answer
It seems that you have done. Let
$ \det(sI-A)=a(s)=s^n+a_{n-1}s^{n-1}+ \cdots + a_1 s+a_0 $ be the characteristic polynomial. We know by assumption that it has exactly $n$ distinct roots, namely: $\lambda_1,\dots,\lambda_n$.
Let $ \Lambda = \begin{pmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & \lambda_n
\end{pmatrix}\text{ }$ be the diagonal matrix whose entries are the roots of $a(s)$.
We want to show that $ \Lambda^n + a_{n-1}\Lambda^{n-1}+ \cdots + a_0 I = 0. $ Since
$$\Lambda^k = \begin{pmatrix}
\lambda_1^k & 0 & \cdots & 0 \\
0 & \lambda_2^k & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & \lambda_n^k
\end{pmatrix},$$
thus $$\Lambda^n + a_{n-1}\Lambda^{n-1}+ \cdots + a_0 I=
\begin{pmatrix}
a(\lambda_1) & 0 & \cdots & 0 \\
0 & a(\lambda_2) & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & a(\lambda_n)
\end{pmatrix}=
\begin{pmatrix}
0 & 0 & \cdots & 0 \\
0 & 0 & \cdots & 0 \\
\vdots & \vdots & \ddots & \vdots \\
0 & \cdots & 0 & 0
\end{pmatrix}$$
because $\lambda_i$ are roots of $a(s)$.
The second part ask to prove the theorem for a matrix $A$ similar to $\Lambda$, i.e.
$$ A^n + a_{n-1}A^{n-1}+ \cdots + a_0 I = 0 .$$
Thus $A = T ΛT^{−1}$; where $Λ$ is diagonal, by definition of similarity; and
$$ A^k = T\Lambda^k T^{-1} \quad \forall k\in \Bbb N.$$
Finally: $$A^n + a_{n-1}A^{n-1}+\cdots+a_0 I = 0\Longleftrightarrow T \Lambda^n T^{-1} + a_{n-1}T \Lambda^{n-1} T^{-1}+\cdots + a_0 TT^{-1}= 0 $$
$$\Longleftrightarrow T( \Lambda^n + a_{n-1} \Lambda^{n-1} +\cdots + a_0I)T^{-1}= 0 \Longleftrightarrow \Lambda^n + a_{n-1} \Lambda^{n-1} +\cdots + a_0I =T^{-1}0T=0.$$
No comments:
Post a Comment