I've been thinking about this this problem:
Let M be a symmetric matrix. Recall that the eigenvalues of M are the roots of the characteristic polynomial of M:
p(x):=det(xI−M)=n∏i=1(x−μi)
Write
p(x)=n∑k=0xn−kck(−1)k
Prove that
ck=∑S⊆[n],|S|=kdet(M(S,S)).
Here, we write [n] to denote the set {1,2,…,n}, and M(S,S) to denote the submatrix of M with rows and columns indexed by S.
I am a little confused on how to relate the coefficients back to the determinants of submatricies of M.
I think it's not too tricky using the product formulation for the characteristic polynomial that the coefficients ck are the sum of all k-wise products of eigenvalues. Each coefficient ck then corresponds to the sum of all determinants of principal submatrices of size k for the matrix Λ if you write M=UΛUT via the spectral theorem. Since UT=U−1, M is similar to Λ. Can you then say that since it is true for the diagonal matrix, it is true of the matrix itself because of similarity? I know the eigenvalues don't necessarily correspond to the respective submatrices of M via Cauchy's interlacing theorem, so it's confusing to me how to jump back to talking about submatrices of M.
Or do you need to use some formulation of the determinant to show this?
Thanks in advance!
Answer
The detour through the eigenvalues will not help.
Using the notations of your link, recall that [n]={1,…,n} and let M(S,T) denote the matrix obtained from M by only keeping the rows with index in S and columns with index in T. The the first thing you need is
∂∂Mijdet(M)=(−1)i+j det( M( [n]∖{i}, [n]∖{j} ) )
which follows from the expansion with respect to say the j-th column.
Then, express the ck's as derivatives at zero of the characteristic polynomial. Finally, compute these derivatives using the multivariate chain rule and the formula above.
No comments:
Post a Comment