Monday 21 December 2015

linear algebra - Proof of minimum eigenvalue of non-symmetric matrix with real eigenvalues



I am wondering if this true: $\lambda_{\min}(A) \ge \lambda_{\min}(\frac{A+A^T}{2})$, given that $A$ is non-symmetric but with real eigenvalues. I came across this inequality in one of the math-stackexchange posts but wonder why it is true? I did MATLAB simulations with for many rand(2,2) matrices and it seems to hold up but is not sufficient to be taken as a fact. Please let me know.


Answer



Fact. If $S$ is a real symmetric matrix (of size $n$), then
$$\forall X\in\mathbb{R}^n,\ X^T\,S X\geq\lambda_{\min}\|X\|^2,$$
where $\lambda_{\min}$ is the minimum eigenvalue of $S$.



Proof. We know that a real symmetric matrix is diagonalizable in an orthonormal basis. Let $(X_1,\ldots,X_n)$ be an orthonormal basis of eigenvectors of $S$, with $X_k$ associated with the eigenvalue $\lambda_k$. Now let $X\in\mathbb{R}^n$ and decompose it on the basis: there exists $x_1,\ldots,x_n\in\mathbb{R}^n$ such that $X=x_1X_1+\cdots+x_nX_n$. Then
$$X^T SX=\lambda_1x_1^2\|X_1\|^2+\cdots+\lambda_nx_n^2\|X_n\|^2\geq\lambda_{\min}\|X\|^2.$$







Now let $A$ be a square real matrix with real coefficients. Let $\lambda$ be a real eigenvalue of $A$ and let $X_\lambda$ be an associated eigenvector. Then
$$X_\lambda^T AX_\lambda=\lambda\|X_\lambda\|^2.$$
Now, transposing this one-by-one matrix yields
$$X_\lambda^TA^TX_\lambda=\lambda\|X_\lambda\|^2$$
too, hence
$$X_\lambda^T\left(\frac{A+A^T}2\right)X_\lambda=\lambda\|X_\lambda\|^2.$$
Hence, from the preliminary fact, and since $S=\dfrac{A+A^T}2$ is a real symmetric matrix, we must have

$$\lambda\|X_\lambda\|^2\geq\lambda_{\min}\|X_\lambda\|^2$$
where $\lambda_{\min}$ is the minimal eigenvalue of $S$. Since $X_\lambda\neq0$ we conclude that $\lambda\geq\lambda_{\min}$ i.e., that:




every real eigenvalue of $A$ is non-less than $\lambda_{\min}$.







You can generalize it slightly with the non-real eigenvalues of $A$ too: let $\lambda\in\mathbb{C}$ be an eigenvalue of $A$ and let $X_\lambda\in\mathbb{C}^n$ be an eigenvector of $A$ associated with $\lambda$. Then:

$$\overline{X_\lambda^T}AX_\lambda=\lambda\|X_\lambda\|^2,$$
and also (transpose and take the conjugate, using the fact that $A$ has real coefficients):
$$\overline{X_\lambda^T}A^TX_\lambda=\overline{\lambda}\|X_\lambda\|^2.$$
Hence
$$\overline{X_\lambda^T}\left(\frac{A+A^T}2\right)X_\lambda=\Re(\lambda)\|X_\lambda\|^2.$$
Extending the preliminary fact to complex vectors (and taking the associated hermitian product) yields $\Re(\lambda)\geq\lambda_{\min}$, i.e.,




the real part of every (complex) eigenvalue of $A$ is non-less than $\lambda_{\min}$.




No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...