Monday, 10 March 2014

matrices - Understanding Kantorovich's inequality

I'm looking for a proof of the Kantorovich inequality, namely:




$$ \langle Ax,x\rangle \langle A^{-1}x,x\rangle \leq \frac{1}{4}\left(K(A)+\frac{1}{K(A)}\right)$$
Where $ K(A)= \lVert A\rVert_2 \lVert A^{-1}\rVert_2 $ and $A $ is an Hermitian positive definite matrix and $x$ a vector with the accurate size. Or alternatively



$$ \langle Ax,x\rangle \langle A^{-1}x,x\rangle \leq \frac{1}{4}\left(\left(\frac{\beta}{\alpha}\right)^{2}+\left(\frac{\alpha}{\beta}\right)^{2}\right)$$
where $0 < \alpha = \lambda_1 \leq \cdots \leq \lambda_n = \beta$ are the eigenvalues of $ A$.



There are a lot of proofs on internet but this is the one that I found easier to understand. Nevertheless, I'm stuck figuring out something:



They use $ f,g: [\alpha,\beta]\to \mathbb{R} $ two convex function with $ f$ positive and $f(t)\leq g(t) $ for every $t \in [\alpha, \beta] $. Then they claim that $ F=f(A)$ and $ G=g(A)$ are well defined and hermitian positive definite.




I don't even understand why they mean by $ f(A)$.



If any of you guys can suggest me alternative documentation (other proof) or help me with this one, I would be very grateful

No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...