A question from Introduction to Analysis by Arthur Mattuck:
Prove that $\Gamma (x)=\int_0^\infty t^{x-1}e^{-t}dt$ is continuous at $x=1^+$.
(Method: consider $|\Gamma(1+h)-\Gamma(1)|.$ To estimate it, break up the interval $[0,\infty)$ into two parts. Remember that you can estimate differences of the form $|f(a)-f(b)|$ by using the Mean-value Theorem, if $f(x)$ is differentiable on the relevant interval.
$|\Gamma(1+h)-\Gamma(1)|=\int_0^\infty (t^h-1)e^{-t}dt$. I don't know how to apply the Mean-value Theorem to it. I haven't learned differentiating under the integral sign.
No comments:
Post a Comment