A question from Introduction to Analysis by Arthur Mattuck:
Prove that Γ(x)=∫∞0tx−1e−tdt is continuous at x=1+.
(Method: consider |Γ(1+h)−Γ(1)|. To estimate it, break up the interval [0,∞) into two parts. Remember that you can estimate differences of the form |f(a)−f(b)| by using the Mean-value Theorem, if f(x) is differentiable on the relevant interval.
|Γ(1+h)−Γ(1)|=∫∞0(th−1)e−tdt. I don't know how to apply the Mean-value Theorem to it. I haven't learned differentiating under the integral sign.
No comments:
Post a Comment