I'm learning for an exam in calculus and have come across this question which I can't seem to prove:
Let C⊆V be a convex set. Let there be a function >f:C→R a convex function.
Now let there be scalars α1,α2…αk>0 such that >∑ki=1αi=1, and also let there be a group of points >x1,x2…xk∈C. Show that the following inequality exists:
f(∑ki=1αixi)≤∑ki=1αif(xi)
I thought about using induction to show the inequality, whereby the basis k=2 is the actual definition of a convex function i.e: f(λx+(1−λ)y)≤λf(x)+(1−λ)f(y)
I've ran into a problem using induction, since the scalars won't sum up to 1 in the induction hypothesis. I also don't really use the fact that C is convex, which I'm probably supposed to...
Either there is some algebraic trick that can be done so I can use induction, or I'm doing something wrong (and induction is not the way to go with this inequality)
Any help will be appreciated, thank you!
Answer
Induction step:
f(n+1∑i=1αixi)=f(n−1∑i=1αixi+(αn+αn+1)(αnαn+αn+1xn+αn+1αn+αn+1xn+1))≤n−1∑i=1αif(xi)+(αn+αn+1)f(αnαn+αn+1xn+αn+1αn+αn+1xn+1)≤n−1∑i=1αif(xi)+(αn+αn+1)(αnαn+αn+1f(xn)+αn+1αn+αn+1f(xn+1))=n+1∑i=1αif(xi).
The first inequality above uses the induction hypothesis. The convexity of C is there so that whenever x1,…,xn are in C, any convex combination of x1,…,xn is again in C. (Recall the domain of f is C.)
No comments:
Post a Comment