Let f:[0,∞)n→R be a function continuously differentiable in the interior (0,∞)n and that ∂∂xjf(x)→−∞ as xj→0+ for j=1,…,n.
Can it be shown rigorously that when this function is minimized over a set determined by a linear equation say {x=(x1,…,xn):∑jajxj=b,xj≥0}, the minimizer doesn't have a 0 entry at a position when the constraint set allows non zero entries for that position?
Thanks.
Answer
This is false. A counterexample is given by
f(x,y)=4√(x−1)2+y2−1.1√y
with the linear constraint x+y=1. The partial derivative with respect to y is
∂f∂y=y2((x−1)2+y2)3/4−1.12√y,
which goes to −∞ as y→0 for fixed x (including x=1). On the line x+y=1, we have x=1−y and
f(1−y,y)=4√y2+y2−1.1√y=(4√2−1.1)√y,
which is minimal for y=0.
No comments:
Post a Comment