Let $f:[0,\infty)^n\to \mathbb{R}$ be a function continuously differentiable in the interior $(0,\infty)^n$ and that $\frac{\partial}{\partial x_j}f(\textbf{x})\to -\infty$ as $x_j\to 0^+$ for $j=1,\dots,n$.
Can it be shown rigorously that when this function is minimized over a set determined by a linear equation say $\{\textbf{x}=(x_1,\dots,x_n):\sum_j a_j x_j=b, x_j\ge 0\}$, the minimizer doesn't have a $0$ entry at a position when the constraint set allows non zero entries for that position?
Thanks.
Answer
This is false. A counterexample is given by
$$f(x,y)=\sqrt[4]{(x-1)^2+y^2}-1.1\sqrt y$$
with the linear constraint $x+y=1$. The partial derivative with respect to $y$ is
$$
\frac{\partial f}{\partial y} = \frac y{2\left((x-1)^2+y^2\right)^{3/4}}-\frac{1.1}{2\sqrt y}\;,
$$
which goes to $-\infty$ as $y\to0$ for fixed $x$ (including $x=1$). On the line $x+y=1$, we have $x=1-y$ and
$$
f(1-y,y)=\sqrt[4]{y^2+y^2}-1.1\sqrt y=\left(\sqrt[4]2-1.1\right)\sqrt y\;,
$$
which is minimal for $y=0$.
No comments:
Post a Comment