Suppose x_n is the only positive solution to the equation x^{-n}=\sum\limits_{k=1}^\infty (x+k)^{-n},how to show the existence of the limit \lim_{n\to \infty}\frac{x_n}{n}?
It is easy to see that \{x_n\} is increasing.In fact, the given euation equals
1=\sum_{k=1}^\infty(1+\frac{k}{x})^{-n} \tag{*}
If x_n\ge x_{n+1},then notice that for any fixed k,(1+\frac{k}{x})^{-n} is increasing,thus we can get
\frac{1}{(1+\frac{k}{x_n})^n}\ge \frac{1}{(1+\frac{k}{x_{n+1}})^n}>\frac{1}{(1+\frac{k}{x_{n+1}})^{n+1}}
By summing up all k's from 1 to \infty,we can see
\sum_{k=1}^\infty\frac{1}{(1+\frac{k}{x_n})^n}>\sum_{k=1}^\infty\frac{1}{(1+\frac{k}{x_{n+1}})^{n+1}}
then from (*) we see that the two series in the above equality are all equals to 1,witch is a contradiction!
But it seems hard for us to show the existence of \lim_{n\to \infty}\frac{x_n}{n}.What I can see by the area's principle is
\Big|\sum_{k=1}^\infty\frac{1}{(1+\frac{k}{x_n})^n}-\int_1^\infty \frac{1}{(1+\frac{x}{x_n})}dx\Big|<\frac{1}{(1+\frac1{x_n})^n}
or
\Big|1-\frac{x_n}{n-1}(1+\frac{1}{x_n})^{1-n}\Big|<\frac{1}{(1+\frac1{x_n})^n}
For any n \ge 2, consider the function \displaystyle\;\Phi_n(x) = \sum_{k=1}^\infty \left(\frac{x}{x+k}\right)^n.
It is easy to see \Phi_n(x) is an increasing function over (0,\infty].
For small x, it is bounded from above by x^n \zeta(n) and hence decreases to 0 as x \to 0.
For large x, we can approximate the sum by an integral and \Phi_n(x) diverges like \displaystyle\;\frac{x}{n-1} as x \to \infty. By definition, x_n is the unique root for \Phi_n(x_n) = 1. Let \displaystyle\;y_n = \frac{x_n}{n}.
For any \alpha > 0, apply AM \ge GM to n copies of 1 + \frac{\alpha}{n} and one copy of 1, we obtain
\left(1 + \frac{\alpha}{n}\right)^{n/n+1} > \frac1{n+1} \left[n\left(1 + \frac{\alpha}{n}\right) + 1 \right] = 1 + \frac{\alpha}{n+1}
The inequality is strict because the n+1 numbers are not identical. Taking reciprocal on both sides, we get
\left( \frac{n}{n + \alpha} \right)^n \ge \left(\frac{n+1}{n+1 + \alpha}\right)^{n+1}
Replace \alpha by \displaystyle\;\frac{k}{y_n} for generic positive integer k, we obtain
\left( \frac{x_n}{x_n + k} \right)^n = \left( \frac{n y_n}{n y_n + k} \right)^n > \left(\frac{(n+1)y_n}{(n+1)y_n + k}\right)^{n+1}
Summing over k and using definition of x_n, we find
\Phi_{n+1}(x_{n+1}) = 1 = \Phi_n(x_n) > \Phi_{n+1}((n+1)y_n)
Since \Phi_{n+1} is increasing, we obtain x_{n+1} > (n+1)y_n \iff y_{n+1} > y_n.
This means y_n is an increasing sequence.
We are going to show y_n is bounded from above by \frac32
(see update below for a more elementary and better upper bound).
For simplicity, let us abberivate x_n and y_n as x and y. By their definition, we have
\frac{2}{x^n} = \sum_{k=0}^\infty \frac{1}{(x+k)^n}
By Abel-Plana formula, we can transform the sum on RHS to integrals. The end result is
\begin{align}\frac{3}{2x^n} &= \int_0^\infty \frac{dk}{(x+k)^n} + i \int_0^\infty \frac{(x+it)^{-n} - (x-it)^{-n}}{e^{2\pi t} - 1} dt\\ &=\frac{1}{(n-1)x^{n-1}} + \frac{1}{x^{n-1}}\int_0^\infty \frac{(1+is)^{-n} - (1-is)^{-n}}{e^{2\pi x s}-1} ds \end{align}
Multiply both sides by nx^{n-1} and replace s by s/n, we obtain
\begin{align}\frac{3}{2y} - \frac{n}{n-1} &= i \int_0^\infty \frac{(1 + i\frac{s}{n})^{-n} - (1-i\frac{s}{n})^{-n}}{e^{2\pi ys} - 1} ds\\ &= 2\int_0^\infty \frac{\sin\left(n\tan^{-1}\left(\frac{s}{n}\right)\right)}{\left(1 + \frac{t^2}{n^2}\right)^{n/2}} \frac{ds}{e^{2\pi ys}-1}\tag{*1} \end{align}
For the integral on RHS, if we want its integrand to be negative, we need
n\tan^{-1}\left(\frac{s}{n}\right) > \pi \implies \frac{s}{n} > \tan\left(\frac{\pi}{n}\right) \implies s > \pi
By the time s reaches \pi, the factor \frac{1}{e^{2\pi ys} - 1} already drops to very small. Numerically, we know y_4 > 1, so for n \ge 4 and s \ge \pi, we have
\frac{1}{e^{2\pi ys} - 1} \le \frac{1}{e^{2\pi^2} - 1} \approx 2.675 \times 10^{-9}
This implies the integral is positive. For n \ge 4, we can deduce
\frac{3}{2y} \ge \frac{n}{n-1} \implies y_n \le \frac32\left(1 - \frac1n\right) < \frac32
Since y_n is increasing and bounded from above by \frac32, limit
y_\infty \stackrel{def}{=} \lim_{n\to\infty} y_n exists and \le \frac32.
For fixed y > 0, with help of DCT, one can show the last integral of (*1)
converges.
This suggests y_\infty is a root of following equation near \frac32
\frac{3}{2y} = 1 + 2\int_0^\infty \frac{\sin(s)}{e^{2\pi ys} - 1} ds
According to DLMF,
\int_0^\infty e^{-x} \frac{\sin(ax)}{\sinh x} dx = \frac{\pi}{2}\coth\left(\frac{\pi a}{2}\right) - \frac1a\quad\text{ for }\quad a \ne 0
We can transform our equation to
\frac{3}{2y} = 1 + 2\left[\frac{1}{4y}\coth\left(\frac{1}{2y}\right) - \frac12\right] \iff \coth\left(\frac{1}{2y}\right) = 3
This leads to \displaystyle\;y_\infty = \frac{1}{\log 2}.
This is consistent with the finding of another answer (currently deleted):
If L_\infty = \lim_{n\to\infty}\frac{n}{x_n} exists, then L_\infty = \log 2.
To summarize, the limit \displaystyle\;\frac{x_n}{n} exists and should equal to \displaystyle\;\frac{1}{\log 2}.
Update
It turns out there is a more elementary proof that y_n is bounded from above by the optimal bound \displaystyle\;\frac{1}{\log 2}.
Recall for any \alpha > 0. we have 1 + \alpha < e^\alpha. Substitute
\alpha by \frac{k}{n}\log 2 for n \ge 2 and k \ge 1, we get
\frac{n}{n + k\log 2} = \frac{1}{1 + \frac{k}{n}\log 2} > e^{-\frac{k}{n}\log 2} = 2^{-\frac{k}{n}}
This leads to
\Phi_n\left(\frac{n}{\log 2}\right) = \sum_{k=1}^\infty \left(\frac{n}{n + \log 2 k}\right)^n > \sum_{k=1}^\infty 2^{-k} = 1 = \Phi_n(x_n)
Since \Phi_n(x) is increasing, this means
\displaystyle\;\frac{n}{\log 2} > x_n and y_n is bounded from above by \displaystyle\;\frac{1}{\log 2}.