Thursday, 12 May 2016

number theory - least common multiple $limsqrt[n]{[1,2,dotsc,n]}=e$




The least common multiple of $1,2,\dotsc,n$ is $[1,2,\dotsc,n]$, then



$$\lim_{n\to\infty}\sqrt[n]{[1,2,\dotsc,n]}=e$$






we can show this by prime number theorem, but I don't know how to start



I had learnt that it seems we can find the proposition in G.H. Hardy's number theory book, but I could not find it.




I am really grateful for any help


Answer



Let's look how the least common multiple evolves.



If $n > 1$ is a prime power, $n = p^k$ ($k \geqslant 1$), then no number $< n$ is divisible by $p^k$, but $p^{k-1} < n$, so $[1,2,\dotsc,n-1] = p^{k-1}\cdot m$, where $p\nmid m$. Then $[1,2,\dotsc,n] = p^k\cdot m$, since on the one hand, we see that $p^k\cdot m$ is a common multiple of $1,2,\dotsc,n$, and on the other hand, every common multiple of $1,2,\dotsc,n$ must be a multiple of $p^k$ as well as of $m$.



If $n > 1$ is not a prime power, it is divisible by at least two distinct primes, say $p$ is one of them. Let $k$ be the exponent of $p$ in the factorisation of $n$, and $m = n/p^k$. Then $ 1 < p^k < n$ and $1 < m < n$, so $p^k\mid [1,2,\dotsc,n-1]$ and $m\mid [1,2,\dotsc,n-1]$, and since the two are coprime, also $n = p^k\cdot m \mid [1,2,\dotsc,n-1]$, which means that then $[1,2,\dotsc,n] = [1,2,\dotsc,n-1]$.



Taking logarithms, we see that for $n > 1$




$$\begin{align}
\Lambda (n) &= \log [1,2,\dotsc,n] - \log [1,2,\dotsc,n-1]\\
&= \begin{cases} \log p &, n = p^k\\ \;\: 0 &, \text{otherwise}.\end{cases}
\end{align}$$



$\Lambda$ is the von Mangoldt function, and we see that



$$\log [1,2,\dotsc,n] = \sum_{k\leqslant n} \Lambda(k) = \psi(n),$$



where $\psi$ is known as the second Chebyshev function.




With these observations, it is clear that



$$\lim_{n\to\infty} \sqrt[n]{[1,2,\dotsc,n]} = e\tag{1}$$



is equivalent to



$$\lim_{n\to\infty} \frac{\psi(n)}{n} = 1.\tag{2}$$



It is well-known and easy to see that $(2)$ is equivalent to the Prime Number Theorem (without error bounds)




$$\lim_{x\to\infty} \frac{\pi(x)\log x}{x} = 1.\tag{3}$$



To see the equivalence, we also introduce the first Chebyshev function,



$$\vartheta(x) = \sum_{p\leqslant x} \log p,$$



where the sum extends over the primes not exceeding $x$. We have



$$\vartheta(x) \leqslant \psi(x) = \sum_{n\leqslant x}\Lambda(n) = \sum_{p\leqslant x}\left\lfloor \frac{\log x}{\log p}\right\rfloor\log p \leqslant \sum_{p\leqslant x} \log x = \pi(x)\log x,$$




which shows - the existence of the limits assumed -



$$\lim_{x\to\infty} \frac{\vartheta(x)}{x} \leqslant \lim_{x\to\infty} \frac{\psi(x)}{x} \leqslant \lim_{x\to\infty} \frac{\pi(x)\log x}{x}.$$



For $n \geqslant 3$, we can split the sum at $y = \frac{x}{(\log x)^2}$ and obtain



$$\pi(x) \leqslant \pi(y) + \sum_{y < p \leqslant x} 1 \leqslant \pi(y) + \frac{1}{\log y}\sum_{y < p < x}\log p \leqslant y + \frac{\vartheta(x)}{\log y},$$



whence




$$\frac{\pi(x)\log x}{x} \leqslant \frac{y\log x}{x} + \frac{\log x}{\log y}\frac{\vartheta(x)}{x} = \frac{1}{\log x} + \frac{1}{1 - 2\frac{\log \log x}{\log x}}\frac{\vartheta(x)}{x}.$$



Since $\frac{1}{\log x}\to 0$ and $\frac{\log\log x}{\log x} \to 0$ for $x\to \infty$, it follows that (once again assuming the existence of the limits)



$$\lim_{x\to\infty} \frac{\pi(x)\log x}{x} \leqslant \lim_{x\to\infty} \frac{\vartheta(x)}{x},$$



and the proof of the equivalence of $(1)$ and $(3)$ is complete.


No comments:

Post a Comment

real analysis - How to find $lim_{hrightarrow 0}frac{sin(ha)}{h}$

How to find $\lim_{h\rightarrow 0}\frac{\sin(ha)}{h}$ without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...