Sunday, 27 July 2014

calculus - Is there a series where the terms tend to zero faster than the harmonic series but it still diverges?



I know that for the harmonic series

limn1n=0 and
\sum_{n=1}^{\infty} \frac1n = \infty.



I was just wondering, is there a sequence (a_n =\dots) that converges "faster" (I am not entirely sure what's the exact definition here, but I think you know what I mean...) than \frac1n to 0 and its series \sum_{n=1}^{\infty}{a_n}= \infty?



If not, is there proof of that?


Answer



There is no slowest divergent series. Let me take this to mean that given any sequence a_n of positive numbers converging to zero whose series diverges, there is a sequence b_n that converges to zero faster and the series also diverges, where "faster" means that \lim b_n/a_n=0. In fact, given any sequences of positive numbers (a_{1,n}), (a_{2,n}),\dots with each \sum_n a_{i,n}=\infty and \lim_n a_{i+1,n}/a_{i,n}=0, there is (a_n) with \sum a_n=\infty and \lim_n a_n/a_{i,n}=0 for all i.



To see this, given a_1,a_2,\dots, first define b_1=a_1,b_2=a_2,\dots,b_k=a_k until a_1+\dots+a_k>1. Second, let b_{k+1}=a_{k+1}/2,b_{k+2}=a_{k+2}/2,\dots,b_n=a_n/2 until a_{k+1}+\dots+a_n>2, etc. That is, we proceed recursively; if we have defined b_1,\dots,b_m and b_m=a_m/2^r, and b_1+\dots+b_m>r+1, let b_{m+1}=a_{m+1}/2^{r+1},\dots,b_l=a_l/2^{r+1} until a_{m+1}+\dots+a_l>2^{r+1}. The outcome is that \sum b_i=\infty and \lim b_i/a_i=0.




Similarly, given (a_{1,n}),(a_{2,n}),\dots, with each (a_{k+1,n}) converging to zero faster than (a_{k,n}), and all of them diverging, let a_i=a_{1,i} for i\le n_1, where a_{1,1}+\dots+a_{1,n_1}>1, then a_i=a_{2,i} for $n_11 and that for any k>n_2/2 we have a_{2,k}/a_{1,k}<1/2, etc. That is, if we have defined n_k, we let a_i=a_{k+1,i} for n_k1 and for all l>n_{k+1}/2 we have a_{k+1,l}/a_{i,l}<1/2^{k+1} for all i

We can modify the above slightly so that given any sequences (a_{i,n}) with \sum_n a_{i,n}<\infty and \lim_n a_{i+1,n}/a_{i,n}=\infty for all i, we can find (a_n) with \sum_n a_n<\infty and \lim_n a_n/a_{i,n}=\infty, so there is no fastest convergent series, and not even considering a sequence of faster and faster convergent series is enough. (In modern terms, there is no (0,\omega)-gap.)



What we cannot do in general is, given (a_{i,n}), with all \sum_n a_{i,n}=\infty, find (a_n) with \sum a_n=\infty and a_n/a_{i,n}\to0 for all i, if the a_{i,n} are not ordered so that a_{i+1,n} converges to zero faster than a_{i,n}. For example, we can have a_n=1/n if n is odd and a_n=1/n^2 if n is even, and b_n=1/n^2 if n is odd and b_n=1/n if n is even, and if c_n converges to zero faster than both, then \sum c_n converges. (These exceptions can typically be fixed by asking monotonicity of the sequences, which is how these results are usually presented in the literature.)



Note that the argument I gave is completely general, no matter what the series involved. For specific series, of course, nice "formulas" are possible. For example, given a_n=1/n, we can let b_n=1/(n\log n) for n>1. Or c_n=1/(n\log n\log\log n), for n\ge 3. Or ... And we can then "diagonalize" against all these sequences as indicated above.



By the way, the first person to study seriously the boundary between convergence and divergence is Paul du Bois-Reymond. He proved a version of the result I just showed above, that no "decreasing" sequence of divergent series "exhausts" the divergent series in that we can always find one diverging and with terms going to zero faster than the terms of any of them. A nice account of some of his work can be found in the book Orders of Infinity by Hardy. Du Bois-Reymond's work was extended by Hadamard and others. What Hadamard proved is that given (a_i) and (b_i) with \sum a_i=\infty, \sum b_i<\infty, and b_i/a_i\to 0, we can find (c_i),(d_i) with c_i/a_i\to0, b_i/d_i\to 0, d_i/c_i\to0, \sum c_i=\infty, \sum d_i<\infty. More generally:





If we have two sequences of series, (a_{1,n}), (a_{2,n}),\dots and (b_{1,n}),(b_{2,n}),\dots, such that




  • Each (a_{i+1,n}) converges to zero faster than the previous one,

  • Each (b_{i+1,n}) converges to zero slower than the previous one,

  • Each (a_{i,n}) converges to zero slower than all the (b_{j,n}),

  • Each \sum_n a_{i,n} diverges, and

  • Each \sum_n b_{i,n} converges,




then we can find sequences (c_n),(d_n), "in between", with one series converging and the other diverging.




In modern language, we say that there are no (\omega,\omega)-gaps, and similarly, there are no (\omega,1)- or (1,\omega)-gaps. This line of research led to some of Hausdorff's deep results in set theory, such as the existence of so-called (\omega_1,\omega_1)- or Hausdorff gaps. What Hausdorff proved is that this "interpolation" process, which can be iterated countably many times, cannot in general be carries out \omega_1 times, where \omega_1 is the first uncountable ordinal.


No comments:

Post a Comment

real analysis - How to find lim_{hrightarrow 0}frac{sin(ha)}{h}

How to find \lim_{h\rightarrow 0}\frac{\sin(ha)}{h} without lhopital rule? I know when I use lhopital I easy get $$ \lim_{h\rightarrow 0}...