Home » Blog » Conclude if the given series converges or diverges and justify the conclusion

Conclude if the given series converges or diverges and justify the conclusion

Test the following series for convergence or divergence. Justify the decision.

    \[ \sum_{n=2}^{\infty}  \frac{1}{(\log n)^s}. \]


The series diverges. To show this we use the limit comparison test (more precisely, we use the comment following the proof of the limit comparison test that if \lim_{n \to \infty} \frac{a_n}{b_n} = 0 then \sum b_n converges implies \sum a_n converges) with

    \[ a_n = \frac{1}{n}, \qquad b_n = \frac{1}{(\log n)^s}. \]

Then we have

    \begin{align*}  \lim_{n \to \infty} \frac{a_n}{b_n} &= \lim_{n \to \infty} \frac{\frac{1}{n}}{\frac{1}{(\log n)^s}} \\[9pt]  &= \lim_{n \to \infty} \frac{(\log n)^s}{n} \\[9pt]  &= 0. \end{align*}

Hence, the convergence of \sum b_n would imply the convergence of \sum a_n, but we know \sum a_n diverges; hence, \sum b_n must diverge as well.

5 comments

  1. Giovanni says:

    Hello sir, and thank you so much for posting these solutions, you are really a life-saver.

    But I’m having some problems related to your use of the limit comparison method.

    The theorem on apostol says that it convergence of a implies on convergence of b if the quotient equals to 1, and you used it to some extent when the quotient equals 0, what is your logic?

    It is easy to extend the theorem no any constant c > 0 but not to 0.

    Could you elaborate?

    Thx in advance

    • RoRi says:

      Hi, I’m using the comment after Theorem 10.9 on page 396 of Apostol. The comment tells us that if

          \[ \lim_{n \to \infty} \frac{a_n}{b_n} = 0 \]

      then we can conclude that the convergence of \sum b_n implies the convergence of \sum a_n. The idea of this (without giving a formal proof) is just that the a_n must be smaller than the b_n for all n \geq N (otherwise the limit couldn’t be going to 0). As you point out, the full theorem doesn’t hold in this case since \sum a_n might converge, but this would not imply the convergence of \sum b_n.

      In this case though we are saying that if \sum b_n converged it would imply that \sum a_n converges. Since we know \sum a_n does not converge, this means \sum b_n cannot converge either. Does that make sense?

      • Giovanni says:

        Oh, it makes sense now (I did not notice the comment on that theorem).

        I googled a lot and couldn’t find a formal proof, but as much as “waving hands” go this is really intuitive.

        Thx a lot man, I must say I am a big fan of yours (Just started my calculus 3 course and this is helping a LOT) You are a really bright person, hahaha.

      • RoRi says:

        Ha, thanks, and no problem. I think you can get to a formal proof of the comment by following Apostol’s proof of the theorem, but leaving off one side of the inequality. So, since \lim_{n \to \infty} \frac{a_n}{b_n} = 0 we know there is some N such that for all n \geq N we have \frac{a_n}{b_n} < \frac{1}{2} (just taking \varepsilon = \frac{1}{2} in the definition of the limit) and so a_n < 2 b_n for all n \geq N. So, then by Theorem 10.8 we have that \sum b_n converges implies \sum a_n converges. (This still isn’t totally rigorous, but it’s closer.)

Point out an error, ask a question, offer an alternative solution (to use Latex type [latexpage] at the top of your comment):