Let be a convergent series of nonnegative terms (i.e., for all ). Prove that

converges for all . Provide an example to show this fails in the case .

(**Note.** Thanks to Giovanni in the comments for pointing out the problems with the original proof.)

*Proof.* We apply the Cauchy-Schwarz inequality. (See this exercise for some more proofs regarding the Cauchy-Schwarz inequality.) The Cauchy-Schwarz inequality establishes that

So, to apply this to the current problem, we let

Then, since we have ; hence, . So, we have

But, on the right we know converges (by hypothesis) and we know converges for . Hence, each of the sums on the right converges, so the term on the right is some finite number . This means the partial sums of are bounded by a constant . By Theorem 10.7 (page 395 of Apostol) this implies the convergence of the series

**Example.** Let and let . Then, we know converges (by Example #2 on page 398 of Apostol, established with the integral test). However,

We know

diverges by the same example on page 398. Therefore, we conclude the theorem fails in the case .

Very nice proof.

I came up with a more basic one.

Since converges, then since otherwise would diverge.

Taking the square root and multiplying by we have that the sum converges by comparison with since .

a_nN, for some N. However my intuition tells me it should be true, but I have no proof of this.

If a_n > c/n for some n, it doesn’t exclude that a_{n+1} < c/(n+1), so I am not convinced your proof is correct.

We know that a_n N for all N in R (because otherwise sum(a_n) would dominate sum(1/n)), which implies that the limit of {a_n}/{c/n} as n approaches infinity must be less than one if it exists since we can always find an n where a_n < c/n, implying a_n/(c/n) < 1. We also know that both a_n and c/n head to zero as n approaches infinity. However, I don't know if there's a way to prove the limit exists (maybe the fact that 1/n is monotonic proves this somewhow, but I couldn't think of anything so far)

Bit of a typo in the first sentence. I meant to say that we know a_n > 1/n, for some n > N, for all N.

a_n < 1/n *

The proof stated above is interesting, but I think it can be proven even more easily, first by noting that a_n 1/2 we have sqrt(a_n) / n^p 1, it converges, hence by Comparison Test, we have sqrt(a_n)/n^p converges as well, which completes the proof.

I don’t know why but my previous comment was somehow cut in middle. So the proof is as following: we first note that a_n 1/2 we get sqrt(a_n)/n^p1, by Comparison Test, sqrt(a_n)/n^p converges, which completes the proof.

Ok, just one more thing and I’m sorry for taking your time, when you say : “since squaring the sum is finite, the sum itself must also be finite” aren’t you saying:

which is false?

This is the part where I don’t follow you, everything else makes total sense

No worries (I’m waiting for my bus anyway). I am definitely not saying

(Since that is not true.) What the proof is claiming is

Which really is just saying that

converges implies that both and converge. We probably have a theorem that says this (I don’t have my book with me), but it must be true since if either of them diverge then the product diverges as well.

In our proof we showed that was convergent (by the comparison test). So then the claim is that converges. Does that help?

I think I got it now, thx.

So I can say that if a sequence converges and converges, then converges? I thought of using Abel’s or Dirichlet’s tests but I couldn’t fit these sequences on their hypothesis.

Sorry, I’m being really sloppy with this. I need to restate this proof more carefully. I’m not sure we’ve really defined what it means to multiply two infinite series. So, it is true that if converges and converges then is a finite number, but we haven’t really defined what it means to multiply two infinite series. (Be careful though, converges and converges definitely does not imply converges… take , then each converges by Leibniz, but the product is so does not converge). Anyway, we can fix the proof by taking square roots. So from Caucy-Schwarz we have

implies

which gives us

Since we know the two series on the right converge, the product of their square roots is just some finite number. Thus, the partial sums of the series on the left are bounded by some constant so by Theorem 10.7, this means the series on the left is convergent. For the problem at hand, substitute the same and .

Ok, I really think this is right now. Thanks for being persistent! I’ll post a fix to the proof later today. Hopefully this makes more sense than what I was saying before?

Ok it seems really great now, no objections, hahaha.

I thank you a lot for the attention, I really do.

Ha, no problem. Plus, it turns out my original proof wasn’t good! Thanks for taking the time to point out mistakes. I’m sure there are tons of them.

could you please elaborate a little more on why are you proving it for ?

Because, to me, it feels like you are proving it just for .

Sory, if my doubt is basic, just got really confused.

Hi, I think you’re right. There is a mistake in this proof, for exactly the reason you say. Let me think about it and I’ll try to post a fix as soon as I have one (hopefully by tomorrow). It’s not immediately obvious to me how to fix this (or what I was thinking when I wrote it… maybe I had an idea why this works, but I just don’t see it now).

Hey, I think it’s fixed now! The Cauchy-Schwarz strategy still worked, but somewhere in writing it up I’d mixed something up. I think the proof works now. Let me know if it makes sense.

Also, did you use the “subscribe to comments feature”, and did it function properly? (I think you’re the first person to comment since I set it up.)

It worked fine for me, got an email when my comment was responded, no problems.

Would your proof work for 1/n ?

Awesome.

It will work for (the case ). In that case we apply Cauchy-Schwarz with and . Then we have

Since both of the series on the right converge, the series on the left must converge also by the comparison test.