Consider the series
Use Gauss’ test (from the previous exercise, Section 10.16, Exercise #17) to prove that the series converges for and diverges for .
Proof. From the definition of the series we have the th and st terms,
Therefore,
Using the Taylor expansion,
So we have,
where is bounded. Letting we apply Gauss’ test to conclude that converges if and diverges if
hi, \frac{2n+1}{2n+2}^k = (1 – \frac{1}{2(n+2)})^k
Here is my proof:
First, change the indexing in the series to make the ratio (1-1/(2n))^k which is nicer to work with. Second, write the ratio as
1-(k/2)/n + f(n)/n^2 where
f(n) = n^2((1-1/(2n))^k-1+k/(2n)) . Next prove that f(n) has a limit k(k-1)/8, this shows that it’s bounded since f must be within 1 of its limit for any sufficiently large n and for the remaining finite number of terms f is bounded between its maximum and minimum. Finally use Gauss’s test to conclude that the series converges iff k>2
your definition of a_n appears to be wrong, you wrote the same for a_n and a_n+1