Home » Blog » Show that the series obtained from a generalization of the decimal expansion converges

Show that the series obtained from a generalization of the decimal expansion converges

We may generalize the decimal expansion of a number by replacing the integer 10 with any integer b> 1. If x > 0, let a_0 denote the greatest integer greater than x. Assuming the integers a_0, a_1, \ldots, a_{n-1} have been defined, let a, denote the largest integer such that

    \[ \sum_{k=0}^n \frac{a_k}{b^k} \leq x. \]

Show that the series

    \[ \sum_{k=0}^{\infty} \frac{a_k}{b^k} \]

converges and has sum x.


Proof. Since a_k \leq b-1 we have

    \[ \sum_{k=0}^{\infty} \frac{a_k}{b^k} \leq \sum_{k=0}^{\infty} \frac{b-1}{b^k} < \sum_{k=0}^{\infty} \frac{b}{b^k} = b \sum_{k=0}^{\infty} \frac{1}{b^k} = b \left( \frac{1}{1 - \frac{1}{b}} \right). \]

Since

    \[ 0 \leq \sum_{k=0}^{\infty} \frac{a_k}{b^k} < \frac{b^2}{b-1} \]

we have established the convergence of

    \[ \sum_{k=0}^{\infty} \frac{a_k}{b^k}. \qquad \blacksquare \]

Point out an error, ask a question, offer an alternative solution (to use Latex type [latexpage] at the top of your comment):