Home » Polynomials

Tag: Polynomials

Prove properties of polynomials of a complex variable with real coefficients

Consider the polynomial f(z) of a complex variable with real coefficients.

  1. Prove that

        \[ \overline{f(z)} = f (\overline{z}) \]

    for all z \in \mathbb{C}.

  2. Using part (a) show that if f has any nonreal zeros, they must occur in pairs of a complex number and its conjugate.

  1. Proof. Let

        \[ f(z) = a_0 + a_1 z + a_2 z^2 + \cdots + a_n z^n \qquad a_i \in \mathbb{R}. \]

    Then, using the properties of conjugation we have,

        \begin{align*}  \overline{f(z)} &= \overline{a_0 + a_1 z + \cdots + a_n z^n} \\  &= \overline{a_0} + \overline{a_1 z} + \cdots + \overline{a_n z^n} &(\text{since } \overline{z_1 + z_2} = \overline{z_1} + \overline{z_2}) \\  &= \overline{a_0} + \overline{a_1} \, \overline{z} + \cdots + \overline{a_n} \, \overline{z^n} &(\text{since } \overline{z_1 z_2} = \overline{z_1} \overline{z_2})\\  &= a_0 + a_1 \overline{z} + a_2 \overline{z^2} + \cdots + a_n \overline{z_n} &(\text{since } \overline{a} = a \text{ for } a \in \mathbb{R}) \\  &= a_0 + a_1 \overline{z} + a_2 (\overline{z})^2 + \cdots + a_n (\overline{z})^n \\  &= f(\overline{z}). \end{align*}

    (The final line follows since \overline{z^2} = \overline{z \cdot z} = \overline{z} \cdot \overline{z} = (\overline{z})^2 and then induction for to get \overline{z^n} = (\overline{z})^n for all n.) This completes the proof. \qquad \blacksquare

  2. Proof. If z is a non-real zero of f then

        \[ f(z) = 0 \quad \implies \quad \overline{f(z)} = 0 \quad \implies \quad f(\overline{z}) = 0, \]

    and z \neq \overline{z} (since z is not real by assumption). Hence, \overline{z} is also a zero of f, so the non-real zeros come in pairs. \qquad \blacksquare

Derive some properties of the product of ex with a polynomial

Let

    \[ f(x) = e^x p(x) \qquad \text{where} \qquad p(x) = c_0 + c_1 x + c_2x^2. \]

  1. Prove that

        \[ f^{(n)} (0) = c_0 + nc_1 + n(n-1)c_2 \]

    where f^{(n)} denotes the nth derivative of f.

  2. Do part (a) in the case that p(x) is a cubic polynomial.
  3. Find a similar formula and prove it in the case that p(x) is a polynomial of degree m.

For all of these we recall from a previous exercise (Section 5.11, Exercise #4) that by Leibniz’s formula if h(x) = f(x)g(x) then the nth derivative h^{(n)}(x) is given by

    \[ h^{(n)} (x) = \sum_{k=0}^n \binom{n}{k} f^{(k)} (x) g^{(n-k)}(x). \]

So, in the case at hand we have f(x) = e^x p(x) and so

    \[ f^{(n)} (x) = \sum_{k=0}^n \binom{n}{k} p^{(k)}(x) e^x. \]

(Since the (n-k)th derivative of e^x is still e^x for all n and k.)

  1. Proof. From the formula above we have

        \[ f^{(n)}(0) = \sum_{k=0}^n \binom{n}{k} p^{(k)}(0) e^0 = \sum_{k=0}^n \binom{n}{k} p^{(k)}(0). \]

    But, since p(x) is a quadratic polynomial we have

        \begin{align*}  p(x) &= c_0 + c_1 x + c_2 x^2 \\[9pt]  p'(x) &= c_1 + 2c_2 x \\[9pt]  p''(x) &= 2c_2 \\[9pt]  p^{(k)} (x) &= 0 & \text{for all } k \geq 3. \end{align*}

    Hence, we have

        \begin{align*}   f^{(n)}(0) &= \binom{n}{0} p(0) + \binom{n}{1} p'(0) + \binom{n}{2} p''(0) \\[9pt]  &= c_0 + n c_1 + \frac{n(n-1)}{2} 2c_2 \\[9pt]  &= c_0 + n c_1 + n(n-1) c_2. \qquad \blacksquare \end{align*}

  2. If p(x) is a cubic polynomial we may write,

        \[ p(x) = c_0 + c_1 x + c_2 x^2 + c_3 x^3. \]

    Claim: If f(x) = e^x p(x) then

        \[ f^{(n)}(0) = c_0 + nc_1 + n(n-1)c_2 + n(n-1)(n-2)c_3. \]

    Proof. We follow the exact same procedure as part (a) except now we have the derivatives of p(x) given by

        \begin{align*}  p(x) &= c_0 + c_1 x + c_2x^2 + c_3x^3 \\[9pt]  p'(x) &= c_1 + 2c_2 x + 3c_3 x^2 \\[9pt]  p''(x) &= 2c_2 + 6c_3 x \\[9pt]  p^{(3)}(x) &= 6c_3 \\[9pt]  p^{(k)}(x) &= 0 & \text{for all } k \geq 4.  \end{align*}

    Therefore, we now have

        \begin{align*}  f^{(n)}(0) &= \sum_{k=0}^n \binom{n}{k} p^{(k)}(0) e^0 \\[9pt]  &= \sum_{k=0}^n \binom{n}{k} p^{(k)}(0) \\[9pt]  &= \binom{n}{0} p(0) + \binom{n}{1} p'(0) + \binom{n}{2} p''(0) + \binom{n}{3} p'''(0) \\[9pt]  &= c_0 + nc_1 + \frac{n(n-1)}{2} 2c_2 + \frac{n(n-1)(n-2)}{6} 6c_3 \\[9pt]  &= c_0 + nc_1 + n(n-1)c_2 + n(n-1)(n-2)c_3. \qquad \blacksquare \end{align*}

  3. Claim: Let p(x) be a polynomial of degree m,

        \[ p(x) = \sum_{k=0}^m c_k x^k. \]

    Let f(x) = e^x p(x). Then,

        \[ f^{(n)}(0) = \sum_{k=0}^m k! \binom{n}{k} c_k. \]

    Proof. Using Leibniz’s formula again, we have

        \[ f^{(n)}(0) = \sum_{k=0}^m \binom{n}{k} p^{(k)}(0). \]

    But for the degree m polynomial p(x), we know p^{(k)}(0) = k!c_k if 0 \leq k \leq m and p^{(k)} (0) = 0 for all k > m. Hence, we have

        \[ f^{(n)}(0) = \sum_{k=0}^m k! \binom{n}{k} c_k. \qquad \blacksquare \]

Prove properties of the Bernoulli polynomials

The Bernoulli polynomials are defined by

    \[ P_0(x) = 1; \qquad P'_n(x) = n P_{n-1} (x) \qquad \text{and} \qquad \int_0^1 P_n (x) \, dx = 0 \qquad \text{if } n \geq 1. \]

  1. Find explicit formulas for the first Bernoulli polynomials in the cases n = 1,2,3,4,5.
  2. Use mathematical induction to prove that P_n(x) is a degree n polynomial in x, where the degree n term is x^n.
  3. For n \geq 2 prove that P_n (0) = P_n (1).
  4. For n \geq 1 prove that

        \[ P_n (x+1) - P_n (x) = nx^{n-1}. \]

  5. Prove that

        \[ \sum_{r=1}^{k-1} r^n = \int_0^k P_n (x) \, dx = \frac{P_{n+1}(k) - P_{n+1}(0)}{n+1} \]

    for n \geq 2.

  6. Prove that for n \geq 1,

        \[ P_n (1-x) = (-1)^n P_n (x). \]

  7. Prove that for n \geq 1,

        \[ P_{2n+1} (0) = 0 \qquad \text{and} \qquad P_{2n-1}\left( \frac{1}{2} \right) = 0. \]


  1. We start with the initial condition P_0(x) = 1. This gives us

        \[ P'_1 (x) = 1 \cdot P_0(x) = 1 \quad \implies \quad P_1 (x) = \int dx = x + C_1. \]

    Now, using the integral condition to find C_1,

        \[ \int_0^1 (x+C_1) \, dx = 0 \quad \implies \quad \left( \frac{1}{2}x^2 + C_1 x \right) \Bigr \rvert_0^1 = 0 \quad \implies \quad C_1  &= -\frac{1}{2}. \]

    Thus,

        \[ P_1 (x) = x -\frac{1}{2}. \]

    Next, using this expression for P_1(x) we have

        \[ P'_2 (x) = 2 \cdot \left( x- \frac{1}{2} \right) = 2x - 1 \quad \implies \quad P_2 (x) = x^2 - x + C_2. \]

    Using the integral condition to find C_2,

        \[ \int_0^1 (x^2 - x + C_2) \, dx = 0 \quad \implies \quad \frac{1}{3} - \frac{1}{2} + C_2 = 0 \quad \implies \quad C_2 = \frac{1}{6}. \]

    Thus,

        \[ P_2 (x) = x^2 - x + \frac{1}{6}. \]

    Next, using this expression for P_2 (x) we have

        \[ P'_3 (x) = 3 \cdot \left( x^2 - x + \frac{1}{6} \right) = 3x^2 - 3x + \frac{1}{2} \quad \implies \quad P_3(x) = x^3 - \frac{3}{2}x^2 + \frac{1}{2}x + C_3. \]

    Using the integral condition to find C_3,

        \[ \int_0^1 \left( x^3 - \frac{3}{2}x^2 + \frac{1}{2}x + C_3\right) \, dx = 0 \quad \implies \quad \frac{1}{4} - \frac{1}{2} + \frac{1}{4} + C_3 = 0 \quad \implies \quad C_3 = 0. \]

    Thus,

        \[ P_3 (x) = x^3 - \frac{3}{2}x^2 + \frac{1}{2} x. \]

    Next, using this expression for P_3 (x) we have

        \[ P'_4(x) = 4 \cdot \left( x^3 - \frac{3}{2}x^2 + \frac{1}{2}x \right) = 4x^3 - 6x^2 + 2x \quad \implies \quad P_4 (x) = x^4 - 2x^3 + x^2 + C_4. \]

    Using the integral condition to find C_4,

        \[ \int_0^1 \left( x^4 - 2x^3 +  x^2 + C_4 \right) \, dx = 0 \quad \implies \quad \frac{1}{5} - \frac{1}{2} + \frac{1}{3} + C_4 = 0 \quad \implies \quad C_4 = \frac{1}{30}. \]

    Thus,

        \[ P_4 (x) = x^4 - 2x^3 + x^2 - \frac{1}{30}. \]

    Finally, using this expression for P_4 (x) we have

        \[ P'_5(x) = 5 \cdot \left( x^4 - 2x^3 + x^2 - \frac{1}{30} \right) = 5x^4 - 10x^3 + 5x^2 - \frac{1}{6} \quad \implies \quad P_5 (x) = x^5 - \frac{5}{2} x^4 + \frac{5}{3}x^3 - \frac{1}{6}x + C_5. \]

    Using the integral condition to find C_5,

        \[ \int_0^1 \left( x^5 - \frac{5}{2} x^4 + \frac{5}[3} x^3 - \frac{1}{6} x + C_5 \right) \, dx = \frac{1}{6} - \frac{1}{2} + \frac{5}{12} - \frac{1}{12} + C_5 = 0 \quad \implies \quad C_5 = 0. \]

    Thus,

        \[ P_5 (x) = x^5 - \frac{5}{2}x^4 + \frac{5}{3}x^3 - \frac{1}{6}x. \]

  2. Proof. We have shown in part (a) that this statement is true for n = 0, 1, \ldots, 5. Assume then that the statement is true for some positive integer m, i.e.,

        \[ P_m (x) = x^m + \sum_{k=0}^{m-1} c_k x^k. \]

    Then, by the definition of the Bernoulli polynomials we have,

        \[ P_{m+1}' (x) = (m+1) \cdot \left( x^m + \sum_{k=0}^{m-1} a_k x^k \right) = (m+1)x^m + \sum_{k=0}^{m-1} b_k x^k, \]

    where b_k = (m+1)a_k for k = 1, \ldots, m-1. Then, taking the integral of this expression

        \[ P_{m+1} (x) = \int \left( (m+1)x^m + \sum_{k=0}^{m-1} b_k x^k \right) \, dx = x^{m+1} + \sum_{k=0}^m \frac{b_k}{k+1} x^{k+1}. \]

    Hence, the statement is true for the case m+1; hence, for all positive integers n. \qquad \blacksquare

  3. Proof. From the integral property in the definition of the Bernoulli polynomials we know for n \geq 1,

        \[ \int_0^1 P_n (x) \, dx = 0 \quad \implies \quad \int_0^1 (n+1) P_n (x) \, dx = 0. \]

    Then, using the first part of the definition we have P'_{n+1} (x) = (n+1) P_n (x); therefore,

        \[ 0 = \int_0^1 (n+1) P_n (x) \, dx = \int_0^1 P'_{n+1} (x) \, dx = P_{n+1}(1) - P_{n+1}(0). \]

    Thus, we indeed have

        \[ P_{n+1} (1) = P_{n+1}(0). \qquad \blacksquare \]

  4. Proof. The proof is by induction. For the case n = 1 we have

        \[ P_1 (x) = x - \frac{1}{2}, \qquad \text{and} \qquad P_1 (x+1) = x + \frac{1}{2}. \]

    Therefore,

        \[ P_1 (x+1) - P_1 (x) = 1. \]

    Since nx^{n-1} = 1 \cdot x^0 = 1, the stated difference equation holds for n =1. Assume then that the statement holds for some positive integer m. Then by the fundamental theorem of calculus, we have

        \[ P_{m+1} (x) = \int_0^x P'_{m+1}(t) \, dt = (m+1) \int_0^x P_m (t) \, dt. \]

    Therefore,

        \begin{align*}  P_{m+1}(x+1) - P_{m+1}(x) &= (m+1) \left( \int_0^{x+1} P_m (t) \, dt - \int_0^x P_m (t) \, dt \right) \\[9pt]  &= (m+1) \left( \int_0^1 P_m (t) \, dt + \int_1^{x+1} P_m (t) \, dt - \int_0^x P_m (t) \, dt \right) \\[9pt]  &= (m+1) \left( 0 + \int_0^x P_m (t+1) \, dt - \int_0^x P_m (t) \, dt \right) &( \text{Integral condition})\\[9pt]  &= (m+1) \left( \int_0^x (P_m (t+1) - P_m (t)) \, dt\right) \\[9pt]  &= (m+1) \left( \int_0^x mt^{m-1} \, dt \right)&(\text{Induction Hypothesis})\\[9pt]  &= (m+1) x^m.  \end{align*}

    Hence, the statement is true for the case m+1, and so it is true for all positive integers n. \qquad \blacksquare

  5. Proof. (Let’s assume Apostol means for k to be some positive integer.) First, we use the definition of the Bernoulli polynomials to compute the integral,

        \begin{align*}  \int_0^k P_n (x) \, dx &= \int_0^k \frac{1}{n+1} P'_{n+1} (x) \, dx \\[9pt]  &= \frac{P_{n+1}(k) - P_{n+1}(0)}{n+1}. \end{align*}

    Now, we want to express the numerator as a telescoping sum and use part (d),

        \begin{align*}   P_{n+1}(k) - P_{n+1}(0) &= \sum_{r = 1}^{k-1} \left( P_{n+1}(r + 1) - P_{n+1}(r) \right) \\[9pt]  &= \sum_{r=0}^{k-1} \left( (n+1)r^n \right) \\  &= (n+1) \sum_{r=1}^{k-1} r^n. \end{align*}

    Thus, we indeed have

        \[ \sum_{r=1}^{k-1} r^n = \int_0^k P_n (x) \, dx = \frac{P_{n+1}(k) - P_{n+1}(0)}{n+1}. \qquad \blacksquare\]

  6. Proof.

Incomplete. I’ll try to fix parts (f) and (g) soon(ish).

Find a polynomial satisfying given conditions

  1. Find a polynomial satisfying

        \[ P'(x) - 3P(x) = 4 - 5x + 3x^2. \]

    Prove that there is only one such polynomial.

  2. Given a polynomial Q(x), prove there is exactly one polynomial P(x) such that

        \[ P'(x) - 3P(x) = Q(x). \]


  1. Proof. (Finding the polynomial will prove that it is unique since we will not have any choices to make while deriving the polynomial P(x).) First, we write

        \begin{align*}   P(x) = \sum_{k=0}^n c_k x^k && \implies && P'(x) &= \sum_{k=1}^n c_k (k) x^{k-1} \\  &&&&& = \sum_{k=0}^{n-1} (k+1)c_{k+1} x^k. \end{align*}

    Thus, we have

        \[ P'(x) - 3P(x) = \sum_{k=0}^{n-1} (k+1)c_{k+1} x^k - \sum_{k=0}^n 3c_k x^k = \sum_{k=0}^{n-1} ((k+1)c_{k+1} - 3c_k)x^k - 3c_n x^n. \]

    Setting this equal to 4 - 5x + 3x^2 we have

        \[ \sum_{k=0}^{n-1} ((k+1)c_{k+1} - 3c_k)x^k - 3c_n x^n = 4 - 5x + 3x^2. \]

    But, this implies n = 2 and c_2 = -1 since -3c_n x^n is the only x^n term on the left (so if n > 2, then we couldn’t have x^2 the largest power of x on the right). Therefore, P(x) is a degree polynomial and with c_2 = -1, so we have

        \[ P(x) = c_0 + c_1 x - x^2 \quad \implies \quad P'(x) = c_1 - 2x. \]

    Hence we have

        \begin{align*}  P'(x) - 3P(x) = 4 - 5x + 3x^2 && \implies && c_1 - 2x - 3c_0 - 3c_1 x + 3x^2 &= 4 - 5x + 3x^2 \\  && \implies && (c_1 - 3c_0) - (2+3c_1)x &= 4 - 5x. \end{align*}

    Thus we have the equations

        \[ c_1 - 3c_0 = 4 \quad \text{and} \quad 2+3c_1 = 5.\]

    These uniquely determine c_0 and c_1,

        \[ c_1 = 1,\quad c_0 = -1. \]

    Hence, there is a unique P(x) satisfying this equation,

        \[ P(x) = -1 + x - x^2.  \qquad \blacksquare \]

  2. Proof. Let Q(x) be a given polynomial and suppose there exist two polynomials P_1 (x) and P_2(x) such that

        \[ P'(x) - 3P(x) = Q(x) \quad \text{and} \quad R'(x) - 3R(x) = Q(x). \]

    This implies

        \[ (P'(x) - R'(x)) - 3(P(x) - R(x)) = 0 \quad \implies \quad (P(x) - R(x))' - 3(P(x) - R(x)) = 0. \]

    Now, if P(x) - R(x) \neq 0 then it is of degree n for some n \geq 1. We know its derivative has degree n-1 (Apostol, Page 166). But then, this would imply

        \[ (P(x) - R(x))' - 3(P(x) - R(x)) \]

    has degree n (since the coefficient of x^n in (P(x) - R(x))' is zero since it is degree n-1, and the coefficient of 3(P(x) - R(x)) is nonzero since it has degree n). But we know this difference is 0, which means it cannot have degree n for any n \geq 1. Thus, we must have P(x) - R(x) = 0 or P(x) = R(x). \qquad \blacksquare

Find a quintic polynomial meeting given conditions

Find a polynomial P of degree \leq 5 satisfying the following conditions:

    \begin{align*}  P(0) &= 1, \\  P(1) &= 2, \\  P'(0) &= P''(0) = P'(1) = P''(1) = 0. \end{align*}


Since P must be a polynomial of degree \leq 5 we may write

    \[ P(x) = a_5 x^5 + a_4 x^4 + a_3 x^3 + a_2 x^2 + a_1 x + a_0 \]

where any of the a_i may be 0 (since we could have P a polynomial of degree strictly less than 5). First, let’s apply the condition P(0) = 1 to obtain

    \[ P(0) = a_0 = 1. \]

Now, let’s take the first two derivatives since we have conditions on P' and P''.

    \begin{align*}  P(x) &= a_5 x^5 + a_4 x^4 + a_3 x^3 + a_2 x^2 + a_1 x + 1 \\  P'(x) &= 5a_5 x^4 + 4a_4 x^3 + 3a_3 x^2 + 2a_2 x + a_1 \\  P''(x) &= 20a_5 x^3 + 12a_4 x^2 + 6a_3 x + 2a_2. \end{align*}

We can then apply the conditions P'(0) = 0 and P''(0) = 0 to obtain

    \begin{align*}  P'(0) &= a_1 = 0 \\  P''(0) &= 2a_2 = 0. \end{align*}

So now we have a_0 = 1 and a_1 = a_2 = 0 and so

    \[ P(x) = a_5 x^5 + a_4 x^4 + a_3 x^3 + 1. \]

Now we need to use the other three conditions

    \begin{align*}  P(1) &= 2 & \implies && a_5 + a_4 + a_3 &= 1 \\  P'(1) &= 0 & \implies && 5a_5 + 4a_4 + 3a_3 &= 0 \\  P''(1) &= 0 & \implies && 20a_5 + 12a_4 + 6a_3 &= 0. \end{align*}

(If you know some linear algebra feel free to solve this in a more efficient way.) From the first equation we have

    \[ a_3 = 1 - a_4 - a_5. \]

Plugging this into the second equation we have

    \[ 5a_5 + 4a_4 + 3(1 - a_4 - a_5) = 0 \quad \implies \quad 2a_5 + a_4 + 3 = 0 \quad \implies \quad a_4 = -3-2a_5. \]

Now plugging in our expressions for a_3 and a_4 into the third equation we have

    \[ 20a_5 + 12(-3-2a_5) + 6(1-(-3-2a_5)-a_5) = 0 \quad \implies \quad a_5 = 6. \]

Then using our expressions for a_3 and a_4 we have

    \begin{align*}  a_4 &= -15 \\  a_3 &= 10. \end{align*}

Now, we have computed all of the constants a_i so we can write down the formula for the polynomial

    \[ P(x) = 6x^5 - 15x^4 + 10x^3 + 1. \]

Compute the derivatives of g(x) = xn f(x)

Assume f is a polynomial with f(0) = 1. Define

    \[ g(x) = x^n f(x). \]

Compute the values of g(0), g'(0), \ldots, g^{(n)}(0).


Assume n is a non-negative integer (otherwise g is undefined at x = 0). Then, we make the following claim:

Claim: The polynomial g(x) has derivatives at 0 given by the following

    \[ g^{(k)}(0) = \begin{cases} 0 & \text{if } 0 \leq k < n \\ n! & \text{if } k =n. \end{cases} \]

Proof. Since f is a polynomial we may write,

    \[ f(x) = a_m x^m + a_{m-1} x^{m-1} + \cdots + a_1 x + a_0. \]

Furthermore, since f(0) = 1 is given we know a_0 = 1. Now, multiplying by x^n we have

    \begin{align*}  g(x) &= x^n f(x) = x^n (a_m x^m + \cdots + a_1 x + 1) \\  &= a_m x^{m+n} + \cdots + a_1 x^{n+1} + x^n. \end{align*}

Next, we will use induction to prove that the kth derivative of g(x) for 0 \leq k < n is given by

    \[ g^{(k)}(x) = c_m x^{m+n-k} + c_{m-1} x^{m+n-k-1} + \cdots + c_1 x^{n+1-k} + \frac{n!}{\cdot (n-k)!} x^{n-k},\]

for constants c_1, \ldots, c_m. Since the derivative g'(x) is given by

    \begin{align*}   g'(x) &= (m+n) a_m x^{m+n-1} + (m+n-1) a_{m-1} x^{m+n-2} + \cdots + (n+1) a_1 x^n + n x^{n-1} \\  &= b_m x^{m+n-1} + b_{m-1} x^{m+n-2} + \cdots + b_1 x^n + \frac{n!}{(n-1)!}x^{n-1}, \end{align*}

for constants b_1, \ldots, b_m, we see that the formula holds for k = 1. Assume then that it holds for some k,

    \[ g^{(k)}(x) = b_m x^{m+n-k} + b_{m-1} x^{m+n-k-1} + \cdots + b_1 x^{n+1-k} + \frac{n!}{(n-k)!}x^{n-k}. \]

Then, taking the derivative of this we have,

    \begin{align*}   g^{(k+1)}(x) &= b_m (m+n-k) x^{m+n-k-1} + b_{m-1} (m+n-k-1) x^{m+n-k-2} + \cdots + b_1 (n+1-k) x^{n+1-k-1} + \frac{n!}{(n-k)!} (n-k) x^{n-k-1)} \\  &= c_m x^{m+n-(k+1)} + c_{m-1} x^{m+n-(k+1)-1} + \cdots + c_1 x^{n+1-(k+1)} + \frac{n!}{(n-(k+1))!} x^{n-(k+1)}. \end{align*}

Hence, the formula holds for all 0 \leq k \leq n. But then, if 0 \leq k < n we have

    \begin{align*}   &&g^{(k)}(x) &= b_m x^{m+n-k} + b_{m-1} x^{m+n-k-1} + \cdots + b_1 x^{n+1-k} + \frac{n!}{(n-k)!}x^{n-k} \\ \implies g^{(k)}(0) &= 0. \end{align*}

If k =n then x^{n-k} = x^0 = 1 for all x; hence,

    \[ g^{(n)}(0) &= \frac{n!}{0!} x^0 = n!. \qquad \blacksquare\]

Prove there is no polynomial with derivative 1/x

Prove that there is no polynomial f such that

    \[ f'(x) = \frac{1}{x}. \]


Proof. We know from Example 1 of Section 4.5 in Apostol (p. 166) that every polynomial is differentiable everywhere on \mathbb{R}. (In that example we show that the derivative of a polynomial is a polynomial, and we know that polynomials are defined everywhere on \mathbb{R}.) However, the function \frac{1}{x} is not defined for x = 0. Hence, this function cannot be the derivative of a polynomial. \qquad \blacksquare

Prove properties about the zeros of a polynomial and its derivatives

Consider a polynomial f. We say a number \alpha is a zero of multiplicity m if

    \[ f(x) = (x - \alpha)^m g(x), \]

where g(\alpha) \neq 0.

  1. Prove that if the polynomial f has r zeros in [a,b], then its derivative f' has at least r-1 zeros in [a,b]. More generally, prove that the kth derivative, f^{(k)} has at least r-k zeros in the interval.
  2. Assume the kth derivative f^{(k)} has exactly r zeros in the interval [a,b]. What can we say about the number of zeros of f in the interval?

  1. Proof. Let \alpha_1, \ldots, \alpha_k denote the k distinct zeros of f in [a,b] and m_1, \ldots, m_k their multiplicities, respectively. Thus, the total number of zeros is given by,

        \[ r = \sum_{i=1}^k m_i. \]

    By the definition given in the problem, if \alpha_i is a zero of f of multiplicity m_i then

        \[ f(x) = (x - \alpha_i)^{m_i} g(x) \qquad \text{where } g(x) \neq 0. \]

    Taking the derivative (using the product rule), we have

        \begin{align*}   f'(x) &= m_i (x - \alpha_i)^{m_i - 1} g(x) + (x- \alpha_i)^{m_i} g'(x)\\  &= (x - \alpha_i)^{m_i - 1} (m_i g(x) + (x - \alpha_i) g'(x))  \end{align*}

    Thus, again using the definition given in the problem, \alpha_i is a zero of f'(x) of multiplicity m_i -1.
    Next, we know from the mean-value theorem for derivatives, that for distinct zeros \alpha_i and \alpha_j of f there exists a number c \in [\alpha_i, \alpha_j] (assuming, without loss of generality, that \alpha_i < \alpha_j) such that f'(c) = 0. Hence, if f has k distinct zeros, then the mean value theorem guarantees k-1 numbers c such that f'(c) = 0. Thus, f' has at least:

        \[ \left(\sum_{i=1}^k (m_i -1)\right) + k - 1 = \left( \sum_{i=1}^k m_i \right) - 1 = r-1 \ \ \text{zeros}. \]

    By induction then, the kth derivative f^{(k)}(x) has at least r-k zeros.

  2. If the kth derivative f^{(k)} has exactly r zeros in [a,b], then we can conclude that f has at most r+k zeros in [a,b].

Prove there is exactly one negative solution to an equation

Show there is exactly one b < 0 such that b^n = a for a < 0 and n an odd, positive integer.


Proof. Let f(x) = x^n, and let c < -1 with c < a < 0. Then, c^n < c (for odd n) so c^n < c < a < 0. Since f(0) = 0 and f(c) = c^n, by the Intermediate Value Theorem, we know f(x) takes every value between c^n and 0 for some x \in [c,0]. Thus, we know there exists b \in [c,0] such that f(b) = a (since c^n < a < 0). This implies b^n = a for some b \in [c,0].

We know this solution is unique since f is strictly increasing on the whole real line for odd n. \qquad \blacksquare