Home » Mean Value Theorem

Tag: Mean Value Theorem

Prove the integral from 0 to x of sin t /(t2 + 1) is positive for positive x

Prove that for all x \geq 0 we have

    \[ \int_0^x \frac{\sin t}{t+1} \, dt \geq 0. \]


Before proceeding with the proof, we recall the second mean-value theorem for integrals (Theorem 5.5 on p. 219 of Apostol). For a continuous function g on the interval [a,b] if f has a continuous derivative which never changes sign on the interval [a,b] then there exists a c \in [a,b] such that

    \[ \int_a^b f(x) g(x) \, dx = f(a) \int_a^c g(x) \, dx + f(b) \int_c^b g(x) \, dx. \]

Proof. Now, we want to apply the mean-value theorem above with g(t) = \sin t and f(t) = \frac{1}{t+1}. Since \sin t is continuous everywhere \mathbb{R}, it is continuous on any interval [0,x]. Then,

    \[ f'(t) = \frac{-1}{(1+t)^2} \]

is continuous for all t \neq -1 (so, in particular, for all t \geq 0). Furthermore, since (1+t)^2 > 0 for all t \geq 0 we have that f'(t) < 0 for all t \geq 0. Thus, f'(t) is continuous and never changes sign in any interval [0,x]. Therefore, we can apply the mean-value theorem to conclude there exists a c \in [0,x] for any x \geq 0 such that

    \begin{align*}  \int_0^x \left(\frac{1}{1+t}\right) \sin t \, dt &= \frac{1}{1+0} \int_0^c \sin t \, dt + \frac{1}{1+1} \int_c^x \sin t \, dt \\  &= (- \cos t) \Big \rvert_0^c + \frac{1}{2} (-\cos t) \Big \rvert_c^x \\  &= 1 - \cos c + \frac{1}{2} \cos c - \frac{1}{2} \cos x \\  &= 1 - \frac{1}{2} ( \cos c + \cos x). \end{align*}

But since \cos x \leq 1 for all x we know \cos c + \cos x \leq 2 for any c and x. Hence,

    \[ \int_0^x \frac{\sin t}{1+t} \, dt = 1 - \frac{1}{2}(\cos c + \cos x) \geq 0. \qquad \blacksquare \]

Application of the mean-value theorem for integrals

  1. Let \varphi be a function with second derivative \varphi'' continuous and nonzero on an interval [a,b]. Furthermore, let m > 0 be a constant such that

        \[ \varphi'(t) \geq m \qquad \text{for all } t \in [a,b]. \]

    Use the second mean-value theorem for integrals (Theorem 5.5 in Apostol) to prove the inequality

        \[ \left| \int_a^b \sin \varphi(t) \, dt \right| \leq \frac{4}{m}. \]

  2. If a > 0 prove that

        \[ \left| \int_a^x \sin (t^2) \, dt \right| \leq \frac{2}{a} \qquad \text{for } x > a. \]


  1. Proof. Since we have the assumption that \varphi'(t) \geq m > 0 for all t \in [a,b] we may divide by \varphi'(t), to obtain

        \[ \left| \int_a^b \sin \varphi(t) \, dt \right| = \left| \int_a^b \frac{\sin \varphi(t)}{\varphi'(t)} \cdot \varphi'(t) \, dt \right|. \]

    Then, to apply the second mean-value theorem for integrals (Theorem 5.5 of Apostol) we define functions

        \[ f(t) = \frac{1}{\varphi'(t)} \qquad \text{and} \qquad g(t) = \varphi'(t) \sin \varphi(t). \]

    The function g is continuous since \sin \varphi(t) is a composition of continuous functions (we know \varphi(t) is continuous since it is differentiable) and \varphi'(t) is continuous (again, it is differentiable since \varphi''(t) exists and is continuous by assumption). Then the product of continuous function is also continuous, which establishes that g is continuous. We also know that f meets the conditions of the theorem since it has derivative given by

        \[ f'(t) = - \frac{\varphi''(t)}{(\varphi'(t))^2}. \]

    This derivative is continuous since \varphi''(t) and \varphi'(t) are continuous and \varphi'(t) is nonzero. Furthermore, this derivative does not change since on [a,b] since \varphi''(t) is nonzero on [a,b] (and by Bolzano’s theorem we know that a continuous function that changes sign must have a zero). Therefore, we can apply the second mean-value theorem:

        \begin{align*}  \left| \int_a^b \frac{\varphi'(t) \sin \varphi(t)}{\varphi'(t)} \, dt \right| &= \left| \frac{1}{\varphi'(a)} \int_a^c \varphi'(t) \sin \varphi(t) \, dt + \frac{1}{\varphi'(b)} \int_c^b \varphi'(t) \sin \varphi(t) \, dt \right| \\  &\leq \left| \frac{1}{m} \int_a^c \varphi'(t) \sin \varphi(t) \, dt + \frac{1}{m} \int_c^b \varphi'(t) \sin \varphi(t) \, dt \right| \\  &\leq \left| -\frac{1}{m} \cos \varphi(t) \Big \rvert_a^c + \left(-\frac{1}{m} \right) \cos \varphi(t) \Big \rvert_c^b \right| \\  & \leq \left| -\frac{1}{m} \cos \varphi(t) \Big \rvert_a^c \right| + \left| -\frac{1}{m} \cos \varphi(t) \Big \rvert_c^b \right|  \\ \intertext{(by the Triangle Inequality)}  &\leq \left| -\frac{2}{m} \right| + \left| -\frac{2}{m} \right| \\  & = \frac{2}{m} + \frac{2}{m} \\  &= \frac{4}{m}. \qquad \blacksquare \end{align*}

  2. Proof. Using part (a), we take \varphi(t) = t^2, giving us

        \[ \varphi(t) = t^2 \quad \implies \quad \varphi'(t) = 2t \quad \implies \quad \varphi''(t) = 2. \]

    Thus, \varphi''(t) is continuous and never changes sign. Furthermore, \varphi' \geq m = 2a (where a is a given constant) and 2a > 0 since a> 0. Thus,

        \[ \left| \int_a^x \sin (t^2) \, dt \right| \leq \frac{4}{2a} = \frac{2}{a} \qquad \text{for all } x > a. \qquad \blacksquare \]

Find a function with continuous second derivative satisfying given conditions

In each of the following cases find a function f with continuous second derivative f'' satisfying the given conditions.

  1. f''(x) > 0 for all x, f'(0) = 1, and f'(1) = 0.
  2. f''(x) > 0 for all x, f'(0) = 1, and f'(1) = 3.
  3. f''(x) > 0 for all x, f'(0) = 1, and f(x) \leq 100 for all x > 0.
  4. f''(x) > 0 for all x, f'(0) = 1, and f(x) \leq 100 for all x < 0.

  1. There can be no function meeting all of these conditions since f''(x) > 0 implies f'(x) is an increasing function (since its derivative, f'', is positive). But then f'(0) > f'(1) contradicts that f'(x) is increasing.
  2. Let f(x) = x^2 + x. Then

        \begin{align*}    f'(x) = 2x + 1 && \implies  f'(0) &= 1, \\  && \implies f'(1) &= 3 \end{align*}

    Furthermore, f''(x) = 2 > 0 for all x.

  3. There can be no function meeting all of these conditions. Again, f''(x) > 0 for all x implies that f'(x) is increasing for all x. Therefore, f'(0) = 1 implies f'(x) > 1 for all x > 0. Then, by the mean-value theorem, we know that for any b > 0 there exists some c \in (0,b) such that

        \[ f(b) - f(0) = f'(c) (b-0) \quad \implies \quad f(b) > b + f(0) \qquad (\text{since } f'(c) > 1). \]

    Now, choose b > 100 - f(0). Then, f(b) > 100, contradicting that f(x) \leq 100 for all x.

  4. We’ll define f piecewise as follows

        \[ f(x) = \begin{dcases} 1 + x  + x^2 & \text{if } x \geq 0 \\ \frac{1}{1-x} & \text{if } x < 0. \end{dcases} \]

    Then, we can take the derivative of each piece (and see that they are equal, so the derivative is defined at x = 0)

        \[ f'(x) = \begin{dcases} 1 + 2x &\text{if } x \geq 0 \\ \frac{1}{(1-x)^2} & \text{if } x < 0. \end{dcases} \]

    Taking the derivative again we find

        \[ f''(x) = \begin{dcases} 2 & \text{if } x \geq 0 \\ \frac{2}{(1-x)^3} & \text{if } x < 0. \end{dcases} \]

    Thus, f''(x) > 0 for all x and f'(0) = 1. Furthermore, for x < 0 we have

        \[ f(x) = \frac{1}{1-x} \leq 100 \text{ for all } x < 0. \]

Prove some inequalities using the mean value theorem

Prove the following inequalities using the mean value theorem:

  1. | \sin x - \sin y| \leq |x - y|.
  2. ny^{n-1} (x-y) \leq x^n - y^n \leq nx^{n-1} (x-y) \qquad \text{if } 0 < y \leq x, for n = 1, 2, 3, \ldots.

  1. Proof. Define f(t) = \sin t and g(t) = t. Then f and g are continuous and differentiable everywhere so we may apply the mean value theorem. We obtain

        \begin{align*}  &&f'(c) (g(x) - g(y)) &= g'(c) (f(x) - f(y))  & (\text{for some } c \in (x,y))\\ \implies && (\cos c) (x-y) &= \sin x - \sin y \\ \implies && | \cos c | | x-y| &= | \sin x - \sin y| \\ \implies && |x-y| &\geq | \sin x - \sin y|. \end{align*}

    The final step follows since | \cos c | \leq 1 for all c. \qquad \blacksquare

  2. Proof. Let f(t) = t^n, g(t) = t, then f'(t) = nt^{n-1} and g'(t) = 1. So, by the mean-value theorem we have there exists a c \in (x,y) such that,

        \begin{align*}  && f'(c) (g(x) - g(y)) &= g'(c)(f(x) - f(y)) \\ \implies && nc^{n-1}(x-y) &= x^n - y^n. \end{align*}

    But, since x^{n-1} is an increasing function on the positive real axis, and we have 0 < y \leq c \leq x we know

        \[ y^{n-1} \leq c^{n-1} \leq x^{n-1}. \]

    Further, since (x-y) \geq 0 and n is positive we can multiply all of the terms in the equality by n (x-y) without reversing inequalities to obtain,

        \[ ny^{n-1}(x-y) \leq nc^{n-1} (x-y) \leq nx^{n-1} (x-y). \]

    Therefore, substituting nc^{n-1} (x-y) = x^n - y^n from above we obtain the requested inequality:

        \[ ny^{n-1} (x-y) \leq x^n - y^n \leq nx^{n-1} (x-y). \qquad \blacksquare \]

Prove properties about the zeros of a polynomial and its derivatives

Consider a polynomial f. We say a number \alpha is a zero of multiplicity m if

    \[ f(x) = (x - \alpha)^m g(x), \]

where g(\alpha) \neq 0.

  1. Prove that if the polynomial f has r zeros in [a,b], then its derivative f' has at least r-1 zeros in [a,b]. More generally, prove that the kth derivative, f^{(k)} has at least r-k zeros in the interval.
  2. Assume the kth derivative f^{(k)} has exactly r zeros in the interval [a,b]. What can we say about the number of zeros of f in the interval?

  1. Proof. Let \alpha_1, \ldots, \alpha_k denote the k distinct zeros of f in [a,b] and m_1, \ldots, m_k their multiplicities, respectively. Thus, the total number of zeros is given by,

        \[ r = \sum_{i=1}^k m_i. \]

    By the definition given in the problem, if \alpha_i is a zero of f of multiplicity m_i then

        \[ f(x) = (x - \alpha_i)^{m_i} g(x) \qquad \text{where } g(x) \neq 0. \]

    Taking the derivative (using the product rule), we have

        \begin{align*}   f'(x) &= m_i (x - \alpha_i)^{m_i - 1} g(x) + (x- \alpha_i)^{m_i} g'(x)\\  &= (x - \alpha_i)^{m_i - 1} (m_i g(x) + (x - \alpha_i) g'(x))  \end{align*}

    Thus, again using the definition given in the problem, \alpha_i is a zero of f'(x) of multiplicity m_i -1.
    Next, we know from the mean-value theorem for derivatives, that for distinct zeros \alpha_i and \alpha_j of f there exists a number c \in [\alpha_i, \alpha_j] (assuming, without loss of generality, that \alpha_i < \alpha_j) such that f'(c) = 0. Hence, if f has k distinct zeros, then the mean value theorem guarantees k-1 numbers c such that f'(c) = 0. Thus, f' has at least:

        \[ \left(\sum_{i=1}^k (m_i -1)\right) + k - 1 = \left( \sum_{i=1}^k m_i \right) - 1 = r-1 \ \ \text{zeros}. \]

    By induction then, the kth derivative f^{(k)}(x) has at least r-k zeros.

  2. If the kth derivative f^{(k)} has exactly r zeros in [a,b], then we can conclude that f has at most r+k zeros in [a,b].

Prove an alternate expression for the mean-value formula

Prove that the expression

    \[ f(x+h) = f(x) + hf'(x+ \theta h) \qquad \text{where } 0 < \theta < 1 \]

is an equivalent form of the mean-value theorem.
Find the value of \theta in terms of x and h when:

  1. f(x) = x^2;
  2. f(x) = x^3.

For parts (a) and (b) keep x fixed with x \neq 0 and find the limit of \theta as h tends to 0.


Proof. If f is continuous on [a,b] and differentiable on (a,b), then by the mean-value theorem we have

    \[ f(b) - f(a) = f'(c) (b-a) \qquad \text{for some } c \in [a,b]. \]

Letting a = x and b = x + h for some h > 0 (since b > a), we have

    \[ c \in [a,b] \quad \implies \quad c = x + \theta h \qquad \text{for some } \theta \in (0,1). \]

(This follows since from our definitions, h is the distance from b-a. Then, since c is somewhere in the interval [a,b] its value must be a plus some portion of the distance to b. This portion is then \theta, which is how we know 0 < \theta < 1.) Substituting x = a and x + h = b and c = x + \theta h,

    \begin{align*}  f(b) - f(a) = f'(c) (b-a) && \implies && f(x+h) - f(x) &= f'(x+ \theta h)(x+h-x) \\  && \implies && f(x+h) &= f(x) + h f'(x + \theta h), \end{align*}

where 0 < \theta < 1 and h > 0. \qquad \blacksquare

Now for parts (a) and (b).

  1. If f(x) = x^2, we have f'(x) = 2x, so,

        \begin{align*}  f(x+h) = f(x) + hf'(x+ \theta h) && \implies && (x+h)^2 &= x^2 + 2h (x + \theta h) \\  && \implies && h^2 &= 2 \theta h^2 \\  && \implies && \theta &= \frac{1}{2} \\  && \implies && \lim_{h \to 0} \theta = \frac{1}{2}.  \end{align*}

  2. If f(x) = x^3, we have f'(x) = 3x^2. So,

        \begin{align*}  &&f(x+h) &= f(x) + hf'(x+ \theta h) \\  \implies && (x^3 + 3x^2 h + 3xh^2 + h^3) &= x^3 + 3hx^2 + 6 \theta x h^2 + 3 \theta^2 h^3. \\  \implies && 0 &= (3h^3)\theta^2 + (6xh^2) \theta - (3xh^2 + h^3) \\  \implies && 0 &= h \theta^2 + 2x \theta + \left( -x - \frac{h^2}{3} \right) \\ \intertext{Using the quadratic formula to solve for $\theta$ and noting that since $0 < \theta < 1$ only the positive root is possible, we continue computing...}  \implies && \theta &= \frac{-2x + \sqrt{4x^2 + 4hx + 4h^2/3}}{2h} \\  \implies && \theta &= \frac{\sqrt{x^2 + xh + h^2/3} - x}{h} \\  \implies && \theta &= \frac{ \left( \sqrt{x^2 + xh + h^2/3} - x \right) \left( \sqrt{x^2 + xh + h^2/3} + x \right)}{h \left( \sqrt{x^2 + hx + h^2/3} + x \right)} \\  \implies && \theta &= \frac{x^2 + xh + \frac{h^2}{3} - x^2}{h \left( \sqrt{x^2 + hx + h^2/3} + x \right)} \\  \implies && \theta &= \frac{x + \frac{h}{3}}{x + \sqrt{x^2 + xh + h^2/3}} \\  \implies && \lim_{h \to 0} \theta &= \frac{1}{2}. \end{align*}

Show that x^2 = x*sin x + cos x for exactly two real numbers x

Consider the equation

    \[ x^2 = x \sin x + \cos x. \]

Show that there are two values of x \in \mathbb{R} such that the equation is satisfied.


Proof. Let f(x) = x^2 - x \sin x - \cos x. (We want then to find the zeros of this function since these will be the points that x^2 = x \sin + \cos x.) Then,

    \[ f'(x) = 2x - \sin x - x \cos x + \sin x = x(2 - \cos x). \]

Since 2 - \cos x \neq 0 for any x (since \cos x \leq 1), we have

    \[ f'(x) = 0 \quad \iff \quad x = 0. \]

Then, f is continuous and differentiable everywhere, so we may apply Rolle’s theorem on any interval. So, by Rolle’s theorem we know f has at most two zeros (if there were three or more, say x_1, x_2, and x_3, then there must be distinct numbers c_1 and c_2 with x_1 < c_1 < x_2 < c_2 < x_3 such that f'(c_1) = f'(c_2) = 0, but we know there is only one c such that f'(c) = 0).
Furthermore, f has at least two zeros since f(-\pi) = \pi^2 + 1 > 0, f(0) = -1 < 0, and f(\pi) = \pi^2 + 1 > 0. Thus, by Bolzano’s theorem there are zeros between each of these points. We have that the number of zeros of f is at most two and at least 2. Hence, the number of zeros must be exactly two. \qquad \blacksquare

Explain why the absence of a zero does not violate Rolle’s theorem

Consider the function

    \[ f(x) = 1 - x^{\frac{2}{3}}. \]

Show that f(1) = 0 and f(-1) = 0, but that the derivative f'(x) \neq 0 for all x \in [-1,1]. Explain why this does not violate Rolle’s theorem.


Proof. First, we show that f(1) = 0 and f(-1) = 0 by a direct computation:

    \begin{align*}    f(1) = 1 - (1)^{\frac{2}{3}} = 1 - 1 &= 0, \\   f(-1) = 1 - (-1)^{\frac{2}{3}} = 1 - (1^2)^\frac{1}{3} = 1 - 1 &= 0. \end{align*}

Then, we compute the derivative,

    \[ f(x) = 1 - x^{\frac{2}{3}} \quad \implies \quad f'(x) = -\frac{2}{3} x^{-\frac{1}{3}}. \]

To show f'(x) \neq 0 for any x \in [-1,1] we consider three cases:

  • If x < 0 then x^{-\frac{1}{3}} < 0 implies f'(x) > 0 (since -\frac{2}{3} times a negative is positive).
  • If x > 0 then x^{-\frac{1}{3}} > 0 implies f'(x) < 0 (since -\frac{2}{3} times a positive is then negative).
  • If x = 0, then f'(x) is undefined (since x^{-\frac{1}{3}} = \frac{1}{x^{1/3}}).

Thus, f'(x) \neq 0 for any x \in [-1,1]. \qquad \blacksquare

This is not a violation of Rolle’s theorem since the theorem requires that f(x) be differentiable for all x on the open interval (-1,1). Since f'(x) is not defined at x = 0, we have f(x) is not differentiable on the whole interval. Hence, Rolle’s theorem does not apply.

Show a function satisfies the hypotheses of the mean value theorem

Let

    \[ f(x) = \begin{dcases} \frac{3-x^2}{2} & \text{if } x \leq 1, \\ \frac{1}{x} & \text{if } x \geq 1. \end{dcases} \]

  1. Draw the graph of f for x \in [0,2].
  2. Show that the hypotheses of the mean value theorem are satisfied on [0,2] and find the mean values the theorem provides.

  1. The sketch is as follows:

    Rendered by QuickLaTeX.com

  2. Since \frac{3-x^2}{2} and \frac{1}{x} are continuous on [0,1] and [1,2], respectively and are differentiable on (0,1) and (1,2), the only point at which this piecewise function might be discontinuous or non differentiable is at x = 1.

    At x = 1 we have

        \[ \lim_{x \to 1^-} \frac{3-x^2}{2} = 1, \qquad \lim_{x \to 1^+} \frac{1}{x} = 1. \]

    Thus, the left and right-hand limits are equal, so the limit exists and equals the function value. Therefore, f is continuous at x = 1. Finally, we must check that the derivative exists at x = 1. Since the derivative of \frac{3-x^2}{2} = -x and the derivative of \frac{1}{x} = -\frac{1}{x^2}, both are equal to -1 at x = 1. Hence, the derivative exists.

    Now, we have met the conditions of the mean-value theorem, so we can apply the theorem to conclude,

        \begin{align*}  f(2) - f(0) = f'(c)\cdot 2 &\implies \frac{1}{2} - \frac{3}{2} = 2 f'(c) \\  &\implies f'(c) = -\frac{1}{2}. \end{align*}

    Further, we know

        \[ f'(x) = \begin{dcases} -x & \text{if } x < 1 \\  -\frac{1}{x^2} & \text{if } x \geq 1. \end{dcases} \]

    So,

        \[ f'(c) = -c = -\frac{1}{2} \quad  \implies \quad c = \frac{1}{2} \]

    and

        \[ f'(c) = -\frac{1}{c^2} = -\frac{1}{2} \quad \implies \quad c = \sqrt{2} \qquad (\text{since } -\sqrt{2} \notin [1,2].) \]

Prove a property of the integral of the product of continuous functions

Let f be a continuous function on the interval [a,b] and assume

    \[ \int_a^b f(x) g(x) \, dx = 0 \]

for every function g which is continuous on the interval [a,b]. Prove that f(x) = 0 for all x \in [a,b].


Proof. Since \int_a^b f(x) g(x) \, dx = 0 must hold for every function g that is continuous on [a,b], it must hold for f itself (since f is continuous on [a,b] by hypothesis). Therefore we must have,

    \[ \int_a^b f(x) f(x) \, dx = \int_a^b (f(x))^2 \, dx = 0 \]

However, (f(x))^2 \geq 0 for all x \in [a,b]. We know from the previous exercise (Section 3.20, #7) that a non-negative function whose integral is zero on an interval must be zero at every point at which it is continuous. By hypothesis f is continuous at every point of [a,b]; hence, f^2 is also continuous at every point of [a,b] (since the product of continuous functions is continuous). Therefore,

    \[ (f(x))^2 = 0 \qquad \text{for all } x \in [a,b]. \]

Which implies,

    \[ f(x) = 0 \qquad \text{for all } x \in [a,b]. \qquad \blacksquare\]