Home » Blog » Prove that if the integral of a function is zero, then it is zero for at least one point

Prove that if the integral of a function is zero, then it is zero for at least one point

Let f be a continuous function on an interval [a,b]. (Assume b > a.) Then, if

    \[ \int_a^b f(x) \, dx = 0, \]

prove that f(c) = 0 for at least one c \in [a,b].


Proof. Since f is continuous on the interval [a,b] we may apply the mean value theorem (Theorem 3.15 in Apostol) to conclude

    \[ \int_a^b f(x) \, dx = f(c) (b-a) \qquad \text{for some } c \in [a,b]. \]

So,

    \[ \int_a^b f(x) \, dx = 0 \quad \implies \quad f(c) (b-a) = 0. \]

But since b > a, we know (b-a) \neq 0; hence, we must have f(c) = 0 for some c \in [a,b]. \qquad \blacksquare

Point out an error, ask a question, offer an alternative solution (to use Latex type [latexpage] at the top of your comment):