Relative Maximum

DEFINITION OF RELATIVE MAXIMUM. A function $f$ is said to have a relative maximum at a point $c$ if there is some $\epsilon >0$ such that
MATH
for all $x$ satisfying MATH

The concept of relative minimum is similarly defined by reversing the inequality.

DEFINITION OF EXTREMUM. A number which is either a relative maximum or a relative minimum of a function $f$ is called an extreme value or an extremum of $f.$

THEOREM (VANISHING OF THE DERIVATIVE AT AN INTERIOR EXTREMUM.) Let $f$ be defined on an open interval $I$, and assume that $f$ has a relative maximum or a relative minimum at an interior point $c$ of $f$. If the derivative $f^{\prime }(c)$ exists, then $f^{\prime }(c)=0.$

Proof Define a function $Q$ on I as follows:
MATH
Since $f^{\prime }(c)$ exists, MATH as $x\rightarrow c$, so $Q$ is continuous at $c.$ we wish to prove that $Q(c)=0$. We shall do this by showing that each of the inequalities $Q(c)>0$ and $Q(c)<0$ leads to a contradiction.

Assume $Q(c)>0$. By the sign-preserving property of continuous functions, there is an interval about $c$ in which $Q(x)$ is positive. Therefore the numerator of the quotient $Q(x)$ has the same sign as the denominator for all $x\ne c$ in this interval. In other words, $f(x)>f(c)$ when $x>c$, and $f(x)<f(c)$ when $x<c$. This contradicts the assumption that $f$ has an extremum at $c$. Hence, the inequality $Q(c)>0$ is impossible. A similar argument shows that we cannot have $Q(c)<0$. Therefore $Q(c)=0$, as asserted. Since MATH, this proves the theorem.

Rolle's Theorem

Let $f$ be a function which is continuous everywhere on a closed interval $[a,b]$ and has a derivative at each point of the open interval $(a,b)$. Also, assume that
MATH
Then there is at least one point $c$ in the open interval $(a,b)$ such that $f^{\prime }(c)=0$.

Proof. We assume that MATH for every $x$ in the open interval $(a,b),$ and we arrive at a contradiction as follows: By the extreme-value theorem for continuous functions, $f$ must take on its absolute maximum $M$ and its absolute minimum $m$ somewhere in the closed interval $[a,b].$ The previous theorem tells us that neither extreme value can be taken at any interior point (otherwise the derivative would vanish there). Hence, both extreme values are taken on at the endpoints $a$ and $b$. But since $f(a)=f(b),$ this means that $m=M$, and hence $f$ is constant on $[a,b]$. This contradicts the fact that MATH for all $x$ in $(a,b).$ It follows that $f^{\prime }(c)=0$ for at least one $c$ satisfying $a<c<b,$ which proves the theorem.

Mean-Value Theorem

MEAN-VALUE THEOREM FOR DERIVATIVES. Assume that $f$ is continuous everywhere on a closed interval $[a,b]$ and has a derivative at each point of the open interval $(a,b).$ Then there is at least one interior point $c$ of $(a,b)$ for which
MATH

Proof. To apply Rolle's theorem we need a function which has equal values at the endpoints $a$ and $b$. To construct such a function, we modify $f$ as follows. Let
MATH

Then MATH Also, $h$ is continuous on $[a,b]$ and has a derivative in the open interval $(a,b).$ Applying Rolle's theorem to $h$, we find that $h^{\prime }(c)=0$ for some $c$ in $(a,b).$ But
MATH
When $x=c$, this gives us the desired result.

Cauchy's Mean-Value Theorem

Theorem. Let $f$ and $g$ be two functions continuous on a closed interval $[a,b]$ and having derivatives in the open interval $(a,b).$ Then,for some $c$ in $(a,b)$, we have
MATH
Proof. We let
MATH
Then
MATH
Applying Rolle's theorem to $h$, we find that $h^{\prime }(c)=0$ for some $c$ in $(a,b)$. Computing $h^{\prime }(c)$ from the formula defining $h$, we obtain Cauchy's mean-value formula. Note that the mean-value theorem is the special case obtained by taking $g(x)=x.$

Geometric Properties of Functions

The mean-value theorem may be used to deduce properties of a function from a knowledge of the algebraic sign of its derivative. This is illustrated by the following theorem.

Theorem. Let $f$ be a function which is continuous on a closed interval $[a,b]$ and assume $f$ has a derivative at each point of the open interval $(a,b).$ Then we have:

(a) If $f^{\prime }(x)>0$ for every $x$ in $(a,b)$,$f$ is strictly increasing on $[a,b]$;

(b) If $f^{\prime }(x)<0$ for every $x$ in $(a,b)$,$f$ is strictly decreasing on $[a,b]$;

(c) If $f^{\prime }(x)=0$ for every $x$ in $(a,b)$,$f$ is constant throughout $[a,b].$

Proof. To prove (a) we must show that $f(x)<f(y)$ whenever $a<x<y<b.$ Therefore, suppose $x<y$ and apply the mean-value theorem to the closed subinterval $[x,y].$ We obtain:
MATH
Since both $f^{\prime }(c)$ and $y-x$ are positive, so is $f(y)-f(x)$, and this means $f(x)<f(y)$, as asserted. This proves (a), and the proof of (b) is similar. To prove (c), we again use the mean-value theorem:
MATH
We have $f(x)=f(y)$ for every $x,y$ in $(a,b)$, so $f$ is constant

Theorem. Assume $f$ is continuous on a closed interval $[a,b]$ and assume that the derivative $f^{\prime }$ exists everywhere in the open interval $(a,b),$ except possibly at a point $c.$

(a) If $f^{\prime }(x)$ is positive for all $x<c$ and negative for all $x>c$, then $f$ has a relative maximum at $c.$

(b) If, on the other hand, $f^{\prime }(x)$ is negative for all $x<c$ and positive for all $x>c$, then $f$ has a relative minimum at $c.$

Proof. In case (a), the previous theorem tells us that $f$ is strictly increasing on $[a,c]$ and strictly decreasing on $[c,b]$. Hence $f(x)<f(c)$ for all $x\ne c$ in $(a,b)$, so $f$ has a relative maximum at $c.$ This proves (a) and the proof of (b) is entirely analogous.

Second-derivative Test for Extrema

If a function $f$ is continuous on a closed interval $[a,b]$, the extreme-value theorem tells that it has an absolute maximum and an absolute minimum somewhere in $[a,b]$. If $f$ has a derivative at each interior point, then the only places where extrema can occur are:

1) at the endpoints $a$ and $b$;

2) at those interior points $x$ where $f^{\prime }(x)=0$.

Points of type (2) are often called critical points of $f$. To decide whether there is a maximum or minimum (or neither) at a critical point $c$, we need more information about $f.$ Usually the behavior of $f$ at a critical point can be determined from the algebraic sign of the derivative near $c$. The next theorem shows that a study of the sign of the second derivative near $c$ can also be helpful.

Second-derivative Test

THEOREM. SECOND-DERIVATIVE TEST FOR AN EXTREMUM AT A CRITICAL POINT. Let $c$ be a critical point of $f$ in an open interval $(a,b)$; that is, assume $a<c<b$ and $f^{\prime }(c)=$0. Assume also that the second derivative $f^{\prime \prime }$ exists in $(a,b).$ Then we have the following:

(a) If $f^{\prime \prime }$ is negative in $(a,b)$,$f$ has a relative maximum at $c$.

(b) If $f^{\prime \prime }$ is positive in $(a,b)$,$\;f$ has a relative minimum at $c$.

Proof. Consider case (a), MATH in $(a,b).$ The function $f^{\prime }$ is strictly decreasing in $(a,b)$: But $f^{\prime }(c)=0$, so $f^{\prime }$ changes its sign from positive to negative at $c$. Hence, $f$ has a relative maximum at $c$. The proof in case (b) is entirely analogous.

Convex Function

DEFINITION OF A CONVEX FUNCTION. A function $g$ is said to be convex on an interval $[a,b]$ if, for all $x$ and $y$ in $[a,b]$ and for every $\alpha $ satisfying $0<\alpha <1$, we have
MATH
where
MATH
We say $g$ is concave on $[a,b]$ if the reverse inequality holds,
MATH
where
MATH

The sign of the second derivative also governs the convexity or the concavity of $f$. The next theorem shows that the function is convex in intervals where $f^{\prime \prime }$is positive. It suffices to discuss only the convex case, because if $f$ is convex, then $-f$ is concave.

THEOREM. DERIVATIVE TEST FOR CONVEXITY. Assume $f$ is continuous on $[a,b]$ and has a derivative in the open interval $(a,b)$. If $f^{\prime }$ is increasing on $(a,b)$, then $f$ is convex on $[a,b]$. In particular, $f$ is convex if $f^{\prime \prime }$ exists and is nonnegative in $(a,b).$

Proof. Take $x<y$ in $[a,b]$ and let MATHwhere $0<\alpha <1.$ We wish to prove that MATH Since MATH, this is the same as proving that
MATH
By the mean-value theorem (applied twice), there exist points $c$ and $d$ satisfying $x<c<z$ and $z<d<y\;$such that
MATH
Since $f^{\prime }$ is increasing; we have MATH. Also, we have MATH, so we may write
MATH
which proves the required inequality for convexity.

This document created by Scientific WorkPlace 4.0.